# [Official] NVIDIA GTX 1080 Owner's Club



## Zurv

Is the a page to unlock the 1080 for > 2 SLI up yet?


----------



## OverK1LL

I haven't seen one yet, but I'd be willing to guess a lot of the discussion will happen here if my Haswell-E thread is any indication.

If it does I will link it in the first post


----------



## dpoverlord

Trying to figure if the EVGA ACX 3.0 is that much better than the founders edition. Currently the founders edition is $79 more.


----------



## traxtech

They use the same PCB, only difference would be temps/sound. If putting it under water either, if air definitely ACX 3.0

My founders edition EVGA 1080 will be here in a few days with an EK block/back plate on the way too, can't wait


----------



## dpoverlord

Quote:


> Originally Posted by *traxtech*
> 
> They use the same PCB, only difference would be temps/sound. If putting it under water either, if air definitely ACX 3.0
> 
> My founders edition EVGA 1080 will be here in a few days with an EK block/back plate on the way too, can't wait


Problem with their site crashing from demand I had no chance to see the differences until now. I thought that like a bad consumer "Higher Price = Better card", Its a $79 difference between the ACX 3.0 and Founders Edition.

I shot them a call and they said I would have to cancel my step up (get out of queue) and try to get through the site again.

I guess, I could just try calling them if they approve the step up and see if that helps?

What would you do. Just be happy I got through?

#BuyersRemorse


----------



## OverK1LL

Has anybody fired on the Founders Editions? I see stock fluctuating for the Asus ones on NewEgg. Looks like they had them in stock three different times.


----------



## OverK1LL

@westlake

Do you know how to get one of them fancy SLi Bridges for your cards?


----------



## westlake

@OverK1LL
I have no idea.








Soon (EVGA).
ASICQ? GPU-Z 0.8.8 not supported this time, but the test version of AIDA64 is works fine.








Confidential for now.
92.5%









Old SLI bridge:
http://prohardver.hu/dl/upc/2016-05/188652_sli_png_2.jpg
(not my pic)

ASUS GTX 1080 FE model number is *GTX1080-8G*.


----------



## TK421

Here's a guide to use EVGA Hybrid cooler with GTX 1080 FE/REF

http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look\

stock both, oc cpu stock gpu, oc both

http://www.3dmark.com/compare/fs/8597249/fs/8597350/fs/8597452


----------



## AllGamer

I'm still waiting for mine to arrive, sometimes next week.

MSI GTX 1080 FE stock on air.


----------



## shilka

I am really split between the Asus Strix and the new Gigabyte Gaming GTX 1080 cards.
Going to wait for reviews of both before i make a final decision.

Not sure if i should upgrade my CPU / motherboard and RAM first and get video cards later or upgrade the video cards first.
Is a non overclocked 3820 going to hold one or two GTX 1080 cards back?

Only site in denmark that has GTX 1080 cards is sold out and wont have cards in untill 2-3 weeks from now.


----------



## TK421

Please add my hybrid install guide to main page http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look


----------



## sherlock

I got one coming in two weeks. My 980Ti Hybrid(bought for $739 in April, so I only had to pay $30ish for shipping both ways) is on its way to EVGA for step-up to a GTX 1080 Founder Edition.


----------



## Zurv

Quote:


> Originally Posted by *OverK1LL*
> 
> Has anybody fired on the Founders Editions? I see stock fluctuating for the Asus ones on NewEgg. Looks like they had them in stock three different times.


I have 8 coming, 4 from amazon and 4 from newegg. The newegg one's shipped already - but coming next week on Friday







The Amazon cards will be here on Tuesday (i pre-order last week).
the EK blocks .. ordered but "Processing"

edit: ** yep, i took all your cards!


----------



## Maintenance Bot

Hope to join club tomorrow if card gets here.

For those who do not care for Evga PX, found 1080 Afterburner beta tool http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

A good write up here about it http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3.html

Quote:


> Originally Posted by *shilka*
> 
> I am really split between the Asus Strix and the new Gigabyte Gaming GTX 1080 cards.
> Going to wait for reviews of both before i make a final decision.
> 
> Not sure if i should upgrade my CPU / motherboard and RAM first and get video cards later or upgrade the video cards first.
> Is a non overclocked 3820 going to hold one or two GTX 1080 cards back?
> 
> Only site in denmark that has GTX 1080 cards is sold out and wont have cards in untill 2-3 weeks from now.


Thats a tough decision, get the card, bottleneck will probably be minimal from cpu if any.


----------



## Maintenance Bot

Quote:


> Originally Posted by *OverK1LL*
> 
> Has anybody fired on the Founders Editions? I see stock fluctuating for the Asus ones on NewEgg. Looks like they had them in stock three different times.


Yeah I pulled the trigger on one as well.

Quote:


> Originally Posted by *Zurv*
> 
> I have 8 coming, 4 from amazon and 4 from newegg. The newegg one's shipped already - but coming next week on Friday
> 
> 
> 
> 
> 
> 
> 
> The Amazon cards will be here on Tuesday (i pre-order last week).
> the EK blocks .. ordered but "Processing"
> 
> edit: ** yep, i took all your cards!


You building a rig in every room


----------



## OverK1LL

Quote:


> Originally Posted by *Zurv*
> 
> I have 8 coming, 4 from amazon and 4 from newegg. The newegg one's shipped already - but coming next week on Friday
> 
> 
> 
> 
> 
> 
> 
> The Amazon cards will be here on Tuesday (i pre-order last week).
> the EK blocks .. ordered but "Processing"
> 
> edit: ** yep, i took all your cards!


It sounds like you have something wicked planned! What are you doing with them?


----------



## Zurv

Quote:


> Originally Posted by *OverK1LL*
> 
> It sounds like you have something wicked planned! What are you doing with them?


I have two systems. Both will be running 4 way. I'm not to worried about "support" for SLI from NVidia. It isn't like that supported that much in the past (and in the end they normally do if the engine supports it) - also the "limits" to SLI have been focused on dx12 games (which 0 have sli support now anyway).

I'm really looking forward a 4k gysnc 32" monitors with 1.4 DP. (hopefully oled too). My second system is connected to a 4k OLED 65" with HDR.

I'll play around with firemark ranking too. For a bit with the 980s i was #1. My titan build was in the top 5 for a long time.
http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1 (seems i'm still #10.. but i think that will only be there for the next few days)


----------



## c0nsistent

Quote:


> Originally Posted by *Zurv*
> 
> I have two systems. Both will be running 4 way. I'm not to worried about "support" for SLI from NVidia. It isn't like that supported that much in the past (and in the end they normally do if the engine supports it) - also the "limits" to SLI have been focused on dx12 games (which 0 have sli support now anyway).
> 
> I'm really looking forward a 4k gysnc 32" monitors with 1.4 DP. (hopefully oled too). My second system is connected to a 4k OLED 65" with HDR.
> 
> I'll play around with firemark ranking too. For a bit with the 980s i was #1. My titan build was in the top 5 for a long time.
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1 (seems i'm still #10.. but i think that will only be there for the next few days)


I ordered 5 Titan's at launch as well, but reality set in when I tried to play my favorite games in Quad SLI. First I switched to Tri-SLI and had SOME success, as far as good scaling in about 50% of the games I played, then I switched to 2 cards, and for the most part the FPS I attained was within 10% or so, so I stayed there.

Even for 4K, I think 2 1080's would suffice. Why not wait until the 1080 Ti or Titan (?) comes out?


----------



## TK421

3rd ranked user on that pic had a house fire incident after benching


----------



## Jpmboy

Quote:


> Originally Posted by *c0nsistent*
> 
> I ordered 5 Titan's at launch as well, but reality set in when I tried to play my favorite games in Quad SLI. First I switched to Tri-SLI and had SOME success, as far as good scaling in about 50% of the games I played, then I switched to 2 cards, and for the most part the FPS I attained was within 10% or so, so I stayed there.
> 
> Even for 4K, I think 2 1080's would suffice. Why not wait until the 1080 Ti or Titan (?) comes out?


2 1080s or even 2 TitanX for that matt4er are plenty for 4K.


----------



## c0nsistent

Quote:


> Originally Posted by *Jpmboy*
> 
> 2 1080s or even 2 TitanX for that matt4er are plenty for 4K.


Yeah I mean I've seen a guy playing DOOM @ 4k with everything on ultra, and never dipping under 60 fps with a SINGLE 1080 overclocked to 2.1ghz.


----------



## Hackslash

Any guesses what will be the best custom gtx 1080 in terms of overclocking and cooling?


----------



## OverK1LL

Quote:


> Originally Posted by *c0nsistent*
> 
> I ordered 5 Titan's at launch as well, but reality set in when I tried to play my favorite games in Quad SLI. First I switched to Tri-SLI and had SOME success, as far as good scaling in about 50% of the games I played, then I switched to 2 cards, and for the most part the FPS I attained was within 10% or so, so I stayed there.
> 
> Even for 4K, I think 2 1080's would suffice. Why not wait until the 1080 Ti or Titan (?) comes out?


I did the same thing! I always used to roll tri SLi for every GPU iteration since the 8800 Ultra's. Then fired on all four for Quad SLi.

I had them running for about a week and decided to downgrade to Tri SLi and pop in a 512GB NVMe to use the last of my 40 lanes. Quad SLi was unplayable. Not surprised NVIDIA dropped officially supporting it.

These in SLi should work perfect with the PG348Q. Excited to see how they work!


----------



## AllGamer

Quote:


> Originally Posted by *Zurv*
> 
> I have 8 coming, 4 from amazon and 4 from newegg. The newegg one's shipped already - but coming next week on Friday
> 
> 
> 
> 
> 
> 
> 
> The Amazon cards will be here on Tuesday (i pre-order last week).
> the EK blocks .. ordered but "Processing"
> 
> edit: ** yep, i took all your cards!


Since Quad-SLI is no longer officially supported

Are you trying to do an Octo-SLI build instead ?


----------



## TK421

Quote:


> Originally Posted by *AllGamer*
> 
> Since Quad-SLI is no longer officially supported
> 
> Are you trying to do an Octo-SLI build instead ?


stack ROG expanders on top of each others?


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> 2 1080s or even 2 TitanX for that matt4er are plenty for 4K.


Maybe 1080 could cut it.. but two titan x don't. sadly, I was planning only to have two in my HTPC to play games on my 4k TV. Games like tomb raider, the division, and witcher 3 were sub 60fps with only 2. (yes, not including the times when NVidia broke 3-4 way SLI with not putting traffic over the PCI bus .. but they fixed that a few month ago).

Of course playing at 4k and playing at 4k with everything maxed are different things. for me playing at 4k is with all the sparkly stuff on.


----------



## Jpmboy

Quote:


> Originally Posted by *TK421*
> 
> stack ROG expanders on top of each others?


either BS or binning cards.








Quote:


> Originally Posted by *Zurv*
> 
> Maybe 1080 could cut it.. but two titan x don't. sadly, I was planning only to have two in my HTPC to play games on my 4k TV. Games like tomb raider, the division, and witcher 3 were sub 60fps with only 2. (yes, not including the times when NVidia broke 3-4 way SLI with not putting traffic over the PCI bus .. but they fixed that a few month ago).
> 
> Of course playing at 4k and playing at 4k with everything maxed are different things. for me playing at 4k is with all the sparkly stuff on.


Yeah - I have 2 4K monitors (one for over 2 years - piggybacked on a skid from China, and a Smasung). 2 WCd overclocked TXs (1545/8100) have not failed me or the kids in any thing so far. Games are the COD series, BF4, and a few other titles. Sometimes driver "upgrades" are not.. upgrades.









Right now, the 1080 is the strongest card available (tho if we talk single slot solutions, a 295x2 still keeps up very well







).

I should have one here mid next week... no shipping over the Holiday.


----------



## Zurv

Quote:


> Originally Posted by *OverK1LL*
> 
> I did the same thing! I always used to roll tri SLi for every GPU iteration since the 8800 Ultra's. Then fired on all four for Quad SLi.
> 
> I had them running for about a week and decided to downgrade to Tri SLi and pop in a 512GB NVMe to use the last of my 40 lanes. Quad SLi was unplayable. Not surprised NVIDIA dropped officially supporting it.
> 
> These in SLi should work perfect with the PG348Q. Excited to see how they work!


i'm kinda in the same boat. I have plans to run 4 way for a bit.. but if it doesn't go super well I think i might drop to 3 way with the 1080s and put a intel 750 nvme 1.2gig PCI-e card in. In fact that is what my systems are running now (all nvme. the other HD is a Samsung 950 pro). One of my titan Xs broke and i had to RMA it and never put it back in the system (ugh.. i hate redoing the water!)

If that is the case my brother is going to get a sweet upgrade to his PC (WHY DOES HE STILL PLAY ON THE XBOX WHEN I GAVE HIM A SWEET PC!)


----------



## mypickaxe

Quote:


> Originally Posted by *Zurv*
> 
> (WHY DOES HE STILL PLAY ON THE XBOX WHEN I GAVE HIM A SWEET PC!)


It's ok to do both. Some games, especially for the online multiplayer aspect, are more robust in the XBOX ecosystem.


----------



## bfedorov11

Quote:


> Originally Posted by *Zurv*
> 
> Maybe 1080 could cut it.. but two titan x don't. sadly, I was planning only to have two in my HTPC to play games on my 4k TV. Games like tomb raider, the division, and witcher 3 were sub 60fps with only 2. (yes, not including the times when NVidia broke 3-4 way SLI with not putting traffic over the PCI bus .. but they fixed that a few month ago).
> 
> Of course playing at 4k and playing at 4k with everything maxed are different things. for me playing at 4k is with all the sparkly stuff on.


I had 2 TX and they were fine for 4k 60 fps. I could max pretty much every game. Some games, like syndicate, I had to drop AA settings about half way down, but everything else was maxed. GTA5, you name it. They were also under water and game stable at ~1525.

Just ordered an evga from newegg. I didn't want to wait for all the aib reviews. I honestly don't think the better coolers will make a huge difference anyway. The problem is the die size/surface area, not lack of cooling. Scaling seems meh anyway.

Anyone get tracking from newegg? I got 2 emails, but the order doesn't show up in my account.


----------



## TK421

Ok ran into problem.

Card is being held back by vRel limit, seems to be a VRM temp issue.

Another thing is that the card intermittently ramps the blower fan up and down for a very short period of time, idk why.


----------



## Zurv

Quote:


> Originally Posted by *bfedorov11*
> 
> I had 2 TX and they were fine for 4k 60 fps. I could max pretty much every game. Some games, like syndicate, I had to drop AA settings about half way down, but everything else was maxed. GTA5, you name it. They were also under water and game stable at ~1525.
> 
> Just ordered an evga from newegg. I didn't want to wait for all the aib reviews. I honestly don't think the better coolers will make a huge difference anyway. The problem is the die size/surface area, not lack of cooling. Scaling seems meh anyway.
> 
> Anyone get tracking from newegg? I got 2 emails, but the order doesn't show up in my account.


Quote:


> Originally Posted by *bfedorov11*
> 
> I had 2 TX and they were fine for 4k 60 fps. I could max pretty much every game. Some games, like syndicate, I had to drop AA settings about half way down, but everything else was maxed. GTA5, you name it. They were also under water and game stable at ~1525.
> 
> Just ordered an evga from newegg. I didn't want to wait for all the aib reviews. I honestly don't think the better coolers will make a huge difference anyway. The problem is the die size/surface area, not lack of cooling. Scaling seems meh anyway.
> 
> Anyone get tracking from newegg? I got 2 emails, but the order doesn't show up in my account.


I got tracking from newegg. 2 asus and 2 evga cards. Both will come Friday - but i picked slow shipping. tracking took hours to show up


----------



## Seyumi

Quote:


> Originally Posted by *Zurv*
> 
> I have 8 coming, 4 from amazon and 4 from newegg. The newegg one's shipped already - but coming next week on Friday
> 
> 
> 
> 
> 
> 
> 
> The Amazon cards will be here on Tuesday (i pre-order last week).
> the EK blocks .. ordered but "Processing"
> 
> edit: ** yep, i took all your cards!


Will you please get us some quick 3 way and 4 way benchmarks at 4k? (minimum FPS mostly. Games mostly as well I don't give a **** about benchmarks) We have very similar systems (4 Titan X on water & 5960x) and I've been itching to downgrade to 2 way 1080 Z170 setup ever since the whole "only 2 way SLI officially supported" announcement. My criteria is being able to play modern games with max settings never dropping below 60FPS (nothing ridiculous like 8x MSAA but like 2-4x is fine). If 2 1080's can pull this off then I'd love to save me some thousands of dollars. My decision is literally resting on people like you who are getting more than 2 cards. Thanks!


----------



## Zurv

Quote:


> Originally Posted by *Seyumi*
> 
> Will you please get us some quick 3 way and 4 way benchmarks at 4k? (minimum FPS mostly. Games mostly as well I don't give a **** about benchmarks) We have very similar systems (4 Titan X on water & 5960x) and I've been itching to downgrade to 2 way 1080 Z170 setup ever since the whole "only 2 way SLI officially supported" announcement. My criteria is being able to play modern games with max settings never dropping below 60FPS. If 2 1080's can pull this off then I'd love to save me some thousands of dollars. My decision is literally resting on people like you who are getting more than 2 cards. Thanks!


I won't have EK blocks till mid next week - also NVidia hasn't put up the site were we can unlock SLI (unless i missed something). I sadly can't close my loop without video cards in the mix








But i'll test whatever you like. just note, the limits they talk about in games is for dx12 not dx11. So any dx11 testing (which will be all the game.. as no dx12 games support sli) might look better than what we see in the future.

I'm sure 2 1080s will do find, but maybe wait for the evga classified to have more volt options. I don't want on running more than 2ghz for every day running - so the normal 1080 and water should do the job.


----------



## Seyumi

Quote:


> Originally Posted by *Zurv*
> 
> I won't have EK blocks till mid next week - also NVidia hasn't put up the site were we can unlock SLI (unless i missed something). I sadly can't close my loop without video cards in the mix
> 
> 
> 
> 
> 
> 
> 
> 
> But i'll test whatever you like. just note, the limits they talk about in games is for dx12 not dx11. So any dx11 testing (which will be all the game.. as no dx12 games support sli) might look better than what we see in the future.
> 
> I'm sure 2 1080s will do find, but maybe wait for the evga classified to have more volt options. I don't want on running more than 2ghz for every day running - so the normal 1080 and water should do the job.


Appreciate it. I'm hoping 2 1080's can pull it off but I'm kind of doubting it. I think we'll still need to go 3 way SLI until the Titan replacement drops then I'm confident 2 of those can maintain 4k 60 FPS. Still no 120hz 4k monitors anywhere in sight







will definitely need 3-4 Pascal Titans to pull that off.


----------



## DADDYDC650

I'll be in this club in a few weeks. Custom 1080 is the way to go. Why spend more money for less?

BTW, the Gainward GeForce GTX 1080 Phoenix has awesome box art. Reminds me of the old school artwork on gfx cards.


----------



## OverK1LL

What's it take to get an "Official" slapped on the front of this here thread?


----------



## Jpmboy

Quote:


> Originally Posted by *OverK1LL*
> 
> What's it take to get an "Official" slapped on the front of this here thread?


Need to have a Mod make it an [Official} thread AFAIK... or just call it that and see.







We need to get our hands on a pascal bios tweaker. Running Titan X at stock voltage is fine, but bios tweaked to 1.274 really unleashed the TX...


----------



## steveTA1983

Quote:


> Originally Posted by *Jpmboy*
> 
> Need to have a Mod make it an [Official} thread AFAIK... or just call it that and see.
> 
> 
> 
> 
> 
> 
> 
> We need to get our hands on a pascal bios tweaker. Running Titan X at stock voltage is fine, but bios tweaked to 1.274 really unleashed the TX...


is that 1.274v bios ok if your temps are in check and you have passive cooling on the RAM on the back of the card (I have copper hear sinks in all of them and a fan laying on top, gets nice and cold). I'm on a stock cooler too


----------



## Silent Scone

Picked up one, but missed the rush. Hoping to have it by Thursday!

Quote:


> Originally Posted by *c0nsistent*
> 
> Yeah I mean I've seen a guy playing DOOM @ 4k with everything on ultra, and never dipping under 60 fps with a SINGLE 1080 overclocked to 2.1ghz.


As nice as that would be, you'll see around 40fps quite a lot. I'll let you off though as the hype train has barely left the station


----------



## shilka

I was really looking forward to the new Gigabyte G1 cards and now i am not.
http://www.tweaktown.com/news/52296/gigabyte-announces-geforce-gtx-1080-g1-gaming-video-card/index.html

I think that card is ugly as sin so thats a pass from me.
Think i am going to end up with either the Asus Strix or maybe i will try EVGA for the first time.


----------



## MrDerrikk

To be honest I was going to go with the Gigabyte G1 too, but the single 8-pin has turned me off. Might have to get the Xtreme or another brand (gasp!).


----------



## Hackslash

Quote:


> Originally Posted by *MrDerrikk*
> 
> To be honest I was going to go with the Gigabyte G1 too, but the single 8-pin has turned me off. Might have to get the Xtreme or another brand (gasp!).


same here...

i fear that all 1x8Pin boards are reference boards

and i want a higher powertarget


----------



## r0l4n

I received the eVGA FE today, and the fan seems to be acting up. During load, it ramps up, I'd say randomly, for a couple of seconds and goes back to normal. Anybody else experiencing this? See the spikes below.


----------



## shilka

Quote:


> Originally Posted by *MrDerrikk*
> 
> To be honest I was going to go with the Gigabyte G1 too, but the single 8-pin has turned me off. Might have to get the Xtreme or another brand (gasp!).


Forgot about the Xtreme so i really really hope they dont put freaking orange on that card!


----------



## MrDerrikk

Quote:


> Originally Posted by *shilka*
> 
> Forgot about the Xtreme so i really really hope they dont put freaking orange on that card!


It looks like they'll go for the black/RGB style with it, at least I'm hoping so as I need it in blue.


----------



## shilka

Quote:


> Originally Posted by *MrDerrikk*
> 
> It looks like they'll go for the black/RGB style with it, at least I'm hoping so as I need it in blue.


Think i am going to push getting GTX 1080 cards back to autumn and upgrade my motherboard CPU and RAM first.
All the cards should be out by then prices should be lower and drivers should better.

As always i am going to do a mega review of the cards once i get them.
But since i am not made out of money i need to save up for 2-3 months before i have enough for two cards.

Anyway i am going to wait for reviews before i pick the card that i want.
The Gigabyte G1 is out as its way too ugly for my taste.


----------



## Clockster

I7 [email protected], Gigabyte GTX1080FE (+180Core, +575 mem)





http://www.3dmark.com/3dm/12204180?

http://www.3dmark.com/3dm/12204127?

http://www.3dmark.com/fs/8599032


----------



## zetoor85

so you play poker with nvidia, even they tell you NOT to go 4way you still do it? okaaaay....

nvidia states them self, that 3way 4way is no no, you can do it, but nvidia WONT BOTHER ABOUT DRIVERS, its the game dev that have to come up with the support, and with this series, 2 cards in sli, anything over you would not find very usefull for 99% of the games that will be released.


----------



## r0l4n

On stock voltage, it seems the one I got is stable at +182Core and +602Mem. On +200 core I start getting artifacts in Doom. I max out the power limit and temp limit (120/92) and temps stay around 85C (with 20C ambient). Will soon start playing around with the overvoltage offset and see where it takes me.


----------



## traxtech

So excited for the custom bios gods to get working on these







, hoping the chip itself isn't the limiting factor.


----------



## Jpmboy

Quote:


> Originally Posted by *steveTA1983*
> 
> is that 1.274v bios ok if your temps are in check and you have passive cooling on the RAM on the back of the card (I have copper hear sinks in all of them and a fan laying on top, gets nice and cold). I'm on a stock cooler too


I've been running that bios since th4e cards launched (well, within a week) using EK blocks. Cards are still running strong.
You can get it *here*
Quote:


> Originally Posted by *Silent Scone*
> 
> Picked up one, but missed the rush. Hoping to have it by Thursday!
> As nice as that would be, you'll see around 40fps quite a lot. I'll let you off though as the hype train has barely left the station


Nice!! Yeah - just one for me too right now. I'll put a uniblock on it to play around. Really want a GP100 card (or 2).


----------



## westlake

@r0l4n
Yes. This is really annoying.








(2 EVGA FE)


----------



## r0l4n

Quote:


> Originally Posted by *westlake*
> 
> @r0l4n
> Yes. This is really annoying.
> 
> 
> 
> 
> 
> 
> 
> 
> (2 EVGA FE)


A bios upgrade from EVGA will hopefully fix it.


----------



## Hackslash

what will be faster?
ordering from evga directly or ordering on some etailers?


----------



## r0l4n

Quote:


> Originally Posted by *traxtech*
> 
> So excited for the custom bios gods to get working on these
> 
> 
> 
> 
> 
> 
> 
> , hoping the chip itself isn't the limiting factor.


The overvoltage percentage is not doing much tbh, even cranking it up to 100% doesn't seem to add but a few mV, resulting in perhaps 2 or 3 bins increase. I reckon a custom bios will do wonders.


----------



## Jpmboy

Hey guys.... if you have your cards, run some of the OCN benchmarks. The datasets are helpful in many ways:
http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30


----------



## Clockster

@Jpmboy I posted some of my benches on the previous page (Post#50).
I suspect you want the full details though?

If I get some time I'll rerun and do it the proper way.


----------



## Jpmboy

Quote:


> Originally Posted by *Clockster*
> 
> @Jpmboy I posted some of my benches on the previous page (Post#50).
> I suspect you want the full details though?
> 
> If I get some time I'll rerun and do it the proper way.


Thanks bro,
Yeah - the OCN threads require a specific (default) setting so that the data are all comparable. In aggregate, It really helps understanding the different generations and improvements over time.


----------



## r0l4n

Fire Strike (graphics score):
23305 (http://www.3dmark.com/fs/8604008)

Fire Strike Extreme (graphics score):
11025 (http://www.3dmark.com/fs/8604122)

Fire Strike Ultra (graphics score):
5486 (http://www.3dmark.com/3dm/12210698)

Max "stable" on stock overvoltage:
ov 0%
core +182 (1975-2050mhz)
mem +602 (11200mhz)
pl 120%
tl 92C
fan 100%


----------



## Jpmboy

Quote:


> Originally Posted by *r0l4n*
> 
> Fire Strike (graphics score):
> 23305 (http://www.3dmark.com/fs/8604008)
> 
> Max "stable" on stock overvoltage:
> ov 0%
> core +182 (1975-2050mhz)
> mem +602 (11200mhz)
> pl 120%
> tl 92C
> fan 100%


Firestrike is the most CPU dependent of the three. Extreme less and Ultra is the least CPU dependent. A 2700K is at the top of the chart for single card FSU.


----------



## r0l4n

Quote:


> Originally Posted by *Jpmboy*
> 
> Firestrike is the most CPU dependent of the three. Extreme less and Ultra is the least CPU dependent. A 2700K is at the top of the chart for single card FSU.


I'll be updating the post as I run them


----------



## Clockster

Quote:


> Originally Posted by *Jpmboy*
> 
> Thanks bro,
> Yeah - the OCN threads require a specific (default) setting so that the data are all comparable. In aggregate, It really helps understanding the different generations and improvements over time.


Ok sweet mate.

I'll be getting a Gigabyte GTX1080 Extreme gaming in the next 2 weeks or so, will add those results as well.


----------



## zetoor85

so you guys are doing 23k FS allready? thats titan x domain







and on 2000mhz only? thats pretty cool


----------



## GnarlyCharlie

Quote:


> Originally Posted by *zetoor85*
> 
> so you guys are doing 23k FS allready? thats titan x domain
> 
> 
> 
> 
> 
> 
> 
> and on 2000mhz only? thats pretty cool


Notice r0l4n specified that the score was "Graphics Score". Still, his 18K FS raw score is right in 980Ti/ TX territory!


----------



## zetoor85

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> Notice r0l4n specified that the score was "Graphics Score". Still, his 18K FS raw score is right in 980Ti/ TX territory!


Hello sir, im looking on the GS only







but yes, spot on


----------



## bfedorov11

Quote:


> Originally Posted by *r0l4n*
> 
> I received the eVGA FE today, and the fan seems to be acting up. During load, it ramps up, I'd say randomly, for a couple of seconds and goes back to normal. Anybody else experiencing this? See the spikes below.


How bad is it? Can you hear it? Someone else just mentioned it too. That is unacceptable for a $700 card. I have an evga fe coming next week.


----------



## r0l4n

Quote:


> Originally Posted by *bfedorov11*
> 
> How bad is it? Can you hear it? Someone else just mentioned it too. That is unacceptable for a $700 card. I have an evga fe coming next week.


You can definitely hear it. Happens as well when the gpu is at rest but gets load suddenly, it spikes for a couple of seconds and gets back in check. This happens even at idle temps.

I agree, this should not happen, but it's not so annoying as to RMA the card (this is a personal opinion). I expect they'll address this in a future vBios.


----------



## Jpmboy

Group hunt for PASCAL BIOS TWEAKER


----------



## bfedorov11

Quote:


> Originally Posted by *r0l4n*
> 
> You can definitely hear it. Happens as well when the gpu is at rest but gets load suddenly, it spikes for a couple of seconds and gets back in check. This happens even at idle temps.
> 
> I agree, this should not happen, but it's not so annoying as to RMA the card (this is a personal opinion). I expect they'll address this in a future vBios.


Basically when your not gaming. Does it happen with precision x or afterburner open? What happens if you make a similar to stock fan curve? Maybe the polling rate is too high?

They would never release that directly to the public though. Can't ask people to risk bricking their card. I'm tempted to cancel my order since it doesn't ship till Monday.


----------



## bfedorov11

Quote:


> Originally Posted by *TK421*
> 
> Another thing is that the card intermittently ramps the blower fan up and down for a very short period of time, idk why.


What brand card?


----------



## sebasm

I can't speak about the 1080 but my travel laptop (Lenovo Thinkpad ..something) has the same hair dryer issue with the cooler. Every few minutes it just spikes up for not reason, keeps spinning hard for a few (20+) seconds, then goes back to normal RPM. It is supremely annoying and Lenovo have refused to fix or even acknowledge it for years now.

Hopefully this will not be the case with the 1080. Keeping my fingers crossed!


----------



## r0l4n

Quote:


> Originally Posted by *bfedorov11*
> 
> Basically when your not gaming. Does it happen with precision x or afterburner open? What happens if you make a similar to stock fan curve? Maybe the polling rate is too high?
> 
> They would never release that directly to the public though. Can't ask people to risk bricking their card. I'm tempted to cancel my order since it doesn't ship till Monday.


It only happens when you start gaming. On idle, and after a few minutes of gaming, the fan is stable. The problem shows itself during the first moments, when the fan will spike up by itself and down again (to follow the fan curve normally). Setting up a custom fan curve doesn't help, it still spikes up a few times before it reaches a stable temperature (since you asked, poll rating is 2s with 2C hysteresis).

It's an EVGA FE.


----------



## dVeLoPe

thinking of stepping up a 84% 980ti classified that oc like a beast for a 1080 acx 3.0 is this a stupid move?

i have until june 7th for my 90 day thing so i was gonna wait it out a few more days to see if they add FTW etc to their list


----------



## Vellinious

Quote:


> Originally Posted by *dVeLoPe*
> 
> thinking of stepping up a 84% 980ti classified that oc like a beast for a 1080 acx 3.0 is this a stupid move?
> 
> i have until june 7th for my 90 day thing so i was gonna wait it out a few more days to see if they add FTW etc to their list


The FTW isn't expected to release until about then....I doubt they add it to the step up program until late June. Still...worth waiting just in case.


----------



## iARDAs

Congrats to all 1080 owners. I hope you guys are very happy with your purchase. Those of you who upgraded from few generations back I can feel your enthusiasm all the way here in Turkey









I have one question to a specific group. Those who gamed at a single 980ti before. How much did your experience change after upgrading to 1080? It would be great to hear it from people who tried both GPUs..

Once again people, enjoy your new GPUs


----------



## romanlegion13th

when will we see real benches with 1080 AND a Titan X? both at stock and both with a OC?


----------



## gamingarena

Quote:


> Originally Posted by *romanlegion13th*
> 
> when will we see real benches with 1080 AND a Titan X? both at stock and both with a OC?


Soon


----------



## Zurv

note on 3 and 4 way SLI
Quote:


> Hello,
> 
> Thank you for contacting NVIDIA Customer care.
> 
> This is a follow-up email in reference to your contact to NVIDIA. Regarding the 3 and 4 Way SLI enabling option for the latest GeForce GTX 1080 card.
> 
> Currently the link to download the application to enable 3 and 4 Way SLI is not available yet, We would request you to wait for sometime till the website is active.
> 
> Best regards
> Rajath,
> NVIDIA Customer Care


blah!


----------



## bfedorov11

Quote:


> Originally Posted by *r0l4n*
> 
> It only happens when you start gaming. On idle, and after a few minutes of gaming, the fan is stable. The problem shows itself during the first moments, when the fan will spike up by itself and down again (to follow the fan curve normally). Setting up a custom fan curve doesn't help, it still spikes up a few times before it reaches a stable temperature (since you asked, poll rating is 2s with 2C hysteresis).
> 
> It's an EVGA FE.


Doesn't seem too bad then. As long as it isn't constantly doing it while using a web browser or something other than gaming. If I stick to the stock heatsink, I normally manually set the fan to ~70% anyway.

Is the fan as loud as the old reference blower? Some reviews said it isn't as bad as the old one.


----------



## r0l4n

Quote:


> Originally Posted by *bfedorov11*
> 
> Doesn't seem too bad then. As long as it isn't constantly doing it while using a web browser or something other than gaming. If I stick to the stock heatsink, I normally manually set the fan to ~70% anyway.
> 
> Is the fan as loud as the old reference blower? Some reviews said it isn't as bad as the old one.


No, it's not that bad. The new blower feels slightly quieter, but don't quote me on this, I've only had the previous reference blower for a few days.


----------



## Clockster

Quote:


> Originally Posted by *iARDAs*
> 
> Congrats to all 1080 owners. I hope you guys are very happy with your purchase. Those of you who upgraded from few generations back I can feel your enthusiasm all the way here in Turkey
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to a specific group. Those who gamed at a single 980ti before. How much did your experience change after upgrading to 1080? It would be great to hear it from people who tried both GPUs..
> 
> Once again people, enjoy your new GPUs


No difference from what I can see(@1440P), I reckon we'll have to wait for the 1080Ti to see a noticeable difference.


----------



## iARDAs

Quote:


> Originally Posted by *Clockster*
> 
> No difference from what I can see(@1440P), I reckon we'll have to wait for the 1080Ti to see a noticeable difference.


Keep me posted my friend. Maybe future drivers can take things further


----------



## romanlegion13th

Quote:


> Originally Posted by *iARDAs*
> 
> Keep me posted my friend. Maybe future drivers can take things further


Means they dont update maxwell drivers to make the card faster only the 1080s

So really its not much faster then unless they minulate the drivers


----------



## westlake

@Jpmboy
And where is the nvflash?


----------



## TommyHere

Quote:


> Originally Posted by *iARDAs*
> 
> Congrats to all 1080 owners. I hope you guys are very happy with your purchase. Those of you who upgraded from few generations back I can feel your enthusiasm all the way here in Turkey
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to a specific group. Those who gamed at a single 980ti before. How much did your experience change after upgrading to 1080? It would be great to hear it from people who tried both GPUs..
> 
> Once again people, enjoy your new GPUs


I've got a 1080 FTW EVGA pre-ordered, will see what clocks I can get out of it, hoping for a solid 2100mhz, I've owned a 980 overclocked to 1600mhz, 780 Ti to 1250mhz, 970, 980 Ti 1500mhz so I'll be able to compare for you, since I'm playing at 4k I'm willing to bet I'll see a difference since any performance uplift will be welcome at that resolution


----------



## c0nsistent

Quote:


> Originally Posted by *Silent Scone*
> 
> Picked up one, but missed the rush. Hoping to have it by Thursday!
> As nice as that would be, you'll see around 40fps quite a lot. I'll let you off though as the hype train has barely left the station


Well, it was documented on video so....


----------



## iARDAs

Quote:


> Originally Posted by *TommyHere*
> 
> I've got a 1080 FTW EVGA pre-ordered, will see what clocks I can get out of it, hoping for a solid 2100mhz, I've owned a 980 overclocked to 1600mhz, 780 Ti to 1250mhz, 970, 980 Ti 1500mhz so I'll be able to compare for you, since I'm playing at 4k I'm willing to bet I'll see a difference since any performance uplift will be welcome at that resolution


You are my man. I am also a 4K gamer andhave a 980ti similar to your OC









Keep me posted.


----------



## Cyclops

Got a BIOS for me?


----------



## Naennon

some different bios are needed for pascal bios editor (oc versions and FE/ref bios)

don't know if gpu-z / nvflash can read those by now


----------



## almrender

Is anyone interested in comparing their GTX1080 against our Titan X? We already ran some 3d rendering benchmarks on Titan and wanted to run the same on your GTX 1080. Will probably take 1-2 hours of your time and we are happy to pay $20-50 for this. Message me if interested.


----------



## westlake

Quote:


> Originally Posted by *almrender*
> 
> Is anyone interested in comparing their GTX1080 against our Titan X? We already ran some 3d rendering benchmarks on Titan and wanted to run the same on your GTX 1080. Will probably take 1-2 hours of your time and we are happy to pay $20-50 for this. Message me if interested.


@almrender
I have:
GTX 970*2
GTX 980*2
GTX 980 Ti*2
GTX TX*2
GTX 1080*2

Where is the benchmarks.


----------



## looniam

Quote:


> Originally Posted by *Cyclops*
> 
> Got a BIOS for me?


^somebody GIVE that man a bios!

btw, might be worth a read:
http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,1.html


----------



## westlake

Comparison of custom GeForce GTX 1080 PCBs
What a list!


----------



## almrender

Quote:


> Originally Posted by *westlake*
> 
> Comparison of custom GeForce GTX 1080 PCBs
> What a list!


Quote:


> Originally Posted by *westlake*
> 
> @almrender
> I have:
> GTX 970*2
> GTX 980*2
> GTX 980 Ti*2
> GTX TX*2
> GTX 1080*2
> 
> Where is the benchmarks.


here are the instructions: https://docs.google.com/document/d/1C_VV89hgAxQ05cVgJ-N-Ib3_ZBG4Ei_y1f4vtTZS1Nk/edit

can you PM me your contact details so I can help guide you through the tests?


----------



## TK421

http://www.3dmark.com/fs/8611073

19144 score

as in regular nvidia fashion, adding voltage will sometimes harm performance without custom bios


----------



## MrTOOSHORT

Quote:


> Originally Posted by *TK421*
> 
> http://www.3dmark.com/fs/8611073
> 
> 19144 score
> 
> as in regular nvidia fashion, adding voltage will sometimes harm performance without custom bios


Nice, that's like a 1550Mhz Titan-X for gpu score. Considering how much the 1080 just sips power, that's fantastic.


----------



## TK421

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Nice, that's like a 1550Mhz Titan-X for gpu score. Considering how much the 1080 just sips power, that's fantastic.


Yep yep, with unlock bios it's possible for 20K+ firestrike!


----------



## dVeLoPe

count me in for a evga 1080 acx whenever step up decides my turn is up lol


----------



## MrTOOSHORT

Quote:


> Originally Posted by *dVeLoPe*
> 
> count me in for a evga 1080 acx whenever step up decides my turn is up lol


I seen your question earlier about the trade up, I think I'd just stick with your golden 980 TI Classy. That's my vote.


----------



## TK421

For daily driver
+25mv
+170
+500

until mod bios bios show up

@Cyclops - GPUz not support bios extract


----------



## dVeLoPe

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I seen your question earlier about the trade up, I think I'd just stick with your golden 980 TI Classy. That's my vote.


why would you say this though?

i believe 1500ish was its max oc on air as more mhz needed more volts which = more hots

even a stock 1080 will smash a fully overclocked 1080 right?? my best score for comparasion

http://www.3dmark.com/fs/7946701

first strike

http://www.3dmark.com/3dm11/11094940

3dm

look at gfx score only as the cpu is very old (p55 LOL) and is ready to be put into HTPC mode when the 1080 arries for X99 to come out and play =b


----------



## rootmoto

Quote:


> Originally Posted by *TK421*
> 
> For daily driver
> +25mv
> +170
> +500
> 
> until mod bios bios show up
> 
> @Cyclops - GPUz not support bios extract


Try using JoeDirt's 5.278 nvflash and using the command nvflash --save GP104.rom
http://s000.tinyupload.com/index.php?file_id=06252844614392607306
Nvflash has had support for Pascal since 5.265


----------



## TK421

Quote:


> Originally Posted by *rootmoto*
> 
> Try using JoeDirt's 5.278 nvflash and using the command nvflash --save GP104.rom
> http://s000.tinyupload.com/index.php?file_id=06252844614392607306
> Nvflash has had support for Pascal since 5.265


tinyupload doesn't work for me, got mega?


----------



## rootmoto

Quote:


> Originally Posted by *TK421*
> 
> tinyupload doesn't work for me, got mega?


Download it here: https://drive.google.com/file/d/0B-dolA3yEsLWYzBUbURORkdXa2s/view


----------



## MrTOOSHORT

Quote:


> Originally Posted by *dVeLoPe*
> 
> why would you say this though?
> 
> i believe 1500ish was its max oc on air as more mhz needed more volts which = more hots
> 
> even a stock 1080 will smash a fully overclocked 1080 right?? my best score for comparasion
> 
> http://www.3dmark.com/fs/7946701
> 
> first strike
> 
> http://www.3dmark.com/3dm11/11094940
> 
> 3dm
> 
> look at gfx score only as the cpu is very old (p55 LOL) and is ready to be put into HTPC mode when the 1080 arries for X99 to come out and play =b


Couple reasons. You have a good card. Have to ship yours in for the step up and pay the difference. Can only step up to the single 8pin 1080, not the FTW, not sure if this is the case.

But I'd also water cool your 980 TI, so since you're on air and staying on air, maybe switch.

If you had the better cpu, your gpu score would be better in 3dmarks. In reality, your 980TI isn't far off from a GTX 1080.


----------



## TK421

Quote:


> Originally Posted by *rootmoto*
> 
> Download it here: https://drive.google.com/file/d/0B-dolA3yEsLWYzBUbURORkdXa2s/view


CD C:\nvflash

nvflash --save 1080fe.rom

error: no nvidia display adapters found


----------



## rootmoto

Quote:


> Originally Posted by *TK421*
> 
> CD C:\nvflash
> 
> nvflash --save 1080fe.rom
> 
> error: no nvidia display adapters found


Try the unmodded one here and see if it works. Here it is: http://www.techpowerup.com/downloads/2678/nvflash-5-278-0-for-windows Looks like we'll have to wait for the next version of Nvflash to be released then


----------



## TK421

Quote:


> Originally Posted by *rootmoto*
> 
> Try the unmodded one here and see if it works. Here it is: http://www.techpowerup.com/downloads/2678/nvflash-5-278-0-for-windows Looks like we'll have to wait for the next version of Nvflash to be released then


Same thing, GPU enabled or disabled through device manager.


----------



## Jpmboy

Quote:


> Originally Posted by *TK421*
> 
> CD C:\nvflash
> 
> nvflash --save 1080fe.rom
> 
> error: no nvidia display adapters found


Open device manager, disable the video driver, then issue the save command. You must openthe cmnd prompt in the folder with the flash exe... For win10 its under the file menu, win 7, rt click - open cmnd promot here...


----------



## TK421

Quote:


> Originally Posted by *Jpmboy*
> 
> Open device manager, disable the video driver, then issue the save command. You must openthe cmnd prompt in the folder with the flash exe... For win10 its under the file menu, win 7, rt click - open cmnd promot here...


exactly what I did with the 1080 disabled/enabled in device manager, doesn't seem to make a difference


----------



## Clockster

@Jpmboy

For the life of me I can't get GPU-Z to show me my overclocked clocks, it just shows stock.

So for now I'll have to post here.

i7 5930K @ 4.6Ghz Gigabyte GTX1080 FE @2101 Boost(+210)/ 5468 Mem (+460)

MSI Afterburner 4.3.0 Beta 3
Core V +50, Power Limit 120%, Temp Limit 92c, Fan speed 100%.
I didn't see max temp during these runs, I did however see them during my heaven run @1440P and max was 63c, exact same clocks.

20 357
http://www.3dmark.com/fs/8613682

10861
http://www.3dmark.com/fs/8613770

5798
http://www.3dmark.com/fs/8613612


----------



## r0l4n

I get the same results from nvflash, it's not able to read the vbios from the 1080. Even when specifying where to read it from.

Code:



Code:


nvflash.exe --save --pcisegbus 00:0001:00.00 -o 3 gp104.rom

NVIDIA Firmware Update Utility (Version 5.278.0)
Modified Version By Joe Dirt

Specified device by PCI Segment# = 0x0000, Bus# = 0x01.
Device:01:00:00=10DE:1B80:3842:6180 GPU
ERROR: IDevice::Create: Unknown/unsupported device type


----------



## r0l4n

Quote:


> Originally Posted by *Clockster*
> 
> For the life of me I can't get GPU-Z to show me my overclocked clocks, it just shows stock.
> 
> So for now I'll have to post here.
> 
> i7 5930K @ 4.6Ghz Gigabyte GTX1080 FE @2101 Boost(+210)/ 5468 Mem (+460)
> 
> 20 357
> http://www.3dmark.com/fs/8613682
> 
> 10861
> http://www.3dmark.com/fs/8613770
> 
> 5798
> http://www.3dmark.com/fs/8613612


Great results. Please post overvoltage, powerlimit, temp. limit and fan speed. Any word on stability?


----------



## Clockster

Quote:


> Originally Posted by *r0l4n*
> 
> Great results. Please post overvoltage, powerlimit, temp. limit and fan speed. Any word on stability?


I'll add it to the original post.
Completely stable, ran 15 benches in a row, no issues whatsoever. No crashes, no artifacts ect.
I'll test some games a little later on.


----------



## cyenz

Quote:


> Originally Posted by *Clockster*
> 
> I'll add it to the original post.
> Completely stable, ran 15 benches in a row, no issues whatsoever. No crashes, no artifacts ect.
> I'll test some games a little later on.


Im getting similar results, currently testing +175 on the core, it seems fine. Stock voltage, already getting power limited at times. Custom fan curve where fan % matches temp (ex: 70 degrees = 70% fan). It seems that at the same % this reference cooler is a bit quieter than the 980ti at the same %.


----------



## Clockster

Quote:


> Originally Posted by *cyenz*
> 
> Im getting similar results, currently testing +175 on the core, it seems fine. Stock voltage, already getting power limited at times. Custom fan curve where fan % matches temp (ex: 70 degrees = 70% fan). It seems that at the same % this reference cooler is a bit quieter than the 980ti at the same %.


This cooler is definitely quieter than the 980Ti/Titan X reference cooler.
We were actually talking about it yesterday, and a friend of mine who has a 980Ti reference said "That can't be 100% fan speed". lol


----------



## cyenz

Quote:


> Originally Posted by *Clockster*
> 
> This cooler is definitely quieter than the 980Ti/Titan X reference cooler.
> We were actually talking about it yesterday, and a friend of mine who has a 980Ti reference said "That can't be 100% fan speed". lol


That confirms it then, it could be my memory failing on me but it really seems it´s quieter since i could not stand 80% fan on the 980ti but i can tolerate 80% on the 1080.


----------



## r0l4n

New run trying to find the limit. This settings are not stable, Fire Strike runs fine but Heaven artifacts.
ov 100%
core +220 (2025-2100mhz)
mem +594 (11188mhz)
pl 120%
tl 92C
fan 100%

Fire Strike (graphics score):
23834 (http://www.3dmark.com/3dm/12223474)

Fire Strike Extreme (graphics score):
11238 (http://www.3dmark.com/fs/8614123)

Fire Strike Ultra (graphics score):
5524 (http://www.3dmark.com/fs/8614153)


----------



## Clockster

Quote:


> Originally Posted by *r0l4n*
> 
> New run trying to find the limit. This settings are not stable, Fire Strike runs fine but Heaven artifacts.
> ov 100%
> core +220 (2025-2100mhz)
> mem +594 (11188mhz)
> pl 120%
> tl 92C
> fan 100%
> 
> Fire Strike (graphics score):
> 23834 (http://www.3dmark.com/3dm/12223474)
> 
> Fire Strike Extreme (graphics score):
> 11238 (http://www.3dmark.com/fs/8614123)
> 
> Fire Strike Ultra (graphics score):
> 5524 (http://www.3dmark.com/fs/8614153)


Drop your memory to +450 and try again.


----------



## r0l4n

Quote:


> Originally Posted by *Clockster*
> 
> Drop your memory to +450 and try again.


Same approx. score, doesn't help stability.

Fire Strike Ultra (graphics score):
5533 (http://www.3dmark.com/fs/8614363)

Seems temperatures have greater impact than on Maxwell, clocks go up towards 2120 when you start the bench, without artifacts, only when it heats up it down locks and begins artifacting.


----------



## Clockster

Quote:


> Originally Posted by *r0l4n*
> 
> Same approx. score, doesn't help stability.
> 
> Fire Strike Ultra (graphics score):
> 5533 (http://www.3dmark.com/fs/8614363)
> 
> Seems temperatures have greater impact than on Maxwell, clocks go up towards 2120 when you start the bench, without artifacts, only when it heats up it down locks and begins artifacting.


Keep mem @ +450, drop core down to +210

Every card is different, best to start low and work your way up.


----------



## minisale

My first impressions:

EVGA GTX 1080

- max oc +220 Mhz
- max mem oc +500 Mhz
- [email protected] = 40°C max with oc
- no coil whine


----------



## TK421

Quote:


> Originally Posted by *minisale*
> 
> My first impressions:
> 
> EVGA GTX 1080
> 
> - max oc +220 Mhz
> - max mem oc +500 Mhz
> - [email protected] = 40°C max with oc
> - no coil whine


>no coil whine

lucky bastard


----------



## Spiriva

My Evga 1080 Founders Edition and EK waterblock will be incoming this week, Time to play with some new hardware


----------



## SweWiking

I got two Evga "FE" 1080 incoming, EK blocks will ship on monday I hope it says "processing" under my order but I know last time i placed an order it said "processing" for a day or two before they shipped my order so I guess its all good.

Anyhow hows it looking on flashing these bad boys with a custom/modded bios ? From what I understand Gpu Z and nvflash doesnt fully support pascal yet and today its impossible to either save/edit or flash your 1080 card ?


----------



## TK421

Quote:


> Originally Posted by *SweWiking*
> 
> I got two Evga "FE" 1080 incoming, EK blocks will ship on monday I hope it says "processing" under my order but I know last time i placed an order it said "processing" for a day or two before they shipped my order so I guess its all good.
> 
> Anyhow hows it looking on *flashing these bad boys with a custom/modded bios* ? From what I understand Gpu Z and nvflash doesnt fully support pascal yet and today its impossible to either save/edit or flash your 1080 card ?


not yet


----------



## Spiriva

Btw one question, you guys who have replaced the original cooler with a water block already: what size was them hex screws under the original backplate ? Hopefully no smaller then 3mm ?


----------



## westlake

NVIDIA GTX 1080 Founders Edition Owners Complain of Fan Revving Issues

Ehh..


----------



## Cyclops

These cards run a bit hot. You sure FE doesn't stand for.... Fermi Edition?


----------



## cyenz

Quote:


> Originally Posted by *westlake*
> 
> NVIDIA GTX 1080 Founders Edition Owners Complain of Fan Revving Issues
> 
> Ehh..


This only happens to me above a certain temperature and just for a few seconds, then it stabilizes, at idle never happened (until now).


----------



## r0l4n

Quote:


> Originally Posted by *westlake*
> 
> NVIDIA GTX 1080 Founders Edition Owners Complain of Fan Revving Issues
> 
> Ehh..


Reported this happening to me earlier in this thread.


----------



## TK421

Quote:


> Originally Posted by *westlake*
> 
> NVIDIA GTX 1080 Founders Edition Owners Complain of Fan Revving Issues
> 
> Ehh..


Happen to me, temps below 51c on AIO cooler


----------



## cyenz

Must be something FW related since it cant be overrided by custom fan profiles.


----------



## TK421

Quote:


> Originally Posted by *cyenz*
> 
> Must be something FW related since it cant be overrided by custom fan profiles.


yes it can

set fan to 100% lol


----------



## bfedorov11

Quote:


> Originally Posted by *minisale*
> 
> My first impressions:
> 
> EVGA GTX 1080
> 
> - max oc +220 Mhz
> - max mem oc +500 Mhz
> - [email protected] = 40°C max with oc
> - no coil whine


Did you test max OC before using water?


----------



## minisale

Quote:


> Originally Posted by *bfedorov11*
> 
> Did you test max OC before using water?


no.


----------



## Clockster

Quote:


> Originally Posted by *westlake*
> 
> NVIDIA GTX 1080 Founders Edition Owners Complain of Fan Revving Issues
> 
> Ehh..


This only happens on the stock fan curve, adjusting it in MSI afterburner stops this from happening.


----------



## HeliXpc

Got mine running at 2ghz, evga founders edition































980 TI for comparison


----------



## TK421

Quote:


> Originally Posted by *Clockster*
> 
> This only happens on the stock fan curve, adjusting it in MSI afterburner stops this from happening.


no


----------



## VSG

I'll do a reference PCB waterblock coverage, and possibly also stock/aftermarket cooler coverage also depending on how things go. Just waiting for something like that $610 EVGA basic blower version to be released for sale. I also think I know how to do an easy hardmod to unlock more volts for Vcore but we'll see.


----------



## JoeDirt

https://mega.nz/#!LwwVhLTB!FSHt73kPLJjcCOC77kaAYOqf75GhKzbE08IVBiyK8ik
NVFlash 5.278 x64. Can ya'll test to see if this is compatible with extracting the BIOS?

*Never mind. Just got to reading a few pages back. Will have to wait for a new version to land.


----------



## kcuestag

Joining the club, with a pair of EVGA GTX 1080 Founder Edition.










Currently using an EVGA Pro V2 SLI Bridge, has anyone heard any news about the Nvidia HB SLI bridge? A bit weird that no store or website says anything about it.


----------



## r0l4n

Overclocking the 1070:


----------



## nexxusty

Hey boys, my 1080 is ordered should be here whenever Amazon.ca gets stock.

Question, I have an H110i GT just sitting there and I can't stand air cooling, at all.

Until I can afford an EK waterblock on top of my $1,152 CDN GTX 1080 purchase (We get gouged like CRAZY in Canada) I'd like to use this AIO.

Anybody know which brackets will fit on the 1080 reference board? NZXT or Corsair? I haven't paid attention as I never thought I'd use one... lol.

Thanks boys!


----------



## Clockster

Quote:


> Originally Posted by *TK421*
> 
> no


No what? lol
I've got mine on a custom fan curve and it's perfect.


----------



## TK421

Quote:


> Originally Posted by *Clockster*
> 
> No what? lol
> I've got mine on a custom fan curve and it's perfect.


mine still kick up and down


----------



## nexxusty

Quote:


> Originally Posted by *TK421*
> 
> Happen to me, temps below 51c on AIO cooler


Which AIO cooler bracket are you using? Also what GTX 1080 is it on? Evga Founders?

Thx.
Quote:


> Originally Posted by *HeliXpc*
> 
> Got mine running at 2ghz, evga founders edition
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 980 TI for comparison


Very interesting numbers.


----------



## Clockster

Quote:


> Originally Posted by *TK421*
> 
> mine still kick up and down


Ah that sux bud.

I've had a few people asking me to run some heaven benchmarks. I don't have much time right now, so only did one run. Will add more tomorrow.

i7 5930K @ 4.6Ghz Gigabyte GTX1080 FE @2101 Boost(+210)/ 5468 Mem (+460)

1080P


1440P


----------



## DADDYDC650

2Ghz 1080 vs a 1400Mhz 980 Ti


----------



## r0l4n

Quote:


> Originally Posted by *Clockster*
> 
> No what? lol
> I've got mine on a custom fan curve and it's perfect.


Mine still does it with a custom fan curve. In the article of techpowerup it states this is the normal behavior, @Clockster I think you got lucky.


----------



## looniam

Quote:


> Originally Posted by *r0l4n*
> 
> Mine still does it with a custom fan curve. In the article of techpowerup it states this is the normal behavior, @Clockster I think you got lucky.


btw:
Quote:


> Originally Posted by *r0l4n*
> 
> It works. Attached is the vBios for the eVGA GTX 1080 FE.
> 
> gp104.zip 148k .zip file


tried opening it:


----------



## r0l4n

Quote:


> Originally Posted by *looniam*
> 
> btw:
> tried opening it:


I guess we'll have to wait for the Pascal Bios Tweaker.


----------



## Zurv

Quote:


> Originally Posted by *SweWiking*
> 
> I got two Evga "FE" 1080 incoming, EK blocks will ship on monday I hope it says "processing" under my order but I know last time i placed an order it said "processing" for a day or two before they shipped my order so I guess its all good.
> 
> Anyhow hows it looking on flashing these bad boys with a custom/modded bios ? From what I understand Gpu Z and nvflash doesnt fully support pascal yet and today its impossible to either save/edit or flash your 1080 card ?


mine are processing too.. i'm "happy" to see others are like that. I was worried because i ordered 7 blocks. (cards for 2 systems, one 4 way sli and one 3 way - of course the site to unlock SLI isn't up yet.. ugh)

I'm also looking forward to a mod'd bios. The cards are locked to 180W.. but the PCI bus gives 75W and the 8-pin does 150W (225W total) so there is some headroom without needing a card with more power inputs.


----------



## TK421

does anyone know if this is a legit 3dmark adv/pro key?

https://www.g2a.com/3dmark-steam-cd-key-global.html


----------



## AdamK47

Bought mine on Friday at the ye olde Micro Center.


----------



## JoeDirt

NVFlash 5.287 with cert check bypassed x64:
http://s000.tinyupload.com/index.php?file_id=92479283567177038264

Might not flash a unmodded BIOS.

Naennon is working on updating the editor so I would suggest holding off modding until he releases that.

I can not test this because I have no idea when I will be able to get a 1080. Best of luck my friends.


----------



## TK421

Quote:


> Originally Posted by *JoeDirt*
> 
> NVFlash 5.287 with cert check bypassed x64:
> http://s000.tinyupload.com/index.php?file_id=92479283567177038264
> 
> Might not flash a unmodded BIOS.
> 
> Naennon is working on updating the editor so I would suggest holding off modding until he releases that.
> 
> I can not test this because I have no idea when I will be able to get a 1080. Best of luck my friends.


If you only live in NY, I'd let you borrow mine!


----------



## Trys0meM0re

Wrong thread


----------



## shilka

http://www.techpowerup.com/222920/gigabyte-gtx-1080-xtreme-gaming-detailed-some-more

I was looking forward to this card and would have been the card to get for me.
With orange and no backplate its a no go for me and i think i will go with the Asus Strix instead.

The G1 Gaming is pretty ugly as well so Gigabyte has really taken a huge step backwards this time around.

Last the dammed OCN picture uploader is yet again broken so cant upload any pictures.


----------



## KickAssCop

Quote:


> Originally Posted by *DADDYDC650*
> 
> 2Ghz 1080 vs a 1400Mhz 980 Ti


Good video. Thanks for the post


----------



## pharcycle

Anyone played with a Rift (CV1) and one or a pair of these cards, are they noticeably better for VR than Maxwell?

I have ordered a pair to replace my 3x GTX 780Ti's mainly because one of the best games I've found for the Rift is Project Cars which doesn't support SLI. I'd get the benefits in that and the 8GB VRAM sounds tasty if I take the plunge from 1440p to 4k.

They're not going to be shipped till next week (when I ordered them on Friday 10 mins after launch they were in stock but apparently they weren't) so debating a few options now I've had a couple of days to think about it:

1. Cancel the entire order and wait for the 1080Ti (although of course by then the 1180 may be rumoured etc etc) and right now the Ti are only rumours although given the last few generations it seems fairly likely.
2. Cancel 1x card since in general I'm always disappointed with SLI support and as mentioned earlier the one game I really want these for doesn't support it anyway. I'd still get a general benefit in a lot of games but overall I'll be sacrificing raw power. However I reckon I can sell each 780 Ti for about £200 (or a bit more with their water blocks) so selling these could cover the cost of the 1080.
3. Keep the current order and plan to sell them if/when the 1080Ti drops. I reckon I'll lose £200-£300 from both cards doing this BUT.... I reckon the 780Ti's will lose another £50 in value in that time so I can offset that loss against the upgrade now.

The lure of the shiny sure is strong! I'm leaning to getting the pair of 1080 FE's but this probably really isn't the most sensible thing to do!

Thoughts?
Cheers


----------



## jommy999

I think the card with no backplate just for the show in Computex .G1 got backplate . i am certain that Xtreme will also .


----------



## shilka

Quote:


> Originally Posted by *jommy999*
> 
> I think the card with no backplate just for the show in Computex .G1 got backplate . i am certain that Xtreme will also .


Sure hopes so as i am not buying a card without a backplate.
Right now i am leaning towards the Asus Strix the most.


----------



## jommy999

@pharcycle for VR i believe the game need to use VR works in order to get double performance ( as Nvidia advertised ) out of 1080/1070 Pascal cards , without games support Nividia VR works then it won't get any benefit extra than Maxwell . Having said that i have preorder my Vive and a 1080 which i hope will get both of them in 2-3 weeks time .


----------



## Spiriva

Wish id get of work soon, my 1080 is waiting for me at home. Altho the block is shipped from EK today so i guess it will be a few days before i can get it in to the loop. But atleast i can install the 1080 in my girlfriends computer and make sure it works before putting it under water


----------



## jommy999

Quote:


> Originally Posted by *shilka*
> 
> Sure hopes so as i am not buying a card without a backplate.
> Right now i am leaning towards the Asus Strix the most.


at first i lean toward Asus Strix too

- Strix is cheaper
- 8+2 Phase Power
- 1X 6 pin + 1X 8pin
- Boost clock out of the box is 1936 MHz

for Gigabyte Xtreme

- 12+2 Phase Power
- 2X8pin
- Boost clock still N/A but likely will be similar as Strix ( i hope it will reach 2Ghz out of the box tho )

Both got LEDs light style

but somehow i preordered Xtreme just because it has more power phases lol i hope it make the right choice , at least i am on the queue to get the card early if i change my mind i can call the shop and change , I am thinking that i might change to EVGA 1080 Hybrid which £40 more expensive than Xtreme in UK


----------



## shilka

Quote:


> Originally Posted by *jommy999*
> 
> at first i lean toward Asus Strix too
> 
> - Strix is cheaper
> - 8+2 Phase Power
> - 1X 6 pin + 1X 8pin
> - Boost clock out of the box is 1936 MHz
> 
> for Gigabyte Xtreme
> 
> - 12+2 Phase Power
> - 2X8pin
> - Boost clock still N/A but likely will be similar as Strix ( i hope it will reach 2Ghz out of the box tho )
> 
> Both got LEDs light style
> 
> but somehow i preordered Xtreme just because it has more power phases lol i hope it make the right choice , at least i am on the queue to get the card early if i change my mind i can call the shop and change , I am thinking that i might change to EVGA 1080 Hybrid which £40 more expensive than Xtreme in UK


Since i am going to upgrade my motherboard CPU and RAM first new video cards is 2-4 months away since i am not made out of money.
By that time there should be reviews of both cards so i am going to read all the reviews and make a final decision when i actually have the money for two GTX 1080 cards.

Off topic but have anyone seen the new Rampage V board?
http://www.pcper.com/news/Motherboards/Computex-2016-ASUS-ROG-Rampage-V-Edition-10-Extreme-performance-gaming-motherboard?utm_source=twitterfeed&utm_medium=facebook

I am thinking about getting this board instead of the cheaper Asus X99 Strix.
The M.2 slot is out on the far right so its not choked under the video cards.

And all the PCI-E slots are reinforced with metal instead of just the top slot on the Strix.


----------



## jodasanchezz

Im disapointed with the 1080s i sold my 980ti Amp extreme (@1500mhz stable) and my 980ti Sc+ (@1450mhz Stable)
I was Hoping at least 20% more Perfomance + OC headrom....

And now i read This

http://videocardz.com/60631/asus-rog-strix-geforce-gtx-1080-offers-poor-overclocking

Is The Oc potential so bad? can noone hit 2100mhz + (LN2 Record is about 2400 very poor)

What do u guys think?
Whould we wait for an 1080Ti?


----------



## Glottis

Quote:


> Originally Posted by *jodasanchezz*
> 
> Im disapointed with the 1080s i sold my 980ti Amp extreme (@1500mhz stable) and my 980ti Sc+ (@1450mhz Stable)
> I was Hoping at least 20% more Perfomance + OC headrom....
> 
> And now i read This
> 
> http://videocardz.com/60631/asus-rog-strix-geforce-gtx-1080-offers-poor-overclocking
> 
> Is The Oc potential so bad? can noone hit 2100mhz + (LN2 Record is about 2400 very poor)
> 
> What do u guys think?
> Whould we wait for an 1080Ti?


funny link. it should read "1080 offers poor overclocking". nothing to do with strix. you think gigabyte g1 gaming will reach 2500Mhz? hehe keep dreaming


----------



## jommy999

Quote:


> Originally Posted by *shilka*
> 
> Sure hopes so as i am not buying a card without a backplate.
> Right now i am leaning towards the Asus Strix the most.


I just asked the Gigabyte FB page and he confirmed that the card has backplate









'*GIGABYTE Xtreme Gaming* : If you look closely there's a soldered up bracket white in the back because this is an early engineering sample. The card has a backplate, you'll see more pictures tomorrow when Computex starts '


----------



## jodasanchezz

Quote:


> Originally Posted by *Glottis*
> 
> funny link. it should read "1080 offers poor overclocking". nothing to do with strix. you think gigabyte g1 gaming will reach 2500Mhz? hehe keep dreaming


No im not hoping the G1 will reach 2500mhz
but Nvidia Shows off 2100mhz+ at the Show "Aircooled" Founders Edition and nobody can rech 2100+mhz stable so far...so what is going on here?
I would like to se 2200mhz on Water...


----------



## shilka

Quote:


> Originally Posted by *jommy999*
> 
> I just asked the Gigabyte FB page and he confirmed that the card has backplate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> '*GIGABYTE Xtreme Gaming* : If you look closely there's a soldered up bracket white in the back because this is an early engineering sample. The card has a backplate, you'll see more pictures tomorrow when Computex starts '


Yes i saw that on TPU.
More pictures will be up tomorrow.


----------



## smicha

Could you please run Octane Bench on 1080 and post results here?

https://render.otoy.com/octanebench/


----------



## shilka

Turns out the new Gigabyte Extreme Gaming DOES have a backplate and a damm good looking one as well.

http://www.techpowerup.com/forums/threads/gigabyte-gtx-1080-xtreme-gaming-detailed-some-more.222920/#post-3465677


Are those stripes red or orange? its hard for me to tell.


----------



## Shadowdane

Quote:


> Originally Posted by *jodasanchezz*
> 
> No im not hoping the G1 will reach 2500mhz
> but Nvidia Shows off 2100mhz+ at the Show "Aircooled" Founders Edition and nobody can rech 2100+mhz stable so far...so what is going on here?
> I would like to se 2200mhz on Water...


Well part of the reason they likely could hit 2100Mhz was they had VSync on, the demo they showed was locked to 60fps. As the demo never once dipped below 60fps I bet the GPU wasn't stressed at all.






Also notice how the overlay didn't have GPU usage, Power usage or Fan speeds listed. I bet they also had the fan cranked up to 100%!









I've seen the same thing with my GTX980Ti when I force a game to 60fps the GPU can run at higher clocks if the GPU isn't maxed out at 99%. For example in GTA5 if I lock to 60fps my card could hit 1580Mhz with about 60-70% gpu usage. If I uncap the fps at 99% usage I had to drop clocks down about 100Mhz to keep stability.


----------



## skline00

I'm thinking of replacing 2 R9-290s in CF with a single GTX 1080. Any thoughts?


----------



## trickeh2k

Quote:


> Originally Posted by *jodasanchezz*
> 
> Im disapointed with the 1080s i sold my 980ti Amp extreme (@1500mhz stable) and my 980ti Sc+ (@1450mhz Stable)
> I was Hoping at least 20% more Perfomance + OC headrom....
> 
> And now i read This
> 
> http://videocardz.com/60631/asus-rog-strix-geforce-gtx-1080-offers-poor-overclocking
> 
> Is The Oc potential so bad? can noone hit 2100mhz + (LN2 Record is about 2400 very poor)
> 
> What do u guys think?
> Whould we wait for an 1080Ti?


I wouldn't draw too many conclusions from one test of one card. It's still down to the silicon lottery and the chip is brand new which means that both drivers and bioses needs to mature. The review doesn't mention anything about what driver version was used either. I think we need to hold off one or two driver versions and maybe one or two bios revisions before we can draw any conclusions. I wouldn't worry too much at this stage.


----------



## superkyle1721

Hey guys need a favor from you 1080 sli owners. I'm considering selling my Z170 rig for the new x99 chips for the additional pcie lanes. I'm currently running the Maximus hero viii and when ran in sli the cards drop to x8 from x16 as many are aware. I'm waiting it out for the 1080TI. I'm curious is someone with x99 can do a firestrike run at x16 sli and repeat the run at x8 on both. I'm curious if the new cards will take a harder hit on x8 than the 980ti does. If it's substantial then it will help support my decision to make the jump to x99. Thanks

Sent from my iPhone using Tapatalk


----------



## minisale

Quote:


> Originally Posted by *smicha*
> 
> Could you please run Octane Bench on 1080 and post results here?
> 
> https://render.otoy.com/octanebench/


"no supported GPU found"


----------



## smicha

Quote:


> Originally Posted by *minisale*
> 
> "no supported GPU found"


Could you run demo?
https://home.otoy.com/render/octane-render/demo/

and demo scene from benchmark?


----------



## Spiriva

Got the card, now im just waiting for the waterblock and the backplate. Ek shipped the waterblock today but the backplate will apperently ship tomorrow.
I hope that on thursday I will be up and running with everything under water, and my old Titan X card as a gift to a friend


----------



## Bigm

Just payed my step up fee, hope to be in the club by the end of the week.


----------



## Scrimstar

Which card releasing soon will have extra psu pins, be clocked the highest, and which will have the best air cooling? I am deciding between the Asus Strix, EVGA Classified, Gigabyte Xtreme G


----------



## Naennon

Quote:


> Originally Posted by *skline00*
> 
> I'm thinking of replacing 2 R9-290s in CF with a single GTX 1080. Any thoughts?


the best you can EVER do


----------



## 5150 Joker

Quote:


> Originally Posted by *Clockster*
> 
> Ah that sux bud.
> 
> I've had a few people asking me to run some heaven benchmarks. I don't have much time right now, so only did one run. Will add more tomorrow.
> 
> i7 5930K @ 4.6Ghz Gigabyte GTX1080 FE @2101 Boost(+210)/ 5468 Mem (+460)
> 
> 1080P
> 
> 
> 1440P


Nice scores, is that the max your card gets? It's about 9% faster than one of my Titan X's at 1480 MHz (my 24/7 setting) in the same benchmark at 1440p.


----------



## trickeh2k

Quote:


> Originally Posted by *Scrimstar*
> 
> Which card releasing soon will have extra psu pins, be clocked the highest, and which will have the best air cooling? I am deciding between the Asus Strix, EVGA Classified, Gigabyte Xtreme G


Classy will defo be far better when it comes to power support with 14+3 power phase. It also features triple vbios and EVGAs super awesome warranty and support. Buuuut, there's no telling really until the cards start to arrive and the end consumers and we get some initial testing and benching.


----------



## Silent Scone

Quote:


> Originally Posted by *Spiriva*
> 
> 
> 
> Got the card, now im just waiting for the waterblock and the backplate. Ek shipped the waterblock today but the backplate will apperently ship tomorrow.
> I hope that on thursday I will be up and running with everything under water, and my old Titan X card as a gift to a friend


Don't get complacent, remember to check the card on air first







. I'm not sure I'll be going water with these personally.


----------



## Naked Snake

What's up guys, I'm from Argentina and there is a Store where I can get an EVGA Founders edition for $1200 USD they only have one and they told me that custom AIB are going to be priced at $1500 USD, it doesn't make any sense because custom should be cheaper but whatever, my country is really the worst.

Anyway I want to know if you guys have fixed the problem with the fan going up to 3000 rpm and what are the temps I should expect with the fan at 100%.

My country is really hot and the last time I had a reference was a 780Ti who run at 88 ºC witht the fan at 100% so I could not OC at all, my case have good airflow but we usually have an ambient temperature of 42 ºC so yeah...

Thanks for any input, it will help me to decide to grab the founders right now or wait and pay more for the custom cards.


----------



## kcuestag

Quote:


> Originally Posted by *Naked Snake*
> 
> What's up guys, I'm from Argentina and there is a Store where I can get an EVGA Founders edition for $1200 USD they only have one and they told me that custom AIB are going to be priced at $1500 USD, it doesn't make any sense because custom should be cheaper but whatever, my country is really the worst.
> 
> Anyway I want to know if you guys have fixed the problem with the fan going up to 3000 rpm and what are the temps I should expect with the fan at 100%.
> 
> My country is really hot and the last time I had a reference was a 780Ti who run at 88 ºC witht the fan at 100% so I could not OC at all, my case have good airflow but we usually have an ambient temperature of 42 ºC so yeah...
> 
> Thanks for any input, it will help me to decide to grab the founders right now or wait and pay more for the custom cards.


That problem is easily solved with software like MSI Afterburner by simply setting a manual fan curve profile.









I use a 1ºC=1% profile after 50ºC and it's all good.


----------



## Benjiw

Quote:


> Originally Posted by *traxtech*
> 
> They use the same PCB, only difference would be temps/sound. If putting it under water either, if air definitely ACX 3.0
> 
> My founders edition EVGA 1080 will be here in a few days with an EK block/back plate on the way too, can't wait


They do? Great information, thanks for this but also not thanks for this lol my urge to get 2x 1080s and ek blocks has just developed 10x.


----------



## Bogga

When are you guys getting your custom versions? 21st of June is the day when I'll receive my two strix... feels like a lifetime


----------



## Spiriva

Quote:


> Originally Posted by *Silent Scone*
> 
> Don't get complacent, remember to check the card on air first
> 
> 
> 
> 
> 
> 
> 
> . I'm not sure I'll be going water with these personally.


Hehe absolutly! gonna hook it up tomorrow to the girlfriends computer and check it out, run a few benchmarks etc, just to make sure the card is okay


----------



## bfedorov11

Quote:


> Originally Posted by *pharcycle*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anyone played with a Rift (CV1) and one or a pair of these cards, are they noticeably better for VR than Maxwell?
> 
> I have ordered a pair to replace my 3x GTX 780Ti's mainly because one of the best games I've found for the Rift is Project Cars which doesn't support SLI. I'd get the benefits in that and the 8GB VRAM sounds tasty if I take the plunge from 1440p to 4k.
> 
> They're not going to be shipped till next week (when I ordered them on Friday 10 mins after launch they were in stock but apparently they weren't) so debating a few options now I've had a couple of days to think about it:
> 
> 1. Cancel the entire order and wait for the 1080Ti (although of course by then the 1180 may be rumoured etc etc) and right now the Ti are only rumours although given the last few generations it seems fairly likely.
> 2. Cancel 1x card since in general I'm always disappointed with SLI support and as mentioned earlier the one game I really want these for doesn't support it anyway. I'd still get a general benefit in a lot of games but overall I'll be sacrificing raw power. However I reckon I can sell each 780 Ti for about £200 (or a bit more with their water blocks) so selling these could cover the cost of the 1080.
> 3. Keep the current order and plan to sell them if/when the 1080Ti drops. I reckon I'll lose £200-£300 from both cards doing this BUT.... I reckon the 780Ti's will lose another £50 in value in that time so I can offset that loss against the upgrade now.
> 
> The lure of the shiny sure is strong! I'm leaning to getting the pair of 1080 FE's but this probably really isn't the most sensible thing to do!
> 
> Thoughts?
> Cheers


Vive owner here.. Here is a good break down of pascal changes and how they benefit VR.






Just get one card unless you are playing games on a regular monitor. Vrsli is just like sli. Support will be game dependent just like sli profiles today. Same goes for the other VR features nv showed.. single pass stereo. Any game with a regular sli profile will not work in vr due to increased frame times caused by sli. I haven't even heard of a single game that is confirmed to support vrsli. We will be lucky to see one game with it this year.

Regular sli support is the worst it has ever been today. I dumped my two TX that did 1500/8000+ on water for a single 1080 for these exact reasons. I'll buy a second 1080 when I actually see sli/vrsli support for games at release. It most likely wont get better until dx12/vulkan is mainstream since they will handle multi card setups differently.


----------



## bfedorov11

Quote:


> Originally Posted by *Bogga*
> 
> When are you guys getting your custom versions? 21st of June is the day when I'll receive my two strix... feels like a lifetime


Where can you order them from? I got an evga from newegg. That is a long wait. And I was upset I forgot today is a holiday.. no shipping.


----------



## Bogga

Quote:


> Originally Posted by *bfedorov11*
> 
> Where can you order them from? I got an evga from newegg. That is a long wait. And I was upset I forgot today is a holiday.. no shipping.


Various places here in *Sweden*...


----------



## Asmola

Quote:


> Originally Posted by *Bogga*
> 
> When are you guys getting your custom versions? 21st of June is the day when I'll receive my two strix... feels like a lifetime


Asus Nordic said that retailers will get Strix cards at 7-9.6, so not long time to wait.


----------



## fat4l

8 or 20 phases arent gonna help you with OC. We could see the same thing with 980Ti where some cards had 10+ phases and couldnt OC any better than ones with 6 phases...
It mostly comes down to the chip itself...


----------



## Clockster

Quote:


> Originally Posted by *5150 Joker*
> 
> Nice scores, is that the max your card gets? It's about 9% faster than one of my Titan X's at 1480 MHz (my 24/7 setting) in the same benchmark at 1440p.


I reckon there is a little bit more left in the tank, I've already scored higher than those scores just haven't had a chance to upload them.
I reckon the card will be out of puff around 2120 core boost.. I'm fine with that though. I'll be selling it in the next week or so once my Extreme Gaming or FTW card arrives.


----------



## r0l4n

Another go at Fire Strike.

Fire Strike (graphics score):
24949 (http://www.3dmark.com/fs/8632279)

Fire Strike Extreme (graphics score):
11804 (http://www.3dmark.com/fs/8632244)

Fire Strike Ultra (graphics score):
5775 (http://www.3dmark.com/fs/8632088)

ov 50%
core +212
mem +585
pl 120%
tl 92C
fan 100%


----------



## Bogga

Quote:


> Originally Posted by *Asmola*
> 
> Asus Nordic said that retailers will get Strix cards at 7-9.6, so not long time to wait.


Hmmm the place where I ordered from has 18th as their date...

They sold 214 strix in 3 days...

That's 1 of 11 versions available at that store...

That store is like 1 out of 10 places where people usually order from. Guess we've bought loads of these cards here in Sweden









(keep in mind that we're only ~9 million over here)


----------



## SweWiking

Quote:


> Originally Posted by *fat4l*
> 
> 8 or 20 phases arent gonna help you with OC. We could see the same thing with 980Ti where some cards had 10+ phases and couldnt OC any better than ones with 6 phases...
> It mostly comes down to the chip itself...


100% agree!


----------



## looniam

announcement left here for no reason:

Phanteks Glacier G1080 Water Block


----------



## Gib007

I've just pre-ordered my *EVGA GeForce GTX 1080 Classified* from *Overclockers.co.uk*. Looking forward to receiving that once it actually comes out!


----------



## HeliXpc

In my experience the reference cards have always overclocked better, this is going back to the gtx 680, 780, 780ti, 980 and 980 ti. My reference is nice at 2.05ghz


----------



## Jpmboy

lol,

@clockster just took 1st place in the Official OCN Heaven 4.0 Benchmark for Single Card.









http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/0_20


----------



## Vellinious

Quote:


> Originally Posted by *HeliXpc*
> 
> In my experience the reference cards have always overclocked better, this is going back to the gtx 680, 780, 780ti, 980 and 980 ti. My reference is nice at 2.05ghz


The Matrix, Extreme OC, KPE and HOF cards would all disagree with that...and so would I.


----------



## Jpmboy

Quote:


> Originally Posted by *Bogga*
> 
> Hmmm the place where I ordered from has 18th as their date...
> 
> They sold 214 strix in 3 days...
> 
> That's 1 of 11 versions available at that store...
> 
> That store is like 1 out of 10 places where people usually order from. Guess we've bought loads of these cards here in Sweden
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (keep in mind that we're only ~9 million over here)


Seems you Swedish guys like to send your money to vendors with the promise they will ship product... sometime in the future.


----------



## fitzy-775

Iv'e been looking to upgrade my gpu for a while now, and I want to be able to play GTA5 and the witcher 3 at max settings at 1440p. Would the gtx 1080 be a good card to upgrade to from a gtx 780 or should I wait for the TI version?.


----------



## bfedorov11

Quote:


> Originally Posted by *fitzy-775*
> 
> Iv'e been looking to upgrade my gpu for a while now, and I want to be able to play GTA5 and the witcher 3 at max settings at 1440p. Would the gtx 1080 be a good card to upgrade to from a gtx 780 or should I wait for the TI version?.


I would say it is probably around 80-100 fps on average for the latest games ~maxed with an overclocked 6700k. You might have to drop a few settings if you want to stay around 100. Drivers are also new so it will only get better with time. I'll be using a single card for 4k60hz with gsync.


----------



## Vellinious

I'm seeing a tiff here and there about the voltage limits. That they're at around 1.25v? Anyone have anything that shows different?


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Vellinious*
> 
> I'm seeing a tiff here and there about the voltage limits. That they're at around 1.25v? Anyone have anything that shows different?


The voltage is limited on pretty much every other nVidia card, isn't it? That it's only 1.25v is likely because of the smaller 16nm process.

A mod-bios might get it up a little, but there'll probably still be a hard ceiling like the (normal) 980Ti/TX.


----------



## Vellinious

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> The voltage is limited on pretty much every other nVidia card, isn't it? That it's only 1.25v is likely because of the smaller 16nm process.
> 
> A mod-bios might get it up a little, but there'll probably still be a hard ceiling like the (normal) 980Ti/TX.


Most of the Maxwells were limited at like 1.312v, with obvious exceptions. It was just noted that even with add on boards, they have yet to find a way around that restriction. Was just curious if anyone here had seen / heard anything different.


----------



## Benjiw

Quote:


> Originally Posted by *Vellinious*
> 
> Most of the Maxwells were limited at like 1.312v, with obvious exceptions. It was just noted that even with add on boards, they have yet to find a way around that restriction. Was just curious if anyone here had seen / heard anything different.


I too will be very interested in this, my golden 970 needs some volt mods on it when I can understand what I'm doing to compare against some ... volt modded 1080s?


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Vellinious*
> 
> Most of the Maxwells were limited at like 1.312v, with obvious exceptions. It was just noted that even with add on boards, they have yet to find a way around that restriction. Was just curious if anyone here had seen / heard anything different.


1.274v on a TX without hard mod - is the 1.312v after the hard-mod? Never tried that. I thought the EVGA bot board (or something, I might have the wrong brand) could bypass the entire VR part of the card and run as much voltage as you could pour to it?


----------



## greg1184

Who else is on the queue for step up from EVGA? I got a 980 ti few weeks ago and put myself on the queue for the ACX 3.0. Might as well get all that I paid for the 980 ti while it depreciates.


----------



## gree

besides Evga and Msi are any other partners selling a blower style 1080 (non FE tho)


----------



## bfedorov11

Yeah, TX limit was 1.27x with bios unlock. You could hard mod with a trim pot. I did it on my two TX but it was completely pointless. Maxwell needs cold, not voltage. I figured it would be the case, but did it more for fun.

If you go back a day or so in the 1080 review thread, I believe it was der8auer, couldn't get the card to boot when giving it more than 1.25v with a power board attached during ln2 runs. Sounds like the chip just doesn't even work with more voltage or there is another hardware lock some where. I believe he needed a custom bios from nvidia to unlock 1.25v so that will probably be the limit. He couldn't break 2500 with ln2. This is all hearsay. I'm sure we'll find out this week.. hopefully we see some more aib cards.


----------



## Lays

Quote:


> Originally Posted by *bfedorov11*
> 
> Yeah, TX limit was 1.27x with bios unlock. You could hard mod with a trim pot. I did it on my two TX but it was completely pointless. Maxwell needs cold, not voltage. I figured it would be the case, but did it more for fun.
> 
> If you go back a day or so in the 1080 review thread, I believe it was der8auer, couldn't get the card to boot when giving it more than 1.25v with a power board attached during ln2 runs. Sounds like the chip just doesn't even work with more voltage or there is another hardware lock some where. I believe he needed a custom bios from nvidia to unlock 1.25v so that will probably be the limit. He couldn't break 2500 with ln2. This is all hearsay.


The custom bios didn't unlock the 1.25v he said on another thread. The people I've talked to over on HWBOT (der8auer, strong island and a few other guys) have said despite trying e-powers, XOC bios and some other stuff it has problems past 1.25v.


----------



## TK421

Need the unlocked power limits









200w is just killing the card's potential

Guess the waiting game is extremely hard this time


----------



## Bogga

Quote:


> Originally Posted by *Jpmboy*
> 
> Seems you Swedish guys like to send your money to vendors with the promise they will ship product... sometime in the future.


Well... if we don't we'll have to wait for the second shipment. Remember the 6700k missing last fall?


----------



## Silent Scone

Quote:


> Originally Posted by *Bogga*
> 
> Well... if we don't we'll have to wait for the second shipment. Remember the 6700k missing last fall?


That was different, someone made a booboo.


----------



## Bogga

Quote:


> Originally Posted by *Silent Scone*
> 
> That was different, someone made a booboo.


Anywhoo... I had money tucked away for these cards, so if they're just sitting on my paypal account for another 4-6 weeks or in the hands of the store doesn't matter to me. What matters is how fast I can get these cards into my case


----------



## Spiriva

Just tried the card in my girlfriends computer and it seem to work just fine. It boosted up to ~1920mhz on its own w/o me touching anything in Msi:AB or such. DHL managed to get the waterblock from Slovenia to Sweden in just one day, however im still waiting for the backplate from EK to ship, i hope they get around doing it today!

Comparison in size: 1080 and her r9 fury strix:










I ran valley benchmark (extreme setting) her gpu clocked at 1050mhz vs 1080 stock (gpu boost 3.0 did the work) and score difference was about 1800ish.

Really want to get this thing under water now and start playing with it in my machine


----------



## PasK1234Xw

I will have my 1080s today


----------



## Clockster

Quote:


> Originally Posted by *Jpmboy*
> 
> lol,
> 
> @clockster just took 1st place in the Official OCN Heaven 4.0 Benchmark for Single Card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/0_20


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Clockster*


Thanks for posting all the benches with your GTX 1080.


----------



## Silent Scone

Quote:


> Originally Posted by *Bogga*
> 
> Anywhoo... I had money tucked away for these cards, so if they're just sitting on my paypal account for another 4-6 weeks or in the hands of the store doesn't matter to me. What matters is how fast I can get these cards into my case


I can't really talk, I'm not entirely sure when I'll be allocated one! Hopefully this week.


----------



## OverK1LL

Finally! Can't wait to get these fired up


----------



## keesgelder

After a postage mix-up I finally got my 1080s yesterday. I will be uploading some 3440x1440 SLI benchmarks when I get home, comparing them to my 780ti's, for those of you interested. I did some brief runs so far, but I'm not too enthusiastic yet. Seemed the Heaven benchmark went from around 60fps (slightly overclocked 780ti's) to around 90 fps (stock 1080's) on 3440x1440, which is a bit less then I might have hoped for.

Im using this SLI bridge btw. Not exactly sure about scaling so far, though GPU usage seems to be just fine.


----------



## kcuestag

Here's a couple of benches I did the other day when I recieved my SLI:





This is using an EVGA Pro V2 SLI bridge. It will be interesting to see if the new HB SLI bridges offer better performance.


----------



## trickeh2k

Quote:


> Originally Posted by *Clockster*


Standard air cooling?


----------



## Clockster

Quote:


> Originally Posted by *trickeh2k*
> 
> Standard air cooling?


Yip standard Gigabyte GTX1080 FE, Fan set to 100% for benchmarking.


----------



## trickeh2k

Quote:


> Originally Posted by *Clockster*
> 
> Yip standard Gigabyte GTX1080 FE, Fan set to 100% for benchmarking.


Alright, coolio







Is 2,2Ghz also game stable? What where the temps after one run?


----------



## Clockster

Quote:


> Originally Posted by *trickeh2k*
> 
> Alright, coolio
> 
> 
> 
> 
> 
> 
> 
> Is 2,2Ghz also game stable? What where the temps after one run?


This is about as far as my card wants to go, I don't have much time to check if there is a bit more left in it.
I can pump up the memory alot but it doesn't change the score at all or score drops.

This run max temps were 63c @100% fanspeed

The card is game stable @+210 core/ +460mem. Played an hour of Overwatch and highest temps recorded was 68c at max stable overclock.


----------



## trickeh2k

Quote:


> Originally Posted by *Clockster*
> 
> This is about as far as my card wants to go, I don't have much time to check if there is a bit more left in it.
> I can pump up the memory alot but it doesn't change the score at all or score drops.
> 
> This run max temps were 63c @100% fanspeed
> 
> The card is game stable @+210 core/ +460mem. Played an hour of Overwatch and highest temps recorded was 68c at max stable overclock.


Alright, thx for that







Where the fans running at 100% when gaming as well?


----------



## Jpmboy

Quote:


> Originally Posted by *Vellinious*
> 
> Most of the Maxwells were limited at like 1.312v, with obvious exceptions. It was just noted that even with add on boards, they have yet to find a way around that restriction. Was just curious if anyone here had seen / heard anything different.


True. Maxwell does not scale with voltage like some previous generations. Getting an actual 1.312V delivered without a hard-mod is not possible due to the hard-coded vdroop. Best one can do thru bios is 1.264V+/- a few depending on the ASIC.
Quote:


> Originally Posted by *Lays*
> 
> The custom bios didn't unlock the 1.25v he said on another thread. The people I've talked to over on HWBOT (der8auer, strong island and a few other guys) have said despite trying e-powers, XOC bios and some other stuff it has problems past 1.25v.


Acually, TitanX bios can be modded to request 1.274V (that's the max, even if the bios has 1.312V set). A bios setting of 1.274V will run in the 1.26 range under load when measured from the back on the PCB with a DMM (~ 10mV droop).
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Thanks for posting all the benches with your GTX 1080.


^^ This. Really helps with folks decision processes, and I find our results more "reliable" than many review sites.







Quote:


> Originally Posted by *GnarlyCharlie*
> 
> 1.274v on a TX without hard mod - is the 1.312v after the hard-mod? Never tried that. I thought the EVGA bot board (or something, I might have the wrong brand) could bypass the entire VR part of the card and run as much voltage as you could pour to it?


Yes! With an EVBOT you can ramp up the voltage on a 980Ti kingpin... can help some cards, but not all. Generally Maxwell does not scale with high voltage.

I've been running my TXs at 1.274V (cyclops3 bios, measured in the 1.26 range depending on the load)) since near launch. Still going strong. I certainly tried lower voltages while making the cyclop3 bios mod of cyclops original bios and with my two cards it was the best compromise with one 64% and one 76% card.


----------



## mouacyk

Quote:


> Originally Posted by *fitzy-775*
> 
> Iv'e been looking to upgrade my gpu for a while now, and I want to be able to play GTA5 and the witcher 3 at max settings at 1440p. Would the gtx 1080 be a good card to upgrade to from a gtx 780 or should I wait for the TI version?.


GTA5 is less demanding than Witcher 3, unless heavily modded of course. 1080 at stock is about as fast as maxed out 980 TI. An OC'ed 2.1GHz 1080 can be up to 1.2x faster (generous) than a maxed out 980 TI.

At 1440p with maxed details and hairworks, I get avg 50fps on my 1.5GHz 980 TI (looking over Novigrad from a hill). If you don't plan to OC a 1080, expect the same performance.
If you OC the 1080 to 2.1GHz, that's 50fps * 1.2 = 60fps. In either case, it's quite playable since the avg is >45fps.


----------



## Clockster

Quote:


> Originally Posted by *trickeh2k*
> 
> Alright, thx for that
> 
> 
> 
> 
> 
> 
> 
> Where the fans running at 100% when gaming as well?


Yeah was running my overclocked profile, fans automatically ramp up to 100% when it's selected.
I doubt the card would be able to maintain those clocks if the fans weren't maxed. I think that's where the appeal of the aib partner cards will come into play. Better temps while offering similar overclocking performance. I reckon only cards like the classified/KPE/Lightning ect will be able to reach 2200...well hopefully reach it.


----------



## Shadowdane

Quote:


> Originally Posted by *kcuestag*
> 
> Here's a couple of benches I did the other day when I recieved my SLI:
> 
> 
> 
> 
> 
> This is using an EVGA Pro V2 SLI bridge. It will be interesting to see if the new HB SLI bridges offer better performance.


Nice!! Yah I'm wondering if the HB bridge will offer any improvements. It wouldn't surprise me honestly the old SLI bridges haven't changed for years now.

On another note your FireStrike score seems a little low, but might be my CPU overclock though if your CPU is running stock.
FireStike 2x980Ti with i7-6700K - *24,505*: http://www.3dmark.com/fs/7881203


----------



## kcuestag

Quote:


> Originally Posted by *Shadowdane*
> 
> Nice!! Yah I'm wondering if the HB bridge will offer any improvements. It wouldn't surprise me honestly the old SLI bridges haven't changed for years now.
> 
> On another note your FireStrike score seems a little low, but might be my CPU overclock though if your CPU is running stock.
> FireStike 2x980Ti with i7-6700K - *24,505*: http://www.3dmark.com/fs/7881203


Yes, my CPU was at stock while running that benchmark.

That s aid, was that result with OC'd 980Ti's or stock? I always prefer to compare stock vs stock.


----------



## kcuestag

Anyhow, don't think I posted a picture yet? Here they are, installed:


----------



## trickeh2k

Quote:


> Originally Posted by *Clockster*
> 
> Yeah was running my overclocked profile, fans automatically ramp up to 100% when it's selected.
> I doubt the card would be able to maintain those clocks if the fans weren't maxed. I think that's where the appeal of the aib partner cards will come into play. Better temps while offering similar overclocking performance. I reckon only cards like the classified/KPE/Lightning ect will be able to reach 2200...well hopefully reach it.


I'm looking to replace my current 780 Classy which has performed remarkably well under water and served me good during these soon two years but it's defo time to move on now but the early reports of the cards doesn't look promising when it comes to overclocking. I was hoping it would be more in line with my card... However, drivers and vbioses most likely needs to mature a bit before we can draw any real conclusions of the Pascal chips OC possibilities.


----------



## GRABibus

Quote:


> Originally Posted by *kcuestag*
> 
> Anyhow, don't think I posted a picture yet? Here they are, installed:


Great and nice machine









I think I will keep my TITAN X until they release a Pascal card with at least 12GB RAM.


----------



## Jpmboy

Quote:


> Originally Posted by *kcuestag*
> 
> Anyhow, don't think I posted a picture yet? Here they are, installed:


looks good!


----------



## N17 dizzi

Quote:


> Originally Posted by *GRABibus*
> 
> Great and nice machine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think I will keep my TITAN X until they release a Pascal card with at least 12GB RAM.


As a former SLI TX owner myself, in what game have you seen memory usage above 8GB?

In other words, why do you need 12GB?


----------



## GiveMeHope

Quote:


> Originally Posted by *gree*
> 
> besides Evga and Msi are any other partners selling a blower style 1080 (non FE tho)


KFA2 as well.


----------



## GRABibus

Quote:


> Originally Posted by *N17 dizzi*
> 
> As a former SLI TX owner myself, in what game have you seen memory usage above 8GB?
> 
> In other words, why do you need 12GB?


What I wanted to say in fact is that it is not valuable for me to spend 800€ to buy a GTX1080 versus TITAN X, that I bought 5 fives months ago only...
I play in 1440p, Battlefront, Black Ops3, DOOM and TITAN X will stay a monster for many months.

Differences is performances between 1080 and TITAN X is not justifying to buy a 1080 at the moment for me.
I will wait for next Pascal releases and check what they propose.

But, maybe we will need more Vram in the next future : new games ? 4K ? More than 4K ? I don't know...
If you speak about running games, of course 12GB is not necessary...But I am not gonna cut my TITAN X by 2 lol


----------



## TK421

What memory clock y'all hitting? I'm now stable at +600


----------



## Trys0meM0re

iam around 580, above that works for a minute of 30 and than artifacts, mayb temp related i dunno ( fan @ 100% though )


----------



## Lays

Quote:


> Originally Posted by *Jpmboy*
> 
> True. Maxwell does not scale with voltage like some previous generations. Getting an actual 1.312V delivered without a hard-mod is not possible due to the hard-coded vdroop. Best one can do thru bios is 1.264V+/- a few depending on the ASIC.
> Acually, TitanX bios can be modded to request 1.274V (that's the max, even if the bios has 1.312V set). A bios setting of 1.274V will run in the 1.26 range under load when measured from the back on the PCB with a DMM (~ 10mV droop).
> ^^ This. Really helps with folks decision processes, and I find our results more "reliable" than many review sites.
> 
> 
> 
> 
> 
> 
> 
> 
> Yes! With an EVBOT you can ramp up the voltage on a 980Ti kingpin... can help some cards, but not all. Generally Maxwell does not scale with high voltage.
> 
> I've been running my TXs at 1.274V (cyclops3 bios, measured in the 1.26 range depending on the load)) since near launch. Still going strong. I certainly tried lower voltages while making the cyclop3 bios mod of cyclops original bios and with my two cards it was the best compromise with one 64% and one 76% card.


I was talking about the 1080, not the Titan X.


----------



## Trys0meM0re

You guys think it will be beneficial to get a waterblock overclocking wise? thinking about getting one but cant justify it on aesthetics and sound alone.


----------



## Zurv

EK blocks here.. now if some cards would show up. I'm kicking myself for picking ground from Newegg. I did pick overnight for the cards from amazon.. but.. *sigh* those 4 cards changed from "come today by 8pm" to.. "got me. I don't know when these cards are going to ship"



the 4 newegg cards come on thrusday. I DID pick overnight for the 6950x (whenever that ships







)


----------



## keesgelder

Here are some of my benchmark results for the 1080 SLI. All benchmarks except the 3DMARK benchmarks were run in 3440*1440 using the PC in my signature. As for game settings, only presets were used, no manual maxing of settings. The 1080's were run on stock settings (even the fan profile), whereas the 780Ti's were slightly overclocked. All reference models on air. All benchmarks were performed using this SLI bridge.





On AVERAGE, the 1080 SLI setup was 1.57 times faster than the 780 Ti setup. Especially for the Unigine benchmarks I find the results relatively disappointing. Given the reviews I've seen I expected closer to 2x the performance of the 780Ti's. I haven't performed any test without SLI yet (except the Tomb Raider DX12 test, obviously), so I can't really say anything reasonable regarding scaling yet.


----------



## Shadowdane

Quote:


> Originally Posted by *kcuestag*
> 
> Yes, my CPU was at stock while running that benchmark.
> 
> That s aid, was that result with OC'd 980Ti's or stock? I always prefer to compare stock vs stock.


Yah that was overclocked benchmark.. I don't have my cards anymore otherwise I'd run them through a new benchmark.

i7-6700K @ 4.6Ghz
980Ti SLI @ ~1460Mhz


----------



## Shadowdane

Just a FYI!

*EVGA 1080 FTW* is up for pre-order on Amazon!









I just ordered 2, can't wait!

http://www.amazon.com/EVGA-GeForce-GAMING-Graphics-08G-P4-6286-KR/dp/B01GAI64GO/


----------



## bfedorov11

Quote:


> Originally Posted by *Zurv*
> 
> EK blocks here.. now if some cards would show up. I'm kicking myself for picking ground from Newegg. I did pick overnight for the cards from amazon.. but.. *sigh* those 4 cards changed from "come today by 8pm" to.. "got me. I don't know when these cards are going to ship"
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> the 4 newegg cards come on thrusday. I DID pick overnight for the 6950x (whenever that ships
> 
> 
> 
> 
> 
> 
> 
> )


Yeah newegg always gets new products in their California warehouse first. If they use UPS ground its like 7 business days across country. I learned that lesson. I got tracking this morning for my card. I missed the friday cut off









Also seems like Amazon always over sells and pushes dates back. I would rather f5 newegg all day than preorder from amazon.
Quote:


> Originally Posted by *keesgelder*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Here are some of my benchmark results for the 1080 SLI. All benchmarks except the 3DMARK benchmarks were run in 3440*1440 using the PC in my signature. As for game settings, only presets were used, no manual maxing of settings. The 1080's were run on stock settings (even the fan profile), whereas the 780Ti's were slightly overclocked. All reference models on air. All benchmarks were performed using this SLI bridge.
> 
> 
> 
> 
> 
> 
> 
> On AVERAGE, the 1080 SLI setup was 1.57 times faster than the 780 Ti setup. Especially for the Unigine benchmarks I find the results relatively disappointing. Given the reviews I've seen I expected closer to 2x the performance of the 780Ti's. I haven't performed any test without SLI yet (except the Tomb Raider DX12 test, obviously), so I can't really say anything reasonable regarding scaling yet.


Kepler cards were strong in unigine benchmarks. I remember when the 980 first launched, gains were not impressive out the gate. FS ultra is double.. where it counts IMO.


----------



## Zurv

Quote:


> Originally Posted by *Shadowdane*
> 
> Just a FYI!
> 
> *EVGA 1080 FTW* is up for pre-order on Amazon!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just ordered 2, can't wait!
> 
> http://www.amazon.com/EVGA-GeForce-GAMING-Graphics-08G-P4-6286-KR/dp/B01GAI64GO/


hrmm.. i wonder if the EK block will fit this (it isn't listed on EK's site.. but that could be that EK doesn't list cards that aren't yet.. and they didn't update..)

*update: i don't think so. blah to full blocks. I don't care about the see thru parts of it, but that would cause a problem on the ftw cards.


----------



## Alwrath

Quote:


> Originally Posted by *Shadowdane*
> 
> Just a FYI!
> 
> *EVGA 1080 FTW* is up for pre-order on Amazon!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just ordered 2, can't wait!
> 
> http://www.amazon.com/EVGA-GeForce-GAMING-Graphics-08G-P4-6286-KR/dp/B01GAI64GO/


Thanks for the info bro, just ordered one myself. Will be a great upgrade coming from my radeon 290.


----------



## Shadowdane

And Amazon just listed 2 more models! Both of these are reference boards, just with different coolers.

EVGA 1080 ACX3.0: $619.99 - http://www.amazon.com/dp/B01GE9RFNK/
EVGA 1080 SC ACX3.0: $649.99 - http://www.amazon.com/dp/B01GAI6478/


----------



## Rhuarc86

Quote:


> Originally Posted by *Zurv*
> 
> hrmm.. i wonder if the EK block will fit this (it isn't listed on EK's site.. but that could be that EK doesn't list cards that aren't yet.. and they didn't update..)
> 
> *update: i don't think so. blah to full blocks. I don't care about the see thru parts of it, but that would cause a problem on the ftw cards.


I believe, unless they've changed their mind, that EK will not be making full cover blocks for EVGA non-reference cards.


----------



## Spiriva

Waterblock came in today from EK











Now im just waiting for the EK backplate for the card, hopefully it will come on thursday, so i can hook it up to the loop








The backplate on the picture is another custom one, but i think im gonna end up using the EK one that is all black and might (?) add some passive cooling to the card


----------



## Zurv

Quote:


> Originally Posted by *Rhuarc86*
> 
> I believe, unless they've changed their mind, that EK will not be making full cover blocks for EVGA non-reference cards.


they do make them.. just not FTW cards. They will do ones for classy or kingpin


----------



## Zurv

Quote:


> Originally Posted by *Spiriva*
> 
> Waterblock came in today from EK
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now im just waiting for the EK backplate for the card, hopefully it will come on thursday, so i can hook it up to the loop
> 
> 
> 
> 
> 
> 
> 
> 
> The backplate on the picture is another custom one, but i think im gonna end up using the EK one that is all black and might (?) add some passive cooling to the card


You can't use the blackplate that came with the card? ie, do i have to order backplates now?









EDIT:
from EK's site:
Quote:


> NVIDIA® GeForce GTX 1080 factory backplate is not compatible with this water block!
> •New NVIDIA® SLI HB bridges are not compatible with this water block. You can still use regular SLI bridge that comes with every SLI supported motherboard.


----------



## Spiriva

Quote:


> Originally Posted by *Zurv*
> 
> You can't use the blackplate that came with the card? ie, do i have to order backplates now?


I think the "original" backplate will work just fine, i just find the Nvidia backplate pretty ugly and wanted something that looked abit better








The EK backplate is black with no weird lines or anything on it, i think that looks alot better then the Nvidia one


----------



## iARDAs




----------



## Outcasst

Apparently the EK Block isn't compatible with the HB SLI Bridge.


----------



## Spiriva

Quote:


> Originally Posted by *Outcasst*
> 
> Apparently the EK Block isn't compatible with the HB SLI Bridge.


I think someone from EK said in another thread that EK will make thier own sli bridge tho.


----------



## VSG

The port terminal will hit the HB bridge on the EK block so they (EK) are making their own HB bridge for this. I did let some of the other block makers know so they are aware of this. Bitspower might have a similar issue and it's too late to change there either as their block is coming out this week but Watercool, Koolance and XSPC are making sure their blocks have no issues and Aquacomputer blocks are compatible out of the box.


----------



## Bogga

Quote:


> Originally Posted by *Zurv*
> 
> they do make them.. just not FTW cards. They will do ones for classy or kingpin


From what I've read. EK will only make blocks that fit the reference pcb for EVGA, no other cards... not FTW, not Classified and won't do the block for the HC.

Check EVGA forum...
Quote:


> Originally Posted by *Spiriva*
> 
> I think someone from EK said in another thread that EK will make thier own sli bridge tho.


Yes, I read that as well... since I'll be going ek/barrow only in my build an EK-bridge will be perfect


----------



## Vellinious

Quote:


> Originally Posted by *Bogga*
> 
> From what I've read. EK will only make blocks that fit the reference pcb for EVGA, no other cards... not FTW, not Classified and won't do the block for the HC.
> 
> Check EVGA forum...
> Yes, I read that as well... since I'll be going ek/barrow only in my build an EK-bridge will be perfect


Yup, confirmed by the EK rep in the EVGA forums.


----------



## kcuestag

Quote:


> Originally Posted by *iARDAs*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Phanteks making waterblocks? That's something new for me.


----------



## Shaded War

Whats the deal with overclocking on the 1080? I'v read several accounts of them being incapable of going over 2050-2100 because of a voltage issue, even with the custom cards that have an extra 6/8 pin. Also saw a post that with ln2 they only got it to ~2300 which seems like another confirmation this is a issue with GPU and nothing can fix it.

I'm looking to buy the 1080 once some custom cards release, but from what I'v been seeing is the GPU is stuck at a 2100 wall, so spending extra money on a high end variant with lots of power management would be a waste of money. Like the Zotac had 16+3 phases which caught my interest, but supposedly it won't matter since voltage issues it will never OC above ~2100 Mhz anyway.

Should I just look for the best silent heatsink design and not even bother looking at power management / phases ect... since it wont make a difference anyway? I need some pointers here to give me an idea of what we have going on here.


----------



## Scrimstar

honestly would only yield 5% extra fps at max. get teh quietest one


----------



## BehindTimes

Quote:


> Originally Posted by *Shadowdane*
> 
> Nice!! Yah I'm wondering if the HB bridge will offer any improvements. It wouldn't surprise me honestly the old SLI bridges haven't changed for years now.
> 
> On another note your FireStrike score seems a little low, but might be my CPU overclock though if your CPU is running stock.
> FireStike 2x980Ti with i7-6700K - *24,505*: http://www.3dmark.com/fs/7881203


Hopefully it will.

Just got my SLI earlier this morning. Ran a couple tests, but everything on stock.

Fire Strike: http://www.3dmark.com/fs/8642394
Fire Strike Ultra: http://www.3dmark.com/fs/8642155

The Ultra is a bigger jump than the regular, but still only about 10% over stock Titan X SLI. No overclocking yet, as I'm waiting for the new bridge.

Volume wise, I'm pleased with the cards. They're fairly quiet even under load.


----------



## Jpmboy

Quote:


> Originally Posted by *Lays*
> 
> I was talking about the 1080, not the Titan X.


oh.. cause the quote in your post was to a TX. So, there's a 1080 bios editor and flasher???


----------



## Shadowdane

Regarding overclocking.. it appears the 1080 has a hard limit on voltage of 1.25v.
At this point no one has said if this is just a BIOS limitation or something that would require physical mods to overcome.

I don't think there are any programs out yet that can even read the BIOS files for editing.


----------



## Zurv

From EK RE: custom cards water blocks. Short version, EVGA won't be getting any (like the past they did get classy and Kingpin versions)
Quote:


> Unfortunately, the non-reference edition EVGA cards will not be getting a waterblock this time around. As of now, we can confirm that the ASUS STRIX, MSI Gaming X, and the Gigabyte G1 Gaming GTX 1080 cards will be getting waterblocks. If you have any other questions, please let us know!


----------



## Lays

Quote:


> Originally Posted by *Jpmboy*
> 
> oh.. cause the quote in your post was to a TX. So, there's a 1080 bios editor and flasher???


I'd imagine so, Der8auer said they had tried an XOC bios on a 1080,(I think even with an e-power?) and when volts went above 1.25v the driver reset.

Quote:


> Originally Posted by *Shadowdane*
> 
> Regarding overclocking.. it appears the 1080 has a hard limit on voltage of 1.25v.
> At this point no one has said if this is just a BIOS limitation or something that would require physical mods to overcome.
> 
> I don't think there are any programs out yet that can even read the BIOS files for editing.


From what I've read in all these random threads in HWBOT forums, the voltage can go above 1.25v. But the driver "crashes" or "resets" when this happens.


----------



## emett

W00t, EVGA FTW 1080 ordered. Back in the game ladies.


----------



## MrDerrikk

Quote:


> Originally Posted by *emett*
> 
> W00t, EVGA FTW 1080 ordered. Back in the game ladies.


Where'd you order from, being is WA and all? I'm having trouble finding any Australian sites that even have the preorders available...


----------



## OverK1LL

Quote:


> Originally Posted by *kcuestag*
> 
> Anyhow, don't think I posted a picture yet? Here they are, installed:


Yo Kevin! Long time no see. Hope all is well. Rig is looking SICK!!!


----------



## Jpmboy

Quote:


> Originally Posted by *Lays*
> 
> I'd imagine so, Der8auer said they had tried an XOC bios on a 1080,(I think even with an e-power?) and when volts went above 1.25v the driver reset.
> From what I've read in all these random threads in HWBOT forums, the voltage can go above 1.25v. But the driver "crashes" or "resets" when this happens.


yeah, we can thank nvidia for the increasing bios/driver voltage control links/clash. I'm not sure we can draw any conclusion from the behavior of a Franken1080 about the performance of a reference board with a bios voltage tweak. The lore was that TX (maxwell) didn't respond to voltage either in the extreme setting - and they don't, but it sure helped many folks in the ambient range as evidenced by the success of the "hi-voltage" bioses in the TX thread.







Unfortunately, the effects of cryogenic cooling on circuit behavior kinda negate any relevance to ambient conditions. We'll have to see how voltage effects Pascal under normal conditions with a some better bioses to be sure.


----------



## RedRumy3

Hey guys just got my card today and installed it and ran 3d mark demo and was wondering if graphic score seems about right for my stock 1080 and my i5 @ 4.6ghz


----------



## Mistwalk

With my crappy 4770k we're sitting here:

http://www.3dmark.com/fs/8643728

Glad to see where we will be later on when I change out to Broadwell -E.


----------



## Burke888

Not water cooling but I did add the Hi-Flow bracket to the rear IO on each card. Should help keep temps a little lower for SLI.



Maybe tomorrow we will hear something about AMD and Nvidia will release the Ti's in response. One can only hope!


----------



## VSG

Quote:


> Originally Posted by *Burke888*
> 
> Not water cooling but I did add the Hi-Flow bracket to the rear IO on each card. Should help keep temps a little lower for SLI.
> 
> 
> 
> Maybe tomorrow we will hear something about AMD and Nvidia will release the Ti's in response. One can only hope!


AMD Computex event just finished an hour ago. Nothing that affects the GTX 1080- yet. More on June 29.

I do like that high flow bracket though. Hope it helps you out


----------



## Benjiw

Quote:


> Originally Posted by *Lays*
> 
> I'd imagine so, Der8auer said they had tried an XOC bios on a 1080,(I think even with an e-power?) and when volts went above 1.25v the driver reset.
> From what I've read in all these random threads in HWBOT forums, the voltage can go above 1.25v. But the driver "crashes" or "resets" when this happens.


Hey Lays, you the same one from LTT? I got banned by Godly gamer so I'm not on there, looking to pick up 2 of these 1080s in september! Maybe a 3rd for the gf.


----------



## Lays

Quote:


> Originally Posted by *Benjiw*
> 
> Hey Lays, you the same one from LTT? I got banned by Godly gamer so I'm not on there, looking to pick up 2 of these 1080s in september! Maybe a 3rd for the gf.


Who else would have this baller name?


----------



## gamingarena

Quote:


> Originally Posted by *Burke888*
> 
> Not water cooling but I did add the Hi-Flow bracket to the rear IO on each card. Should help keep temps a little lower for SLI.
> 
> 
> 
> Maybe tomorrow we will hear something about AMD and Nvidia will release the Ti's in response. One can only hope!


Where did you get that high flow bracket from? and is it for 980Ti? i guess its perfect fit for 1080.
let me know im interested in getting 2
Thanks


----------



## FaStVtEc

EVGA Back in stock!

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6180-KR


----------



## TK421

Not sure if I'm tripping balls or not, but I just did a run with +825 on the mem :|

http://www.3dmark.com/3dm/12265798


----------



## Clockster

Quote:


> Originally Posted by *TK421*
> 
> Not sure if I'm tripping balls or not, but I just did a run with +825 on the mem :|
> 
> http://www.3dmark.com/3dm/12265798


Yeah and you lost score because of it.
I was also able to push my memory sky high and ended up losing score thanx to it.

Anyway

System Specs:

i7 5930K @ 4.5Ghz
MSI X99A Gaming 9 ACK
Kingston HyperX Fure DDR4 2400 16GB
Gigabyte GTX 1080FE Stock + Overclocked 210 Core/ 460 mem Game stable.
Corsair Neutron XT 480GB

*The Witcher 3 (Blood and Wine Expansion)
3 Minute benchmark, inc a fight and horseback riding.

1440P (All settings maxed inc Nvidia hair ect)*

Stock
Avg: 60.656 - Min: 54 - Max: 73

Overclocked
Avg: 66.272 - Min: 56 - Max: 78


----------



## MrTOOSHORT

Must be error correction setting in if memory pushed too high.


----------



## TK421

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Must be error correction setting in if memory pushed too high.


True, I get lower benchmark on +800 compared to +500. Though AIDA64 GPGPU benchmark show improvement in copy.

So for now, stick on the 400-500 range?


----------



## traxtech

Anyone else getting flickering with P States change?? it's getting really annoying.


----------



## phalae

Quote:


> Originally Posted by *Clockster*
> 
> Yeah and you lost score because of it.
> I was also able to push my memory sky high and ended up losing score thanx to it.
> 
> Anyway
> 
> System Specs:
> 
> i7 5930K @ 4.5Ghz
> MSI X99A Gaming 9 ACK
> Kingston HyperX Fure DDR4 2400 16GB
> *Gigabyte GTX 1080FE Stock + Overclocked 210 Core/ 460 mem Game stable.*
> Corsair Neutron XT 480GB


Quick question about your OC, is it default vcore ?


----------



## TK421

Quote:


> Originally Posted by *phalae*
> 
> Quick question about your OC, is it default vcore ?


I can't seem to clock above 190 without artifacts, except if +100 vCore


----------



## Clockster

Quote:


> Originally Posted by *phalae*
> 
> Quick question about your OC, is it default vcore ?


Should have mentioned this, that's at +50v Core in Afterburner.


----------



## Silent Scone

Just picked up EK Acetal for the FE. Card should be here on the 3rd with the block. This thing will be mounted on the wall in front of me. Only one cure for blowers, and that's blocks. Blocks every time.


----------



## VSG

Scone, we need to have a talk


----------



## Silent Scone

Quote:


> Originally Posted by *geggeg*
> 
> Scone, we need to have a talk


Blocks, blocks everywhere...

*rocks back and forth in the corner*


----------



## TheNoseKnows

So, has anyone managed to reach 2114MHz at 67C yet?


----------



## Maintenance Bot

Quote:


> Originally Posted by *TheNoseKnows*
> 
> So, has anyone managed to reach 2114MHz at 67C yet?


I can hold those clocks for about 30 seconds


----------



## OverK1LL

NVIDIA did a great job designing these. Anyone have any intel on the SLi bridges? Not that I really need them... A regular SLi bridge is giving me 172fps in BF4 at 3440x1440 but I'd still like one.


----------



## Outcasst

Mine arrived today. Founders edition.

Managed to get a constant 2010MHz on the core without throttling. Fan speed at 75%. It's fairly loud but I mostly have headphones and a regular desk fan running anyway, so I can't hear it.


----------



## TK421

http://www.overclock.net/forum/newestpost/1601329

Now pascal have nvflash, anyone can cure the limit on these cards?


----------



## Zurv

Looks like EVGA will be selling HB SLI bridges.. including 3 and 4 way...
interesting....


[ source: http://www.pcper.com/news/Graphics-Cards/Check-out-what-EVGA-has-store-you ]

now if NVidia would put that tool out to unlock SLI.

I'm guessing the new NVflash will help with that before NVidia officially releases something.


----------



## Spiriva

Quote:


> Originally Posted by *topway*
> 
> *EVGA GTX1080 SC BIOS (86.04.17.00.80)*
> 
> EVGAGTX1080SC86.04.17.00.80.zip 149k .zip file
> 
> 
> 1080_1.jpg 49k .jpg file


Oh so the bios could be saved and edited now ?


----------



## TK421

Quote:


> Originally Posted by *Spiriva*
> 
> Oh so the bios could be saved and edited now ?


Save, yes.
Flash, maybe.
Edit, no.


----------



## trickeh2k

http://forums.guru3d.com/showpost.php?p=5282076&postcount=960

This... is kinda bad.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *trickeh2k*
> 
> This... is kinda bad.


Yeah, it's been mentioned several times so far in this thread. Bios hackers are on the job as we speak, here's hoping it can eventually be overcome. That would be ... kinda good!


----------



## Menno

2 evga 1080 FE ordered, will be arriving friday. Then I can say goodbye to my Fury X. I have a dell 5k screen now and lets see how sli is performing at that res. If both cards can hit 1.9-2ghz on somewhat louder profile ill be happy enough and I have to search for a sli bridge







.


----------



## trickeh2k

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> Yeah, it's been mentioned several times so far in this thread. Bios hackers are on the job as we speak, here's hoping it can eventually be overcome. That would be ... kinda good!


They've bypassed limitations before but this somehow seems to on another level. If I remember correctly, hardware limitation of 1.25V has been around since at least Kepler. My card was also locked to 1.25V but the limit was set to 1.212V anyway. skyn3t's awesome bioses and the overvolt mod tool removed those limitations... here's to hoping it'll work out just fine when people are able to hack bioses and what not but since he's talking about making physical changes to the PCB in order to bypass it's limits makes me kinda worried.


----------



## Vellinious

So for guys running LN2 this card sucks. Anyone with any sense should have had that figured out before they bought the FE (single 8 pin) and found out about the hardware voltage restriction..... /shrug

This isn't news..... The voltage restriction sucks, but....we'll see what the custom boards look like. lol


----------



## Naennon

Quote:


> Originally Posted by *trickeh2k*
> 
> skyn3t's awesome bioses


----------



## trickeh2k

Quote:


> Originally Posted by *Naennon*


...? They've been awesome for my Classy. Did I miss something?


----------



## shilka

At the end of the video you can see the new Ggiabyte HB SLI bridge.


----------



## SynchroSCP

Has anyone checked to see if the reference 980 waterblocks fit the reference 1080? From EK's website they look very very similar.


----------



## maverikv

I just dropped mine into my SG13 last night and set it to +200 / +400 and played some witcher. I'll tweak more later but no issues so far.


----------



## TK421

Quote:


> Originally Posted by *maverikv*
> 
> I just dropped mine into my SG13 last night and set it to +200 / +400 and played some witcher. I'll tweak more later but no issues so far.


200 craps out on me without +100mv oc


----------



## OverK1LL

Quote:


> Originally Posted by *SynchroSCP*
> 
> Has anyone checked to see if the reference 980 waterblocks fit the reference 1080? From EK's website they look very very similar.


I doubt it, but this would be awesome since I still have all my blocks. Are you willing to take your 1080 apart?

Maybe I can look for a photo of the reference design and see if they match up to the block. I'm sure they don't


----------



## ilgello

Quote:


> Originally Posted by *OverK1LL*
> 
> I doubt it, but this would be awesome since I still have all my blocks. Are you willing to take your 1080 apart?
> 
> Maybe I can look for a photo of the reference design and see if they match up to the block. I'm sure they don't


Wouldn't I Wish this was true... Let us know guys


----------



## gamingarena

Quote:


> Originally Posted by *Zurv*
> 
> Looks like EVGA will be selling HB SLI bridges.. including 3 and 4 way...
> interesting....
> 
> 
> [ source: http://www.pcper.com/news/Graphics-Cards/Check-out-what-EVGA-has-store-you ]
> 
> now if NVidia would put that tool out to unlock SLI.
> 
> I'm guessing the new NVflash will help with that before NVidia officially releases something.


Those are not 3 or 4 way SLi bridges but they are same as Nvidia 2x SLi bridges just in 3 different sizes for different spaces from 40mm to 80mm.


----------



## nexxusty

Quote:


> Originally Posted by *Naennon*


I know right?

zoson did the 900 series custom bioses. Skyn3t promised and didn't deliver.


----------



## SynchroSCP

Quote:


> Originally Posted by *OverK1LL*
> 
> I doubt it, but this would be awesome since I still have all my blocks. Are you willing to take your 1080 apart?
> 
> Maybe I can look for a photo of the reference design and see if they match up to the block. I'm sure they don't


Same here, have a block ready to go if it does. I preordered the ACX 3.0 1080 so won't have it until mid-june but will definitely check then. If not I think I'll just use a thermosphere on it since I already have one and baseplate/heatsink/fan on the VRMs.

I looked at the reference pictures on EK's website, it looks really close so here's hoping. Found this comparison of the 1080 and 980 reference boards...


----------



## bfedorov11

My main concern would be a height difference. Those blocks are pretty exact and you have to tighten the screws all the way down. Looks like that top right cap where the 1080 is missing the second power connector could be a problem depending on if it is a full size block. No die guard or what it is called on the 1080 too.


----------



## SynchroSCP

I'll def give it a try when my board comes, put thermal pads and TIM on and check contact and clearances. As long as the die has good contact and cutouts fit then possible the rest can be made up with thermal pads.


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Blocks, blocks everywhere...
> 
> *rocks back and forth in the corner*


POst a pic when it's _hanging._ I'll use a uniblock for a while to see if temps buy anything.











Quote:


> Originally Posted by *Naennon*


Quote:


> Originally Posted by *nexxusty*
> 
> Skyn3t promised and didn't deliver.


Sometimes life and family get in the way.


----------



## looniam

Quote:


> Originally Posted by *nexxusty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Naennon*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know right?
> 
> zoson did the 900 series custom bioses. Skyn3t promised and didn't deliver.
Click to expand...

something called _life_ happens ya know.









just saying.


----------



## nexxusty

Quote:


> Originally Posted by *looniam*
> 
> something called _life_ happens ya know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just saying.


I understand that. Just stating a fact.


----------



## Noshuru

Quote:


> Originally Posted by *TK421*
> 
> Not sure if I'm tripping balls or not, but I just did a run with +825 on the mem :|
> 
> http://www.3dmark.com/3dm/12265798


Hey, I know you from Battlelog. Cool.


----------



## Spiriva

My EK backplate did come today, sadly I just had time to put the block/backplate on the card, to much at work tomorrow to be abel to drain the loop, take the Titan X out, put the 1080 in and fill the loop back up again, hopefully I will have some time over tomorrow after work for that









I did take some pics while i went along today, sorry for the bad quality, ****ty iphone cam









The card all naked.









I added some more thermal pads then it said in the manual, I did that to match the FE pads from the FE cooler.









Again i added one more little square of thermal pad to match the FE backplate.









EK waterblock installed on the 1080.









EK Backplate installed on the 1080.









Nvidia have used some new hex screws on the FE cards under the original backplate, if anyone needs to know what size of hexagon key that fits its 4mm, the manual state to use pliers, i would absolutly not do that tho

















*Had to reupload the pics so they didnt go upside down.


----------



## looniam

Quote:


> Originally Posted by *nexxusty*
> 
> I understand that. Just stating a fact.


sorry, looked more like a pot shot. but hey, when someone's wife gets sick they still ought to follow through, right?


----------



## Jpmboy

Quote:


> Originally Posted by *Spiriva*
> 
> My EK backplate did come today, sadly I just had time to put the block/backplate on the card, to much at work tomorrow to be abel to drain the loop, take the Titan X out, put the 1080 in and fill the loop back up again, hopefully I will have some time over tomorrow after work for that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did take some pics while i went along today, sorry for the bad quality, ****ty iphone cam
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> The card all naked.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I added some more thermal pads then it said in the manual, I did that to match the FE pads from the FE cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Again i added one more little square of thermal pad to match the FE backplate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EK waterblock installed on the 1080.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EK Backplate installed on the 1080.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nvidia have used some new hex screws on the FE cards under the original backplate, if anyone needs to know what size of hexagon key that fits its 4mm, the manual state to use pliers, i would absolutly not do that tho
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Had to reupload the pics so they didnt go upside down.


QDCs. You gotta use QDCs!


----------



## N17 dizzi

Could one or more people indulge me. In percentage terms, how much faster is a 1080 over say a Titan X? (Overclocked or otherwise) I am hearing different things.

Many thanks.


----------



## Spiriva

Quote:


> Originally Posted by *Jpmboy*
> 
> QDCs. You gotta use QDCs!


Hehe yes, Id just tought about that today when i realized that time wasnt on my side


----------



## bfedorov11

Quote:


> Originally Posted by *N17 dizzi*
> 
> Could one or more people indulge me. In percentage terms, how much faster is a 1080 over say a Titan X? (Overclocked or otherwise) I am hearing different things.
> 
> Many thanks.


Without looking over reviews again, I would guess around 5% since the TX has more OC headroom and scaling. Stock vs stock seems to average 15-20%. Don't forget future driver updates and how nvidia treats it's EOL hardware.


----------



## VSG

Quote:


> Originally Posted by *Jpmboy*
> 
> QDCs. You gotta use QDCs!


Speaking of which,










Can't link the review links here but those results should help.


----------



## Silent Scone

Quote:


> Originally Posted by *geggeg*
> 
> Speaking of which,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't link the review links here but those results should help.


Shup.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Jpmboy*


I need glasses - I thought that was a Manicorn there for a second









Where is the Manicorn? Waiting for the big block Pascal I reckon.


----------



## TK421

Quote:


> Originally Posted by *Noshuru*
> 
> Hey, I know you from Battlelog. Cool.


I don't hang around anymore, community has evolved from mere cancer to world epidemic.

STR8_AN94BALLER or dr0neya (alt) is my username


----------



## FaStVtEc

Asus in stock
http://www.newegg.com/Product/Product.aspx?Item=N82E16814126101


----------



## traxtech

Pretty weird how higher memory negates performance but wont artifact lol. Mem +600 seems to be the best I can do without going backwards.


----------



## TK421

Quote:


> Originally Posted by *traxtech*
> 
> Pretty weird how higher memory negates performance but wont artifact lol. Mem +600 seems to be the best I can do without going backwards.


I'll try 600


----------



## Burke888

Quote:


> Originally Posted by *gamingarena*
> 
> Where did you get that high flow bracket from? and is it for 980Ti? i guess its perfect fit for 1080.
> let me know im interested in getting 2
> Thanks


I actually pulled these off of my old GTX 980 SC reference EVGA cards. It's the same fit.


----------



## PasK1234Xw

Finally got around to installing mine I got them from EVGA and are garbage for OC cant even hit 2ghz stable. This is unreal.


----------



## Rhuarc86

Thermal limit?


----------



## TK421

Quote:


> Originally Posted by *Rhuarc86*
> 
> Thermal limit?


power consumption limit


----------



## N17 dizzi

Quote:


> Originally Posted by *bfedorov11*
> 
> Without looking over reviews again, I would guess around 5% since the TX has more OC headroom and scaling. Stock vs stock seems to average 15-20%. Don't forget future driver updates and how nvidia treats it's EOL hardware.


5%? Is that for real? Why would anyone buy a 1080.

So if an overclocked Titan X is pulling 60fps, an overclocked 1080 is at 63fps?


----------



## criznit

EVGA founders edition available If anyone is interested!


----------



## bfedorov11

Quote:


> Originally Posted by *N17 dizzi*
> 
> 5%? Is that for real? Why would anyone buy a 1080.
> 
> So if an overclocked Titan X is pulling 60fps, an overclocked 1080 is at 63fps?


1500mhz TX would only be obtainable under water. It seems the 1080 can hit max oc with air cooling That could be 10% in a few months after driver updates.. who knows. Still need more comparisons from users and not review sites.


----------



## Benjiw

Quote:


> Originally Posted by *N17 dizzi*
> 
> 5%? Is that for real? Why would anyone buy a 1080.
> 
> So if an overclocked Titan X is pulling 60fps, an overclocked 1080 is at 63fps?


Lets not forget that people are still dealing with early release drivers and bios set limits, I imagine once custom PCBs start hitting the shops that 10% for the standard cards will increase somewhat.


----------



## Jpmboy

Quote:


> Originally Posted by *geggeg*
> 
> Speaking of which,
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't link the review links here but those results should help.


QD4 !! (stay with the plain jane metal... the anodized ones can get funky)
Quote:


> Originally Posted by *GnarlyCharlie*
> 
> I need glasses - I thought that was a Manicorn there for a second
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where is the Manicorn? Waiting for the big block Pascal I reckon.


There's only one Manicorn!








Quote:


> Originally Posted by *traxtech*
> 
> Pretty weird how higher memory negates performance but wont artifact lol. Mem +600 seems to be the best I can do without going backwards.


error correction is going on more than us overclockers want to admit.


----------



## VSG

Quote:


> Originally Posted by *Jpmboy*
> 
> QD4 !! (stay with the plain jane metal... the anodized ones can get funky)


Not anymore! New black Koolance QDCs have black paint on the outside and nickel plating (they weren't anodized, they were chrome plated before) where coolant comes in contact. They've been fine during my testing period of 10+ weeks.










Heck, even the new Alphacool black Eizapfen QDCs are mostly fine too (brass as-is on the inside):


----------



## darkphantom

So I'm sitting on an EVGA FE 1080 GTX but not sure if I should keep it or just wait for the after market? Not sure the extra $100 is worth it - what do you guys think? I don't game at 4k right now but plan on upgrading my monitor to atleast 1440p...

My thoughts are just to SLi the 1070s when they come out, ideas?


----------



## Jpmboy

Quote:


> Originally Posted by *geggeg*
> 
> Not anymore! New black Koolance QDCs have black paint on the outside and nickel plating (they weren't anodized, they were chrome plated before) where coolant comes in contact. They've been fine during my testing period of 10+ weeks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heck, even the new Alphacool black Eizapfen QDCs are mostly fine too (brass as-is on the inside):


that's good news. A couple of guys posted problems with the early black QD4s in a couple of these threads. I happen to like the polished metal - chrome look (on more than just QDCs







)
Quote:


> Originally Posted by *darkphantom*
> 
> So I'm sitting on an EVGA FE 1080 GTX but not sure if I should keep it or just wait for the after market? Not sure the extra $100 is worth it - what do you guys think? I don't game at 4k right now but plan on upgrading my monitor to atleast 1440p...
> 
> My thoughts are just to SLi the 1070s when they come out, ideas?


1 1080 at 1440P may be good enough depending on what you play. a single 980Ti KPE (1565/8300) was gaming very well at 1440P/120Hz here.


----------



## D749

So I installed my two GTX 1080 FE and with SLI disabled I can enabled VSYNC = "Fast," but with SLI enabled the "Fast" option is not listed. I sure as heck hope Nvidia didn't pull a fast one on us SLI user.









*SLI = disabled*


*SLI = enabled*


----------



## Bogga

Quote:


> Originally Posted by *D749*
> 
> So I installed my two GTX 1080 FE and with SLI disabled I can enabled VSYNC = "Fast," but with SLI enabled the "Fast" option is not listed. I sure as heck hope Nvidia didn't pull a fast one on us SLI user.


Hmmm hope not. I'm going SLI and with a 60hz monitor this fast sync sounded like an exciting thing...


----------



## D749

Quote:


> Originally Posted by *Bogga*
> 
> Hmmm hope not. I'm going SLI and with a 60hz monitor this fast sync sounded like an exciting thing...


It seems so. Go back to my original post where I added images.


----------



## Bogga

Quote:


> Originally Posted by *D749*
> 
> It seems so. Go back to my original post where I added images.


Hmm think it is a limitation or that it can be fixed with newer drivers?

If enabled, what settings would you use ingame? Vsync on?


----------



## JoeGuy

I just wanted to ask a quick question on prices before I decide what to buy.

I've been looking at stores everywhere for decent prices & the EVGA ACX 3.0 for €679, direct from the site, is the only card that matches US prices.

€553 (-VAT) works out at $620, just like the US. No other card works out equally. Some are crazy. Is this a good job from EVGA, or is the price difference just early adopter inflation and them being a direct seller bypasses this?

Would the EVGA be a good choice or will the cards level off to US prices soon in your experience?

Thanks.


----------



## D749

Quote:


> Originally Posted by *D749*
> 
> So I installed my two GTX 1080 FE and with SLI disabled I can enabled VSYNC = "Fast," but with SLI enabled the "Fast" option is not listed. I sure as heck hope Nvidia didn't pull a fast one on us SLI user.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *SLI = disabled*
> 
> 
> *SLI = enabled*


Well an NVIDIA rep. just confirmed that this is "by design:" https://forums.geforce.com/default/topic/938259/geforce-drivers/official-geforce-gtx-1080-368-25-whql-display-driver-feedback-thread-released-5-26-16-/post/4893255/#4893255. That really sucks.









Anyone have contacts at NVIDIA they can reach out to about this?


----------



## Silent Scone

Quote:


> Originally Posted by *D749*
> 
> Well an NVIDIA rep. just confirmed that this is "by design:" https://forums.geforce.com/default/topic/938259/geforce-drivers/official-geforce-gtx-1080-368-25-whql-display-driver-feedback-thread-released-5-26-16-/post/4893255/#4893255. That really sucks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Anyone have contacts at NVIDIA they can reach out to about this?*


lol, like whom? AFR probably isn't compatible with this function, hence it's disabled when SLI is detected. I'd recommend buying a G-Sync panel.


----------



## D749

Quote:


> Originally Posted by *Silent Scone*
> 
> lol, like whom? AFR probably isn't compatible with this function, hence it's disabled when SLI is detected. I'd recommend buying a G-Sync panel.


I have a PG348Q which is a GSYNC panel. What does that have anything to do with it? The benefit of Fast VSYNC is when you go past 100Hz in my case, which happens.







Without Fast SYNC it's either tearing or VSYNC. Neither are great options.


----------



## Silent Scone

Quote:


> Originally Posted by *D749*
> 
> I have a PG348Q which is a GSYNC panel. What does that have anything to do with it? The benefit of Fast VSYNC is when you go past 100Hz in my case, which happens.
> 
> 
> 
> 
> 
> 
> 
> Without Fast SYNC it's either tearing or VSYNC. Neither are great options.


And what benefit is that exactly? I never had issues with tearing above 100hz.


----------



## traxtech

What voltage is this card meant to show? I'm only seeing 1.0310V in the new GPU-z, i assume it's not reading it correctly?


----------



## traxtech

Really don't like the way voltage now works







guess i don't like change


----------



## jeanjean15

Hi .

Nobody has an overclocking result with FE edition + waterblock ( EK for example ) please ?


----------



## fernlander

Quote:


> Originally Posted by *N17 dizzi*
> 
> 5%? Is that for real? Why would anyone buy a 1080.
> 
> So if an overclocked Titan X is pulling 60fps, an overclocked 1080 is at 63fps?


Which is why I won't be buying one.


----------



## VSG

Quote:


> Originally Posted by *jeanjean15*
> 
> Hi .
> 
> Nobody has an overclocking result with FE edition + waterblock ( EK for example ) please ?




__
https://www.reddit.com/r/4lzkr9/first_person_with_a_water_cooled_gtx1080/

Not much to say, big power and volt limit to go with poor scaling. 2000-2100 MHz seems to be the norm. No idea about VRM temps but they should be enough for 40 A min each at water-cooled temps.


----------



## Bogga

Quote:


> Originally Posted by *geggeg*
> 
> 
> __
> https://www.reddit.com/r/4lzkr9/first_person_with_a_water_cooled_gtx1080/
> 
> Not much to say, big power and volt limit to go with poor scaling. 2000-2100 MHz seems to be the norm. No idea about VRM temps but they should be enough for 40 A min each at water-cooled temps.


From what I can tell about that he got 2.1 core and 11.4 mem "without fuss"... but no more testing, clocking or fails? Didn't he try more?


----------



## Spiriva

Im sitting at work just wanna get home to put the card in the loop now and have fun, but I guess I have to finnish before I can go home.
Oh well back to work


----------



## Bogga

Quote:


> Originally Posted by *Spiriva*
> 
> Im sitting at work just wanna get home to put the card in the loop now and have fun, but I guess I have to finnish before I can go home.
> Oh well back to work


Flex? Sudden massive headache? BTW... live near Gävle? I wanna see it


----------



## jeanjean15

Quote:


> Originally Posted by *geggeg*
> 
> 
> __
> https://www.reddit.com/r/4lzkr9/first_person_with_a_water_cooled_gtx1080/
> 
> Not much to say, big power and volt limit to go with poor scaling. 2000-2100 MHz seems to be the norm. No idea about VRM temps but they should be enough for 40 A min each at water-cooled temps.


Ok , thanks .

The main information seems to be that the card doesn't reduce anymore his frequency thanks to the low temperature of GPU .


----------



## jeanjean15

error


----------



## TK421

+475 highest mem clock I can do before firestrike score take a hit


----------



## gree

If you don't want to mess with the 1080 OC-ing mess, is getting a card with a high factory overlock a good alternative?


----------



## trickeh2k

Quote:


> Originally Posted by *gree*
> 
> If you don't want to mess with the 1080 OC-ing mess, is getting a card with a high factory overlock a good alternative?


You kinda answered your own question, didn't you?


----------



## TK421

Quote:


> Originally Posted by *gree*
> 
> If you don't want to mess with the 1080 OC-ing mess, is getting a card with a high factory overlock a good alternative?


Sissy...


----------



## gree

Quote:


> Originally Posted by *trickeh2k*
> 
> You kinda answered your own question, didn't you?


It's just I've been told mixed answers.

Some say if you're not into OCing gpus it's waste to get an aftermarket card, get a FE
Some say just to get a Partner card with a high factory OC and you're good
And then I've also heard it's silly to think you can get a plug and play card, that you have to have to OC to be able to play all the high end games with decent fps

I don't mind learning how to overclock, just a lot of review sites are saying nvidia isn't giving much OC room in the new cards


----------



## trickeh2k

Quote:


> Originally Posted by *gree*
> 
> It's just I've been told mixed answers.
> 
> Some say if you're not into OCing gpus it's waste to get an aftermarket card, get a FE
> Some say just to get a Partner card with a high factory OC and you're good
> And then I've also heard it's silly to think you can get a plug and play card, that you have to have to OC to be able to play all the high end games with decent fps
> 
> I don't mind learning how to overclock, just a lot of review sites are saying nvidia is giving much OC room in the new cards


Overclocking a graphics card is the easiest way to do it by yourself. Overclocking a CPU is on a whole different level, I can totally understand why people would stay off that but for GPU? Not really since it's so easy to learn and do. At the moment though, there's no point in buying one of the more expensive aftermarket cards since the overclocking abilities are weak to say the least. Grabbing something like the EVGA FTW would probably suffice for most people though.

Kinda weird question though to ask in a forum for enthusiast overclockers. It's called OVERclock.net and not STOCKclock.net


----------



## gree

Quote:


> Originally Posted by *trickeh2k*
> 
> Overclocking a graphics card is the easiest way to do it by yourself. Overclocking a CPU is on a whole different level, I can totally understand why people would stay off that but for GPU? Not really since it's so easy to learn and do. At the moment though, there's no point in buying one of the more expensive aftermarket cards since the overclocking abilities are weak to say the least. Grabbing something like the EVGA FTW would probably suffice for most people though.
> 
> Kinda weird question though to ask in a forum for enthusiast overclockers. It's called OVERclock.net and not STOCKclock.net


Funny enough I'm on OC net cos I want to OC my cpu haha. But that's because I'm trying to get into editing and would like good rendering times.

I was thinking about risking getting the MSI 1080 gaming Z or just getting the Msi aero OC cos my build has no airflow in the gpu section/htpc


----------



## bfedorov11

Was on the early Fedex truck today







Won't have time to play with it till later


----------



## Zurv

If anyone wanted to see the side by side EK 980, titan x and 1080 blocks.



GOD DAMN NVIDIA! Where is the SLI unlock!?!?!?!?


----------



## VSG

^ That's a great picture. It's fairly easy to see that the Titan X/980Ti block won't work here but that GTX 980 block looks similar. The VRM/VRAM positions are still off relative to the core though, so that's why a new block is needed.


----------



## Baasha

So has anyone else gotten 4 1080s and "unlocked" 4 way SLI?

I just picked up 2 and want to know how/what to do to make sure I can run 4-way SLI on my UberRig before I buy another 2.

Also, what is the status of the HB bridge?


----------



## Zurv

Quote:


> Originally Posted by *Baasha*
> 
> So has anyone else gotten 4 1080s and "unlocked" 4 way SLI?
> 
> I just picked up 2 and want to know how/what to do to make sure I can run 4-way SLI on my UberRig before I buy another 2.
> 
> Also, what is the status of the HB bridge?


The unlock site isn't up yet.. and it makes me really mad as i have 4 cards just sitting on my table


----------



## Spiriva

Titan X is now outside of my case, got the 1080 installed, filled the loop back up and now working on getting all the bubbels out.

Good bye Mr Titan









Quote:


> Originally Posted by *Bogga*
> 
> Flex? Sudden massive headache? BTW... live near Gävle? I wanna see it


Hehe Im from Kungälv, so its a few km´s up to Gävle


----------



## minisale

Quote:


> Originally Posted by *jeanjean15*
> 
> Hi .
> 
> Nobody has an overclocking result with FE edition + waterblock ( EK for example ) please ?


http://www.3dmark.com/fs/8618974

http://www.3dmark.com/fs/8618919

http://www.3dmark.com/fs/8618308

GFX Settings in Afterburner:
- Voltage: +100%
- PowerLimit: +120%
- Core clock: +230MHz
- Memory clock: +500MHz
- Temp max: 40°C

CPU: Intel [email protected],7GHz
Water cooling: EK Fullcover+Backplate


----------



## Playboyer670

Quote:


> Originally Posted by *minisale*
> 
> http://www.3dmark.com/fs/8618974
> 
> http://www.3dmark.com/fs/8618919
> 
> http://www.3dmark.com/fs/8618308
> 
> GFX Settings in Afterburner:
> - Voltage: +100%
> - PowerLimit: +120%
> - Core clock: +230MHz
> - Memory clock: +500MHz
> - Temp max: 40°C
> 
> CPU: Intel [email protected],7GHz
> Water cooling: EK Fullcover+Backplate


Sweet will be ordering an ek waterblock. BTW, what were the temps like on the stock cooler before the waterblock?


----------



## mypickaxe

Quote:


> Originally Posted by *minisale*
> 
> http://www.3dmark.com/fs/8618974


Nice graphics score.


----------



## Hackslash

still no evga ftw pcb shots ?


----------



## Naennon

Quote:


> Originally Posted by *jeanjean15*
> 
> Hi .
> 
> Nobody has an overclocking result with FE edition + waterblock ( EK for example ) please ?


my 980ti vs 1080 same board/same cpu/same ram

CPU @4500
Ram @2800
Rampage V Extreme
980ti @1450 / 4000
1080 @ 2088 / 5400

980ti Fire Strike NVIDIA GeForce GTX 980 Ti video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
1080 Fire Strike NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
980ti Fire Strike Extreme NVIDIA GeForce GTX 980 Ti video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
1080 Fire Strike Extreme NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME


----------



## gree

Quote:


> Originally Posted by *Naennon*
> 
> my 980ti vs 1080 same board/same cpu/same ram
> 
> CPU @4500
> Ram @2800
> Rampage V Extreme
> 980ti @1450 / 4000
> 1080 @ 2088 / 5400
> 
> 980ti Fire Strike NVIDIA GeForce GTX 980 Ti video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
> 1080 Fire Strike NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
> 980ti Fire Strike Extreme NVIDIA GeForce GTX 980 Ti video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
> 1080 Fire Strike Extreme NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5820K Processor,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME


Well you def beat the 980 sli score/4K computer.
Do you have a 4K monitor to test if it can actually get decent fps in 4K?


----------



## Spiriva

Okay so on air the card boosted up to ~1920mhz with the fan spinning as nuts and the temp was around 65-70C, on water it boosted 1990mhz and, then i decided to overclock it some:


2202mhz on the frist try, didnt try higher yet, but so far no weird grafical errors or such. Been letting it run valley bench ar this speed for about 40mins


----------



## TK421

Quote:


> Originally Posted by *Spiriva*
> 
> Okay so on air the card boosted up to ~1920mhz with the fan spinning as nuts and the temp was around 65-70C, on water it boosted 1990mhz and, then i decided to overclock it some:
> 
> 
> Didnt try higher yet, but so far no weird grafical errors or such


daaamn

mine goes to +200 with +100mv but some artifact, works fine in firestrike though


----------



## Spiriva

Quote:


> Originally Posted by *TK421*
> 
> daaamn
> 
> mine goes to +200 with +100mv but some artifact, works fine in firestrike though


I sat mine to +300 to reach 2200, needless to say im pretty happy with it so far








What did you set ur memory too ? i havent touched mine yet.


----------



## TK421

Quote:


> Originally Posted by *Spiriva*
> 
> I sat mine to +300 to reach 2200, needless to say im pretty happy with it so far
> 
> 
> 
> 
> 
> 
> 
> 
> What did you set ur memory too ? i havent touched mine yet.


+475, anything higher I start losing points in firestrike


----------



## Spiriva

Quote:


> Originally Posted by *TK421*
> 
> +475, anything higher I start losing points in firestrike


Gotta have to play around with the memory tomorrow and see what i can get it up too. At +200 what mhz is your card running then ?

I picked up a "HyperX Predator SSD 240GB M.2 PCIe" so sitting installing Windows10 on my computer now again, wish i had time to do this before, i really wanna play around with the 1080 instead


----------



## mouacyk

For those of you pushing memory, it would be interesting to see the performance impact of GDDR5X error correction. To reveal this, OC the memory until it artifacts then back down slowly until it doesn't anymore. Bench that memory and continue backing down until the scores increase again. Error correction costs around 10 - 15% on Maxwell -- appears as a mystery to people unaware of it.


----------



## D13mass

Guys, where are you buying not reference card?


----------



## Radox-0

Thought I would share my results from a comparison between my TX and 1080 so far in single card solution









System stats were same throughout, so just my day to day clocks on the 5960x @ 4.6 Ghz, Cache @ 4.4 Ghz DDR 4 @ 2800 Mhz.

Titan X Settings
Stock Settings - 1316 Mhz core / 7012 Memory throughout tests
Overclock Setting - 1474 Mhz core / 8020 Memory throughout tests

1080
Stock Settings - 1848 Mhz core / 10012 Memory throughout tests
Overclock Setting - 2101 Mhz core / 11030 Memory throughout tests

Clocks can go higher on both, in fact 1080 can get a notch above 2200 Mhz, but 2100 Mhz for that is a nice figure and my second 1080 is somewhat terrible falling over at 1100 (so back it goes) and having that lovely annoying Fan spinning up and down bug. Titan X 1474 Mhz was daily gaming profile, could do higher for bench, but was a decent figure overall for the TX. Did also bench the memory at couple of different speeds and the above for the 1080 did not decrease the score.

All tests were at 3440 x 1440. Scores below are average fps with Min / Max in brackets, though for some tests the Max / Min are pretty crap representations as you can most likely tell.

*Far Cry Primal*
Titan X Stock - 53 fps (45/59)
Titan X OC'd - 60 fps (52/67)
1080 Stock - 61 fps (54/71)
1080 OC'd - 68 fps (59/78)

*Arkham Knight*
Titan X Stock - 63 fps (47/89)
Titan X OC'd - 68 fps (48/98)
1080 Stock - 80 fps (43/120)
1080 OC'd - 94 fps (75/136)

Checked this game 3 times as that's a pretty large and unexpected jump, but results are correct!

*Tomb Raider*
Titan X Stock - 88 fps (70/110)
Titan X OC'd - 99 fps (73/122)
1080 Stock - 97 fps (70/116)
1080 OC'd - 104 fps (80/128)

*Shadow Of Mordor*
Titan X Stock - 83 fps (55/150)
Titan X OC'd - 93 fps (48/141)
1080 Stock - 95 fps (52/181)
1080 OC'd - 105 fps (49/277)

*Metro Last Light*
Titan X Stock - 34 fps (10/130)
Titan X OC'd - 39 fps (12/162)
1080 Stock - 38 fps (20/78)
1080 OC'd - 42 fps (14/82)

*GTA 5*
Titan X Stock - 69 fps (26/147)
Titan X OC'd - 81 fps (31/134)
1080 Stock - 81 fps (23/150)
1080 OC'd - 87 fps (31/130)

*Hitman Absolution*
Titan X Stock - 68 fps (62/105)
Titan X OC'd - 78 fps (66/92)
1080 Stock - 85 fps (72/104)
1080 OC'd - 95 fps (78/114)

*Dragon Age Inquisition*
Titan X Stock - 61 fps (50/87)
Titan X OC'd - 68 fps (54/96)
1080 Stock - 73 fps (68/114)
1080 OC'd - 80 fps (68/118)

*Witcher 3 - Hierarch square (same spot with NPC's)*
Titan X Stock - 50
Titan X OC'd - 53
1080 Stock - 55
1080 OC'd - 57

*Vally - Extreme HD Preset*
Titan X Stock - 95 fps (24/190)
Titan X OC'd - 106 fps (36/211)
1080 Stock - 100 fps (32/201)
1080 OC'd - 115 fps (34/217)

*Firestrike*
Titan X Stock - Graphics score: 18596
Titan X OC'd - Graphics score: 20800
1080 Stock - Graphics score: 21371
1080 OC'd - Graphics score: 23700

*Firestrike - Extreme*
Titan X Stock - Graphics score: 8830
Titan X OC'd - Graphics score: 9721
1080 Stock - Graphics score: 10286
1080 OC'd - Graphics score: 11451

*Firestrike - Ultra*
Titan X Stock - Graphics score: 4488
Titan X OC'd - Graphics score: 4917
1080 Stock - Graphics score: 5101
1080 OC'd - Graphics score: 5624


----------



## traxtech

My best score to date, 5704 in Firestrike Ultra. Not bad considering my score is let down by my 4970k lol

http://www.3dmark.com/fs/8665839

My card does not like the voltage slider at all it seems, so i'm currently at +182 +520 with +0 on voltage.

Same thing happened with my Titan x on stock bios, it was a slug but when i got my hands on a custom vbios it really shined. Hoping its the same for this card considering i seem to be stuck at 2088 core

I still don't understand fully how oc'ing works on these new cards, especially with the voltage slider and how the rule of +13 core is now gone


----------



## rv8000

So I haven't been paying super close attention so far, what seems to be the general consensus on OC limiations, bios, TDP?


----------



## mouacyk

It's not an overclockers card. 1.25v hard limit even on ln2. You basically paid a premium for Boost 3.0 which took away all of the oc headroom. When P100 was revealed at nearly 1.5ghz I mentioned that this time around NVIDIA will improve Boost to do all the work. And they're charging a premium for it.


----------



## Derpinheimer

Have we seen anyone overvolt the card and OC yet? All I see is water cooled at stock voltage?


----------



## mouacyk

Quote:


> Originally Posted by *Derpinheimer*
> 
> Have we seen anyone overvolt the card and OC yet? All I see is water cooled at stock voltage?


Hardware Locked to 1.25v: source http://forum.hwbot.org/showthread.php?p=447721#post447721


----------



## Derpinheimer

Quote:


> Originally Posted by *mouacyk*
> 
> Hardware Locked to 1.25v: source http://forum.hwbot.org/showthread.php?p=447721#post447721


Yeah I mean on water or air though? 2500 LN2 ok, but what about the 99.99%?


----------



## MrDerrikk

Would it be possible to bypass/short out the power limit part of the PCB? I saw an interesting video yesterday of a guy suggesting this is possible on the 1080 to bypass that particular limitation.


----------



## Testier

Quote:


> Originally Posted by *Spiriva*
> 
> I sat mine to +300 to reach 2200, needless to say im pretty happy with it so far
> 
> 
> 
> 
> 
> 
> 
> 
> What did you set ur memory too ? i havent touched mine yet.


Whats your ASIC rating?

I am thinking with the hard locked voltage, higher ASIC cards might be preferable.


----------



## TK421

Quote:


> Originally Posted by *Testier*
> 
> Whats your ASIC rating?
> 
> I am thinking with the hard locked voltage, higher ASIC cards might be preferable.


ASIC not supported by GPU-Z


----------



## Neon Lights

Quote:


> Originally Posted by *Spiriva*
> 
> I sat mine to +300 to reach 2200, needless to say im pretty happy with it so far
> 
> 
> 
> 
> 
> 
> 
> 
> What did you set ur memory too ? i havent touched mine yet.


Do you get 2200 game-stable?

Also, have you tried to reduce the memory clock in order to get a higher GPU clock (doing so more power becomes available to the GPU and I have seen that a few people have achieved higher GPU clocks that way)?


----------



## Neon Lights

Has anyone tried to short the shunt resistors (perhaps by using Coollaboratory Liquid Ultra liquid metal thermal paste) (video: 



) in order to get more power to the GPU?


----------



## SweWiking

Quote:


> Originally Posted by *Spiriva*
> 
> Okay so on air the card boosted up to ~1920mhz with the fan spinning as nuts and the temp was around 65-70C, on water it boosted 1990mhz and, then i decided to overclock it some:
> 
> 
> 2202mhz on the frist try, didnt try higher yet, but so far no weird grafical errors or such. Been letting it run valley bench ar this speed for about 40mins


Darn 2200mhz seems like a kickass card you got there! What did you end up setting your mem too ? I really hope id get such a card too


----------



## Spiriva

Quote:


> Originally Posted by *Neon Lights*
> 
> Do you get 2200 game-stable?
> 
> Also, have you tried to reduce the memory clock in order to get a higher GPU clock (doing so more power becomes available to the GPU and I have seen that a few people have achieved higher GPU clocks that way)?


Yes im running it 2200 (+300 in AB) in games, well atleast so far. Played Doom this morning for almost an hour w/o any problems. The card never goes above 34C, and sits stable at 2202mhz.
So far i just put the memory to +300 that seems to work fine, i havent had the time to either play around more with the memory or the core clock, but i hope i get some time this weekend to fine tune it some more








Quote:


> Originally Posted by *SweWiking*
> 
> Darn 2200mhz seems like a kickass card you got there! What did you end up setting your mem too ? I really hope id get such a card too


I sat the memory to +300 so far.


----------



## Neon Lights

Quote:


> Originally Posted by *Spiriva*
> 
> Yes im running it 2200 (+300 in AB) in games, well atleast so far. Played Doom this morning for almost an hour w/o any problems. The card never goes above 34C, and sits stable at 2202mhz.
> So far i just put the memory to +300 that seems to work fine, i havent had the time to either play around more with the memory or the core clock, but i hope i get some time this weekend to fine tune it some more


What are your exact OC settings (including Power Limit and Voltage) and what program did you use?


----------



## Spiriva

Quote:


> Originally Posted by *Neon Lights*
> 
> What are your exact OC settings (including Power Limit and Voltage) and what program did you use?


I use "Msi Afterburner 4.3.0 beta 4" wit hthe fallowing settings:

Core Voltage +100
Power limit 120
Temp Limit 92c
Core Clock +300
Memory clock +300


----------



## Bogga

What driver are you using? After the drivers post 362.00 one has become a bit afraid of the Nvidia drivers


----------



## PasK1234Xw

My firestrike GPUs at 2000 ill push more later



http://www.3dmark.com/3dm/12298819


----------



## Spiriva

Quote:


> Originally Posted by *Bogga*
> 
> What driver are you using? After the drivers post 362.00 one has become a bit afraid of the Nvidia drivers


I use 368.25, the only ones that work with 1080 I think ?


----------



## criznit

I just snagged a Gigabyte G1 1080 from amazon and now I just play the waiting game...


----------



## AllGamer

This bothers me...

https://www.ekwb.com/news/ek-unveils-new-nvidia-geforce-gtx-1080-water-blocks/

*PLEASE NOTE:

New NVIDIA® SLI HB bridges are not compatible with EK-FC1080 GTX water blocks! You can still use the regular SLI bridge that comes with every SLI supported motherboard.*

I clearly remember during the live stream of the GTX 1080, Nvidia said for the new GTX 1080 to SLI, we need to use the new Hard Bridges, and not the cheap ones that comes with the motherboards.

.... so, does that mean if we go Water with EK blocks, we won't be able to SLI 2 GTX 1080 ?


----------



## VSG

Everyone else, including EK, are making their own versions of the HB SLI bridge that seem to be all occupying a slimmer profile. Those should fit. The stock Nvidia bridge, not even out yet anyway, is really going to look out of place with non FE cards.


----------



## TK421

What difference does it make between HB SLI and old SLI bridge? Not much right?


----------



## VSG

Quote:


> Originally Posted by *TK421*
> 
> What difference does it make between HB SLI and old SLI bridge? Not much right?


We don't know since the bridge isn't even out yet.


----------



## SynchroSCP

As I sit here waiting for mid-june when my EVGA 1080 ships I'm rethinking my plan to use a thermosphere on it. I'll suck it up and get a full block from EK but I really prefer the EVGA backplate...is there a way to keep the EVGA backplate when using an EK reference block? The EK backplate for my TX was a nightmare, never fit quite right and had to use extra pads to make contact plus an additional $35 when the EVGA 1080 will come with a perfectly good one that I prefer the looks of.

Looking at pictures of both looks like there are 4 common screw holes and the ek backplate comes with M2.5x7 size screws. Should be possible and will go to the hardware store and get a couple types of that size screws to be ready.


----------



## jfro63

Quote:


> Originally Posted by *TK421*
> 
> 3rd ranked user on that pic had a house fire incident after benching


So what was the cause of the house fire?


----------



## Derpinheimer

Quote:


> Originally Posted by *Spiriva*
> 
> I use "Msi Afterburner 4.3.0 beta 4" wit hthe fallowing settings:
> 
> Core Voltage +100
> Power limit 120
> Temp Limit 92c
> Core Clock +300
> Memory clock +300


So this is 1.15V out of max 1.25V?


----------



## TK421

Quote:


> Originally Posted by *Derpinheimer*
> 
> So this is 1.15V out of max 1.25V?


At this point we don't even know if adding voltage is beneficial or not.


----------



## Derpinheimer

Quote:


> Originally Posted by *TK421*
> 
> At this point we don't even know if adding voltage is beneficial or not.


True. I keep seeing "voltage +100%" but I thought voltage was locked without a custom bios? Is it even doing anything?


----------



## TK421

Quote:


> Originally Posted by *Derpinheimer*
> 
> True. I keep seeing "voltage +100%" but I thought voltage was locked without a custom bios? Is it even doing anything?


There's a small window where you can adjust voltage from default to whatever the max nvidia allows us.


----------



## Jpmboy

Quote:


> Originally Posted by *PasK1234Xw*
> 
> My firestrike GPUs at 2000 ill push more later
> 
> 
> 
> http://www.3dmark.com/3dm/12298819


eh - time error, usually reflects unstable clocks. (assuming you are not tweaking the HPET/RTC








)


----------



## Menthol

Quote:


> Originally Posted by *geggeg*
> 
> We don't know since the bridge isn't even out yet.


Shouldn't 2 single flexible bridges pretty much be the same as a HB bridge?


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> eh - time error, usually reflects unstable clocks. (assuming you are not tweaking the HPET/RTC
> 
> 
> 
> 
> 
> 
> 
> )


2 at stock frequencies http://www.3dmark.com/fs/8640974

compared to 980ti's http://www.3dmark.com/compare/fs/5774797/fs/8640974#

so with some overclocking the 180 should come out a little on top


----------



## Zurv

Quote:


> Originally Posted by *Menthol*
> 
> Shouldn't 2 single flexible bridges pretty much be the same as a HB bridge?


the flexible ones are really crappy

flex > hard (came with mobo) > LED > HB

HB is only needed for 5k or more. Hard or LED is fine for 4k. LED has more bandwidth than Hard. For LED or Hard use the 3 way for SLI vs 2 way. 2 way only uses 2 bridges. (also, if the bridge doesn't have the bandwidth needed the cards will use the PCI bus.


----------



## velocd

Quote:


> Originally Posted by *Zurv*
> 
> HB is only needed for 5k or more. Hard or LED is fine for 4k. LED has more bandwidth than Hard. For LED or Hard use the 3 way for SLI vs 2 way. 2 way only uses 2 bridges. (also, if the bridge doesn't have the bandwidth needed the cards will use the PCI bus.


I wonder about this, because the Nvidia chart doesn't specify Hz for 4K.










I'm assuming it's at least 4k @ 60Hz, but what about 90Hz or 120Hz+? HB bridge might be useful there.

Also notice the small text specifying smoother gameplay above 2560x1440p 120Hz.


----------



## AllGamer

Quote:


> Originally Posted by *velocd*


Yup... that's me right there Surround....
I'll need that Hard Bridge to get Surround + SLI + 3D + VR to work nicely


----------



## Zurv

argh! unlock me NVidia!


----------



## velocd

Quote:


> Originally Posted by *Zurv*
> 
> argh! unlock me NVidia!


Has there been any word anywhere (since launch) when the request key website will be available?


----------



## AllGamer

Quote:


> Originally Posted by *Zurv*
> 
> argh! unlock me NVidia!


ah! thanks for the reminder.

I forgot to ask exactly about that part.

I've been using AIO for a while, but this time I want to go full Pro using a real EK kit for both CPU and SLI 1080

So, which Parts do I need to order from the EK website for the *"Water SLI"* to connect the two GTX 1080 block to the rest of the kit?

This is the list I've come up with so far:

- EK-KIT X240
- Two EK-FC1080 GTX
- extra bottle of cooling liquid
- (unknown size adapters and fittings to branch out for GPU cooling)








- (unknown parts for Water SLI?)


----------



## zGunBLADEz

Quote:


> Originally Posted by *Zurv*
> 
> argh! unlock me NVidia!


do you know ek its not compatible with the HB sli bridges from nvidia right?

hopefully they do something about it..


----------



## Zurv

Quote:


> Originally Posted by *zGunBLADEz*
> 
> do you know ek its not compatible with the HB sli bridges from nvidia right?


I also know that there are no HB for more than 2 way SLI (and EK is making HB bridges)


----------



## zGunBLADEz

Somebody can do an experiment ... and do before and after temps with and without backplate with active cooling on it?



as the 1080 dont have any VRM/MEM chip on the back of the pcb other than 2 pieces on the gpu part and 1 that only have have thermal pad..
i think the backplate is doing more harm than good on the card itself.

Talking about heat dissipation.. as its not transferring no heat to the backplate (bigger surface metal area) at all, the back of the card have no ventilation no nothing probably raising temps even more..


----------



## JonnyBigBoss

I have the EVGA GTX 1080 FE arriving at my home in just a few minutes. I'm struggling to decide whether or not to keep it.

*I don't plan to overclock or boost the card, but prefer to have the fan speeds on a 1:1 curve (i.e. 60C temperature = 60% fan speed).

Does the reference design still get ridiculously hot under these conditions?* I really don't want my GPU running at over 70 consistently, but I find the custom coolers announced thus far visually unattractive and have had bad experiences with minor quirks with previous custom card solutions (whining noises and such).


----------



## Spiriva

Quote:


> Originally Posted by *Zurv*
> 
> argh! unlock me NVidia!


That looks beautiful!








Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I have the EVGA GTX 1080 FE arriving at my home in just a few minutes. I'm struggling to decide whether or not to keep it.
> 
> *I don't plan to overclock or boost the card, but prefer to have the fan speeds on a 1:1 curve (i.e. 60C temperature = 60% fan speed).
> 
> Does the reference design still get ridiculously hot under these conditions?* I really don't want my GPU running at over 70 consistently, but I find the custom coolers announced thus far visually unattractive and have had bad experiences with minor quirks with previous custom card solutions (whining noises and such).


Mine was around 65-70c with the fan on auto when i tried it on air before installing the waterblock. If you dont mind the sound the fan makes it shouldnt be a problem.


----------



## trickeh2k

Quote:


> Originally Posted by *Zurv*
> 
> argh! unlock me NVidia!


That's... just awesome!


----------



## Bogga

Quote:


> Originally Posted by *SynchroSCP*
> 
> As I sit here waiting for mid-june when my EVGA 1080 ships I'm rethinking my plan to use a thermosphere on it. I'll suck it up and get a full block from EK but I really prefer the EVGA backplate...is there a way to keep the EVGA backplate when using an EK reference block? The EK backplate for my TX was a nightmare, never fit quite right and had to use extra pads to make contact plus an additional $35 when the EVGA 1080 will come with a perfectly good one that I prefer the looks of.
> 
> Looking at pictures of both looks like there are 4 common screw holes and the ek backplate comes with M2.5x7 size screws. Should be possible and will go to the hardware store and get a couple types of that size screws to be ready.


If I'm not incorrect, the backplate for the FE-version don't play well with EK block.


----------



## Zurv

leak hunting time







(and.. blah .. i found one.. but on an old QD.. which i removed)

i also put a 6950x in too


i didn't use the backplates .. because they EK ones don't come till Monday. (i'll use them for my other system with 3 way 1080 - heat really isn't the issue. I worry about knocking chips off the back when i bang around in the case)


----------



## Spiriva

Quote:


> Originally Posted by *Zurv*
> 
> leak hunting time
> 
> 
> 
> 
> 
> 
> 
> (and.. blah .. i found one.. but on an old ****ty QD.. which i removed)
> 
> i also put a 6950x in too


What an awesome system! Did Nvidia say anything about when they will let u run all 4 cards in sli yet ?
Will be fun to see some benchmarks!


----------



## Zurv

not a god damn thing!!!!!. I hope i can live with 2 way @ 4k with all the fancy on while i wait








good thing overwatch can do 4k with one card. (and i'll play the witcher on my other system with 3 titan Xs)

i'm going to also focus on how much i can OC the 6950x in the meantime

(and of course i'll not wait enough time for the water to dry from the leak.. cause i'm a dummy )


----------



## JonnyBigBoss

Can I technically step up an EVGA 1080 FE to one of their custom 1080 cooler packages in a couple months?


----------



## DADDYDC650

Just placed an order for an EVGA 1080 FTW! No taxes and free shipping at B&H which is awesome. They still have the ACX, SC and FTW available for pre-order.


----------



## stocksux

Quote:


> Originally Posted by *DADDYDC650*
> 
> Just placed an order for an EVGA 1080 FTW! No taxes and free shipping at B&H which is awesome. They still have the ACX, SC and FTW available for pre-order.


When will they ship?


----------



## RGSPro

Quote:


> Originally Posted by *stocksux*
> 
> When will they ship?


Their website says 16th of June.


----------



## stocksux

Thanks. I really want to hop on a pre order so I'm not waiting forever, but I want to run a 1080 on water and EK isn't making blocks for EVGA this go around







Gonna wait on a Strix probably.


----------



## DADDYDC650

Quote:


> Originally Posted by *stocksux*
> 
> Thanks. I really want to hop on a pre order so I'm not waiting forever, but I want to run a 1080 on water and EK isn't making blocks for EVGA this go around
> 
> 
> 
> 
> 
> 
> 
> Gonna wait on a Strix probably.


I was going to go for the Strix but I'd rather have an EVGA 1080 since they offer amazing customer service and it'll be easier to sell. GL on your quest!


----------



## rv8000

Quote:


> Originally Posted by *DADDYDC650*
> 
> Just placed an order for an EVGA 1080 FTW! No taxes and free shipping at B&H which is awesome. They still have the ACX, SC and FTW available for pre-order.


Just bit the bullet on the pre-order, probably the last time I'll be buying a card for a long time, excited but I hate waiting. I've had nowinstock bleepin at me all day, a bit too frustrating! Thanks for the heads up!


----------



## DADDYDC650

Quote:


> Originally Posted by *rv8000*
> 
> Just bit the bullet on the pre-order, probably the last time I'll be buying a card for a long time, excited but I hate waiting. I've had nowinstock bleepin at me all day, a bit too frustrating! Thanks for the heads up!


Congrats and no problem. I've been checking out nowinstock as well since before launch. I almost bit the bullet on a FE but why pay more for less? Now let us hope our cards ship on the 16th!


----------



## RGSPro

Quote:


> Originally Posted by *DADDYDC650*
> 
> Congrats and no problem. I've been checking out nowinstock as well since before launch. I almost bit the bullet on a FE but why pay more for less? Now let us hope our cards ship on the 16th!


Or sooner! I haven't seen the EVGA custom cards for sale anywhere except on Amazon, but they aren't taking pre-orders and you have to pay tax with them as well. EVGA's email notification system isn't ideal, either.

The ASUS Strix card has 1 less Displayport, why ASUS? EK's block fits all of the EVGA cards except for the FTW card.


----------



## stocksux

so...hard...not...to...pre order... UH I should just do it. I read the EVGA cards will be getting a water block, just not from EK. But I love EK and my rig is all EK...What to do what to do


----------



## DADDYDC650

Quote:


> Originally Posted by *stocksux*
> 
> so...hard...not...to...pre order... UH I should just do it. I read the EVGA cards will be getting a water block, just not from EK. But I love EK and my rig is all EK...What to do what to do


Support another company that makes blocks not made by EK. Competition is always good.


----------



## stocksux

Quote:


> Originally Posted by *DADDYDC650*
> 
> Support another company that makes blocks not made by EK. Competition is always good.


As is loyalty. Of course as long as what you are loyal to doesn't become complacent. EK seems to be staying on top of things with new products so can't really knock them.


----------



## RGSPro

Quote:


> Originally Posted by *DADDYDC650*
> 
> Just placed an order for an EVGA 1080 FTW! No taxes and free shipping at B&H which is awesome. They still have the ACX, SC and FTW available for pre-order.


Thanks for the heads up! In for 2 SC!


----------



## iCrap

Where are you guys buying the cards? I literally cannot find one to buy


----------



## RGSPro

Quote:


> Originally Posted by *iCrap*
> 
> Where are you guys buying the cards? I literally cannot find one to buy


http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/

Nvidia supposedly has them in stock. If you watch nowinstock long enough you can see them go in and out of stock on Newegg for the various FE cards.


----------



## Asus11

anyone post a firestrike result up using windows 7 and i7 6700k or close (cpu)

want to compare it to my titan x


----------



## criznit

Quote:


> Originally Posted by *DADDYDC650*
> 
> Just placed an order for an EVGA 1080 FTW! No taxes and free shipping at B&H which is awesome. They still have the ACX, SC and FTW available for pre-order.


I'm glad I saw this post lol. I cancelled my pre order for the gigabyte and got the sc version. I prefer evga over gigabyte so this worked out great!


----------



## DADDYDC650

Quote:


> Originally Posted by *stocksux*
> 
> As is loyalty. Of course as long as what you are loyal to doesn't become complacent. EK seems to be staying on top of things with new products so can't really knock them.


Have you seen this blocks for the reference 1080? http://www.overclock.net/t/1600401/various-gtx-1080-reviews/4770#post_25226811


----------



## DADDYDC650

Quote:


> Originally Posted by *criznit*
> 
> I'm glad I saw this post lol. I cancelled my pre order for the gigabyte and got the sc version. I prefer evga over gigabyte so this worked out great!


You can't go wrong with EVGA. Easier to sell once Vega/Big Pascal arrive.


----------



## Benjiw

Quote:


> Originally Posted by *DADDYDC650*
> 
> You can't go wrong with EVGA. Easier to sell once Vega/Big Pascal arrive.


What's Big pascal/Vega?


----------



## PasK1234Xw

Quote:


> Originally Posted by *Jpmboy*
> 
> eh - time error, usually reflects unstable clocks. (assuming you are not tweaking the HPET/RTC
> 
> 
> 
> 
> 
> 
> 
> )


Gotcha havent really messed with. I bumped voltage little more

http://www.3dmark.com/3dm/12309510










edit

it shows higher scores than before but have lower overall score?


----------



## mercs213

Does anyone know what material the heat sink on the founder's edition coolers for the 1080/1070 are made of (i.e. copper, nickle plated, etc)? I want to apply Coollaboratory Liquid Ultra to the die it and see if it drops temperatures, but I am unsure if it is aluminum or not. CLU eats aluminum, nom nom







. Thanks in advance


----------



## DADDYDC650

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Gotcha havent really messed with. I bumped voltage little more
> 
> http://www.3dmark.com/3dm/12309510
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit
> 
> it shows higher scores than before but have lower overall score?


Nice score. Here are my TX's in SL, http://www.3dmark.com/fs/5378668


----------



## PasK1234Xw

you want a cookie?


----------



## DADDYDC650

Quote:


> Originally Posted by *Benjiw*
> 
> What's Big pascal/Vega?


AMD and Nvidia's next top cards silly.
Quote:


> Originally Posted by *PasK1234Xw*
> 
> you want a cookie?


You must be hangry... just wanted to post for comparison. Other people might be interested.


----------



## nersty

Quote:


> Originally Posted by *DADDYDC650*
> 
> AMD and Nvidia's next top cards silly.
> You must be hangry... just wanted to post for comparison. Other people might be interested.


What were they clocked at? I know 3dmark has given some wonky speeds in the summary (Core clock 1,102 MHz is listed)


----------



## DADDYDC650

Quote:


> Originally Posted by *nersty*
> 
> What were they clocked at? I know 3dmark has given some wonky speeds in the summary (Core clock 1,102 MHz is listed)


I think the cores were around 1470-1510 and the memory around 8.0-8.2Ghz.

I pre-ordered an EVGA 1080 FTW. Kinda depressing seeing that the scores are almost identical between a 1080 OC SLI and TX OC SLI.


----------



## AllGamer

Quote:


> Originally Posted by *DADDYDC650*
> 
> Kinda depressing seeing that the scores are almost identical between a 1080 OC SLI and TX OC SLI.


depressing score for the 1080 SLI?
or
depressing score for the TX SLI?


----------



## DADDYDC650

Quote:


> Originally Posted by *AllGamer*
> 
> depressing score for the 1080 SLI?
> or
> depressing score for the TX SLI?


Depressing 1080 SLI score don't you think? Look at the graphics scores.


----------



## CallsignVega

Anyone find a way to get this ridiculously crappy FE fan to stop spiking up to 2900 and back down? Whoever did the BIOS for the FE at NVIDIA is a moron.

My FE on stock BIOS at 1.08v is maxing out at 2100 MHz. This is only netting me about 13% better performance over my 1520MHz 980Ti Lightning.









Any way to increase the voltage yet?


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Anyone find a way to get this ridiculously crappy FE fan to stop spiking up to 2900 and back down? Whoever did the BIOS for the FE at NVIDIA is a moron.
> 
> My FE on stock BIOS at 1.08v is maxing out at 2100 MHz. This is only netting me about 13% better performance over my 1520MHz 980Ti Lightning.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any way to increase the voltage yet?


I believe the next driver update will fix that issue no?


----------



## CallsignVega

Quote:


> Originally Posted by *DADDYDC650*
> 
> I believe the next driver update will fix that issue no?


I don't think drivers have anything to do with fan controllers. I'm thinking a BIOS update. Doing custom fan curves and manual settings for the fan in Precision doesn't do anything. The only way I can get rid of the super annoying revving of the fan is to set it at 2900 RPM (73%). Then I just listen to a steady roar of air in my open bench.

This FE cooler is junk.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> I don't think drivers have anything to do with fan controllers. I'm thinking a BIOS update. Doing custom fan curves and manual settings for the fan in Precision doesn't do anything. The only way I can get rid of the super annoying revving of the fan is to set it at 2900 RPM (73%). Then I just listen to a steady roar of air in my open bench.
> 
> This FE cooler is junk.


I figured the 1080 cooler would kinda suck just like the last reference cooler. That's my opinion though. Others thought it was great for whatever reason.

Check this out,

__
https://www.reddit.com/r/4ma1g2/psa_founders_edition_fan_revving_issue_will_be/


----------



## bfedorov11

Quote:


> Originally Posted by *CallsignVega*
> 
> I don't think drivers have anything to do with fan controllers. I'm thinking a BIOS update. Doing custom fan curves and manual settings for the fan in Precision doesn't do anything. The only way I can get rid of the super annoying revving of the fan is to set it at 2900 RPM (73%). Then I just listen to a steady roar of air in my open bench.
> 
> This FE cooler is junk.


Mine doesn't have the fan issue. Use afterburner beta.


----------



## Bogga

Quote:


> Originally Posted by *bfedorov11*
> 
> Mine doesn't have the fan issue. Use afterburner beta.


Custom fan curves don't fix the revving


----------



## CallsignVega

Look's like the fan issue according to NVIDIA forums happens on both BIOS's. but everyone is not affected.

Oh, and FAST sync is quite a disappointment. Nice stuttering even though FPS is way above refresh rate.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Look's like the fan issue according to NVIDIA forums happens on both BIOS's. but everyone is not affected.
> 
> Oh, and FAST sync is quite a disappointment. Nice stuttering even though FPS is way above refresh rate.


I had no idea about Fast Sync. Do you think the stuttering is because of SLI?


----------



## CallsignVega

Quote:


> Originally Posted by *DADDYDC650*
> 
> I had no idea about Fast Sync. Do you think the stuttering is because of SLI?


Just using one card at the moment.


----------



## axiumone

Quote:


> Originally Posted by *CallsignVega*
> 
> Just using one card at the moment.


Oh dang, that doesn't sound good. Another feature that was advertised and under cooked by nvidia? Shocking. :\


----------



## hellasinc

I wish I knew how people get such high number when trying to overclock . I got mine on +100cv and +120 power v and I cant go past +159 core and +399 on memory .


----------



## spencer785

Quote:


> Originally Posted by *hellasinc*
> 
> I wish I knew how people get such high number when trying to overclock . I got mine on +100cv and +120 power v and I cant go past +159 core and +399 on memory .


I'm with you man Im running 1080 sli and can only get +125 and +500 memory stable with +100cv and +120 power.


----------



## Zurv

kinda whatever settings i put i the cards kinda wanna sit at 2000-2100.

I'm pretty impressed. 4k in witcher 3 and the division are working much better than i thought. not "i have a 980 and 4k works great" - because that is a lie








both those games are rocking solid over 60fps with all the fancy on.


----------



## Petet1990

need help..just picked up a Zotac 1080 and the voltage is locked...any thoughts?


----------



## fat4l

Is the voltage locked/ limited to 1.25v even on custom pcb cards like Asus strix 1080 ?


----------



## spencer785

Quote:


> Originally Posted by *fat4l*
> 
> Is the voltage locked/ limited to 1.25v even on custom pcb cards like Asus strix 1080 ?


Yes and all 1080's are hitting a wall at 2.1ghz


----------



## D749

What pisses me off the most about Fast VSYNC is that it doesn't work in SLI. It wouldn't have stopped me from picking up two GTX 1080 FE, but would have taken the sting out a bit.

NVIDIA confirmed on the official forum that they're working on the FE fan issue and that it should in the next drive release.


----------



## fat4l

Quote:


> Originally Posted by *spencer785*
> 
> Yes and all 1080's are hitting a wall at 2.1ghz


ugh







thats pretty bad .... I guess we have to wait and see if this limit gets removed or not...








+rep


----------



## spencer785

Quote:


> Originally Posted by *Petet1990*
> 
> need help..just picked up a Zotac 1080 and the voltage is locked...any thoughts?


Download the lastest msi afterburner beta and go to settings to unlock voltage.


----------



## CallsignVega

Quote:


> Originally Posted by *spencer785*
> 
> Yes and all 1080's are hitting a wall at 2.1ghz


My 2100 MHz FE is only 13% faster on average than my 1520MHz 980Ti. If the 1080 tops out around 2100, even on custom cards, that's a horrid upgrade.

To add insult to injury I went from one of the best cooled/quietest 980Ti's (Lightning) to this .50 cent piece of crap cooler on the FE making me deaf.


----------



## TK421

https://xdevs.com/guide/pascal_oc/#voltsc

Seems like voltage = useless in this case, and with LN2 the 1080 performs worse than the best 980Ti on LN2.

Welp.


----------



## escalibur

Quote:


> Originally Posted by *spencer785*
> 
> Yes and all 1080's are hitting a wall at 2.1ghz


Some explanation regarding the limit...

*Voltage scaling and "1.25V limit"*
Quote:


> There were some rumors spreading wildly these days regarding "1.25V limitation" or whatever on modified GTX 1080 cards, which requires here few words to explain.
> 
> Hardware itself is well capable of getting to that and above voltage output for GPU core, but GP104 chip itself now more sensitive to voltage, than even previous Maxwell generation. Part of it due to thinner physical process, other part due to challenges removing heat from all those tightly packed 7.2B transistors quick enough from 21% less surface area. Those overclockers who did 2200+ MHz on GTX 980 Ti's are well aware of all things required to achieve those high clocks. Same principle applies to Pascal generation. So if you can manage to keep GPU cooled well and have good voltage delivery to it, you indeed can push higher voltages. Cards cooled by liquid nitrogen during this guide testwork were able to run 1.35-1.4V, reaching speeds over 2500 MHz.
> 
> Fact that GTX 1080's capable of reaching 2.1GHz on aircooling without any modifications confuse lot of people, making them to think that these chips can overclock well past 3GHz on liquid nitrogen cooling. But it's still silicon, with similar architecture, so reality is bit sour. Yes, it allow to get good performance without extreme cooling, but hides the fact that LN2-cooled 980Ti is still much faster than overclocked GTX 1080 due to more shader cores and better CPC performance.
> 
> This also brings and answer to the question if overvolting can help OC on aircooling or watercooling. It does not help, due to thermal, which get only worse. Higher temperature render stability and performance decrease. GPU literally overheats and cannot run high frequency anymore, even though temperature is below specified maximum temperature. So just like in 980/980Ti/TitanX case, over-voltage is not recommended, as it gains no performance improvement.


https://xdevs.com/guide/pascal_oc/#voltsc


----------



## Silent Scone

My FE card turned up yesterday. Time to get cracking. Plugged it into my 6700K ITX system for now.


----------



## Benjiw

Quote:


> Originally Posted by *TK421*
> 
> https://xdevs.com/guide/pascal_oc/#voltsc
> 
> Seems like voltage = useless in this case, and with LN2 the 1080 performs worse than the best 980Ti on LN2.
> 
> Welp.


Can we all stop ney saying, it's still going through driver revisions etc, the 980TI is very much mature and had it's drivers finely tuned.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Benjiw*
> 
> Can we all stop ney saying, it's still going through driver revisions etc, the 980TI is very much mature and had it's drivers finely tuned.


Those are zombie cards that dont follow any hardware limitations fyi so is good to keep an eye on those statements.

We are not going to LN2 or must of us arent, not me at least!!!!! But its an indication on what to expect with all the revisions of gpus coming.

Not looking good, and i got myself a founders and call it a day, theres no point on waiting for better vrms and more power plugs it just doesnt matter..


----------



## mouacyk

Quote:


> Originally Posted by *Benjiw*
> 
> Can we all stop ney saying, it's still going through driver revisions etc, the 980TI is very much mature and had it's drivers finely tuned.


In case you haven't been reading that 1080 is just a Maxwell shrink with faster memory and added VR, there isn't anything new architecturally that drivers can improve upon for non-VR gaming. You just don't write drivers that say "hey, you're 16nm now, give me more perf".


----------



## zGunBLADEz

Quote:


> Originally Posted by *CallsignVega*
> 
> My 2100 MHz FE is only 13% faster on average than my 1520MHz 980Ti. If the 1080 tops out around 2100, even on custom cards, that's a horrid upgrade.
> 
> To add insult to injury I went from one of the best cooled/quietest 980Ti's (Lightning) to this .50 cent piece of crap cooler on the FE making me deaf.


Thats about right lol.

980 ti vs 1080 both overclocked aroundish 1500 and 2100 the 1080 is 13% faster
look at this


----------



## bfedorov11

The 1080 replaced the 980 not the 980ti.


----------



## JonnyBigBoss

I got my EVGA GTX 1080 FE! I love it so far. I had to turn down the target GPU temp in order to get it down to the temps that I look for, though.

Anyone know when Ansel is expected to release? That was a major selling point for me.


----------



## PasK1234Xw

Quote:


> Originally Posted by *DADDYDC650*
> 
> You must be hangry... just wanted to post for comparison. Other people might be interested.


Telling me i have nice score then posting yours you reason wasnt to help..anyways no not angry you at 4.9ghz on CPU
I upgraded from 980s this is huge boost.
Also using old style single floppy bridge ATM so could be bottleneck. Also drivers much more mature for Maxwell than pascal ATM


















edit

here is someone else 45k score with same specs 5960x 4.7Ghz .

http://www.3dmark.com/fs/8677840


----------



## zetoor85

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I got my EVGA GTX 1080 FE! I love it so far. I had to turn down the target GPU temp in order to get it down to the temps that I look for, though.
> 
> Anyone know when Ansel is expected to release? That was a major selling point for me.


may i ask why it where a selling point for you?. as i read it on nvidia homepage ansil have support way down in the nvidia roots







all the way to 600 series

http://www.geforce.com/hardware/technology/ansel/supported-gpus

no problem









but to clearly answer your question, like everyone else, we dont know







but hopefull soon, soon !!!!


----------



## dVeLoPe

can anyone confirm that

EVGA ACX 3.0 WORKS (6181) WITH A WATERBLOCK?? or only the founders/blower style will be able to get cooled via that route?


----------



## fat4l

I cant decide what to get guys .......
1080 seems to be a bit limited and provides about 10% more performance than 980Ti when both oced.
Also I'm on water and ....its a very high end one and I'm not sure if 1080 can utilize it for it's full potential.

Also the price is about 800£ for EK blocks and Asus Strix 1080(custom pcb)...

When I look at ebay, I can get EK watercooled 980Ti strix/msi gaming(both custom pcb) for 1/2 of the cost.... ~400£.

Having said that I'm not rly sure if spending +100% of money to get 10% more performance is adequate









Also, 980Ti utilizes water better than 1080 right ? What sort of clocks can we expect for both cards when run on water <40C temps ?

THanks


----------



## BackwoodsNC

Dont see how drivers will help performance since pascal had no ipc improvements. Maybe in certain aaa games but older games i bet you got what you got.


----------



## Vellinious

Quote:


> Originally Posted by *fat4l*
> 
> I cant decide what to get guys .......
> 1080 seems to be a bit limited and provides about 10% more performance than 980Ti when both oced.
> Also I'm on water and ....its a very high end one and I'm not sure if 1080 can utilize it for it's full potential.
> 
> Also the price is about 800£ for EK blocks and Asus Strix 1080(custom pcb)...
> 
> When I look at ebay, I can get EK watercooled 980Ti strix/msi gaming(both custom pcb) for 1/2 of the cost.... ~400£.
> 
> Having said that I'm not rly sure if spending +100% of money to get 10% more performance is adequate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, 980Ti utilizes water better than 1080 right ? What sort of clocks can we expect for both cards when run on water <40C temps ?
> 
> THanks


I've been debating the same thing myself....good question.


----------



## romanlegion13th

Quote:


> Originally Posted by *CallsignVega*
> 
> My 2100 MHz FE is only 13% faster on average than my 1520MHz 980Ti. If the 1080 tops out around 2100, even on custom cards, that's a horrid upgrade.
> 
> To add insult to injury I went from one of the best cooled/quietest 980Ti's (Lightning) to this .50 cent piece of crap cooler on the FE making me deaf.


Not a happy man.. Im waiting for the next Titan or Ti,

so only 13% not much really and that loud fan is no good can't play games with it out a headset..

Glad i did not upgrade


----------



## caliking420

Quote:


> Originally Posted by *romanlegion13th*
> 
> Not a happy man.. Im waiting for the next Titan or Ti,
> 
> so only 13% not much really and that loud fan is no good can't play games with it out a headset..
> 
> Glad i did not upgrade


I'm with you on this one.

Glad i decided against upgrading. At least until the Ti variant comes out.


----------



## Menno

If I add 2 single sli bridges it won't give me the sli heb performance warning anymore. Dunno if it does anything though. Comming from a Fury X, its loud though to run 1.9-2.0ghz. Needs at least 70% fan to keep temps down. At 2.05ghz it crashes for me. Thats only with 120% and temp target 92c and +175 on the core. Everything lower will hit 90c+. Everything stock and default fan profile it goes all the way down to the base clock. Thats with a slot between the cards. I am running 5k though. That puts up a lot more load then 1440p I noticed.

Time to put them on water when EK releases their sli bridge.


----------



## axiumone

Quote:


> Originally Posted by *Menno*
> 
> If I add 2 single sli bridges it won't give me the sli heb performance warning anymore. Dunno if it does anything though. Comming from a Fury X, its loud though to run 1.9-2.0ghz. Needs at least 70% fan to keep temps down. At 2.05ghz it crashes for me. Thats only with 120% and temp target 92c and +175 on the core. Everything lower will hit 90c+. Everything stock and default fan profile it goes all the way down to the base clock. Thats with a slot between the cards. I am running 5k though. That puts up a lot more load then 1440p I noticed.
> 
> Time to put them on water when EK releases their sli bridge.


There have been a few reports from users that using two old ribbon sli connectors may function at the new high bandwidth rates. I think someone with access to frame rating tools may need to explore that further.


----------



## AllGamer

Quote:


> Originally Posted by *bfedorov11*
> 
> The 1080 replaced the 980 not the 980ti.


exactly

The GTX-1080Ti will be a direct replacement for the GTX-980Ti

it's typical of nVidia, it's the same every year when they release a new model

it always takes them about *2 years* to polish all the bugs from the *base* model, then they call it the *Ti* model.
and in between they release some other cards like the Titans and such to breach the gap.

then next year, the *Ti* model becomes the new "*thing*" that is just a little faster than the *Titan*, and much faster than the *base* model.

That example has been repeated endless time since the GTX 200 series

The only time nVidia doesn't release a *Ti* model are for the dual GPU cards like the GTX 590 and GTX 690, GTX 790 and all the previous models.

We probably might not see the GTX 990 if it was ever on any road maps, since this generation of 900s the Titans were introduced, which kind of took that crown away, and there was no further need for the Dual GPU for the 900 series


----------



## GRABibus

And do they plan the GTX 1080 TX to replace the GTX TITAN X ?


----------



## bfedorov11

IMO it is pretty remarkable they got it to beat overclocked 980ti/TX on stock speeds. It is a double edged sword for them. Beat all current cards, high price tag, complaints about price.. Come in under 980ti, $500, complaints new card is crap.


----------



## AllGamer

Quote:


> Originally Posted by *GRABibus*
> 
> And do they plan the GTX 1080 TX to replace the GTX TITAN X ?


The Titan series came out of nowhere a few years back, it was like one of those illegitimate child from a love affair.

The 900 series and Titan series are like the Maxwell step brothers

so for a proper comparison, we'll need some sort of Pascal step brother of the 1080, maybe call it Goliath or something


----------



## GiveMeHope

They'll release a Titan 10, given the focus on the number 10 in this generation.


----------



## CallsignVega

Quote:


> Originally Posted by *axiumone*
> 
> There have been a few reports from users that using two old ribbon sli connectors may function at the new high bandwidth rates. I think someone with access to frame rating tools may need to explore that further.


That was one thing I was going to test if the new bridges aren't out by then. Just connect two floppy cables and see how I get on. Of course NVIDIA or anyone else didn't have the proper bridges for launch.


----------



## Azazil1190

ill join the club









Μy first quick run on f.s

0mv
+220 core
+400 memory
120%tdp
100% fun
max temp 53c with air codition
ambient temp of the room 24c

im fine i think but we need a bios with more tdp








http://www.3dmark.com/3dm/12324707


----------



## Bogga

Have anyone tried surround and that new technique that they showed?


----------



## axiumone

Quote:


> Originally Posted by *Bogga*
> 
> Have anyone tried surround and that new technique that they showed?


It has to be implemented by the game devs. So, its going to be a while, if the devs have any interest at all that is.


----------



## Benny89

I manged to break today my 980Ti on 1555 on air so I am not gonna upgrade to 1080 for sure, considering its price.

But I really hope 1080Ti will be a total killer. For 1440p my 980Ti is more than enough but if 1080Ti can offer at least 50% more power- I am going for it.

for 1080... nah really. I would upgrade from 970 though


----------



## Outcasst

Anyone having an issue where you get horizontal lines flash for a second when the card switches between power states? It happens whenever the clocks are changed. Looks like its related to the memory clock.

I can activate it on purpose by changing the memory clock in MSI Afterburner. Doesn't seem to be related to stability as it happens when the memory isn't overclocked at all.

Here's a video:


----------



## Zurv

ultra in 2 way sli...


http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/2+gpu

i'm sure i'll be bumped by the end of the weekend.


----------



## looniam

^GG









enjoy a +REP


----------



## gree

Quote:


> Originally Posted by *Zurv*
> 
> ultra in 2 way sli...
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/2+gpu
> 
> i'm sure i'll be bumped by the end of the weekend.


Wow your 980ti sli kicking all those 1080sli s. Is because the 1080 drivers are still new?
Also I was gonna upgrade my 980ti HOF to a 1080

Edit just noticed that 6950x, damn that guy spent a lot on parts.


----------



## Maintenance Bot

Quote:


> Originally Posted by *Outcasst*
> 
> Anyone having an issue where you get horizontal lines flash for a second when the card switches between power states? It happens whenever the clocks are changed. Looks like its related to the memory clock.
> 
> I can activate it on purpose by changing the memory clock in MSI Afterburner. Doesn't seem to be related to stability as it happens when the memory isn't overclocked at all.
> 
> Here's a video:


Yes, this does happen to me intermittently, I thought I was the only one.

Your video is exactly what i have seen about maybe a dozen times or so the past week.


----------



## traxtech

Quote:


> Originally Posted by *Outcasst*
> 
> Anyone having an issue where you get horizontal lines flash for a second when the card switches between power states? It happens whenever the clocks are changed. Looks like its related to the memory clock.
> 
> I can activate it on purpose by changing the memory clock in MSI Afterburner. Doesn't seem to be related to stability as it happens when the memory isn't overclocked at all.
> 
> Here's a video:


Does it do it as it goes through the power states, as in load to idle and each P step?? I had the same issue and the only thing that fixed it was uninstall EVGA Precision X(In your case, MSI afterburner) and reset the PC and reinstall. Fixed it for me, hope it does for you.

Edit : Forgot to mention it started when messing with the automatic overclock JUNK they now have, don't touch it lol


----------



## Outcasst

Quote:


> Originally Posted by *traxtech*
> 
> Does it do it as it goes through the power states, as in load to idle and each P step?? I had the same issue and the only thing that fixed it was uninstall EVGA Precision X(In your case, MSI afterburner) and reset the PC and reinstall. Fixed it for me, hope it does for you.
> 
> Edit : Forgot to mention it started when messing with the automatic overclock JUNK they now have, don't touch it lol


Yeah it basically does it whenever the memory clock changes. After a reboot it disappears for a while but then it comes back. It happens in the witcher 3 when switching from in-game to the map.

BIOS modding tools can't come soon enough. Never liked having overclocking utilities installed.


----------



## -terabyte-

Quote:


> Originally Posted by *axiumone*
> 
> It has to be implemented by the game devs. So, its going to be a while, if the devs have any interest at all that is.


Wasn't that only for ANSEL ? I thought that the multi-monitor setup angle fix is just something you setup in the drivers (by manually specifying the angle you have your monitors at).


----------



## GnarlyCharlie

Quote:


> Originally Posted by *gree*
> 
> Wow your 980ti sli kicking all those 1080sli s. Is because the 1080 drivers are still new?
> Also I was gonna upgrade my 980ti HOF to a 1080


He's running 1080s.

Those things are really showing their legs in Ultra.


----------



## Visceral

Regarding the flickering: Yes, mine does this on occasion as well.


----------



## axiumone

Quote:


> Originally Posted by *-terabyte-*
> 
> Wasn't that only for ANSEL ? I thought that the multi-monitor setup angle fix is just something you setup in the drivers (by manually specifying the angle you have your monitors at).


Both, Ansel and smp have to be explicitly programed in by the devs.

Here, check this out.


----------



## PasK1234Xw

Not sure if been mentioned but author of GPUZ confirmed that ASIC reading not possible on 1080.

__
https://www.reddit.com/r/4lmpg5/gtx_1080_holy_grail_100_asic/


----------



## carlhil2

Quote:


> Originally Posted by *Zurv*
> 
> ultra in 2 way sli...
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/2+gpu
> 
> i'm sure i'll be bumped by the end of the weekend.


The 1080 has taken over the HOF, and the custom cards aren't even out yet....reference cards competing with cards with the bios flashed under exotic cooling. can't wait for the big cards to hit...


----------



## CallsignVega

Quote:


> Originally Posted by *Outcasst*
> 
> Anyone having an issue where you get horizontal lines flash for a second when the card switches between power states? It happens whenever the clocks are changed. Looks like its related to the memory clock.
> 
> I can activate it on purpose by changing the memory clock in MSI Afterburner. Doesn't seem to be related to stability as it happens when the memory isn't overclocked at all.
> 
> Here's a video:


Both my 1080 and 980Ti's do it.


----------



## traxtech

Okay so, as most of you know the memory on these cards have diminishing returns.

I opened a window version of Unigine Valley and checked the FPS at a certain point(paused so there was no FPS fluctuation) and it was 106 FPS with my memory on stock. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i was between the areas of the issue. Ended up with +457 with gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.

Recommend you all do it.







Could be getting less FPS than you're meant to just by being +1 memory over.


----------



## AllGamer

Quote:


> Originally Posted by *traxtech*
> 
> Okay so, as most of you know the memory on these cards have diminishing returns.
> 
> I opened a window version of Unigine Valley and checked the FPS at a certain point(paused so there was no FPS fluctuation) and it was 106 FPS with my memory on stock. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i was between the areas of the issue. Ended up with +457 with gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.
> 
> Recommend you all do it.
> 
> 
> 
> 
> 
> 
> 
> Could be getting less FPS than you're meant to just by being +1 memory over.


I was actually expecting this to happen.

During the Nvidia 1080 live stream, they have shown how tight the timing processing signal runs at the "atomic leve" (they claim)
so, any deviation from that already optimized signaling will actually slow it down rathern than speeding it up.

Your tests just seem to have confirmed what they said.


----------



## Radox-0

Quote:


> Originally Posted by *Zurv*
> 
> ultra in 2 way sli...
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/2+gpu
> 
> i'm sure i'll be bumped by the end of the weekend.


Woop @ 27 myself. DId not even realise as I was not going for absolute clocks so did not check. Nice to see though


----------



## KickAssCop

Quote:


> Originally Posted by *CallsignVega*
> 
> My 2100 MHz FE is only 13% faster on average than my 1520MHz 980Ti. If the 1080 tops out around 2100, even on custom cards, that's a horrid upgrade.
> 
> To add insult to injury I went from one of the best cooled/quietest 980Ti's (Lightning) to this .50 cent piece of crap cooler on the FE making me deaf.


Thanks for the informative post


----------



## Benny89

Quote:


> Originally Posted by *Outcasst*
> 
> Anyone having an issue where you get horizontal lines flash for a second when the card switches between power states? It happens whenever the clocks are changed. Looks like its related to the memory clock.
> 
> I can activate it on purpose by changing the memory clock in MSI Afterburner. Doesn't seem to be related to stability as it happens when the memory isn't overclocked at all.
> 
> Here's a video:


I am having it on my 980Ti each time I apply overclock to it in Afterburner before playing some game. I think it is pretty normal. I have no issues at all with card for months.


----------



## trickeh2k

I have the same flickering at times when I apply the overlock as well on my 780 Classy. Never experience them otherwise though than applying or disabling oc.


----------



## Outcasst

I'm getting them constantly in the Witcher 3 when switching from the game to the map. I'm going to try high performance mode in the control panel so the clocks don't change whilst in-game.


----------



## Spiriva

I never got any flashes when i apply the overclock on the memory, neither on the titan x or now the 1080.


----------



## romanlegion13th

Quote:


> Originally Posted by *AllGamer*
> 
> exactly
> 
> The GTX-1080Ti will be a direct replacement for the GTX-980Ti
> 
> it's typical of nVidia, it's the same every year when they release a new model
> 
> it always takes them about *2 years* to polish all the bugs from the *base* model, then they call it the *Ti* model.
> and in between they release some other cards like the Titans and such to breach the gap.
> 
> then next year, the *Ti* model becomes the new "*thing*" that is just a little faster than the *Titan*, and much faster than the *base* model.
> 
> That example has been repeated endless time since the GTX 200 series
> 
> The only time nVidia doesn't release a *Ti* model are for the dual GPU cards like the GTX 590 and GTX 690, GTX 790 and all the previous models.
> 
> We probably might not see the GTX 990 if it was ever on any road maps, since this generation of 900s the Titans were introduced, which kind of took that crown away, and there was no further need for the Dual GPU for the 900 series


The 980ti was not faster then the Titan it was a cut down card the Titan X was still about %5 faster and 12gb vram
Titan X OC well


----------



## Menno

http://www.3dmark.com/fs/8699046

Ya! I am also in the 10K score now







.


----------



## Noshuru

Quote:


> Originally Posted by *romanlegion13th*
> 
> The 980ti was not faster then the Titan it was a cut down card the Titan X was still about %5 faster and 12gb vram
> Titan X OC well


That guy has no idea what he's talking about, lol.


----------



## bfedorov11

Quote:


> Originally Posted by *Noshuru*
> 
> That guy has no idea what he's talking about, lol.


Clock vs clock the TX will win every time. The reason people think the 980ti is faster is because custom cards come with higher clock speeds out of the box. Most reviews bench the custom cards against the stock TX which has a base clock of 1000mhz with boost at 1075mhz. TX has 3072 cores.. 980ti is the same chip with 2816 cores..


----------



## richie_2010

Could someone advise me if the gtx 1080 is sharing the same screw locations as the gtx 980.

Regards


----------



## Noshuru

Quote:


> Originally Posted by *bfedorov11*
> 
> Clock vs clock the TX will win every time. The reason people think the 980ti is faster is because custom cards come with higher clock speeds out of the box. Most reviews bench the custom cards against the stock TX which has a base clock of 1000mhz with boost at 1075mhz. TX has 3072 cores.. 980ti is the same chip with 2816 cores..


What? When I reply to someone and I say 'that guy has no idea what he's talking about' I'm obviously talking to them, not about them. I was talking about the guy that the person I replied to quoted.


----------



## yousef17

I was able to get my founders editions on 100mhz core clock over clock and 400mhz memory overlock with the regular fans. I just set the power limit to 120 and the fan speed to 90%. its a little loud but on air they stay at 78 degrees C. The performance gain is very good.


----------



## Noshuru

Quote:


> Originally Posted by *yousef17*
> 
> I was able to get my founders editions on 100mhz core clock over clock and 400mhz memory overlock with the regular fans. I just set the power limit to 120 and the fan speed to 90%. its a little loud but on air they stay at 78 degrees C. The performance gain is very good.


This may be of interest to you:

__
https://www.reddit.com/r/4mm5zt/for_those_overclocking_1080_memory_please_read/


----------



## yousef17

Thank you I read that. I got them at 200mhz core over lock and 400 memory now. I'm + 9fps now. Not bad on air cooling.


----------



## CallsignVega

Quote:


> Originally Posted by *traxtech*
> 
> Okay so, as most of you know the memory on these cards have diminishing returns.
> 
> I opened a window version of Unigine Valley and checked the FPS at a certain point(paused so there was no FPS fluctuation) and it was 106 FPS with my memory on stock. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i was between the areas of the issue. Ended up with +457 with gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.
> 
> Recommend you all do it.
> 
> 
> 
> 
> 
> 
> 
> Could be getting less FPS than you're meant to just by being +1 memory over.


I've found the same thing. There are multiple "valleys" (pun intended) that the memory doesn't kick into error correcting or loosening timings. The top memory clock before you see artifacts is definitely not the way to go.


----------



## fewness

What's with that Fury X (Lethenar) on the single card FSU and FSE list? Is that a real one on fire under liquid nitrogen, or actually CrossFire recognized as 1 card by mistake?

There are 24 980Ti ahead of me on the FSE single card ranking, most of them from GALAX GOC 2015 Qualifier, then there is this Fury X .... so annoying


----------



## spencer785

Hit 45th place in the hall of fame ultra 2x with my specs in my sig.

http://www.3dmark.com/fs/8681142


----------



## skline00

Quote:


> Originally Posted by *Zurv*
> 
> ultra in 2 way sli...
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/2+gpu
> 
> i'm sure i'll be bumped by the end of the weekend.


Goodness, Zurv that truly is impressive for brand new cards and just released drivers!


----------



## bfedorov11

I applied some kryonaut on the FE and temps dropped 5-8 degrees at idle. Probably difficult to tell if it makes a difference under load with boost 3.0. It also seems more logical to leave the back plate off. It is covered with plastic and has two open spots that use thermal pads to make contact with the card. It didn't even look like the pads were making contact so I just left it off.


----------



## zGunBLADEz

Quote:


> Originally Posted by *bfedorov11*
> 
> I applied some kryonaut on the FE and temps dropped 5-8 degrees at idle. Probably difficult to tell if it makes a difference under load with boost 3.0. It also seems more logical to leave the back plate off. It is covered with plastic and has two open spots that use thermal pads to make contact with the card. It didn't even look like the pads were making contact so I just left it off.


Thats what i was saying early that backplate have no use and probably would do more harm than good as is not transfering heat to a bigger surface area aka backplate metal.. So it would create more heat..


----------



## bfedorov11

Trying for 5900 FS ultra.. almost there. http://www.3dmark.com/fs/8705079

Think I'm gonna try the PT mod tomorrow.


----------



## Silent Scone

You youngsters. When will you get bored of testing your shiny new cards on synthetic benchmarks









Quote:


> Originally Posted by *skline00*
> 
> Goodness, Zurv that truly is impressive for brand new cards and just released drivers!


Actually it's more likely, the vendors tend to optimise for FM tests in the release drivers. I broke the 3 way world record on air cooled reference 980GTX at launch.

I'm just thankful for how this card performs in games this time round!


----------



## romanlegion13th

Quote:


> Originally Posted by *Noshuru*
> 
> What? When I reply to someone and I say 'that guy has no idea what he's talking about' I'm obviously talking to them, not about them. I was talking about the guy that the person I replied to quoted.


Clearly you do not know what your talking about lol.. T-X has 3072 cores.. 980ti is the same chip with 2816 cores.. T-X 12gb of Vram 980ti 6gb
My T-X is running at 1480mhz so i would like to bench with a OC 1080 see the difference.. I will get the Next Titan also as i like to have the fastest card.. Took 14 months for a faster card to come out and people are saying its only 10% faster for $300 less


----------



## Silent Scone

Faster is faster...is faster







.

It's not an ideal upgrade path for a Titan user, that is to come - obviously.


----------



## romanlegion13th

Quote:


> Originally Posted by *Silent Scone*
> 
> Faster is faster...is faster
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It's not an ideal upgrade path for a Titan user, that is to come - obviously.


Yeah its not a upgrade for a T-X i am waiting for that.. but took 14 months for 10-13% faster

I bet the next Titan is a beast


----------



## webmi

greetz from germany


----------



## MrDerrikk

Preordered my EVGA 1080 FTW yesterday, hopefully I'll get it before the month is out. Looking forward to having better drivers by then though!


----------



## buellersdayoff

WOW, can't believe people are paying ridiculous prices for these cards, especially stock coolers! What's the go?


----------



## fat4l

I don't get it as well.
1080 oc vs 980ti oc is about 10% difference.
Paying double the price for 10% performance is meh


----------



## Benny89

Quote:


> Originally Posted by *fat4l*
> 
> I don't get it as well.
> 1080 oc vs 980ti oc is about 10% difference.
> Paying double the price for 10% performance is meh


If people have too much money- they can spend on what they want. If I had enough money to don't care, hell I would upgrade even for this 10%







. Because why not if you have money









Since I do not have money to waste, I am waiting for 1080Ti. I hope for AT LEAST 30% more peformacne than my 980Ti OCed and 50% when I OC hell out of 1080Ti.

But the only thing I do not get personally- why people buy referance 1080s?? That is the only enigma for me.


----------



## keesgelder

Quote:


> Originally Posted by *Benny89*
> 
> But the only thing I do not get personally- why people buy referance 1080s?? That is the only enigma for me.


I agree that in general, people should wait for the aftermarket cards (especially when you don't plan to put a waterblock on your card).

Personally, I just had to have these cards for my In Win Tou build (see sig for a slightly outdated picture). These cards fit into the theme so perfectly... My PC is a prominent piece in the living room, and while many would disagree, I care about the looks more than I care about the advantages of the aftermarket models.


----------



## PinoJet

Which prices are you comparing? Does the 1080 cost more than 1k$ or do you know where to buy a new cheap 980ti?
I would love to get 1 or 2 980ti for under 500$ ea


----------



## Benjiw

Quote:


> Originally Posted by *fat4l*
> 
> I don't get it as well.
> 1080 oc vs 980ti oc is about 10% difference.
> Paying double the price for 10% performance is meh


Gotta go fast!!!!

Honestly though, don't forget that people are building new rigs atm, the drivers still have time to mature and what not.


----------



## techguymaxc

1080 OC is NOT only 10% faster than 980 Ti OC. Average performance gain from 980 Ti to 1080 is 22.6% across 9 games @ 1080p/1440/4k. If your basis for comparison is just 3dmark you don't have the whole picture.

Data compiled from overclockersclub recent overclocked 1070 review: http://www.overclockersclub.com/reviews/nvidia_geforcegtx_1070_overclocking/

https://docs.google.com/spreadsheets/d/10mwNtNsQXNJCjHQPtE5K636f-rXjZTRrmHszOUWyI6Y/edit?usp=sharing


----------



## Silent Scone

Love reading the annual 'previous gen buyers try to justify _not_ purchasing next gen to themselves' spew.

Because that's what it is, spew. Every single year lol. Nobody is forcing you to upgrade, but the 1080 is undeniably faster. In all scenarios in fact. As is to be expected.


----------



## Bear907

I'll be joining the club soon-ish myself. Hopefully in the next couple of months. Finally upgrading from a 780. This should be quite the noticeable upgrade.


----------



## Noshuru

Quote:


> Originally Posted by *romanlegion13th*
> 
> Clearly you do not know what your talking about lol.. T-X has 3072 cores.. 980ti is the same chip with 2816 cores.. T-X 12gb of Vram 980ti 6gb
> My T-X is running at 1480mhz so i would like to bench with a OC 1080 see the difference.. I will get the Next Titan also as i like to have the fastest card.. Took 14 months for a faster card to come out and people are saying its only 10% faster for $300 less


I guess you don't know how quotes work either. I was saying the the guy you quoted has no idea what he's talking about.


----------



## MCFC

So how long before we get some benchmarks with all the third party 1080 cards (asus, evga, galax, zotac etc.)?


----------



## fripon

Quote:


> So how long before we get some benchmarks with all the third party 1080 cards (asus, evga, galax, zotac etc.)?


http://www.pcgameshardware.de/Asus-GTX-1080-Strix8G-Grafikkarte-264461/Videos/Review-Test-1197288/

http://www.pcgameshardware.de/Nvidia-Geforce-GTX-10808G-Grafikkarte-262111/Videos/Review-Test-1197576/

http://www.computerbase.de/2016-05/asus-geforce-gtx-1080-strix-oc-test/

http://www.computerbase.de/2016-06/inno3d-ichill-geforce-gtx-1080-x3-test/

And yes all these Cards hit the 2,1 GHZ Wall.


----------



## MCFC

Quote:


> Originally Posted by *fripon*
> 
> http://www.pcgameshardware.de/Asus-GTX-1080-Strix8G-Grafikkarte-264461/Videos/Review-Test-1197288/
> 
> http://www.pcgameshardware.de/Nvidia-Geforce-GTX-10808G-Grafikkarte-262111/Videos/Review-Test-1197576/
> 
> http://www.computerbase.de/2016-05/asus-geforce-gtx-1080-strix-oc-test/
> 
> http://www.computerbase.de/2016-06/inno3d-ichill-geforce-gtx-1080-x3-test/
> 
> And yes all these Cards hit the 2,1 GHZ Wall.


thanks good first post


----------



## Zurv

Quote:


> Originally Posted by *Silent Scone*
> 
> Faster is faster...is faster
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It's not an ideal upgrade path for a Titan user, that is to come - obviously.


feh.. voodoo -> voodoo 2, voodoo 3 -> riva..something(i don't recall some of these middle ones)680, 9800 gx2 -> 9780 ->780 -> titan -> 980 ->titan x -> 1080 -> new titan







(the 980 was a waste. Those cards were totally gimped by the limited about of vram)

here in overclock.net forums we play for keeps! (i beat the hell out of those titan Xs too.. so i was happy to clear them out. They were launch cards)
faster is faster









now if they would let me god damn unlock 3/4 way SLI!


----------



## fat4l

Quote:


> Originally Posted by *techguymaxc*
> 
> 1080 OC is NOT only 10% faster than 980 Ti OC. Average performance gain from 980 Ti to 1080 is 22.6% across 9 games @ 1080p/1440/4k. If your basis for comparison is just 3dmark you don't have the whole picture.
> 
> Data compiled from overclockersclub recent overclocked 1070 review: http://www.overclockersclub.com/reviews/nvidia_geforcegtx_1070_overclocking/
> 
> https://docs.google.com/spreadsheets/d/10mwNtNsQXNJCjHQPtE5K636f-rXjZTRrmHszOUWyI6Y/edit?usp=sharing


Nice data


----------



## Fahrenheit85

I'm tapped out at +200 on the core and +400 for memory. 225 crashes the core and 425 gives less FPS. I got fan speed @100% while testing. Waiting on my EK block so I can actually game on it while OCed (right now the noise is way to much for me).

Dumb question but how is memory speed calculated? Heaven bench shows it at 5400, GPUz shows 1251, and the card is advertised as 10010 MHz. I'm very confused


----------



## techguymaxc

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Dumb question but how is memory speed calculated? Heaven bench shows it at 5400, GPUz shows 1251, and the card is advertised as 10010 MHz. I'm very confused


Memory clockspeeds can be reported in different ways by different tools. 1251 and 10010 are just different representations of the same thing, and 10010 is in fact a perfect multiple of 1251 (if the decimals weren't rounded off) though these numbers don't reflect your overclock whereas the 5400 number does.


----------



## GunnzAkimbo

GDDR5XXX (Ladies)


----------



## CallsignVega

It's kinda sad my overclocked 1080's are only 13% faster than my overclocked 980 Ti's. Kinda of a wasted upgrade, but I wanted to try out the new SLI bridge.


----------



## Chargeit

So when are the aftermarket 1080's hitting?

Been thinking about it and even gaining 10fps or so at my current res of 3440x1440 would put me above the 60 mark min fps in enough games to make it worth it. Also I'm doubting the 1080 ti's will be out in less then 9 - 12 months.

Would also be kind of interesting to check out a lower TDP gpu since the last two main GPU's I've owned have been 250w.

Anyone with an X34 and a 980 ti moving to a 1080 can relay their experience in newer games? Was the move worth it? Gsync does a good job of covering dips, but, I still find myself wanting to be at 60+. Oh, my 980 ti is also a crap oc'er that runs hot.


----------



## Baasha

Got my 4x 1080 Founder's Edition cards.









Now, only if we can actually use 4-Way SLI!?!?!









Any updates on ETA for the unlock? And what about the new HB SLI bridges?


----------



## Silent Scone

Quote:


> Originally Posted by *Baasha*
> 
> Got my 4x 1080 Founder's Edition cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, only if we can actually use 4-Way SLI!?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any updates on ETA for the unlock? And what about the new HB SLI bridges?


It's not available yet. Seriously, why would you even consider this anymore? Just stop...stop and think what you are doing, man!


----------



## Menno

Pushing above > +190 on the core crashes my cards in SLI. Even though the temps stay below < 83c. (100% fan test just not the be limited by the temp to be sure) + 175 seems to be the sweet spot for me. It results in 1950 ~ 2025 in most games and a fan profile that doesnt sound like my pc is taxing for take off. 70-80 fps with overwatch @5k is awesome. My front intake is only 1 fan though, getting 2 extra fans today for some more airflow to the cards. If 2.1ghz is really a wall it isnt worth to buy EK blocks and another radiator for me.


----------



## Silent Scone

Quote:


> Originally Posted by *Menno*
> 
> Pushing above > +190 on the core crashes my cards in SLI. Even though the temps stay below < 83c. (100% fan test just not the be limited by the temp to be sure) + 175 seems to be the sweet spot for me. It results in 1950 ~ 2025 in most games and a fan profile that doesnt sound like my pc is taxing for take off. 70-80 fps with overwatch @5k is awesome. My front intake is only 1 fan though, getting 2 extra fans today for some more airflow to the cards. If 2.1ghz is really a wall it isnt worth to buy EK blocks and another radiator for me.


I've seen similar results with my FE card on the stock cooler. Around 2050mhz I get the odd artefact.


----------



## muhd86

Quote:


> Originally Posted by *Baasha*
> 
> Got my 4x 1080 Founder's Edition cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, only if we can actually use 4-Way SLI!?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any updates on ETA for the unlock? And what about the new HB SLI bridges?


i wanted to get a similar set up but .

1 . where to get the new sli bridges from , will the old 4 way sli bridge that came with the Rampage v work .

2 . how to unlock the 4 way sli option via key .....there is no way to unlock them .


----------



## Menno

Quote:


> Originally Posted by *Silent Scone*
> 
> I've seen similar results with my FE card on the stock cooler. Around 2050mhz I get the odd artefact.


I don't get any artifacting the driver just crashes / stops working and resets.


----------



## Bogga

Quote:


> Originally Posted by *Chargeit*
> 
> So when are the aftermarket 1080's hitting?


If I'm lucky I'll get my cards on friday (Strix)


----------



## Silent Scone

Quote:


> Originally Posted by *Menno*
> 
> I don't get any artifacting the driver just crashes / stops working and resets.


Plenty fireworks when I pushed core too far here. Red dots before driver recovery.


----------



## traxtech

WTB bios mods, will give candy


----------



## Zurv

Quote:


> Originally Posted by *Baasha*
> 
> Got my 4x 1080 Founder's Edition cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, only if we can actually use 4-Way SLI!?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any updates on ETA for the unlock? And what about the new HB SLI bridges?


i'm with ya man. I have 2 systems filled up with 1080s - and waiting for the unlock.

there will never be a HB bridge for more than 2 way. That said, it doesn't really matter (at least to me) because the hard and LED brdiges can do it for 4k.

Do use the LED bridge over the hard. It is better. How much? no idea. but in the pcper video with an NVidia he talked about bridges - and the LED had more bandwidth than the hard.


----------



## TK421

Quote:


> Originally Posted by *traxtech*
> 
> WTB bios mods, will give candy


*candy given inside my van

Best one I can do so far with a single 1080

http://www.3dmark.com/fs/8717046


----------



## Frutek

So I got my MSI Gaming X 1080 and the stable clocks are 2126/11114 without additional voltage.

http://www.3dmark.com/fs/8717469


----------



## nexxusty

My EVGA FE arrives Thursday... excited.

Feeling like I'll finally get the performance I want at 1080p 144hz. Might even be able to play Ark now too... lol.


----------



## gree

The FE cards are dual slots correct? Trying to see if my case with 4 slots can fit two 1080s


----------



## fat4l

So guys.....
how is it with the 1.25v lock ?

Whats the default voltage and can u increase it with afterburner to some point ?









Does the card scale with voltage at least a bit ?


----------



## zGunBLADEz

Leaving this here for you guys


----------



## TK421

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> Leaving this here for you guys


and we're still stuck here with gimped bioses


----------



## D13mass

Guys, where I can buy 1080 in USA?


----------



## TK421

Quote:


> Originally Posted by *D13mass*
> 
> Guys, where I can buy 1080 in USA?


out of stock everywhere right now


----------



## Mrip541

Quote:


> Originally Posted by *buellersdayoff*
> 
> WOW, can't believe people are paying ridiculous prices for these cards, especially stock coolers! What's the go?


Insanity. Insanity is the explanation.


----------



## Maintenance Bot

Quote:


> Originally Posted by *D13mass*
> 
> Guys, where I can buy 1080 in USA?


Newegg now http://www.newegg.com/Product/Product.aspx?Item=N82E16814126101&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-Veeralava%20LLC-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=

Or check here https://www.nowinstock.net/computers/videocards/nvidia/gtx1080/


----------



## looniam

new drivers out :
Quote:


> Changes and Fixed Issues in Version *368.39*
> The following sections list the important changes and the most common issues resolved in this version. This list is only a subset of the total number of changes made in this driver version. The NVIDIA bug number is provided for reference.
> 
> Windows 10 Fixed Issues
> - [368.25] GeForce GTX 1080 Founders Edition cards spin fan up and down rapidly. [1771960]





Spoiler: dont mind me im just trolling



http://forums.guru3d.com/showthread.php?t=407972&page=3


----------



## D13mass

I mean not reference, maybe Gigabyte or Evga...
Thanks for link https://www.nowinstock.net/computers/videocards/nvidia/gtx1080/ I will monitor, didn`t know before about this service.


----------



## jase78

When are the non reference cards being released to the masses?


----------



## jase78

Really liking the strix card. I'm really new to pc gaming and currently have a ftw 970 and looking forward to upgrading. I figured the reference model 1080 would be 500 like the 980 was. Was I wrong for assuming this? Or do they always jump 200 bux between new models.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> It's kinda sad my overclocked 1080's are only 13% faster than my overclocked 980 Ti's. Kinda of a wasted upgrade, but I wanted to try out the new SLI bridge.


yeah - about th same here... one card comparison.
Quote:


> Originally Posted by *Baasha*
> 
> Got my 4x 1080 Founder's Edition cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, only if we can actually use 4-Way SLI!?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any updates on ETA for the unlock? And what about the new HB SLI bridges?


4 -way SLI for what? 4K surround?
Quote:


> Originally Posted by *Silent Scone*
> 
> It's not available yet. Seriously, why would you even consider this anymore? Just stop...stop and think what you are doing, man!


NOt many scenarios (or ganes) where more than 2 cards will scale.
Quote:


> Originally Posted by *Silent Scone*
> 
> I've seen similar results with my FE card on the stock cooler. Around 2050mhz I get the odd artefact.


Artifacts in what game/bench?


----------



## XCalinX

Just ordered mine. Any links to downloading custom bios? Already got the sexy EK block for it.


----------



## superkyle1721

Quote:


> Originally Posted by *XCalinX*
> 
> Just ordered mine. Any links to downloading custom bios? Already got the sexy EK block for it.


No custom bios out yet. Usually takes about a month or so before a tool is released to be able to mod the bios.

Always destroying exergy


----------



## MrDerrikk

Quote:


> Originally Posted by *jase78*
> 
> Really liking the strix card. I'm really new to pc gaming and currently have a ftw 970 and looking forward to upgrading. I figured the reference model 1080 would be 500 like the 980 was. Was I wrong for assuming this? Or do they always jump 200 bux between new models.


Ah but this isn't the reference model it's the mystical "Founders Edition" that is just beckoning you to pay extra for its sharp polygon lines.

If you want the reference model cheap though, just wait a week or so for MSI to bring out the Aero cards of which the cheapest is at that price mark. I also think they look pretty snappy myself.


----------



## Jared Pace

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> Leaving this here for you guys


wow almost 2.9ghz GTX 1080. Maybe 3.0+ ghz GTX 1080 is possible


----------



## gree

102C tho, would that be stable for gaming?

2.8 man that's going to destroy the hall of fame


----------



## mouacyk

Quote:


> Originally Posted by *CallsignVega*
> 
> It's kinda sad my overclocked 1080's are only 13% faster than my overclocked 980 Ti's. Kinda of a wasted upgrade, but I wanted to try out the new SLI bridge.


Awww comon -- you don't like the 50% TDP, 2x perf/watt, VR, 8GB, wot? I blame magnek for not trying hard enough to show that oc 1080 is only 12.5% faster than oc 980 ti.


----------



## VSG

Quote:


> Originally Posted by *gree*
> 
> 102C tho, would that be stable for gaming?
> 
> 2.8 man that's going to destroy the hall of fame


-102 °C, not +102 °C. That's a straight up LN2 run.


----------



## gree

Oh xD that's still insane. hopefully somebody can get past 2200 on air


----------



## CallsignVega

At least they fixed the fan revving issue with the new drivers. I also switched back to MSI AB, so much better than that alpha state EVGA OC-X.


----------



## Jpmboy

Quote:


> Originally Posted by *gree*
> 
> 102C tho, would that be stable for gaming?
> 
> 2.8 man that's going to destroy the hall of fame


you missed the "negative" part of the temperature.









ANYWAY - 30min OOB heaven run:


----------



## darkphantom

So i'm still sitting on an unopened Evga GTX 1080 FE and I really want to just buy a FTW 1080 but can't find any of those for sale, nor can I find an Asus Strix 1080 - any idea if those are even out?


----------



## MrDerrikk

Quote:


> Originally Posted by *darkphantom*
> 
> So i'm still sitting on an unopened Evga GTX 1080 FE and I really want to just buy a FTW 1080 but can't find any of those for sale, nor can I find an Asus Strix 1080 - any idea if those are even out?


Not that I know of, I had to preorder my FTW from B&H and apparently they'll get stock in on the 16th


----------



## dagget3450

Deleted - posted in heaven thread.


----------



## Baasha

Quote:


> Originally Posted by *Zurv*
> 
> i'm with ya man. I have 2 systems filled up with 1080s - and waiting for the unlock.
> 
> there will never be a HB bridge for more than 2 way. That said, it doesn't really matter (at least to me) because the hard and LED brdiges can do it for 4k.
> 
> Do use the LED bridge over the hard. It is better. How much? no idea. but in the pcper video with an NVidia he talked about bridges - and the LED had more bandwidth than the hard.


Nice!

Yea, I'm using the Nvidia hard LED 3-way SLI bridge.

Is the 4-Way thing going to come any time soon? Really sick of waiting (although it's only been a couple of days).

Also noticed you have the new 6950X. How is it compared to the 5960X? Perceptible difference? I wonder if that CPU can also handle much faster RAM(?).

Got any pics of your setup(s)? This thread needs more pics!









Here's mine:


----------



## bp7178

I just got my two EVGA 1080 Founders cards yesterday. Hit 25654 in Firestrike with my 6700k running at 4.8ghz, stock clocks on the 1080s in SLI. Once I get the 1080s installed with my EK blocks and get the loop going, I'll see if I can hit 2ghz with the 1080s and retest.

http://www.3dmark.com/fs/8714533

With 2x 980Ti cards in SLI OC'd 1506 ghz, I got 25549 in Firestrike, but the 6700k was at 4.9ghz for that.

http://www.3dmark.com/fs/8538994

The 980Ti cards were on air and fans cranked all the way up for that run. The 1080s managed to beat them at significantly lower noise levels. The air cooler on the Founders cards pumps out a lot of heat, but is much lower in volume at full blast than the EVGA 980Ti cards at full blast.


----------



## dagget3450

What are you guys going to do if nvidia decides not to let 3/4 way sli? I am a huge quad gpu fanatic, but 4 1080gtx when you only can enable 2 way. No HB bridges and if they decide to allow over 2 way sli but its currently locked? To me this is like the guys buying hydro evga cards because they cost the most so they thought they were the best and didnt know it requires a water cooling setup. I can't wrap my head around this, leaving 1400$ of gpus unusable while waiting on a possibility to use them kills me brain. Its not about the fact of the money involved but the almost certain path you will be when they enable it (IF) knowing what the scaling is on 980ti/titanx in 3/4 way sli i just dont see why. I mean it looks like to me you guys will be gaming not benchmarking much right?

Good luck no doubt but i cannot understand this path at all.


----------



## MCFC

Quote:


> Originally Posted by *gree*
> 
> Oh xD that's still insane. hopefully somebody can get past 2200 on air


Don't count on it, nvidia spent more time locking this card down than actually making it work well


----------



## Menno

I can get a pretty stable 1952 boost now with stock fan profile and power target 120% and +175 on the core. Hitting both cards about 85c and 80c. Thats awesome compromise for performance / noise for me. (I know AIB cards can do it on lower temps but I don't want 2 cards dumping heat in the case)


----------



## nexxusty

Quote:


> Originally Posted by *TK421*
> 
> and we're still stuck here with gimped bioses


That's hard modded.

Been around long or?


----------



## webmi

[email protected] & [email protected]/5.600


----------



## KickAssCop

Quite an impressive run.


----------



## Spiriva

You guys who got two 1080 running in SLI, how do you like it so far ? Have you notice any micro stuttering or such ?

I was gonna get two Evga 1080 FE, but endet up just getting one of them (the store sold them out in a matter of minutes) but they will soon come in stock again, so i figured maybe ill place an order and get the second card now along with a new water block/backplate.

I know that the new nvidia sli bridge wount work with two EK blocks, but im hoping that EK will release thier own bridge soon.

Im planing on getting the new Kaby lake cpu along with a new 200-series motherboard and some ddr4 to replace my old z97 setup, the plan was to get two 1080ti´s too, but since i got a monitor who can max do 100hz (pg348q), i really dont need more then 100fps in games at max settings, one 1080 can almost handle that, a few titles run under 100fps (gta 5 for exemple).

But now im thinking about getting an extra 1080 and just skip the 1080ti all together. Another tought would be to get two 1080ti when they come out, but every fps over 100 would just be a waste.


----------



## Menthol

Quote:


> Originally Posted by *webmi*
> 
> [email protected] & [email protected]/5.600


Does Valley still report higher than actual clocks or has that now changed?


----------



## Vellinious

Quote:


> Originally Posted by *Menthol*
> 
> Does Valley still report higher than actual clocks or has that now changed?


None of the benchmarks really report the clocks correctly. I noticed though, that once you flash to a custom bios, at least on Maxwell, they DID show correctly. At least on the Unigine benchmarks.


----------



## pauly94

Thinking about buying a 1080 but don't want to wait for aftermarket ones. Should I just go for the FE despite its mediocre temps? I much prefer the look of the FE too.


----------



## webmi

Quote:


> Originally Posted by *Menthol*
> 
> Does Valley still report higher than actual clocks or has that now changed?


gpu clock was 2.202, valley shows 2.202 = correct clocks are shown


----------



## webmi

mäh


----------



## Zurv

Quote:


> Originally Posted by *webmi*
> 
> [email protected] & [email protected]/5.600


what cooling are you using? or other fancy trick to get to 2200mhz on the gpu?


----------



## Jpmboy

Ghetto water cooling.


----------



## Associated

Quote:


> Originally Posted by *Jpmboy*
> 
> Ghetto water cooling.


Not cooling VRMs at all?


----------



## Jpmboy

Quote:


> Originally Posted by *Associated*
> 
> Not cooling VRMs at all?


120V fan (a screamer just for testing). I'm measuring the temps of the ram and power section with an IR thermo. So far, looping Heaven 4.0 one of the chokes (top R22) hits 58C. Honestly, the stock air cooler probably doesn't cool these as well as forced air directly on the parts. Gpu core max temp is 28C.


----------



## SsXxX

http://www.overclock.net/t/1602413/gtx-1080-power-target

hello guys, can u kindly check the link and assist me

thanks


----------



## Associated

Quote:


> Originally Posted by *SsXxX*
> 
> http://www.overclock.net/t/1602413/gtx-1080-power-target
> 
> hello guys, can u kindly check the link and assist me
> 
> thanks


Try with MSI Afterburner

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## Jpmboy

Quote:


> Originally Posted by *Associated*
> 
> Try with MSI Afterburner
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


^^This. 4.3 beta is working very well.

_____________
I mean _forced_ air:


----------



## SsXxX

already tried the msi afterburner, same issue









hmm . . . some body just replied now and his explanation seems logical:

http://www.overclock.net/t/1602413/gtx-1080-power-target


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^This. 4.3 beta is working very well.
> 
> _____________
> I mean _forced_ air:


Thought you weren't gonna upgrade :-D

Where is your 6950x?


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> Thought you weren't gonna upgrade :-D
> 
> Where is your 6950x?


I didn't. seems more like a side grade.








6950x on this other rig.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> I didn't. seems more like a side grade.
> 
> 
> 
> 
> 
> 
> 
> 
> 6950x on this other rig.


Hahaha nice man! I cant make myself do it. I also sold my titans and have a 560ti 448...

I have ventured down a different road


Spoiler: Warning: Spoiler!


----------



## webmi

Quote:


> Originally Posted by *Zurv*
> 
> what cooling are you using?


3x480mm + mora420 @ 300 rpm noctua P12/A14 PWM (room 24°C, water 30°C, gpu 42°C)


----------



## Trys0meM0re

Any of you guys noticed improved overclocking when switchin from air to water on the GTX1080 ?


----------



## Jquala

I'm having an odd issue. I was stable with 220/600 yesterday and now I can't even reach 150/0 after the driver update. When I turn sli on firestrike just decides to freeze up or I get gnarly black streaking artifacts as low as +100 on the core. They were over locking beautifully until yesterday since launch but sli has never worked for me


----------



## AllGamer

well the new drivers were made to take care of the fan spin up and down issue.

whatever they changed to prevent the fan from going crazy, might have affected the OC-ing capabilities for safety reasons perhaps to prevent over heating

that'd be my guess


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> Hahaha nice man! I cant make myself do it. I also sold my titans and have a 560ti 448...
> 
> I have ventured down a different road
> 
> 
> Spoiler: Warning: Spoiler!


lol - watch out for the fork. Nice kit!








Quote:


> Originally Posted by *Jquala*
> 
> I'm having an odd issue. I was stable with 220/600 yesterday and now I can't even reach 150/0 after the driver update. When I turn sli on firestrike just decides to freeze up or I get gnarly black streaking artifacts as low as +100 on the core. They were over locking beautifully until yesterday since launch but sli has never worked for me


if it was an auto update via experience.. .delete the drivers with DDU and reinstall manually. Best way to update NV drivers.


----------



## fat4l

I wonder what is locking that voltage and how long it takes to bypass it









Even this guy doesnt seem to be sure what that is ..


----------



## Creizai

I don't want to make a separate thread, just a quick question for anyone who got a founders edition from Nvidia. How long was it until you got a tracking number from Nvidia? I ended up buying one last friday from Nvidia's page and it didn't say a pre-order so I was wondering what the deal was.


----------



## nexxusty

Ugh... one more day you lucky bastards.

Lol.
Quote:


> Originally Posted by *MCFC*
> 
> Don't count on it, nvidia spent more time locking this card down than actually making it work well


So true. Sux...


----------



## pez

Snagged a G1 from NE this morning/last night. Will be able to join you fine folks on Monday







.


----------



## Zurv

Quote:


> Originally Posted by *Spiriva*
> 
> You guys who got two 1080 running in SLI, how do you like it so far ? Have you notice any micro stuttering or such ?


nope.. none. That said, even when running 4 way SLI i never really noticed it either. Maybe i'm bad at seeing it? maybe the x99 and extra lanes helped with it? Maybe i'm not playing games that others saw stuttering in?

I'd be happy to test something - here is my steam game list
http://steamcommunity.com/id/zurv/games/?tab=all


----------



## Zurv

sadly my PC looks 100% the same when 4 titan Xs were in there with a 5960x









but here it is with 4 1080s and a 6950x







(WHERE IS THE UNLOCK NVIDIA!!!!)


----------



## Bogga

Quote:


> Originally Posted by *Zurv*
> 
> sadly my PC looks 100% the same when 4 titan Xs were in there with a 5960x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but here it is with 4 1080s and a 6950x
> 
> 
> 
> 
> 
> 
> 
> (WHERE IS THE UNLOCK NVIDIA!!!!)
> 
> 
> 
> Spoiler: Warning: Spoiler!


External rads?


----------



## Zurv

Quote:


> Originally Posted by *Trys0meM0re*
> 
> Any of you guys noticed improved overclocking when switchin from air to water on the GTX1080 ?


yes and no. No, the max number is about the same, but on air thermal limits kick in after a while and down-clock the GPU. That isn't an issue with water.


----------



## viking21

Is here anyone that's running a 1080 into a node 202 or raven02?


----------



## Zurv

Quote:


> Originally Posted by *Bogga*
> 
> External rads?


yeah, i rebuild the system a few times and year and got sick of all the crap in the case.

I went with the koolance ERM-3k3uc:
http://koolance.com/erm-3k3uc-liquid-cooling-system-copper

I have another one in for my PC connected to the TV. On it's side it fits nicely into the AV furniture.
here is an old pix of it before i mounted in the cabinet- image that open space with the koolance unit.


----------



## Weber

This card reminds me of the good vib from when Maxwell was new. I'll get another as soon as I can figure out which water block to use.
Room is 26c in a stock heat test at 100% load.


----------



## looniam

Quote:


> Originally Posted by *Jpmboy*
> 
> Ghetto water cooling.


dat wall socket doe!


----------



## Jpmboy

Quote:


> Originally Posted by *looniam*
> 
> dat wall socket doe!


Dial 911?

lol - it's only phone charger crap and the like.








Quote:


> Originally Posted by *Zurv*
> 
> yes and no. No, the max number is about the same, but on air thermal limits kick in after a while and down-clock the GPU. That isn't an issue with water.


I got a little different result testing only one card tho - recognizing that getting 2 or more cards to overclock "as one" is rare at best and a compromise with the weakest card at worst. I was able to hit mid 2000's / 10800s with the air cooler @ a peak sustained core temp below 60C. With a uniblock the core is set to 2185, ram to 11000 - and it's good thru Heaven 4.0 (loops even).... need to continue stress testing.









Nice cooler! Love Koolance stuff.








When needed I hook in the aquarium chiller on top of the 4x420 Aquacomputer external rad.


Spoiler: Warning: Spoiler!


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> Dial 911?
> 
> lol - it's only phone charger crap and the like.
> 
> 
> 
> 
> 
> 
> 
> 
> I got a little different result testing only one card tho - recognizing that getting 2 or more cards to overclock "as one" is rare at best and a compromise with the weakest card at worst. I was able to hit mid 2000's / 10800s with the air cooler @ a peak sustained core temp below 60C. With a uniblock the core is set to 2185, ram to 11000 - and it's good thru Heaven 4.0 (loops even).... need to continue stress testing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice cooler! Love Koolance stuff.
> 
> 
> 
> 
> 
> 
> 
> 
> When needed I hook in the aquarium chiller on top of the 4x420 Aquacomputer external rad.
> 
> 
> Spoiler: Warning: Spoiler!


you scare me sir... and i'm totally ok with it









i have the koolance chiller. (i keep trying to give it away to friends but they are all god damn slackers that are making the right choice and not water cooling) But the hum of the unit drove me crazy. (and the fact that the CPU didn't run any better when it was colder. I always have bad luck with CPU OC'd... no matter the volt or how cold - they still OC like poo )


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> you scare me sir... and i'm totally ok with it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i have the koolance chiller. (i keep trying to give it away to friends but they are all god damn slackers that are making the right choice and not water cooling) But the hum of the unit drove me crazy. (and the fact that the CPU didn't run any better when it was colder. I always have bad luck with CPU OC'd... no matter the volt or how cold - they still OC like poo )


lol, for sure. Cold only goes so far... until you cross that Tc (cryogenic transition thing). Running Heaven 4.0 loops (idk, habit mostly) and scanning the PCB with an IR gun, it would seem that a good mount on the chokes - 2 circled in red - may be important - hottest things on the open PCB.











I gotta ask... what resolution are you driving with 4 1080s?


----------



## bigdaddytexel

I have a card and am trying to register to owners club but it keeps saying my gpu-z validation is incorrect, I have put in the string that GPUz validated as well as including my member name. no luck. Any ideas?


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> I gotta ask... what resolution are you driving with 4 1080s?


nothing right now! NVidia ungimp tool isn't out yet so only 2 are being used. But this PC has a 4k 32" gsync Acer screen and the other PC is a 65" 4k HDR OLED TV.

I'm looking forward to DP 1.3 (or 1.4) 32"+ 4k screens with over 60hz. Maybe that Dell 30" 4k OLED will be released at some point. Nvidia has had one for testing for a few months.


----------



## bigdaddytexel

http://www.techpowerup.com/gpuz/details/ar799
EVGA
1 Card Founders

Havent OCed much but just installed EK block. So for the voltage offsets do you drag them down in order to ramp up voltage quicker at lower clock speeds? I seems to be hitting possibly a voltage wall at anthing over +225 on core.


----------



## Zurv

Quote:


> Originally Posted by *bigdaddytexel*
> 
> http://www.techpowerup.com/gpuz/details/ar799
> EVGA
> 1 Card Founders
> 
> Havent OCed much but just installed EK block. So for the voltage offsets do you drag them down in order to ramp up voltage quicker at lower clock speeds? I seems to be hitting possibly a voltage wall at anthing over +225 on core.


if it makes you feel any better that is around where the card poo's out on me too. It makes me feel better that others have the same issues. (i have 8 cards so i'm sure some MUST be better.. but i'm not redoing all the loops again.)

I even flashed all my cards to EVGA FE SC








it didn't help.. but i feel a small victory knowing my base speed is a little faster


----------



## Jpmboy

Quote:


> Originally Posted by *bigdaddytexel*
> 
> http://www.techpowerup.com/gpuz/details/ar799
> EVGA
> 1 Card Founders
> 
> Havent OCed much but just installed EK block. So for the voltage offsets do you drag them down in order to ramp up voltage quicker at lower clock speeds? I seems to be hitting possibly a voltage wall at anthing over +225 on core.
> 
> 
> Spoiler: Warning: Spoiler!


download afterburner 4.3 beta and use the "curve" (cntrl-F). http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,2.html


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> if it makes you feel any better that is around where the card poo's out on me too. It makes me feel better that others have the same issues. (i have 8 cards so i'm sure some MUST be better.. but i'm not redoing all the loops again.)
> 
> I even flashed all my cards to EVGA FE SC
> 
> 
> 
> 
> 
> 
> 
> 
> it didn't help.. but i feel a small victory knowing my base speed is a little faster


how did you do the Flash?????


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> how did you do the Flash?????


i just used the latest version (5.287)
http://www.techpowerup.com/downloads/Utilities/BIOS_Modding/ (which isn't working for me now)

What was odd was i did the dance. protectionoff.. flash.. blah blah..
shutdown.. looked at the cards and they looked the same. But 5min ago i check gpu-z again.. and poof.. all are evga SC. Very strange. everything works fine. confirmed that base and boost match the SC.

(2 are normal evga fe and 2 are asus FE)


----------



## bigdaddytexel

My question is do you drag the sliders down in order to privde more voltage at a given clock earlier. So say I want peak voltage at +255 drag sliders down to match right? Ive treid this in precision and afterburner but seems buggy, almost get worse performance after screwing with it. Wondering what I am doing wrong. The card has a ton of cooling headroom.


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> i just used the latest version (5.287)
> http://www.techpowerup.com/downloads/Utilities/BIOS_Modding/ (which isn't working for me now)
> 
> What was odd was i did the dance. protectionoff.. flash.. blah blah..
> shutdown.. looked at the cards and they looked the same. But 5min ago i check gpu-z again.. and poof.. all are evga SC. Very strange. everything works fine. confirmed that base and boost match the SC.
> 
> (2 are normal evga fe and 2 are asus FE)


well it worked once at least... that's half the puzzle. Now we only need Pascal Bios Tweaker! I keep hitting the power limit.



valley has not PL'ed yet....


----------



## bigdaddytexel

So you are taking a founders edition and flashing to a superclocked? Does that push more voltage or just up the base boost curve? Is it worth it? I have a EVGA founders w EK block. Looking to get the most out of it. Obviously


----------



## Zurv

Quote:


> Originally Posted by *bigdaddytexel*
> 
> So you are taking a founders edition and flashing to a superclocked? Does that push more voltage or just up the base boost curve? Is it worth it? I have a EVGA founders w EK block. Looking to get the most out of it. Obviously


pretty much just changes the base and boost speeds. It won't change how well the card OCs.


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> well it worked once at least... that's half the puzzle. Now we only need Pascal Bios Tweaker! I keep hitting the power limit.


... and there is a good amount of room. 8 pin + PCI bus power will get us to 225w, right now the cards are limited to 180w


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> ... and there is a good amount of room. 8 pin + PCI bus power will get us to 225w, right now the cards are limited to 180w


the 8-pin can deliver quite a bit more than 150W... that's just the nominal rating.


----------



## Zurv

ARGH!!!
Quote:


> NVIDIA will enable support for three and four card configurations in future drivers (without a key) for specific overclocking/benchmarking tools only, as a way to make sure the GeForce brand doesn't fall off the 3DMark charts. Only those specific applications will be able operate in the 3-Way and 4-Way SLI configurations that you have come to know. There are no profiles to change manually and even the rare games that might have "just worked" with three or four GPUs will not take advantage of more than two GTX 10-series cards. It's fair to say at this point that except for the benchmarking crowd, NVIDIA 3-Way and 4-Way SLI is over.


http://www.pcper.com/news/Graphics-Cards/GeForce-GTX-1080-and-1070-3-Way-and-4-Way-SLI-will-not-be-enabled-games

HEY! I ALREADY BOUGHT THESE CARDS. This is pretty F'd. You can't promise to unlock the cards then say FU.. you can't use them for games!!!


----------



## VSG

Wow.. If that is true, and no reason to think otherwise at this point, that is absolutely terrible for the consumer. This should have been mentioned during reviews, but I bet Nvidia did not inform the media either.


----------



## nexxusty

Quote:


> Originally Posted by *Zurv*
> 
> oh ***?!!?
> http://www.pcper.com/news/Graphics-Cards/GeForce-GTX-1080-and-1070-3-Way-and-4-Way-SLI-will-not-be-enabled-games
> 
> HEY! I ALREADY BOUGHT THESE CARDS. This is pretty F'd. You can't promise to unlock the cards then say FU.. you can't use them for games!!!


3/4 way SLi is useless for 99% of games.


----------



## Nark96

Quote:


> Originally Posted by *Zurv*
> 
> oh ***?!!?
> http://www.pcper.com/news/Graphics-Cards/GeForce-GTX-1080-and-1070-3-Way-and-4-Way-SLI-will-not-be-enabled-games
> 
> HEY! I ALREADY BOUGHT THESE CARDS. This is pretty F'd. You can't promise to unlock the cards then say FU.. you can't use them for games!!!


Next time maybe you should think and do some thorough research before you splash out $2800 on reference GPU's... just a thought buddy


----------



## VSG

Quote:


> Originally Posted by *Nark96*
> 
> Next time maybe you should think and do some thorough research before you splash out $2800 on reference GPU's... just a thought buddy


Please do help me out here- what was out before this that mentioned what is in the PCPer article?


----------



## Nark96

Quote:


> Originally Posted by *geggeg*
> 
> Please do help me out here- what was out before this that mentioned what is in the PCPer article?


It was said on one of the tech channels on youtube, can't remember if it was Linus/hardwarecanucks etc, but one of them said the GTX 1080 will only support 2 way SLI in the majority of applications... 3/4 way SLI for gaming is just silly regardless of that anyway.


----------



## MrDerrikk

Does that mean lots of second-hand FE models will be hitting the market from the 3-4 way peeps?


----------



## Zurv

Quote:


> Originally Posted by *Nark96*
> 
> Next time maybe you should think and do some thorough research before you splash out $2800 on reference GPU's... just a thought buddy


hrmm.. i have 8! cards..

i also have 7 Titan X.. and almost every game that was important to me worked just great (maybe after some time) with 3/4 way SLI. Including, witcher 3, the division, BF4, fallout 4, rise of the tomb raider, shadow of mordor... etc..
without 3+ SLI i would not have been able to play them at 4k at max settings.
random people bashing SLI on the internet don't understand how it work nor understand scaling. (and yes it didn't work some times.. and some companies refuse to use it all.. i'm looking at you Capcom!)

they CLEARLY said 3/4 would work.. at launch with the unlock key.
Quote:


> Enthusiast Key
> While NVIDIA no longer recommends 3 or 4 way systems for SLI, we know that true enthusiasts will not be swayed&#8230;and in fact some games will continue to deliver great scaling beyond two GPUs. For this class of user we have developed an Enthusiast Key that can be downloaded off of NVIDIA's website and loaded into an individual's GPU. This process involves:
> 1. Run an app locally to generate a signature for your GPU 2. Request an Enthusiast Key from an upcoming NVIDIA Enthusiast Key website 3. Download your key 4. Install your key to unlock the 3 and 4-way function
> Full details on the process are available on the NVIDIA Enthusiast Key website, which will be available at the time GeForce GTX 1080 GPUs are available in users' hands.


http://international.download.nvidia.com/geforce-com/international/pdfs/GeForce_GTX_1080_Whitepaper_FINAL.pdf

also, 2 1080s isn't enough! I can't run the division with all the settings maxed like i could on my 4 way Titan X. (as an example)


----------



## VSG

Quote:


> Originally Posted by *Nark96*
> 
> It was said on one of the tech channels on youtube, can't remember if it was Linus/hardwarecanucks etc, but one of them said the GTX 1080 will only support 2 way SLI in the majority of applications... 3/4 way SLI for gaming is just silly regardless of that anyway.


I am talking about this new update about the Enthusiast Key being cancelled, and >2-way SLI only being for 3 benching suites only. This is completely new. People bought >2 cards, for whatever reason they may want, because Nvidia and the media told them it would be possible in the future. Clearly not.


----------



## Nark96

Quote:


> Originally Posted by *Zurv*
> 
> hrmm.. i have 8! cards..
> 
> i also have 7 Titan X.. and almost every game that was important to me worked just great (maybe after some time) with 3/4 way SLI. Including, witcher 3, the division, BF4, fallout 4, rise of the tomb raider, shadow of mordor... etc..
> without 3+ SLI i would not have been able to play them at 4k at max settings.
> random people bashing SLI on the internet don't understand how it work nor understand scaling.
> 
> they CLEARLY said 3/4 would work.. at launch with the unlock key.


Do whatever you want with your money... but many would agree that it's just being wasted with overkill components lol.
I still stand by what I said, and have always believed in. Anything past 2 way SLI for gaming is just ridiculous. You'll have more disadvantages than benefits. More heat, more power withdrawal, and you'll be out of pocket by a huge amount to just "show off" but like I said, it's your money bud


----------



## Zurv

Quote:


> Originally Posted by *Nark96*
> 
> Do whatever you want with your money... but many would agree that it's just being wasted with overkill components lol.
> I still stand by what I said, and have always believed in. Anything past 2 way SLI for gaming is just ridiculous. You'll have more disadvantages than benefits. More heat, more power withdrawal, and you'll be out of pocket by a huge amount to just "show off" but like I said, it's your money bud


show off? to who? ME? when i want to play a game with all the fancy settings on.. and now i can't because 2 1080s are not powerful enough!

ridiculous? really? sub 60fps? no it isn't. What site are you posting on?








a waste is a 6950x.. that doesn't really help with much. More cards makes games playable to the resolution and settings i want. I want 4k. I want all the settings. I'll pay for it.. big deal.. just money.
3-4 way did the job in the games that matter to me. If you want to usea 780 and play at 1080p.. that is fine for you. Don't tell me what is right for me.


----------



## Nark96

-delete-


----------



## Shiftstealth

Quote:


> Originally Posted by *Zurv*
> 
> sadly my PC looks 100% the same when 4 titan Xs were in there with a 5960x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but here it is with 4 1080s and a 6950x
> 
> 
> 
> 
> 
> 
> 
> (WHERE IS THE UNLOCK NVIDIA!!!!)


Lol, Nvidia just announced they won't be releasing 4 way support. Might want to sell 2 of those puppies.


----------



## Chargeit

The problem with 3/4 way sli is most people aren't doing it. So, it's kind of wasted resources to support it for the handful of people that use it. When you're in the minority sometimes stuff don't work out well for you. Same reason that frustrating as it may be, not every game supports 21:9 properly. Can't complain too much because I'm in the minority using 21:9. Though it still sucks.


----------



## MunneY

I bet we see some hacked bios or support. I really find it odd to believe they will not support someone who wants to give them 2x the extra money.


----------



## AllGamer

Quote:


> Originally Posted by *Zurv*
> 
> ARGH!!!
> http://www.pcper.com/news/Graphics-Cards/GeForce-GTX-1080-and-1070-3-Way-and-4-Way-SLI-will-not-be-enabled-games
> 
> HEY! I ALREADY BOUGHT THESE CARDS. This is pretty F'd. You can't promise to unlock the cards then say FU.. you can't use them for games!!!


Quote:


> Originally Posted by *geggeg*
> 
> Wow.. If that is true, and no reason to think otherwise at this point, that is absolutely terrible for the consumer. This should have been mentioned during reviews, but I bet Nvidia did not inform the media either.


Actually nVidia did said it loud and clear during the live stream 3 and 4 way SLI will not be officially supported in games

You can still do it, but there will be no support.

As in you are on your own, if you can make it work, yay, if you can't then too bad.

They only will provide support to 2 way SLI, that's nVidia official stance.


----------



## AllGamer

Quote:


> Originally Posted by *Chargeit*
> 
> The problem with 3/4 way sli is most people aren't doing it. So, it's kind of wasted resources to support it for the handful of people that use it. When you're in the minority sometimes stuff don't work out well for you. Same reason that frustrating as it may be, not every game supports 21:9 properly. Can't complain too much because I'm in the minority using 21:9. Though it still sucks.


yup, i felt like cheated on, after i got my Quad GTX 690, and it was never properly supported


----------



## Zurv

Quote:


> Originally Posted by *Chargeit*
> 
> The problem with 3/4 way sli is most people aren't doing it. So, it's kind of wasted resources to support it for the handful of people that use it. When you're in the minority sometimes stuff don't work out well for you. Same reason that frustrating as it may be, not every game supports 21:9 properly. Can't complain too much because I'm in the minority using 21:9. Though it still sucks.


i'd agree with you.. but that isn't what just happened. It would be as if they just said 21:9 will not be supported and will lock you out of that option. Even though they promised you should still have the option.
This all would have been fine if they didn't release info saying it would still be a choice. Then they wait 2 weeks after the cards were out change the line.

game are still going to come out.. and those game will have the same level of support for 3 and 4 way SLI as they always have.. just not on the 1080.
Quote:


> Originally Posted by *AllGamer*
> 
> Actually nVidia did said it loud and clear during the live stream 3 and 4 way SLI will not be officially supported in games
> 
> You can still do it, but there will be no support.
> 
> As in you are on your own, if you can make it work, yay, if you can't then too bad.
> 
> They only will provide support to 2 way SLI, that's nVidia official stance.


i was fine with that.. but that isn't this new news. They are locking 3/4 SLI out other than benchmarking

but ... it is what it is now... i wouldn't have picked up these cards.. but it is to late. they are slower than my 4 way TitanX


----------



## Jpmboy

Quote:


> Originally Posted by *Nark96*
> 
> Do whatever you want with your money... but many would agree that it's just being wasted with overkill components lol.
> I still stand by what I said, and have always believed in. Anything past 2 way SLI for gaming is just ridiculous. You'll have more disadvantages than benefits. More heat, more power withdrawal, and you'll be out of pocket by a huge amount to just "show off" but like I said, it's your money bud


Quote:


> Originally Posted by *Nark96*
> 
> the word 'ignorant' comes to mind when I read your replies
> 
> 
> 
> 
> 
> 
> 
> quite sad really... look by all means do whatever you want with your money. You could set all those GPU's on fire and I couldn't care less because they aren't mine. I can have my opinion just like you can have yours... same with anyone else. Freedom of speech buddy. FYI I'm waiting for the aftermarket/non reference 1080's to then get rid of my 780 Classified. Till then I'll be wise with my money and stick to what I have


Ignorant? LOL, _I_ sense envy.
Some folks buy a car with 650HP







and not for the track. Got a problem with that??








4-way has it's benes, if you game at 4K and above, it is not overkill...if the game and NV can support it. Driver optimization for 4-way (and 3-way) gaming has been lacking since the release of Maxwell. This is driven by market segmentation, which has been declining in HEDT in general and more so at the 4 GPU level. I don;t know what business you are in, but chasing the "whales" out has it's consequences.
fercrissakes... if ya can't run with the big dogs, don't get off the porch.


----------



## Chargeit

Yea they should of made it more clear what their intentions were.


----------



## Jpmboy

where all those wayward 3 and 4 way casts out can go:


----------



## nexxusty

Quote:


> Originally Posted by *Jpmboy*
> 
> where all those wayward 3 and 4 way casts out can go:


Haha Frank.

"Hi ladies!! I'm Frack! *****". ROFL.


----------



## floodo1

Nvidia just said they wont have SLI profiles for games but that DX12 games can take advantage of >2 GPUs if the developer wants to. Some games will support 3 or 4 GPUs.

Also, we should just ignore the anti-3/4 way SLI trolls .... they hate because they are stuck at 1080p or are poor (-8


----------



## gree

Hey anybody know if overclocks uk is a good place to order from (I'm in the USA) they seem to have more 1080 options than places like newegg


----------



## MrDerrikk

Quote:


> Originally Posted by *gree*
> 
> Hey anybody know if overclocks uk is a good place to order from (I'm in the USA) they seem to have more 1080 options than places like newegg


From what I've seen the exchange rate is slightly better from B&H than Overclockers (I'm in Aus).


----------



## Seyumi

Quote:


> Originally Posted by *Zurv*
> 
> ARGH!!!
> http://www.pcper.com/news/Graphics-Cards/GeForce-GTX-1080-and-1070-3-Way-and-4-Way-SLI-will-not-be-enabled-games
> 
> HEY! I ALREADY BOUGHT THESE CARDS. This is pretty F'd. You can't promise to unlock the cards then say FU.. you can't use them for games!!!


Actually feel bad from you coming from a 5960x @ 4x Titan X's myself. I think your biggest waste is your new $1700 10 core processor since it sounds like you're more of a gamer like myself. You're sure as hell not going to run into any CPU bottlenecks with only 2 1080's. (What do you even have that thing overclocked to anyway? Several sites are hitting a wall at 4.1 to 4.3ghz which is kind of crappy for gaming). What do you plan on doing with your CPU & GPU's? Return? Sell? Wait to sell what happens? Good news is you might actually be able to scalp a profit selling your cards as they are due to demand. You were probably better off with a higher clocked 5960x or even a Z170 system which is what I downgraded to and now glad I did.

FYI - Everyone bashing this guy for his ridiculous computers needs to look at the bigger picture. I got several guys are my work that spend several thousand dollars lifting their stupid big ass trucks or spending thousands on bling-bling rims for their stupid ricer cars which not only probably degrade performance but also looks ******ed. I've seen this for over a decade and not just at my current work place.


----------



## Bogga

Any of you guys got your AIB-versions yet? Today at 9 (CEST) I'll know if my two STRIX will arrive tomorrow. With my luck I'll probably have to wait two more weeks... but who knows, I gotta be lucky sometime


----------



## Spiriva

Quote:


> Originally Posted by *Nark96*
> 
> Do whatever you want with your money... but many would agree that it's just being wasted with overkill components lol.
> I still stand by what I said, and have always believed in. Anything past 2 way SLI for gaming is just ridiculous. You'll have more disadvantages than benefits. More heat, more power withdrawal, and you'll be out of pocket by a huge amount to just "show off" but like I said, it's your money bud


Are you sure your not a Swede or is this a European thing ? This crazy envy that other people can and do buy things you cant ? And that you feel the need to tell people that they waste money, and that a AMD card from year 2000 is the way to go.

This is the main reason i stay away from the Swedish forum "sweclockers" because its just a bunch of crying kids there telling everyone who bought something they cant afford how silly,dumb and a waste of money it is.

I really hope this forum doesnt turn in to that.


----------



## fat4l

3/4 way sli. NO!


----------



## Bogga

Quote:


> Originally Posted by *Spiriva*
> 
> Are you sure your not a Swede or is this a European thing ? This crazy envy that other people can and do buy things you cant ? And that you feel the need to tell people that they waste money, and that a AMD card from year 2000 is the way to go.
> 
> This is the main reason i stay away from the Swedish forum "sweclockers" because its just a bunch of crying kids there telling everyone who bought something they cant afford how silly,dumb and a waste of money it is.
> 
> I really hope this forum doesnt turn in to that.


Can't say I've seen much of that.... but please try flashback. People there tell you that you can buy a monster that can take everything for 5 years into the future without breaking a sweat for 8000 sek


----------



## grimboso

Quote:


> Originally Posted by *Spiriva*
> 
> Are you sure your not a Swede or is this a European thing ? This crazy envy that other people can and do buy things you cant ? And that you feel the need to tell people that they waste money, and that a AMD card from year 2000 is the way to go.
> 
> This is the main reason i stay away from the Swedish forum "sweclockers" because its just a bunch of crying kids there telling everyone who bought something they cant afford how silly,dumb and a waste of money it is.
> 
> I really hope this forum doesnt turn in to that.


Quote:


> Originally Posted by *Bogga*
> 
> Can't say I've seen much of that.... but please try flashback. People there tell you that you can buy a monster that can take everything for 5 years into the future without breaking a sweat for 8000 sek


Have to agree with Bogga, that I haven't seen much about that. Needless to say Sweclockers is not as "evolved" in terms of overclocking as this forum is, and my personal opinion is that the "average" user here is much more tech-savvy than on sweclockers.

Most annoying thing there tho, is the constant AMD vs Nvidia fanboy battle where they just pour hatred over the brands without any coherent arguments.

Ontopic: My two EVGA SC cards are delayed by 12 days from EVGA, wth guys! I want to install my blocks. Everything else is set up and just waiting!


----------



## Spiriva

Quote:


> Originally Posted by *Bogga*
> 
> Can't say I've seen much of that.... but please try flashback. People there tell you that you can buy a monster that can take everything for 5 years into the future without breaking a sweat for 8000 sek


Quote:


> Originally Posted by *grimboso*
> 
> Have to agree with Bogga, that I haven't seen much about that. Needless to say Sweclockers is not as "evolved" in terms of overclocking as this forum is, and my personal opinion is that the "average" user here is much more tech-savvy than on sweclockers.
> 
> Most annoying thing there tho, is the constant AMD vs Nvidia fanboy battle where they just pour hatred over the brands without any coherent arguments.
> 
> Ontopic: My two EVGA SC cards are delayed by 12 days from EVGA, wth guys! I want to install my blocks. Everything else is set up and just waiting!


*Off topic:*
I went against my principles, and had a look in the 1080 thread over at sweclockers...right.
I saw that you Bogga had posted today about getting sli 1080 cards, and like a "letter in the mail" the very next message
Quote:


> Originally Posted by *TaunyTiger*
> Behövs verkligen 2 kort?


Nah to me sweclockers is a joke of a forum.

*back on topic:*

Bogga, where did you order your 1080 from ?


----------



## MCFC

Quote:


> Originally Posted by *Zurv*
> 
> nothing right now! NVidia ungimp tool isn't out yet so only 2 are being used. But this PC has a 4k 32" gsync Acer screen and the other PC is a 65" 4k HDR OLED TV.
> 
> I'm looking forward to DP 1.3 (or 1.4) 32"+ 4k screens with over 60hz. Maybe that Dell 30" 4k OLED will be released at some point. Nvidia has had one for testing for a few months.


New York City rent plus the spare cash to buy 4 1080s and big ass 4k screens.... must be nice


----------



## Nark96

Quote:


> Originally Posted by *Spiriva*
> 
> Are you sure your not a Swede or is this a European thing ? This crazy envy that other people can and do buy things you cant ? And that you feel the need to tell people that they waste money, and that a AMD card from year 2000 is the way to go.
> 
> This is the main reason i stay away from the Swedish forum "sweclockers" because its just a bunch of crying kids there telling everyone who bought something they cant afford how silly,dumb and a waste of money it is.
> 
> I really hope this forum doesnt turn in to that.


I can have an opinion on the matter just as anyone else can. Some people on here are so pathetic... I was simply saying 3/4 SLI is a big no for gaming. The cons outweigh the benefits. I wasn't telling anyone what they should do with their money so I don't know how you just assumed I was implying that.


----------



## pez

Quote:


> Originally Posted by *Nark96*
> 
> Do whatever you want with your money... but many would agree that it's just being wasted with overkill components lol.
> I still stand by what I said, and have always believed in. Anything past 2 way SLI for gaming is just ridiculous. You'll have more disadvantages than benefits. More heat, more power withdrawal, and you'll be out of pocket by a huge amount to just "show off" but like I said, it's your money bud


Quote:


> Originally Posted by *Nark96*
> 
> the word 'ignorant' comes to mind when I read your replies
> 
> 
> 
> 
> 
> 
> 
> quite sad really... look by all means do whatever you want with your money. You could set all those GPU's on fire and I couldn't care less because they aren't mine. I can have my opinion just like you can have yours... same with anyone else. Freedom of speech buddy. FYI I'm waiting for the aftermarket/non reference 1080's to then get rid of my 780 Classified. Till then I'll be wise with my money and stick to what I have


Quote:


> Originally Posted by *Jpmboy*
> 
> Ignorant? LOL, _I_ sense envy.
> Some folks buy a car with 650HP
> 
> 
> 
> 
> 
> 
> 
> and not for the track. Got a problem with that??
> 
> 
> 
> 
> 
> 
> 
> 
> 4-way has it's benes, if you game at 4K and above, it is not overkill...if the game and NV can support it. Driver optimization for 4-way (and 3-way) gaming has been lacking since the release of Maxwell. This is driven by market segmentation, which has been declining in HEDT in general and more so at the 4 GPU level. I don;t know what business you are in, but chasing the "whales" out has it's consequences.
> fercrissakes... if ya can't run with the big dogs, don't get off the porch.


Quote:


> Originally Posted by *Nark96*
> 
> I can have an opinion on the matter just as anyone else can. Some people on here are so pathetic... I was simply saying 3/4 SLI is a big no for gaming. The cons outweigh the benefits. I wasn't telling anyone what they should do with their money so I don't know how you just assumed I was implying that.


The fact that he's given you examples of games that scale and let him max out (more than 3 games at that) their settings at 4K is enough for it to obviously be worth it to him.

It's ok to have an opinion, but you're speaking it as if it's gospel. He's water cooling them, and plus these cards are using ~200W a piece? That's only a bit more than some BIOS modded 2-way SLI 780s I've seen. You're literally at the point that you're making stuff up just to make him feel like he wasted money. There's plenty of reviews out that the scaling and benefits.

Since he's willing to drop the cash on the cards, and the time to tweak them to scale, what are you even mad for? Are you even planning to get a GTX 1080, or are you only here to troll and be _green with envy_?


----------



## Nark96

Quote:


> Originally Posted by *pez*
> 
> The fact that he's given you examples of games that scale and let him max out (more than 3 games at that) their settings at 4K is enough for it to obviously be worth it to him.
> 
> It's ok to have an opinion, but you're speaking it as if it's gospel. He's water cooling them, and plus these cards are using ~200W a piece? That's only a bit more than some BIOS modded 2-way SLI 780s I've seen. You're literally at the point that you're making stuff up just to make him feel like he wasted money. There's plenty of reviews out that the scaling and benefits.
> 
> Since he's willing to drop the cash on the cards, and the time to tweak them to scale, what are you even mad for? Are you even planning to get a GTX 1080, or are you only here to troll and be _green with envy_?


nope, I was just voicing my opinion on the matter. I don't know why everyone's so mad. I simply just said 3/4 SLI is pointless for gaming and is a waste of money... at least for the majority of games out there. I'm not saying what he should do with his money or what to buy. It's his choice, he can do whatever he wants with it. I was just pointing out what I know. I'm not going to waste anymore time explaining myself time and time again. If you must know, yes, I'm planning to buy a 1080 as soon as the non-reference designs are released and the prices stabilise, since the FE Gpu's are a waste of money imo. Till then I'm sticking with my 780 classy.


----------



## SweWiking

Quote:


> Originally Posted by *Nark96*
> 
> nope, I was just voicing my opinion on the matter. I don't know why everyone's so mad. I simply just said 3/4 SLI is pointless for gaming and is a waste of money... at least for the majority of games out there. I'm not saying what he should do with his money or what to buy. It's his choice, he can do whatever he wants with it. I was just pointing out what I know. I'm not going to waste anymore time explaining myself time and time again. If you must know, yes, I'm planning to buy a 1080 as soon as the non-reference designs are released and the prices stabilise, since the FE Gpu's are a waste of money imo. Till then I'm sticking with my 780 classy.


You should stick with your 780 forever instead, so you dont spend money on new things. You know waste cash and all.

Go away you troll!


----------



## pez

But he and others are basing their 'opinions' off of facts that are proven by reviews. And you gave cons that are deemed pretty much untrue here due to the circumstances. If you're not seeing how your tone is coming across, then I'm not sure what to tell ya.


----------



## pez

So let me raise an argument then:

Are you planning on upgrading your monitor? If not, you're completely wasting your money buying a 1080 over a 1070 for 1080p...completely my opinion, though.

I hope you're able to see where others are coming from, now.


----------



## Nark96

Quote:


> Originally Posted by *pez*
> 
> So let me raise an argument then:
> 
> Are you planning on upgrading your monitor? If not, you're completely wasting your money buying a 1080 over a 1070 for 1080p...completely my opinion, though.
> 
> I hope you're able to see where others are coming from, now.


not that it concerns you, but yes I am. I'm going for a 1440p or an ultra wide (2560x1080). No it's fine that you have an opinion on that. Again many would agree with you and many would disagree that a single 1080 will allow me to max out every/majority of games @ 1080p. But hey that's your opinion.


----------



## pez

Quote:


> Originally Posted by *Nark96*
> 
> not that it concerns you, but yes I am. I'm going for a 1440p or an ultra wide (2560x1080).


So now you're acting offended that we're questioning you and criticizing your plans of purchase/upgrade path? I mean, even then, a GTX 1070 is more than enough....IMO.

I hope you're understanding the point I'm making here is that this is a forum where people want performance; and while we may not understand everyone's setup, there's no use criticizing it in the way that you, or I (in these past two posts) are.

Who cares if you want to use a 1080 in 16:9 or 21:9 1080p? I think it's awesome, and with 240Hz monitors on the hoirzon, this actually makes sense to me. Even if it didn't, who cares?! It's _your_ money and _your_ prerogative.

In the end, we all need to understand that we are a community here and this thread is not here for this kinda stuff. Get your 1080s, overclock them, and bench and/or game with them!


----------



## Nark96

Quote:


> Originally Posted by *pez*
> 
> So now you're acting offended that we're questioning you and criticizing your plans of purchase/upgrade path? I mean, even then, a GTX 1070 is more than enough....IMO.
> 
> I hope you're understanding the point I'm making here is that this is a forum where people want performance; and while we may not understand everyone's setup, there's no use criticizing it in the way that you, or I (in these past two posts) are.
> 
> Who cares if you want to use a 1080 in 16:9 or 21:9 1080p? I think it's awesome, and with 240Hz monitors on the hoirzon, this actually makes sense to me. Even if it didn't, who cares?! It's _your_ money and _your_ prerogative.
> 
> In the end, we all need to understand that we are a community here and this thread is not here for this kinda stuff. Get your 1080s, overclock them, and bench and/or game with them!


I do understand what you're saying. Fair enough I suppose. But I hope you agree we're all allowed to voice our opinions. Just as you had your say, I had mine. Same with anyone else


----------



## pez

Quote:


> Originally Posted by *Nark96*
> 
> I do understand what you're saying. Fair enough I suppose. But I hope you agree we're all allowed to voice our opinions. Just as you had your say, I had mine. Same with anyone else


I do







. It just came off harsh, and that's why everyone jumped down your throat







.

EDIT: BTW, I took a look at your rig pictures and it's very clean! Nice job







.


----------



## romanlegion13th

Quote:


> Originally Posted by *Noshuru*
> 
> I guess you don't know how quotes work either. I was saying the the guy you quoted has no idea what he's talking about.


How is a cut down card faster? its not so come on the Titan X was the King Until this card come out


----------



## grimboso

Ek just announced waterblocks for the FTW and Classy cards on Instagram!


----------



## MrDerrikk

Quote:


> Originally Posted by *grimboso*
> 
> Ek just announced waterblocks for the FTW and Classy cards on Instagram!


Whoa! Got a link?


----------



## sirleeofroy

Quote:


> Originally Posted by *MrDerrikk*
> 
> Whoa! Got a link?


https://www.ekwb.com/news/official-list-of-ek-water-blocks-for-gtx-1080-series/


----------



## grimboso

Quote:


> Originally Posted by *MrDerrikk*
> 
> Whoa! Got a link?


https://www.instagram.com/ekwaterblocks/


__
http://instagr.am/p/BGbigjKpIhc%2F/


----------



## grimboso

Quote:


> Contrary to initial statements about EVGA water blocks, all disputes have been resolved and we are officially announcing the following lineup of GeForce® GTX 1080 water blocks:


This is very good news.


----------



## MrDerrikk

Quote:


> Originally Posted by *sirleeofroy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrDerrikk*
> 
> Whoa! Got a link?
> 
> 
> 
> https://www.ekwb.com/news/official-list-of-ek-water-blocks-for-gtx-1080-series/
Click to expand...

I'm so glad I went with the FTW and not the Superclocked now.


----------



## JackCY

Quote:


> Originally Posted by *Zurv*
> 
> sadly my PC looks 100% the same when 4 titan Xs were in there with a 5960x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but here it is with 4 1080s and a 6950x
> 
> 
> 
> 
> 
> 
> 
> (WHERE IS THE UNLOCK NVIDIA!!!!)


They don't exists, no support from Nvidia drivers for 3+ SLI you have to rely on application developers using low level APIs to implement support for 3+ GPUs.
On the other hand I'm open to receiving donations of unused 1080s from 3+ SLI.


----------



## grimboso

Quote:


> Originally Posted by *MrDerrikk*
> 
> I'm so glad I went with the FTW and not the Superclocked now.


My SC cards was not shipped, so I just called and changed the order to FTW








Now I have 2 FTW arriving next week! #hype


----------



## sirleeofroy

Quote:


> Originally Posted by *MrDerrikk*
> 
> I'm so glad I went with the FTW and not the Superclocked now.


I went with the FTW from the off, dual BIOS, 10+2 phases and 2x 8pin for not a lot more just made sense to me


----------



## MrDerrikk

I would have gone with the Gigabyte G1 originally but it looks very ugly imo. The EVGA ones are really the closest to my tastes visually


----------



## SynchroSCP

Quote:


> Originally Posted by *MrDerrikk*
> 
> I'm so glad I went with the FTW and not the Superclocked now.


Sigh...the exact opposite for me, ordered SC just for the block support...would rather have the FTW but then again I have SC and block in hand today so theres that.


----------



## grimboso

Quote:


> Originally Posted by *sirleeofroy*
> 
> I went with the FTW from the off, dual BIOS, 10+2 phases and 2x 8pin for not a lot more just made sense to me


I have never flashed a bios so the FTW cards will be the first ones I do it too.
I expect that one of the two slots contains the original bios, and the secondary slot is open for us to put w/e we want there?

I also assume that putting a custom bios in the secondary slot does not harm the warranty in any way? At least if I use a bios that only unlocks the power target and does not go over the limit of 1.25v?


----------



## sirleeofroy

Quote:


> Originally Posted by *SynchroSCP*
> 
> Sigh...the exact opposite for me, ordered SC just for the block support...would rather have the FTW but then again I have SC and block in hand today so theres that.


I wouldn't be too worried, looking at the overclocking performance of some of the custom boards, it looks like they clock no better than the FE versions anyway. Worse in some cases. It seems that a well binned chip will be more important for now, until the 16nm process matures a bit........


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> So let me raise an argument then:
> 
> Are you planning on upgrading your monitor? If not, you're completely wasting your money buying a 1080 over a 1070 for 1080p...completely my opinion, though.
> 
> I hope you're able to see where others are coming from, now.


1080P.. what's that? oh... my old TV set. I get it.









anyway, let's move on...


----------



## sirleeofroy

Quote:


> Originally Posted by *grimboso*
> 
> I have never flashed a bios so the FTW cards will be the first ones I do it too.
> I expect that one of the two slots contains the original bios, and the secondary slot is open for us to put w/e we want there?
> 
> I also assume that putting a custom bios in the secondary slot does not harm the warranty in any way? At least if I use a bios that only unlocks the power target and does not go over the limit of 1.25v?


I suspect that is the case, there is a switch on board to change which BIOS is used but I'm unsure of how much the warranty covers in terms of custom BIOS'.


----------



## grimboso

Quote:


> Originally Posted by *sirleeofroy*
> 
> I suspect that is the case, there is a switch on board to change which BIOS is used but I'm unsure of how much the warranty covers in terms of custom BIOS'.


As I have no previous experience with dual-bios cards, and only about a year ago found the intrest for watercooling and overclocking (I am still a pleb, but always trying to learn







) I have never dwelved much in the custom bios. From a corporate standpoint I can see why you would not issue warranty for custom bios, as users can do a lot of harm if you pour to much voltage or power into the chip.
But on the other hand, if you dont want your users to use custom bios, why would you supply a card with two of them, and an easy way of switching between?

I sent an email to them asking. I might be a lost cause but some official response would be positive. Any of you well-educated, experienced clockers can shed some light on a pleb?


----------



## sirleeofroy

Quote:


> Originally Posted by *grimboso*
> 
> As I have no previous experience with dual-bios cards, and only about a year ago found the intrest for watercooling and overclocking (I am still a pleb, but always trying to learn
> 
> 
> 
> 
> 
> 
> 
> ) I have never dwelved much in the custom bios. From a corporate standpoint I can see why you would not issue warranty for custom bios, as users can do a lot of harm if you pour to much voltage or power into the chip.
> But on the other hand, if you dont want your users to use custom bios, why would you supply a card with two of them, and an easy way of switching between?
> 
> I sent an email to them asking. I might be a lost cause but some official response would be positive. Any of you well-educated, experienced clockers can shed some light on a pleb?


So I found these on the EVGA FAQ pages.....

Question / Issue

Does overclocking void my warranty?

Answer / Solution

Overclocking our products does not void the warranty as long as there is no physical damage to the product or missing components. However EVGA Support will not be able to assist you in overclocking the product.

You may post in our forums, where other EVGA.com community members would be able to give you advise on how to overclock.
Please visit: forums.evga.com/

http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=55

Question / Issue

Does water-cooling or installing a third party cooling solution on my video card void the warranty?

Answer / Solution

Installing third party cooling solutions does not void warranty on our products. Just be sure to keep the original cooling solution as it will have to be on the card if it is ever sent in for RMA. Any physical damage such as burn marks, water residue or damage, or any damage to the PCB will void ALL warranties.

http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=57720

So it looks like that as long as nothing is physically damaged, you're good.

Can anyone else weigh in on this?


----------



## grimboso

Quote:


> Originally Posted by *sirleeofroy*
> 
> So I found these on the EVGA FAQ pages.....
> 
> Question / Issue
> 
> Does overclocking void my warranty?
> 
> Answer / Solution
> 
> Overclocking our products does not void the warranty as long as there is no physical damage to the product or missing components. However EVGA Support will not be able to assist you in overclocking the product.
> 
> You may post in our forums, where other EVGA.com community members would be able to give you advise on how to overclock.
> Please visit: forums.evga.com/
> 
> http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=55
> 
> Question / Issue
> 
> Does water-cooling or installing a third party cooling solution on my video card void the warranty?
> 
> Answer / Solution
> 
> Installing third party cooling solutions does not void warranty on our products. Just be sure to keep the original cooling solution as it will have to be on the card if it is ever sent in for RMA. Any physical damage such as burn marks, water residue or damage, or any damage to the PCB will void ALL warranties.
> 
> http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=57720
> 
> So it looks like that as long as nothing is physically damaged, you're good.
> 
> Can anyone else weigh in on this?


What I also found, which is one of the reasons I choose to use EVGA. They allow waterblocks and they allow overclocking! The lack of information regarding custom bios however is what bothers me.

If the only requirement is that the card is physically undamaged then that is mighty fine with me. If I should fry the GPU with to much voltage it would probably leave marks anyway.


----------



## grimboso

Quote:


> Originally Posted by *sirleeofroy*
> 
> lots of text


Got this answer from [email protected]
Quote:


> The second BIOS on the card is not meant to be used for custom BIOS, as long as you restore the original BIOS for any type of warranty replacement then your warranty will stand, but we strongly do not running custom BIOS on the card as it can push components beyond what they are designed to run.


----------



## Zurv

Quote:


> Originally Posted by *sirleeofroy*
> 
> So I found these on the EVGA FAQ pages.....
> 
> Question / Issue
> 
> Does overclocking void my warranty?
> 
> Answer / Solution
> 
> Overclocking our products does not void the warranty as long as there is no physical damage to the product or missing components. However EVGA Support will not be able to assist you in overclocking the product.
> 
> You may post in our forums, where other EVGA.com community members would be able to give you advise on how to overclock.
> Please visit: forums.evga.com/
> 
> http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=55
> 
> Question / Issue
> 
> Does water-cooling or installing a third party cooling solution on my video card void the warranty?
> 
> Answer / Solution
> 
> Installing third party cooling solutions does not void warranty on our products. Just be sure to keep the original cooling solution as it will have to be on the card if it is ever sent in for RMA. Any physical damage such as burn marks, water residue or damage, or any damage to the PCB will void ALL warranties.
> 
> http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=57720
> 
> So it looks like that as long as nothing is physically damaged, you're good.
> 
> Can anyone else weigh in on this?


oh.. I've got'n funky with many a EVGA card and they will take'm for RMA. I just put the cooler back on and flash back the bios it came with. This is why i always try to buy EVGA.


----------



## sirleeofroy

Quote:


> Originally Posted by *grimboso*
> 
> Got this answer from [email protected]
> 
> Quote:
> The second BIOS on the card is not meant to be used for custom BIOS, as long as you restore the original BIOS for any type of warranty replacement then your warranty will stand, but we strongly do not running custom BIOS on the card as it can push components beyond what they are designed to run.


Quote:


> Originally Posted by *Zurv*
> 
> oh.. I've got'n funky with many a EVGA card and they will take'm for RMA. I just put the cooler back on and flash back the bios it came with. This is why i always try to buy EVGA.


Sounds good to me!

So as long as you don't snap the card in half and it has the original BIOS, you can more or less do what you want!


----------



## grimboso

Quote:


> Originally Posted by *sirleeofroy*
> 
> Sounds good to me!
> 
> So as long as you don't snap the card in half and it has the original BIOS, you can more or less do what you want!


Quote:


> Originally Posted by *Zurv*
> 
> oh.. I've got'n funky with many a EVGA card and they will take'm for RMA. I just put the cooler back on and flash back the bios it came with. This is why i always try to buy EVGA.


Sounds like pretty darn good terms if you ask me!


----------



## pez

Yep. EVGA treated me very well through my GTX780 RMA(s). Upon receiving my first replacement that lasted all of a week, after the second RMA, they sent me back a GTX 970. Small performance bump and slightly more VRAM for my troubles. And it was Xmas time, too. Great customer service that doesn't BS you around and respects and recognizes that you know what you're talking about.

I would have honestly gone with them if FE cards weren't so much more than AIB offerings and if that cooler hadn't been so (definitely IMO) atrocious. Maybe pictures don't do it justice, but who knows. I was also a tad disappointed that G1 stock boost/OC is equivalent to the FTWs, for SC price, and thus was the final reasoning in cancelling my preorder.


----------



## Bogga

Quote:


> Originally Posted by *Spiriva*
> 
> Bogga, where did you order your 1080 from ?


Ordered from Inet... they had a couple of cards coming in today.... but of course they got delayed until sunday


----------



## TK421

So if maxwell trend follows (proved by kingpin himself), the main OC potential of these cards is only based on luck. Custom pcb like classy, g1 extreme, zotac pgf, kingpin, and lightning isn't going to improve OC by a significant margin but will cost substantially more than the 599 msrp (eg: msi aero 1080).

What do you guys think?


----------



## Spiriva

From guru3d, nVidia GeForce GTX1080 Bios:
Quote:


> Originally Posted by *MichaelT*
> There are two of these... They appear to be for MSi
> The second has a newer date than the first.
> 
> http://www.station-drivers.com/index.php?option=com_remository&Itemid=353&func=fileinfo&id=2260&lang=fr
> 
> http://www.station-drivers.com/index.php?option=com_remository&Itemid=352&func=fileinfo&id=2261&lang=fr
> 
> Michael T


----------



## Asus11

Quote:


> Originally Posted by *TK421*
> 
> So if maxwell trend follows (proved by kingpin himself), the main OC potential of these cards is only based on luck. Custom pcb like classy, g1 extreme, zotac pgf, kingpin, and lightning isn't going to improve OC by a significant margin but will cost substantially more than the 599 msrp (eg: msi aero 1080).
> 
> What do you guys think?


I think grab an aero and put it on water, thats the first thing that came to my head when I seen it


----------



## TK421

Quote:


> Originally Posted by *Asus11*
> 
> I think grab an aero and put it on water, thats the first thing that came to my head when I seen it


yea, I'm going to abuse the micro center return policy until a 599 card hits there


----------



## bfedorov11

Quote:


> Originally Posted by *Zurv*
> 
> i just used the latest version (5.287)
> http://www.techpowerup.com/downloads/Utilities/BIOS_Modding/ (which isn't working for me now)
> 
> What was odd was i did the dance. protectionoff.. flash.. blah blah..
> shutdown.. looked at the cards and they looked the same. But 5min ago i check gpu-z again.. and poof.. all are evga SC. Very strange. everything works fine. confirmed that base and boost match the SC.


Where did you DL the bios? ::found it. How did you get around the cert 2.0 error?


----------



## r0l4n

Quote:


> Originally Posted by *Zurv*
> 
> if it makes you feel any better that is around where the card poo's out on me too. It makes me feel better that others have the same issues. (i have 8 cards so i'm sure some MUST be better.. but i'm not redoing all the loops again.)
> 
> I even flashed all my cards to EVGA FE SC
> 
> 
> 
> 
> 
> 
> 
> 
> it didn't help.. but i feel a small victory knowing my base speed is a little faster


Do you use a custom fan profile? I reckon the SC bios is made for the ACX cooler which is programmed for less RPMs at the same temps.


----------



## Outcasst

Quote:


> Originally Posted by *r0l4n*
> 
> Do you use a custom fan profile? I reckon the SC bios is made for the ACX cooler which is programmed for less RPMs at the same temps.


SC Bios at 100% is equal to the FE Bios at 75%.


----------



## Agent Smith1984

Bad news rich people!
http://arstechnica.com/gadgets/2016/06/bad-news-rich-people-you-wont-be-able-to-use-your-gtx-1080-in-4-way-sli/

They reeled you in with their "special key garbage" only to leave you with two cards you can't use and can't refund (moat places are exchange only on high end cards).

What a cruel thing to do. I assume they said (these rich people buying 4 cards don't care if they lose the money anyways).


----------



## SsXxX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Bad news rich people!
> http://arstechnica.com/gadgets/2016/06/bad-news-rich-people-you-wont-be-able-to-use-your-gtx-1080-in-4-way-sli/
> 
> They reeled you in with their "special key garbage" only to leave you with two cards you can't use and can't refund (moat places are exchange only on high end cards).
> 
> What a cruel thing to do. I assume they said (these rich people buying 4 cards don't care if they lose the money anyways).


lol, seriously they f*cked rich people, but I'm not surprised about this kind of doings coming from NVidia, they tend to get a habit of screwing their customers at least lately they do it a lot!

and u know what, I think 3-4 way sli is useless anyway to gaming, moreover even 2-way sli is not that much of benefit as it is not supported by many of the new games, that's not to say about 2-way sli issue being heat, power draw and most importantly STUTTRING/MICRO STUTTRING!!

I myself has always been a fan of having the absolute best performing single gpu









I now have a gigabyte G1 gtx 1080 overclocked to 2025mhz fully stable; temps below 67c at max load and most importantly ZER THROTTLING








I love u gigabyte G1


----------



## gree

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Bad news rich people!
> http://arstechnica.com/gadgets/2016/06/bad-news-rich-people-you-wont-be-able-to-use-your-gtx-1080-in-4-way-sli/
> 
> They reeled you in with their "special key garbage" only to leave you with two cards you can't use and can't refund (moat places are exchange only on high end cards).
> 
> What a cruel thing to do. I assume they said (these rich people buying 4 cards don't care if they lose the money anyways).


Well that's dirty, screw your most loyal customers

Back to the 900/Titan X for the Quad-sli-ers? Or are you guys just going to settle for 2 cards


----------



## ChevChelios

anyone has any experience with Palit 1080 GameRock Premium Edition http://www.palit.com/palit/vgapro.php?id=2614&lang=en&pn=NEB1080H15P2-1040G&tab=sp

how is that cooler ?

how is Palit quality in general ?

between Palit 1080 GameRock and Gigabyte 1080 G1 Gaming - which would you choose ? Palit Gamerock has 8+6 (and dual bios !), G1 has 8-pin ... but maybe G1 has better cooler ?

just reading the specs the Palit GameRock 1080 looks pretty good .. 8+6 pins, 8+2 power PWM, dual bios, backplate, ok looking cooler, decent factory OC ..

I can get it right now for 699 EUR which seems a steal in Europe at launch frankly


----------



## Jpmboy

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Bad news rich people!
> http://arstechnica.com/gadgets/2016/06/bad-news-rich-people-you-wont-be-able-to-use-your-gtx-1080-in-4-way-sli/
> 
> They reeled you in with their "special key garbage" only to leave you with two cards you can't use and can't refund (moat places are exchange only on high end cards).
> 
> What a cruel thing to do. I assume they said (these rich people buying 4 cards don't care if they lose the money anyways).


Quote:


> Originally Posted by *SsXxX*
> 
> lol, seriously they f*cked rich people, but I'm not surprised about this kind of doings coming from NVidia, they tend to get a habit of screwing their customers at least lately they do it a lot!
> 
> and u know what, I think 3-4 way sli is useless anyway to gaming, moreover even 2-way sli is not that much of benefit as it is not supported by many of the new games, that's not to say about 2-way sli issue being heat, power draw and most importantly STUTTRING/MICRO STUTTRING!!
> 
> I myself has always been a fan of having the absolute best performing single gpu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I now have a gigabyte G1 gtx 1080 overclocked to 2025mhz fully stable; temps below 67c at max load and most importantly ZER THROTTLING
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I love u gigabyte G1


yesterday's news, kids. try to keep up.


----------



## TK421

Zero difference between gamerock or g1 except visual

You don't need vrm to oc, you need luck from silicon gods


----------



## SsXxX

Quote:


> Originally Posted by *TK421*
> 
> Zero difference between gamerock or g1 except visual
> 
> You don't need vrm to oc, you need luck from silicon gods


amen to that


----------



## ChevChelios

and Palit GPUs are realiable/ok quality ? Gigabyte is the same/better ?


----------



## SsXxX

Quote:


> Originally Posted by *ChevChelios*
> 
> and Palit GPUs are realiable/ok quality ? Gigabyte is the same/better ?


no experience with palit to be honest, but as per my knowledge best are those 3: EVGA, Asus, Gigabyte


----------



## AllGamer

Quote:


> Originally Posted by *ChevChelios*
> 
> and Palit GPUs are realiable/ok quality ? Gigabyte is the same/better ?


from personal experience, if you are buying Reference (Founder Edition) versions, it's safe from Any manufacture, even the less known brands.

but when it comes to Custom fans or alternative cooling solution and factory OC-ed versions, I only pick from ASUS, MSI, EVGA, Gigabyte


----------



## SsXxX

Quote:


> Originally Posted by *AllGamer*
> 
> from personal experience, if you are buying Reference (Founder Edition) versions, it's safe from Any manufacture, even the less known brands.
> 
> but when it comes to Custom fans or alternative cooling solution and factory OC-ed versions, I only pick from ASUS, MSI, EVGA, Gigabyte


totally true









and yes msi is on bar with the other 3 mentioned, how did I forget to mention lol


----------



## ChevChelios

I see

well I'll probably grab the G1 Gaming 1080 then

cheers


----------



## gree

Galax and Ichill/inno3d have good cards too

They don't sell palit in my country so no clue about them

Here's a list of the 1080s

http://www.techpowerup.com/gpudb/2839/geforce-gtx-1080


----------



## Testier

Has anyone broken the 2.2ghz barrier on air yet?


----------



## SsXxX

Quote:


> Originally Posted by *Testier*
> 
> Has anyone broken the 2.2ghz barrier on air yet?


I doubt someone will be able to without a heavily modified bios and a seriously beefy custom pcb

in my case 2025mhz was very easily stable and with perfect temps (below 67C at max temp with 70% max fan speed), my limit was the power limit


----------



## stocksux

Quote:


> Originally Posted by *grimboso*
> 
> Ek just announced waterblocks for the FTW and Classy cards on Instagram!


Super good news!!!!! I hope they some how figure a way to manufacture the blocks so that the original RGB on most of these AIB 1080s will be usable. I see everyone after the EVGA cards but I like the backplate of the Strix with its RGB design plus my build is a ROG build. How come people are all about the EVGA only it seems?


----------



## Bogga

Quote:


> Originally Posted by *stocksux*
> 
> Super good news!!!!! I hope they some how figure a way to manufacture the blocks so that the original RGB on most of these AIB 1080s will be usable. I see everyone after the EVGA cards but I like the backplate of the Strix with its RGB design plus my build is a ROG build. How come people are all about the EVGA only it seems?


Warranties when removing cooler


----------



## TK421

Quote:


> Originally Posted by *Bogga*
> 
> Warranties when removing cooler


Yep, EVGA honor warranty when remove cover which is perfect for watercooling folks.

I'm waiting on the MSi Aero 1080 599 usd, by then I'll just keep returning this 1080FE to micro center complaining about fan noise and coil whine lol.


----------



## SsXxX

just played around in my case to improve air flow and cooling, sorted the wires here and there, relocated the gpu to a higher slot, moved the sound card and intel 750 pci ssd as far as possible from the gpu and here is what happened, my gigabyte G1 gaming runs cooler now, max temp is 55C on 100% load tested with asus realbench open cl while max fan speed is 66%, of course the gpu clock is the still the same 2025mhz and again I confirm it was 2025mhz 99% of with very and extremely minimal drops to 2012mhz (I mean those drops was for just a friction of a second I wouldn't have noted them if not for gpuz and in gaming it was always 2025mhz all the time ZERO throttle below that, tried with shadow of mordor maxxed out 4k for about 45 mins)

now I achieved that with the OC profile on gigabyte xtreme engine and both power and temp target and maxxed out but with STOCK VOLTAGE, and as per gpuz the perfcap reason was the vrel which means its the gpu voltage . . . . hmm, how about since my temps are cool I try to play a bit with the voltage? how risky it is to increase the gpu voltage? I'm afraid as the card is still new and I have heard horror stories about overvolting gpus which tends to die quickly unlike increasing cpu voltage which seems a lot safer

suggestions?


----------



## Jpmboy

Quote:


> Originally Posted by *SsXxX*
> 
> just played around in my case to improve air flow and cooling, sorted the wires here and there, relocated the gpu to a higher slot, moved the sound card and intel 750 pci ssd as far as possible from the gpu and here is what happened, my gigabyte G1 gaming runs cooler now, max temp is 55C on 100% load tested with asus realbench open cl while max fan speed is 66%, of course the gpu clock is the still the same 2025mhz and again I confirm it was 2025mhz 99% of with very and extremely minimal drops to 2012mhz (I mean those drops was for just a friction of a second I wouldn't have noted them if not for gpuz and in gaming it was always 2025mhz all the time ZERO throttle below that, tried with shadow of mordor maxxed out 4k for about 45 mins)
> 
> now I achieved that with the OC profile on gigabyte xtreme engine and both power and temp target and maxxed out but with STOCK VOLTAGE, and as per gpuz the perfcap reason was the vrel which means its the gpu voltage . . . . hmm, how about since my temps are cool I try to play a bit with the voltage? how risky it is to increase the gpu voltage? I'm afraid as the card is still new and I have heard horror stories about overvolting gpus which tends to die quickly unlike increasing cpu voltage which seems a lot safer
> 
> suggestions?


low risk to raise the voltage using the OEM tools. the core will downclock if it get too warm.


----------



## ChevChelios

pretty sure increasing voltage to what a program like Gigabyte Xtreme Engine or EVGA Precision or MSI afterburner allows you to isnt dangerous, its part of the OC they offer

but you can also google a 1080 OC guide on guru 3D http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,1.html


----------



## traxtech

Anyone else finding voltage does absolutely nothing on their card? WTB vbios plx


----------



## bfedorov11

Quote:


> Originally Posted by *Outcasst*
> 
> SC Bios at 100% is equal to the FE Bios at 75%.


Perfect! That is my cut off for sound on the FE card. Got my card flashed to a SC. Stays around 1950mhz without touching anything.
Quote:


> Originally Posted by *traxtech*
> 
> Anyone else finding voltage does absolutely nothing on their card? WTB vbios plx


Yeah, I don't even see an increase in afterburner. It must be boost 3.0. Doesn't matter what you set it at, it does it's own thing. I even tried the PT mod.. CLU on shunt resistors.. and it didn't do a damn thing. The only thing you can do is put the card under water for slightly higher and more stable clocks. But since most cards will run 2050 while gaming, the gains of gaming at 2150 are almost nonexistent.


----------



## r0l4n

Quote:


> Originally Posted by *bfedorov11*
> 
> Perfect! That is my cut off for sound on the FE card. Got my card flashed to a SC. Stays around 1950mhz without touching anything.


What temperature limit have you set? What kind of case cooling and ambient temperatures do you have? If the SC bios fan 100% is equal to fan 75% in FE bios, the temperature limit should be hit earlier with SC bios than FE bios, and therefore throttle a bit more. This is just me adding two and two, I haven't really tested, left for vacation just before bios flashing was tested with 1080s.


----------



## bfedorov11

Quote:


> Originally Posted by *r0l4n*
> 
> What temperature limit have you set? What kind of case cooling and ambient temperatures do you have? If the SC bios fan 100% is equal to fan 75% in FE bios, the temperature limit should be hit earlier with SC bios than FE bios, and therefore throttle a bit more. This is just me adding two and two, I haven't really tested, left for vacation just before bios flashing was tested with 1080s.


Max 92. Its a Lian Li PC-V359WX with 4 intake fans (af120) on the sides. Two are blowing directly on the card. While running 3dmark temps rarely go over 60. I need to double check it though. Maybe I removed too much clu.. not the kind of stuff you want dripping around your mobo. This was also with the non sc bios and fan at 4k. After doing the mod, I couldn't even match my best FS ultra. I thought speeds would be a little less jumpy.


----------



## gagac1971

hello to all here ...
i menage to overclock my gtx 1080 founders edition to 2126 mhz and from there i cant go higher.....power target limit...
max temp.66c fans on 100 %
this is my first reference card and let me tell you-from now im am only buying reference model,i love the look and all of the rest...


----------



## nexxusty

Quote:


> Originally Posted by *gagac1971*
> 
> hello to all here ...
> i menage to overclock my gtx 1080 founders edition to 2126 mhz and from there i cant go higher.....power target limit...
> max temp.66c fans on 100 %
> this is my first reference card and let me tell you-from now im am only buying reference model,i love the look and all of the rest...


I like the look too.


----------



## Jpmboy

Quote:


> Originally Posted by *traxtech*
> 
> Anyone else finding voltage does absolutely nothing on their card? WTB vbios plx


AFAIK you can't just push the voltage slider and expect to see the voltage (via GPUZ or anything else) respond without the card reaching the corresponding clock bin. Try using the voltage/frequency curve, (MSI AB then cntrl-F).


----------



## TK421

Quote:


> Originally Posted by *Jpmboy*
> 
> AFAIK you can't just push the voltage slider and expect to see the voltage (via GPUZ or anything else) respond without the card reaching the corresponding clock bin. Try using the voltage/frequency curve, (MSI AB then cntrl-F).


wouldn't putting an offset work the same?


----------



## gagac1971

Quote:


> Originally Posted by *TK421*
> 
> wouldn't putting an offset work the same?


in my opinion if i will rise voltage offset card will boost higher but then the power target limit is lowering the clock...
my best overclock of 2126 mhz is on stock voltage...


----------



## Jpmboy

Quote:


> Originally Posted by *TK421*
> 
> wouldn't putting an offset work the same?


The voltage is coupled to a clock bin in bios and vis-versa... synching them manually is problematic. Just try the cntrl_f function then grab the far right point, with the cntrl pressed move the curve up to the offset freq or abs freq you want to try. I'm running valley at 2200+ and Heaven at 2184 (it is more of a power limit thing with Heaven. valley barely hits the PL)


----------



## stocksux

Missed the Strix 1080 on newegg apparently. Instead of coming soon, it now says sold out


----------



## nexxusty

Quote:


> Originally Posted by *stocksux*
> 
> Missed the Strix 1080 on newegg apparently. Instead of coming soon, it now says sold out


Oh yeah, if you aren't pre paying for one you aren't getting one for awhile. I had to wait 12 days for mine.

Bought on May 27th.


----------



## stocksux

Never was a pre order on newegg for them


----------



## Bogga

Quote:


> Originally Posted by *nexxusty*
> 
> Oh yeah, if you aren't pre paying for one you aren't getting one for awhile. I had to wait 12 days for mine.
> 
> Bought on May 27th.


Ordered mine at 15:02 on the 27th... still to see them IRL


----------



## nexxusty

Quote:


> Originally Posted by *Bogga*
> 
> Ordered mine at 15:02 on the 27th... still to see them IRL


Aww sorry to hear that man. I know your pain. Waiting for this card sucked this time around.

Probably the most anticipated launch ever though. I don't see this getting any better in the future. Hehe.


----------



## grimboso

Quote:


> Originally Posted by *nexxusty*
> 
> Aww sorry to hear that man. I know your pain. Waiting for this card sucked this time around.
> 
> Probably the most anticipated launch ever though. I don't see this getting any better in the future. Hehe.


Made an order here in Sweden with confirmation that I was on the 8. june batch. Then it was delayed from EVGA untill the 16th, then again delayed untill the 22nd -.-
EVGA seems to prioritize the cards to their own preorders.


----------



## ChevChelios

so I actually managed to miss the G1 Gaming by a hair lol .. no longer available, these cards literally vanish into mid-air

the current option besides Palit is a Gainward Phoenix http://www.gainward.com/main/edm/gtx1080_phoenix/index_en.php either the GS or the GLH

http://www.gainward.com/main/vgapro.php?id=982&lang=en
http://www.gainward.com/main/product/vga/pro/p00982/p00982_datasheet_87557514eae84ebf.pdf?s=584

GLH actually looks yummy, cooler similar to Palits one, 8 phase, 8+6 pins, dual bios (for that dream of a custom bios), backplate, 1900/10500 factory OC etc.

so, Gainward - good/bad ?


----------



## SsXxX

Quote:


> Originally Posted by *ChevChelios*
> 
> so I actually managed to miss the G1 Gaming by a hair lol .. no longer available, these cards literally vanish into mid-air
> 
> the current option besides Palit is a Gainward Phoenix http://www.gainward.com/main/edm/gtx1080_phoenix/index_en.php either the GS or the GLH
> 
> http://www.gainward.com/main/vgapro.php?id=982&lang=en
> http://www.gainward.com/main/product/vga/pro/p00982/p00982_datasheet_87557514eae84ebf.pdf?s=584
> 
> GLH actually looks yummy, cooler similar to Palits one, 8 phase, 8+6 pins, dual bios (for that dream of a custom bios), backplate, 1900/10500 factory OC etc.
> 
> so, Gainward - good/bad ?


looks pretty cool to me, havnt had experience with gainward but spec wise its amazing, even better than g1


----------



## ChevChelios

daamn and now the G1 Gaming is back again

decisions decisions

.. Ill probably end up going with G1, Gigabyte is like one of the most trusted and safest options


----------



## SsXxX

Quote:


> Originally Posted by *ChevChelios*
> 
> .. Ill probably end up going with G1, Gigabyte is like one of the most trusted and safest options


amen to that.

seriously through, go get the gigabyte before its out of stock again, like u said gigabyte is safe, trusted and reliable, difference in clocks is minimal between it and the gainward anyway, moreover who knows if the gainward can hold its clocks without throttling? as for gigabyte its custom cards are well-known to have the most stable clocks with almost zero throttling


----------



## ChevChelios

I kind of thought the dual bios is nice (dont think G1 has it), but Im not really the kind of guy who will risk putting a custom overvolted potentially risky BIOS on a 700 EUR card (even if it has dual bios as a fall back) .. and thats assuming such a BIOS will even appear and assuming Pascal will give actual tangible fps increases going from 2050-2100 to ~2200+

not to mention a BIOS overvolting will surely mess up the thermals & noise on air cooling (and I will not do liquid), may not be worth it even for a small performance bump


----------



## SsXxX

Quote:


> Originally Posted by *ChevChelios*
> 
> I kind of thought the dual bios is nice (dont think G1 has it), but Im not really the kind of guy who will risk putting a custom overvolted potentially risky BIOS on a 700 EUR card (even if it has dual bios as a fall back) .. and thats assuming such a BIOS will even appear and assuming Pascal will give actual tangible fps increases going from 2050-2100 to ~2200+
> 
> not to mention a BIOS overvolting will surely mess up the thermals & noise on air cooling (and I will not do liquid), may not be worth it even for a small performance bump


why are u still here . . . go get the G1 lol


----------



## Menthol

Quote:


> Originally Posted by *TK421*
> 
> Yep, EVGA honor warranty when remove cover which is perfect for watercooling folks.
> 
> I'm waiting on the MSi Aero 1080 599 usd, by then I'll just keep returning this 1080FE to micro center complaining about fan noise and coil whine lol.


Is that ethical?


----------



## ChevChelios

Quote:


> Originally Posted by *SsXxX*
> 
> why are u still here . . . go get the G1 lol


its done









697 EUR for 1080 G1 Gaming









my card for the next 2+ years to drive a 1440p 144-165Hz monitor

SeemsGood


----------



## Jpmboy

F no.


----------



## pez

I'd say 1070 does fine for 2K if you've got g-sync, but I've seen minimums on a lot of 1070 reviews that drop below 60FPS quite a bit. For me, the 1080 should hold 60+ at 2K better and only get better if I decide to get a g-sync monitor this year







.


----------



## ChevChelios

Quote:


> Originally Posted by *pez*
> 
> I'd say 1070 does fine for 2K if you've got g-sync, but I've seen minimums on a lot of 1070 reviews that drop below 60FPS quite a bit. For me, the 1080 should hold 60+ at 2K better and only get better if I decide to get a g-sync monitor this year
> 
> 
> 
> 
> 
> 
> 
> .


I thought of 1070, but I want to experience those sweet synced 70-80+ frames in more demanding games on 1440p maxed out (instead of dipping sub 60 on a 1070) + the fact that its my next card for a whole 2+ years, since Volta wont be here earlier than mid 2018 for consumers and I will *not* buy a 1000 EUR 1080Ti

so for 2+ years I figure Id spend the extra

if I had 1080p or [email protected] then I would probably go with 1070

for for [email protected] I think 1080 is worth it if you really into gaming


----------



## pez

Nearly in line with my thoughts








.

I was going to do a RX 480 on my GFs build, but I may actually see if I can snag a 1070 later today...that way she'll be able to game in my 2K 60Hz monitor f I go g-sync 2k/UW 2k.


----------



## MCFC

Is it even worth it to hold on for the aftermarket cards?
I'm hearing they can't be OCd much anyway because Nvidia made sure of that through hardware limitations


----------



## ChevChelios

The thermals and noise are much better then FE


----------



## ChevChelios

also, a question - should I use the Gigabyte tools (Extreme Engine or w/e) to tweak/OC the G1 Gaming or better use the MSI afterburner ?


----------



## SsXxX

Quote:


> Originally Posted by *ChevChelios*
> 
> also, a question - should I use the Gigabyte tools (Extreme Engine or w/e) to tweak/OC the G1 Gaming or better use the MSI afterburner ?


definitely use the gigabyte xtreme engine, while msi after burner was perfectly fine with my titan x it started causing me bsod and crashes with the g1 1080!!! so I advice to avoid it like a plague


----------



## ChevChelios

hmm, Ive seen all the reviews use the Afterburner to OC the FE 1080

but maybe its less compatible with other brands custom cards

I always hear afterburner/EVGA precision are the best OC tools, but I guess for the custom cards you should use the specific software of that brand


----------



## SsXxX

Quote:


> Originally Posted by *ChevChelios*
> 
> hmm, Ive seen all the reviews use the Afterburner to OC the FE 1080
> 
> but maybe its less compatible with other brands custom cards
> 
> I always hear afterburner/EVGA precision are the best OC tools, but I guess for the custom cards you should use the specific software of that brand


its true afterburner is the best, I always used it and as long as your card is reference any oc tool whether afterburner or else will work with any gpu brand, but when it come to custom after market cards with custom pcb and/or a custom bios like the g1 things are different.\


----------



## stocksux

Yeah but what if you water cool??? Then thermal and noise is a mute point. Why else wait?


----------



## TK421

Quote:


> Originally Posted by *stocksux*
> 
> Yeah but what if you water cool??? Then thermal and noise is a mute point. Why else wait?


you wait for mod bios to unleash the card's power


----------



## stocksux

Newegg says the Strix is in stock!! However, when I try to check out with it in my cart it says it's out of stock... Sigh


----------



## Digitalwolf

Quote:


> Originally Posted by *stocksux*
> 
> Newegg says the Strix is in stock!! However, when I try to check out with it in my cart it says it's out of stock... Sigh


*edited*

It finally let an order go through... maybe it is just an error.. but this is the most 1080's I've seen listed as "in stock" by them at once.


----------



## stocksux

Frustrating.


----------



## fcman

Interesting development here, 1080 Classy for under $725? I wonder if they'll manage to squeeze any extra performance out of the chip.


__ https://twitter.com/i/web/status/741011190025834496


----------



## Digitalwolf

Quote:


> Originally Posted by *stocksux*
> 
> Frustrating.


It must have been some kind of site error or they just didn't plan to sell the cards to us.

My order came back as payment failed... which was off my PayPal seller's account that currently has far more than this card cost on it as a balance. So I called them and they tried to give me a line as to how it was my fault the charge didn't process due to lack of funds... PayPal as expected says no charge attempt was made and since its off my cash balance its not a thing that can fail...

Then again this is part of why I am moving away from ever ordering from Newegg.


----------



## SynchroSCP

Quote:


> Originally Posted by *Jpmboy*
> 
> 120V fan (a screamer just for testing). I'm measuring the temps of the ram and power section with an IR thermo. So far, looping Heaven 4.0 one of the chokes (top R22) hits 58C. Honestly, the stock air cooler probably doesn't cool these as well as forced air directly on the parts. Gpu core max temp is 28C.


Interesting, have a full block coming next week but think I'm going to mount a Thermosphere on and see how that works out. I have the ACX version so will just leave the baseplate on for mem and vrm cooling and mount a 92mm to the baseplate with some 1/2" standoffs. Have some heatsinks around as well, maybe throw a couple over the vrm just to add fins and mass.


----------



## Cool Mike

Just grabbed a 1080 Strix at Newegg. the first two attempts I got a "removed from cart no longer available". tried a third time and got it.
In packaging now. Just cancelled my EVGA FTW preorder at Amazon. Sorry EVGA.









Strix getting some very good reviews.









Seems Newegg is receiving very small batches as they are sold within 5-10 minutes.


----------



## Pascual

I just got my Gigabyte GTX 1080 G1 Gaming and the fans really kick up. Very audible coil whine too. Has anyone else experienced this?


----------



## Dirgeth

Greetings from Czech Rep.


----------



## Spiriva

Someone asked for a 3dmark. This is with the computer in my sign, @ 5ghz and gpu @ 2200, memory on the 1080 is set too +350.



Just got the tax returns in Sweden this week, so i will prolly order another 1080, and put it in the loop too


----------



## Z0eff

If anyone is waiting for the 1070, it's available now: http://www.geforce.com/hardware/10series/geforce-gtx-1070


----------



## ChevChelios

Quote:


> Originally Posted by *Pascual*
> 
> I just got my Gigabyte GTX 1080 G1 Gaming and the fans really kick up. Very audible coil whine too. Has anyone else experienced this?


about fans, try this -
Quote:


> After installing it, the fans were stuck on (near) full blast. After installing Gigabyte's "EXTREME GAMING" utility - their overclocking tool, I disabled the "3D ACTIVE FAN" setting in the "Fan" tab, which allowed the fans to intelligently scale up or down. You may have to do the same.


coil whine - there should not be any


----------



## pez

Quote:


> Originally Posted by *Pascual*
> 
> I just got my Gigabyte GTX 1080 G1 Gaming and the fans really kick up. Very audible coil whine too. Has anyone else experienced this?


Mine won't be in until Monday








. It's 20 minutes away, but I can't have it until Monday







.

What drivers are you running? Have you tried out Gigabyte's OC program yet?


----------



## Spikeyjohnson

Just ordered my PNY Founders Edition on Amazon for 699 and the block from EK. Should all be here Thursday and Friday next week. Glad I got to it last night, the Scalpers have already hit it it as of early this morning and it is priced up there with the rest of the brands anywhere from 840 to 900.


----------



## sherlock

Got my step-up GTX 1080 FE from EVGA, plugged in and installed drivers. All looks fine and boost to 1873 stock, can't wait to take it into 4K gaming and see how well it does in Armored Warfare.


----------



## SsXxX

Quote:


> Originally Posted by *Pascual*
> 
> I just got my Gigabyte GTX 1080 G1 Gaming and the fans really kick up. Very audible coil whine too. Has anyone else experienced this?


its very quite for me, but my pc is about 1.5 meters away from me so . . .


----------



## Pascual

Quote:


> Originally Posted by *ChevChelios*
> 
> about fans, try this -
> coil whine - there should not be any


Just tried both on and off and didn't seem to make any difference. Fans ran at 45% the under load.


----------



## ChevChelios

well 45% should be a good % under load and it should be quite quiet


----------



## nexxusty

Quote:


> Originally Posted by *SsXxX*
> 
> definitely use the gigabyte xtreme engine, while msi after burner was perfectly fine with my titan x it started causing me bsod and crashes with the g1 1080!!! so I advice to avoid it like a plague


ROFL.

Think before you post.

There is nothing wrong with MSI Afterburner. I've used it since day one. It is THE BEST OC software for GPU's. Period.

100% PEBKAC error.
Quote:


> Originally Posted by *SsXxX*
> 
> its true afterburner is the best, I always used it and as long as your card is reference any oc tool whether afterburner or else will work with any gpu brand, but when it come to custom after market cards with custom pcb and/or a custom bios like the g1 things are different.\


Not true. At all. Reference, AIB Custom boards... Doesn't matter. Please stop spreading misinformation, thanks.


----------



## ChevChelios

Quote:


> Originally Posted by *nexxusty*
> 
> ROFL.
> 
> Think before you post.
> 
> There is nothing wrong with MSI Afterburner. I've used it since day one. It is THE BEST OC software for GPU's. Period.
> 
> 100% PEBKAC error.


so MSI afterburner for G1 Gaming 1080 - all good ?


----------



## nexxusty

Quote:


> Originally Posted by *ChevChelios*
> 
> so MSI afterburner for G1 Gaming 1080 - all good ?


I absolutely guarantee it.

I've never had a card that Afterburner couldn't overclock. It even overclocks Radeon for god sakes.

MSI doesn't even make Radeon GPU boards...


----------



## pez

I can't imagine AfterBurner not working like the Gigabyte software, if not better.

The only thing the GB software will be good for probably is setting your RGB lighting







.


----------



## TK421

Quote:


> Originally Posted by *pez*
> 
> I can't imagine AfterBurner not working like the Gigabyte software, if not better.
> 
> The only thing the GB software will be good for probably is setting your RGB lighting
> 
> 
> 
> 
> 
> 
> 
> .


because RGB matters more than anything else it seems


----------



## pez

I'm just glad Gigabyte kept the cooler design fairly tame compared to EVGA or even their on Xtreme Gaming cooler. They just remind me why I don't like those flashy CM cases.


----------



## gree

Whoa how is the g1 better looking? The extreme has stacks so it's shorter and cools better


----------



## TK421

Quote:


> Originally Posted by *gree*
> 
> Whoa how is the g1 better looking? The extreme has stacks so it's shorter and cools better


What do you mean stacks?

Highest one yet with cpu @4.4

www.3dmark.com/fs/8746619


----------



## gree

http://www.gigabyte.com/products/product-page.aspx?pid=5920#kf

The middle fan sits behind the other two fans making a "stack" as Gigabyte calls it

My problems with their cards was how long they are, this looks to fix that.

I'm trying the Msi gaming X tho, should arrive today. Feels like I paid for a lighting ._. But it's just their mid tier card so well see


----------



## TK421

Quote:


> Originally Posted by *gree*
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5920#kf
> 
> The middle fan sits behind the other two fans making a "stack" as Gigabyte calls it
> 
> My problems with their cards was how long they are, this looks to fix that.
> 
> I'm trying the Msi gaming X tho, should arrive today. Feels like I paid for a lighting ._. But it's just their mid tier card so well see


does it really matter? a 120mm clc from evga will outperform even the best aircoolers out there by a pretty large margin, plus it's only 65usd from amazon.


----------



## Dirgeth

2088/11200MHz Asus Strix

BTW G1 980Ti 1541/8400MHz have 10499 GPU score


----------



## Agent Smith1984

Quote:


> Originally Posted by *nexxusty*
> 
> I absolutely guarantee it.
> 
> I've never had a card that Afterburner couldn't overclock. It even overclocks Radeon for god sakes.
> 
> MSI doesn't even make Radeon GPU boards...


MSI makes tons of Radeon cards???


----------



## SynchroSCP

Thermosphere on EVGA SC. Temps below 50c on gpu and vrm.


----------



## Clay333

Quote:


> Originally Posted by *Dirgeth*
> 
> 
> 
> 2088/11200MHz Asus Strix
> 
> BTW G1 980Ti 1541/8400MHz have 10499 GPU score


Slow your Ram down to about a +400 offset and run it again. You will likely see a slight increase in your score. I do with my EVGA Founders edition. I'm guessing that the Ram dynamically loosens timings to keep itself stable which actually negatively impacts performance.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Clay333*
> 
> Slow your Ram down to about a +400 offset and run it again. You will likely see a slight increase in your score. I do with my EVGA Founders edition. I'm guessing that the Ram dynamically loosens timings to keep itself stable which actually negatively impacts performance.


Well, it's more likely a set of straps with different timings at different clock levels, but more importantly than that is the error correction taking place when overclocking VRAM too high. HWinfo actually monitors memory errors now so you can see what's going on.


----------



## nexxusty

Quote:


> Originally Posted by *Agent Smith1984*
> 
> MSI makes tons of Radeon cards???


Yeah I'm a moron for saying that. I have no idea why I said this.

My bad.


----------



## fat4l

Quote:


> Originally Posted by *Clay333*
> 
> Slow your Ram down to about a +400 offset and run it again. You will likely see a slight increase in your score. I do with my EVGA Founders edition. I'm guessing that the Ram dynamically loosens timings to keep itself stable which actually negatively impacts performance.


its prolly error correction kicking in....


----------



## TK421

Quote:


> Originally Posted by *SynchroSCP*
> 
> 
> 
> Thermosphere on EVGA SC. Temps below 50c on gpu and vrm.


Do you think the thermosphere would work with the FE cards (shroud removed)?


----------



## zGunBLADEz

Quote:


> Originally Posted by *nexxusty*
> 
> I absolutely guarantee it.
> 
> I've never had a card that Afterburner couldn't overclock. It even overclocks Radeon for god sakes.
> 
> MSI doesn't even make Radeon GPU boards...


Huh what you mean with MSI dont make gpu cards?? Yes they do


----------



## nexxusty

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Huh what you mean with MSI dont make gpu cards?? Yes they do


Oh I know... I made a dumb comment. Lol.


----------



## Dirgeth

Quote:


> Originally Posted by *Clay333*
> 
> Slow your Ram down to about a +400 offset and run it again. You will likely see a slight increase in your score. I do with my EVGA Founders edition. I'm guessing that the Ram dynamically loosens timings to keep itself stable which actually negatively impacts performance.


Here is 2088/11480MHz OC (+725MHz offset)


and here is your +400MHz offset

2088/10800MHz


so +400offset is better in GPU score







LOL


----------



## Clay333

Quote:


> Originally Posted by *Dirgeth*
> 
> Here is 2088/11480MHz OC (+725MHz offset)
> 
> 
> and here is your +400MHz offset
> 
> 2088/10800MHz
> 
> 
> so +400offset is better in GPU score
> 
> 
> 
> 
> 
> 
> 
> LOL


That is exactly what I am seeing. What we need to do now is run firestike over and over with a 10mhz increase each time to find the exact point when the timings change and set it a few MHz short of that. I'm not sure if every card is different or if they will all use a similar frequently table, but there has to be one or more sweet spots depending how many different points there are that the timings loosen with the range of stable clocks.

Before anyone says anything I do know that these differences are very marginal and will not make a difference in real world gaming, but those extra couple extra points are sometimes all you need to set a record if you are going for that sort of thing. And it's basically a reason for us to tinker with our new cards


----------



## AllGamer

Quote:


> Originally Posted by *TK421*
> 
> Do you think the thermosphere would work with the FE cards (shroud removed)?


I was just about to ask the same question too.

I'd like to keep the side panel LED, and I like the idea of a Hybrid.

Going full EK block will lose the LED on the side.


----------



## TK421

Quote:


> Originally Posted by *AllGamer*
> 
> I was just about to ask the same question too.
> 
> I'd like to keep the side panel LED, and I like the idea of a Hybrid.
> 
> Going full EK block will lose the LED on the side.


Maybe ek rep can chime in?

@EK_tiborrr

@EK-CEO


----------



## looniam

Quote:


> Originally Posted by *sherlock*
> 
> Got my step-up GTX 1080 FE from EVGA, plugged in and installed drivers. All looks fine and boost to 1873 stock, can't wait to take it into 4K gaming and see how well it does in Armored Warfare.


GG.









gonna miss a nice guy like you on the 980ti owners thread. don't be afraid to come back and post a few benches for fun.


----------



## pez

Quote:


> Originally Posted by *gree*
> 
> Whoa how is the g1 better looking? The extreme has stacks so it's shorter and cools better


I've got plenty of space for a long GPU, and I can't say that has ever bothered me. Even the Mini ITX build we have has ample GPU room.

Obviously the cooler design is purely opinion, but the 'Xtreme Gamer' card looks too flashy. It does look to be packed with features and great functionality, but it's just not for me







.


----------



## TK421

Quote:


> Originally Posted by *pez*
> 
> I've got plenty of space for a long GPU, and I can't say that has ever bothered me. Even the Mini ITX build we have has ample GPU room.
> 
> Obviously the cooler design is purely opinion, but the 'Xtreme Gamer' card looks too flashy. It does look to be packed with features and great functionality, but it's just not for me
> 
> 
> 
> 
> 
> 
> 
> .


in the hiearchy, which one is more "premium"?

the G1 or the 'extreme gaming'?


----------



## zGunBLADEz

Quote:


> Originally Posted by *Clay333*
> 
> That is exactly what I am seeing. What we need to do now is run firestike over and over with a 10mhz increase each time to find the exact point when the timings change and set it a few MHz short of that. I'm not sure if every card is different or if they will all use a similar frequently table, but there has to be one or more sweet spots depending how many different points there are that the timings loosen with the range of stable clocks.
> 
> Before anyone says anything I do know that these differences are very marginal and will not make a difference in real world gaming, but those extra couple extra points are sometimes all you need to set a record if you are going for that sort of thing. And it's basically a reason for us to tinker with our new cards


You dont need to, you will notice it while in game for example the best game i have found to test for a quick overclock is ROTR core and mem..

Laras hair to begin with starts flickering or artifacting (like lil white squares/snow) on with too much core you need a good angle and have the camera close to her hair and you will see right away the flickering snow.. and in memory the frame rate starts going up and down on a mem overclock.. just make sure you have a view which is a steady and lots of fog. and make a preset on memory on afterburner and switch thru them..

I will start 500/600/700 on a hot key on afterburner and cycle to them..

This is the angle you need to look for on maxed out ROTR to see artifacts on almost a top stable oced is all about that hair lol
then you move the camera in and out and you will notice those artifacts on the hair


easy XD

Then after that i do a heaven run 2 hrs for stability.


----------



## -terabyte-

Quote:


> Originally Posted by *TK421*
> 
> in the hiearchy, which one is more "premium"?
> 
> the G1 or the 'extreme gaming'?


Extreme gaming, the G1 this time around (I think) is the reference card with just a different/custom cooler. It even has the original 8 pin power connector only.


----------



## gerbil80

Quote:


> Originally Posted by *-terabyte-*
> 
> Extreme gaming, the G1 this time around (I think) is the reference card with just a different/custom cooler. It even has the original 8 pin power connector only.


fyi

According to the Gigabyte Guy on OCUK the G1 is a custom PCB with 8+2 Power Phases, it does still use a single 8 pin as you say


----------



## uggy

any custom bios for 1080 on water yet?


----------



## TK421

Quote:


> Originally Posted by *-terabyte-*
> 
> Extreme gaming, the G1 this time around (I think) is the reference card with just a different/custom cooler. It even has the original 8 pin power connector only.


Quote:


> Originally Posted by *gerbil80*
> 
> fyi
> 
> According to the Gigabyte Guy on OCUK the G1 is a custom PCB with 8+2 Power Phases, it does still use a single 8 pin as you say


Ok, thanks for the answer.

Quote:


> Originally Posted by *uggy*
> 
> any custom bios for 1080 on water yet?


Not yet sadly, but nvflash for pascal is working.


----------



## fewness

Valley always reports the core frequency absolutely correct, right? Please tell me it's right!


----------



## Joneszilla

Received my MSI Gaming 1080 today. Getting more than double the FPS I was getting on my Titan OG.


----------



## Joneszilla

Played some Doom and Witcher 3 in 1440p and loving it so far. Going to mess with the overclocks tonite.


----------



## SsXxX

Quote:


> Originally Posted by *nexxusty*
> 
> ROFL.
> 
> Think before you post.
> 
> There is nothing wrong with MSI Afterburner. I've used it since day one. It is THE BEST OC software for GPU's. Period.
> 
> 100% PEBKAC error.
> Not true. At all. Reference, AIB Custom boards... Doesn't matter. Please stop spreading misinformation, thanks.


dear sir, I'm not spreading misinformation, I was getting crashes, bsods. random restarts like there is no tomorrow when I was using afterburner 4.3 with my g1 gaming 1080, to the degree that I suspected something is wrong with my overclock, reset bios to default with no benefit, i tried everything possible, i almost gave up and thought something wrong with my cpu and/or mobo to the degree that i was abt to buy new build!!!!

suddenly i decided to uninstall afterburner and try gigabyte xtreme engine, and guess what?!! no crashes! no bsod! no random restarts! my pc is back to rock solid stability, i even did the usual stress tests again to be sure (linx, ibt, aida64, realbench) and hell yea its STABLE!!

I'm not saying afterburner is bad, i know its the best and i used to use it all the time btw, its just that i think its not compatible with the g1 gaming 1080 and that's based on my own personal experience or maybe because its beta, i don't know neither do i care, my advice is if u have the g1 gaming then use the xtreme engine.


----------



## guyinthecorner1

Quote:


> Originally Posted by *SsXxX*
> 
> dear sir, I'm not spreading misinformation, I was getting crashes, bsods. random restarts like there is no tomorrow when I was using afterburner 4.3 with my g1 gaming 1080, to the degree that I suspected something is wrong with my overclock, reset bios to default with no benefit, i tried everything possible, i almost gave up and thought something wrong with my cpu and/or mobo to the degree that i was abt to buy new build!!!!
> 
> suddenly i decided to uninstall afterburner and try gigabyte xtreme engine, and guess what?!! no crashes! no bsod! no random restarts! my pc is back to rock solid stability, i even did the usual stress tests again to be sure (linx, ibt, aida64, realbench) and hell yea its STABLE!!
> 
> I'm not saying afterburner is bad, i know its the best and i used to use it all the time btw, its just that i think its not compatible with the g1 gaming 1080 and that's based on my own personal experience or maybe because its beta, i don't know neither do i care, my advice is if u have the g1 gaming then use the xtreme engine.


I agree. Afterburner 4.3 Beta is messed up with the G1 gaming. The custom fan profile does not work. With the custom profile on, the fans always run at 25%.

I have been using the xtreme utility but GPU boost 3.0 is very strange. I only got 40 MHz with a +100 core offset. I can't get the card past 2000 MHz. How you had any success?


----------



## SynchroSCP

Quote:


> Originally Posted by *TK421*
> 
> Do you think the thermosphere would work with the FE cards (shroud removed)?


I think so but check EK's configurator to be safe. You'd need to do something to cool the vrms tho, the benefit of the ACX cards is the baseplate works really well for vrms...just get a fan on it and all good. I put one of the temp sensors from my aquaero right over the vrms and it hasnt gotten over 45C yet.

I have a full block coming in a couple of days but I may stay with this for now, tired of shelling out for a new block and backplate each time I buy a new gpu.

To make the Thermosphere work with the EVGA baseplate it has to be modded like this:


----------



## gree

Hey I tried a heavens word benchmarkn (guessing an older once since I downloaded in sept/2015)

Msi 1080 gaming x
But I got a max of 80c/74% fan at 1440p.
1900s for the core clock

That's within the cards max temp right?


----------



## skline00

Just received my EK GTX 1080 copper/acrylic fullblock from Performance PC today. Was trying to snag a GTX 1080 from Newegg and I bought a Zotac GTX 1080 FE. I intend to put this new combo in my 5960x rig and move the GTX 980 TI SC with an EK fullblock over to my 4790k rig to replace 2 R9 290s.


----------



## TK421

Quote:


> Originally Posted by *SynchroSCP*
> 
> I think so but check EK's configurator to be safe. You'd need to do something to cool the vrms tho, the benefit of the ACX cards is the baseplate works really well for vrms...just get a fan on it and all good. I put one of the temp sensors from my aquaero right over the vrms and it hasnt gotten over 45C yet.
> 
> I have a full block coming in a couple of days but I may stay with this for now, tired of shelling out for a new block and backplate each time I buy a new gpu.
> 
> To make the Thermosphere work with the EVGA baseplate it has to be modded like this:


what modification is that? I would doubt that the FE baseplate is same as ACX baseplate

I currently have mine like this, not sure if the vrm is properly cool or not


----------



## stocksux

Quote:


> Originally Posted by *gree*
> 
> Hey I tried a heavens word benchmarkn (guessing an older once since I downloaded in sept/2015)
> 
> Msi 1080 gaming x
> But I got a max of 80c/74% fan at 1440p.
> 1900s for the core clock
> 
> That's within the cards max temp right?


.

Yes. Throttling occurs at 82c. You're just inside the window


----------



## pez

Quote:


> Originally Posted by *TK421*
> 
> in the hiearchy, which one is more "premium"?
> 
> the G1 or the 'extreme gaming'?


As previously said, the Xtreme Gaming will be the premium version this time. It does look like the G1 is still a nice 'upgrade' over the reference/Founders option due to the extra power phases and supposedly quieter and more effective cooling solution.


----------



## THEROTHERHAMKID

Just got a g1 1080 yesterday ?


----------



## TK421

Quote:


> Originally Posted by *pez*
> 
> As previously said, the Xtreme Gaming will be the premium version this time. It does look like the G1 is still a nice 'upgrade' over the reference/Founders option due to the extra power phases and supposedly quieter and more effective cooling solution.


thanks for the info

in other news, I made it to the 65th place in top 100 firestrike leaderboard!

http://www.3dmark.com/fs/8761646


----------



## twelvie

I bought a G1 1080 which I'm very happy with - having an issue with one of the fans, though. Keep in mind this only seems to occur when it's on automatic and is stopping and starting with the "fan stop."

Basically the far right fan when spinning up is making quite a loud clicking or grinding sound, even in idle every now and again it will do it, the far left fan also spins (without noise) and the middle fan remains idle. It seems like it only makes the noise when running at a very low speed.

I've made a custom fan curve which seems to have remedied the problem, although would like to know whether this is common with cards with this "fan stop" feature? Still unsure whether I will need to RMA the card.


----------



## gerbil80

What sort of clocks are you guys seeing with your g1 gaming? Any coil whine at all?


----------



## MRCOCOset




----------



## twelvie

Quote:


> Originally Posted by *gerbil80*
> 
> What sort of clocks are you guys seeing with your g1 gaming? Any coil whine at all?


I haven't had much of a chance to test it properly yet in terms of overclocking - no coil whine that I have noticed so far.


----------



## SweWiking

If you run two of these cards right now what type of sli bridge do you use ? Since the nvidia one is no where to be seen, do you just use two of them old ones that ships with motherboards that support sli or ?

Specially if you have water blocks on your two 1080, what type of sli bridge to use then ?


----------



## uggy

sorry for the language! but how the hell did you manage that?









Water I presume.. have water myself 2 1080.

The memory we have around the same, but you are kicking my ass in the core clock, have around 2072 mhz, msi afterburner with 120% power limit.


----------



## SsXxX

Quote:


> Originally Posted by *guyinthecorner1*
> 
> I agree. Afterburner 4.3 Beta is messed up with the G1 gaming. The custom fan profile does not work. With the custom profile on, the fans always run at 25%.
> 
> I have been using the xtreme utility but GPU boost 3.0 is very strange. I only got 40 MHz with a +100 core offset. I can't get the card past 2000 MHz. How you had any success?


I got up to 2037mhz myself, but it throttles down to 2012 under long time full load and stays there, havnt risked playing with voltages yet, a bit worried to play with a new expensive gpu


----------



## r0l4n

My three Firestrike scores are valid and high enough to make it to the Hall of Fame for 1 card, yet they don't appear, do you guys know why?

http://www.3dmark.com/fs/8632088
http://www.3dmark.com/fs/8632244
http://www.3dmark.com/fs/8632279


----------



## PasK1234Xw

Quote:


> Originally Posted by *SsXxX*
> 
> I got up to 2037mhz myself, but it throttles down to 2012 under long time full load and stays there, havnt risked playing with voltages yet, a bit worried to play with a new expensive gpu


I don't think you can adjust voltage yet. Voltage slider doesn't seem to have any effect regardless at 0 or 100 my voltage doesn't seem to change and doesn't make clocks any more stable.

Even if does work there is no risk in damage due to over volt. Only thing you will only have to keep eye on really are temps.


----------



## ChevChelios

Quote:


> Originally Posted by *PasK1234Xw*
> 
> I don't think you can adjust voltage yet. Voltage slider doesn't seem to have any effect regardless at 0 or 100 my voltage doesn't seem to change and doesn't make clocks any more stable.
> 
> Even if does work there is no risk in damage due to over volt. Only thing you will only have to keep eye on really are temps.


are you talking about the G1 ? are you using extreme engine or afterburner ?

you can *definitely* adjust voltage on 1080, but it will only give you a max of ~2100 or so

http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,3.html


----------



## PasK1234Xw

yea i just tested i notice my boost doesnt down clock as much if i increase slider though i thought voltage increase would be more substantial compared to prev gen or does boost have control?

I really we get custom BIOS that unlocks these limits and to be able to adjust fan curve without software


----------



## Dirgeth

Just hit 25k graphic score











2125MHz/11200MHz on my Asus Strix 1080GTX


----------



## DjDaffyDuck

can someone do this for me please and tell me what clocks they are getting with their 1080 g1 gaming, if you set it to oc mode in the xtreme utility, what clocks does it get to in a game?
trying to make some stuff clear,as i will be buying one next week, on their site it says 1860mhz on oc mode, but i think that's before gpu boost 3.0 kicks in.

and on the 1080 strix they are saying 1936mhz, but i think that's AFTER gpu boost 3.0.


----------



## Dirgeth

Quote:


> Originally Posted by *DjDaffyDuck*
> 
> and on the 1080 strix they are saying 1936mhz, but i think that's AFTER gpu boost 3.0.


First time run Valley bench on my Strix was 2050MHz boost .. all default settings.. after 20min boost drops to 1976MHz...


----------



## ChevChelios

from what I seen you will get actual sustained boost clock of ~1950+ on the G1, maybe even close to/at 2000 .. in OC mode out of the box

manual overclocking will get you +50/100 Mhz on top of that

also I actually thought of maybe getting the MSI 1080 Gaming X instead of the G1 (G1 will still need 2++ weeks to get here), but so far the only Gaming X available to me are +*140/150 EUR* more than the G1 ... insane, either MSI or my local shops have gone nuts

the 700 EUR for the G1 is steep, but manageable, but 840/850 EUR for Gaming X is just nuts


----------



## traxtech

I'm still confused on how to set up voltage per clock or whatever this new system is


----------



## ChevChelios

Quote:


> Originally Posted by *traxtech*
> 
> I'm still confused on how to set up voltage per clock or whatever this new system is


http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,1.html


----------



## pez

Quote:


> Originally Posted by *twelvie*
> 
> I bought a G1 1080 which I'm very happy with - having an issue with one of the fans, though. Keep in mind this only seems to occur when it's on automatic and is stopping and starting with the "fan stop."
> 
> Basically the far right fan when spinning up is making quite a loud clicking or grinding sound, even in idle every now and again it will do it, the far left fan also spins (without noise) and the middle fan remains idle. It seems like it only makes the noise when running at a very low speed.
> 
> I've made a custom fan curve which seems to have remedied the problem, although would like to know whether this is common with cards with this "fan stop" feature? Still unsure whether I will need to RMA the card.


Maybe uninstall the card and use some canned air to blow the card out a bit. Try to get close enough to the motor/bearing area of the fan, but not so much that you're going to dry out the oil







. Try that and then a couple runs of variable speed that allow it to get to 100%.


----------



## xTesla1856

Tempted to ditch my Furys for a 1080 FTW. But I'd lose FreeSync....


----------



## Dirgeth

i was change thermal phaste on my Asus Strix for MX-4.

And from 80cels. to 74cels. in Overwatch Epic settings
Valley bench from 74cels. to 70cels.


----------



## TK421

Quote:


> Originally Posted by *Dirgeth*
> 
> i was change thermal phaste on my Asus Strix for MX-4.
> 
> And from 80cels. to 74cels. in Overwatch Epic settings
> Valley bench from 74cels. to 70cels.


bad heatpipe design again, only 2 makes proper contact whilst the rest doesn't or only very slightly


----------



## fat4l

Quote:


> Originally Posted by *TK421*
> 
> bad heatpipe design again, only 2 makes proper contact whilst the rest doesn't or only very slightly


ppl tell them over and over and they still do the same thing. Asus doenst listen to ppl ..... They could use heatspreader like MSI so all heatlipes would transfer the heat...
But I guess as chips get bigger, more heatpipes wil lbe touching the core..


----------



## TK421

Quote:


> Originally Posted by *fat4l*
> 
> ppl tell them over and over and they still do the same thing. Asus doenst listen to ppl ..... They could use heatspreader like MSI so all heatlipes would transfer the heat...
> But I guess as chips get bigger, more heatpipes wil lbe touching the core..


save money I guess

Now is 57 in top 100

http://www.3dmark.com/fs/8768084


----------



## zGunBLADEz

Paid 39 bucks for a 980ti evga hybrid kit did the mod it dont even look it have it, on my case lol



90F ambient (32C) yeah its hot and humid today in chicago...

its topping out around 49C i give it or take like 3c less once the thermal paste sets


----------



## Jquala

Quote:


> Originally Posted by *Jpmboy*
> 
> The voltage is coupled to a clock bin in bios and vis-versa... synching them manually is problematic. Just try the cntrl_f function then grab the far right point, with the cntrl pressed move the curve up to the offset freq or abs freq you want to try. I'm running valley at 2200+ and Heaven at 2184 (it is more of a power limit thing with Heaven. valley barely hits the PL)


How are you keeping it at 10-17c? Are you benching in an igloo? Even with a waterblock and setting my rig up in a fridge i don't think I can get my cards that cool outside of LN2


----------



## TK421

Quote:


> Originally Posted by *Jquala*
> 
> How are you keeping it at 10-17c? Are you pbenching in an igloo? Even with a waterblock and setting my rig up in a fridge i don't think I can get my cards that cool outside of LN2


water chiller and big reservoir maybe


----------



## Visceral

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Paid 39 bucks for a 980ti evga hybrid kit did the mod it dont even look it have it, on my case lol
> 
> 
> 
> 90F ambient (32C) yeah its hot and humid today in chicago...
> 
> its topping out around 49C i give it or take like 3c less once the thermal paste sets


Been considering this. Any difference in OC? Able to reach higher speeds or more consistent boost?


----------



## downforce

http://www.3dmark.com/3dm/12424923

Got a good GTX1080.


----------



## Jquala

Quote:


> Originally Posted by *TK421*
> 
> water chiller and big reservoir maybe


Damn it I forgot koolance makes those lol. I thought it was just a regular open bench watercool set up from his previous pics.


----------



## Neon Lights

Quote:


> Originally Posted by *Jpmboy*
> 
> Ghetto water cooling.


I'm doing the same thing, haha!

I have the shunt mod on mine, do you too?


----------



## Outcasst

Has anybody replaced the stock TIM on the Founder's Edition yet? Does it yield good results?


----------



## zGunBLADEz

Quote:


> Originally Posted by *Visceral*
> 
> Been considering this. Any difference in OC? Able to reach higher speeds or more consistent boost?


2088-2101 stable except, when changing in some parts on heaven y drops to 2050ish kind of hate boost and Iwant it disable... But pretty stable so far.

dont mind the gpu scale clocks, actually the numbers would be 1000 actually 2000 on the gpu clocks (in this case 2100 MAX) as the program dont read correctly the GPU speed and represent it on the scale but the graph is accurate


95F AMBIENT on DA HOUSE today lol making the apt hot today on chicago lol


----------



## Arizonian

Ok looks like Stayoshi, our GPU editor came by and gave the club [Official] status.









Thank you Overk1ll for starting it, OP looks great.

Enjoy your 1080's!!!!!


----------



## Asus11

hey guys just wondered if you can set 1.25v on the card because mine seems to be only maxin out on 1.081v


----------



## zGunBLADEz

Quote:


> Originally Posted by *Asus11*
> 
> hey guys just wondered if you can set 1.25v on the card because mine seems to be only maxin out on 1.081v


nope


----------



## Asus11

Quote:


> Originally Posted by *zGunBLADEz*
> 
> nope


is 1.081v the max then?


----------



## zGunBLADEz

Quote:


> Originally Posted by *Asus11*
> 
> is 1.081v the max then?


i have seen 1.094 on mine


----------



## Dirgeth

Hey! just used CLU on my Strix

The MX-4 phase around die is for protection










And temps are awesome.. from stock Asus thermal phase in valley 74cels.

TO THIS


Also my card now hold 1.093v for long time.. and didnt drop it


----------



## looniam

congrats to you guys!















Quote:


> Originally Posted by *Arizonian*
> 
> Ok looks like Stayoshi, our GPU editor came by and gave the club [Official] status.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you Overk1ll for starting it, OP looks great.
> 
> Enjoy your 1080's!!!!!


i look forward to more lurking and enjoying what you folks do to your cards.

go at it!


----------



## TK421

Quote:


> Originally Posted by *Dirgeth*
> 
> Hey! just used CLU on my Strix
> 
> The MX-4 phase around die is for protection
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And temps are awesome.. from stock Asus thermal phase in valley 74cels.
> 
> TO THIS
> 
> 
> Also my card now hold 1.093v for long time.. and didnt drop it


you can always use electrical tape instead of using paste like that...

and what's funny is that you can use AIO cooler and get even better result

mine maxes out at 52c when playing witcher 3, 50c with heaven extreme preset


----------



## Scrimstar

is EVGA hybrid and Classified expected to be out July?


----------



## floodo1

Quote:


> Originally Posted by *Jquala*
> 
> How are you keeping it at 10-17c? Are you benching in an igloo? Even with a waterblock and setting my rig up in a fridge i don't think I can get my cards that cool outside of LN2


Aquarium chiller. Pretty sure there are plenty of photos of his rig if you look around.


----------



## Bishop07764

Quote:


> Originally Posted by *Scrimstar*
> 
> is EVGA hybrid and Classified expected to be out July?


Wondering the same about the classified. The announcement of EK blocks for it has me rather interested. My 780 Lightning is beginning to show it's limitations with just 3gb of Vram in some of the latest games. Would love to see a custom bios for the classified as well. My Lightning has a 287% power limit.







I would love to see what could be done with upping the voltage and power limit.


----------



## blahtibla

Just ordered the MSI aero with EK block and backplate. It was the cheapest card I could get with a 3 year warranty..

Does people see higher clocks on water, or is it like maxwell, minimal gains?


----------



## VSG

Where did you find the MSI Aero for sale?


----------



## Neon Lights

Quote:


> Originally Posted by *blahtibla*
> 
> Just ordered the MSI aero with EK block and backplate. It was the cheapest card I could get with a 3 year warranty..
> 
> Does people see higher clocks on water, or is it like maxwell, minimal gains?


What do you do if the MSI Aero has a "Warranty Void If Removed" sticker on it, which is the case for custom MSI cards? Then again I had a 7970 OC (kind of an equivalent of a 1080 Aero) which did not have one. If it has one, will you try to carefully remove it? I have managed to do so a few times with a sharp knife.


----------



## sok0

MSI does not void the warranty if you remove the sticker.


----------



## Benjiw

Quote:


> Originally Posted by *blahtibla*
> 
> Just ordered the MSI aero with EK block and backplate. It was the cheapest card I could get with a 3 year warranty..
> 
> Does people see higher clocks on water, or is it like maxwell, minimal gains?


Wouldn't say maxwell saw minimal gains going from air to water from experience. The cooler the core and the vram the better.


----------



## Jpmboy

Quote:


> Originally Posted by *Neon Lights*
> 
> I'm doing the same thing, haha!
> 
> I have the shunt mod on mine, do you too?


Shunt mod?

But yeah, setting up more than one card with uniblocks can get a little "loopy" :


4x420 rad before aquarium chiller. I recently managed to come across a Koolance EXC-800 chiller... should be running that in a few days on a different rig.











Quote:


> Originally Posted by *Benjiw*
> 
> Wouldn't say maxwell saw minimal gains going from air to water from experience. The cooler the core and the vram the better.


This is absolutely true!


----------



## blahtibla

The Aero is up for preorder on overclockers.co.uk..

MSI warranty shouldn't be a problem https://forum-en.msi.com/index.php?topic=174490.0


----------



## sherlock

Day 2 with my FE is all smooth sailings

Got EVGA Percison XOC 6.01 installed to mod fan curve, temp yet to go above 74C with new curve
Noticed some power perform cap, set Power slider to 120% in Percison XOC fixed that
enabled Fast sync in Control Panel, works great in Armored Warfare 4K, holds a comfortable 60 FPS at Ultra and only uses 1823mhz max


----------



## Menta

Can anyone comment on the strix with a 70% fan use how loud is the card and temperatures ?

My MSI X 1080 IS Silent with those settings gaming 8 hours [email protected] around 69c max 71c sometimes seems pretty good to me.


----------



## Shaded War

Been waiting forever for a custom 1080 to come in stock. Got my order in got the Gigabyte G1 for $650, then Newegg cancelled it on me. Looks like I'll be waiting another month to try and catch it in stock again.


----------



## Benjiw

I won't be rocking a 1080 any time soon sadly, need to pay rent and get a job at the same time or I'll be homeless lol so will have to wait, probably have the custom pcbs and or even the ti models out by the time I can grab one.


----------



## TK421

Quote:


> Originally Posted by *Benjiw*
> 
> I won't be rocking a 1080 any time soon sadly, need to pay rent and get a job at the same time or I'll be homeless lol so will have to wait, probably have the custom pcbs and or even the ti models out by the time I can grab one.


how much does a wife and kid sell for in the black market?


----------



## SynchroSCP

Quote:


> Originally Posted by *Jpmboy*
> 
> Shunt mod?


I think thats where CL ultra is used to cover the 2 milliohm shunt resistors to lower their resistance effectively raising the power limit. 




Bad picture but Thermosphere is working great on this card as well, temps of gpu and vrm in the 40's. Have a full block coming but this works really well, may send it back and just use this until the Ti classy comes out.


----------



## Shiftstealth

Quote:


> Originally Posted by *TK421*
> 
> how much does a wife and kid sell for in the black market?


Depends on how useful the wife is


----------



## TK421

Quote:


> Originally Posted by *SynchroSCP*
> 
> I think thats where CL ultra is used to cover the 2 milliohm shunt resistors to lower their resistance effectively raising the power limit.
> 
> 
> 
> 
> Bad picture but Thermosphere is working great on this card as well, temps of gpu and vrm in the 40's. Have a full block coming but this works really well, may send it back and just use this until the Ti classy comes out.


what mod is required to fit the thermosphere to the EVGA SC card?


----------



## SynchroSCP

Quote:


> Originally Posted by *Shiftstealth*
> 
> Depends on how useful the wife is


Or if your willing to sell organs rather than a whole person








Quote:


> Originally Posted by *TK421*
> 
> what mod is required to fit the thermosphere to the EVGA SC card?


Just have to cut the mounting bracket so it fits over the tabs of the baseplate. Back when the 980 ACX came out people were cutting the tabs off so I came up with this as a workaround and am surprised at how well it works as long as there's enough airflow to remove heat from the baseplate.


----------



## TK421

Quote:


> Originally Posted by *SynchroSCP*
> 
> Or if your willing to sell organs rather than a whole person
> 
> 
> 
> 
> 
> 
> 
> 
> Just have to cut the mounting bracket so it fits over the tabs of the baseplate. Back when the 980 ACX came out people were cutting the tabs off so I came up with this as a workaround and am surprised at how well it works as long as there's enough airflow to remove heat from the baseplate.


How large is the cutout on the SC version's baseplate?

I'd imagine that the FE Reference edition would have to cut the baseplate to mount a thermosphere.



Regarding the cutting mod, I researched the original picture of a thermosphere



Wouldn't the "mod" take away the mounting pressure and some structural integrity of the waterblock?


----------



## Neon Lights

Quote:


> Originally Posted by *Jpmboy*
> 
> Shunt mod?


It is if you put liquid metal thermal paste on the three shunt resistors, that raises the available power by 20%-30%.


----------



## SynchroSCP

Quote:


> Originally Posted by *TK421*
> 
> How large is the cutout on the SC version's baseplate?
> 
> I'd imagine that the FE Reference edition would have to cut the baseplate to mount a thermosphere.
> 
> 
> 
> Regarding the cutting mod, I researched the original picture of a thermosphere
> 
> 
> 
> Wouldn't the "mod" take away the mounting pressure and some structural integrity of the waterblock?


Yeah, thats not going to work without taking off the baseplate...maybe the Supremacy GPU block but even thats a stretch. Problem is the mounting holes with the baseplate are going to keep it from making contact. Don't see a way to make it work without taking the baseplate off, maybe someone else does. Your best option is probably a full cover block. Don't cut the baseplate, I'd imagine that would void your warranty.

As far as mounting pressure and structural integrity with modding the thermosphere, no it still tightens up just the same...its a metal bracket held down with metal screws...never had a problem securing it as designed.


----------



## nexxusty

Quote:


> Originally Posted by *TK421*
> 
> you can always use electrical tape instead of using paste like that...
> 
> and what's funny is that you can use AIO cooler and get even better result
> 
> mine maxes out at 52c when playing witcher 3, 50c with heaven extreme preset


Curious which bracket you are using for your AIO? Thinking of using my H110i for my 1080.


----------



## Neb9

Any idea when custom bios' will be available?


----------



## Hilpi234

http://www.3dmark.com/fs/8775139

http://www.3dmark.com/fs/8768338

http://www.3dmark.com/fs/8774921

Greetz from Germany


----------



## r0l4n

Quote:


> Originally Posted by *Hilpi234*
> 
> http://www.3dmark.com/fs/8775139
> 
> http://www.3dmark.com/fs/8768338
> 
> http://www.3dmark.com/fs/8774921
> 
> Greetz from Germany


Amazing scores, can you post details about your overclock, cooling, etc.?


----------



## Hilpi234

Only Water, and no PT via Hardmod...


----------



## Outcasst

So I went ahead and replaced the stock thermal paste with NT-H1 on the founder's edition card. Amazing results, went from roughly 75c to 69c (100% Fan speed) during a 3 hour heaven run. Clocks are much more consistent now.


----------



## emett

^interesting. Massive drop.


----------



## r0l4n

Quote:


> Originally Posted by *Outcasst*
> 
> So I went ahead and replaced the stock thermal paste with NT-H1 on the founder's edition card. Amazing results, went from roughly 75c to 69c (100% Fan speed) during a 3 hour heaven run. Clocks are much more consistent now.


What about using the default fan profile? Care to measure the difference? I was thinking of using CLU in my FE.


----------



## Outcasst

Quote:


> Originally Posted by *r0l4n*
> 
> What about using the default fan profile? Care to measure the difference? I was thinking of using CLU in my FE.


Unfortunately I've never run the default fan profile so I don't have a baseline reading of what it was before.


----------



## Jpmboy

Quote:


> Originally Posted by *TK421*
> 
> how much does a wife and kid sell for in the black market?


sell? Only experience I have is that it will COST A HELLOFA money to get rid of 'em.
Quote:


> Originally Posted by *SynchroSCP*
> 
> I think thats where CL ultra is used to cover the 2 milliohm shunt resistors to lower their resistance effectively raising the power limit.
> 
> 
> 
> 
> Bad picture but Thermosphere is working great on this card as well, temps of gpu and vrm in the 40's. Have a full block coming but this works really well, may send it back and just use this until the Ti classy comes out.


ah.. I thought it was about a water loop shunt.








Quote:


> Originally Posted by *Neon Lights*
> 
> It is if you put liquid metal thermal paste on the three shunt resistors, that raises the available power by 20%-30%.


yeah... eg, a pencil mod on a 980.

_______________________________________________
*guys - if you run any of these benchmarks, don't forget to post it in the OCN threads* - more data, more better







:
http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30


----------



## r0l4n

Quote:


> Originally Posted by *r0l4n*
> 
> What about using the default fan profile? Care to measure the difference? I was thinking of using CLU in my FE.


It seems the blower is aluminium where it contacts the GPU die? If that's the case, no CLU this time around.


----------



## TK421

Quote:


> Originally Posted by *SynchroSCP*
> 
> Yeah, thats not going to work without taking off the baseplate...maybe the Supremacy GPU block but even thats a stretch. Problem is the mounting holes with the baseplate are going to keep it from making contact. Don't see a way to make it work without taking the baseplate off, maybe someone else does. Your best option is probably a full cover block. Don't cut the baseplate, I'd imagine that would void your warranty.
> 
> As far as mounting pressure and structural integrity with modding the thermosphere, no it still tightens up just the same...its a metal bracket held down with metal screws...never had a problem securing it as designed.


That's a shame :|
I'll return the FE and hope thet MC has a 1080SC or something with an ACX cooler soon.

Quote:


> Originally Posted by *nexxusty*
> 
> Curious which bracket you are using for your AIO? Thinking of using my H110i for my 1080.


Not bracket, it's EVGA hybrid cooler (see link in sig)


----------



## OverK1LL

Quote:


> Originally Posted by *Arizonian*
> 
> Ok looks like Stayoshi, our GPU editor came by and gave the club [Official] status.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you Overk1ll for starting it, OP looks great.
> 
> Enjoy your 1080's!!!!!


Sweet! Thank you for the compliment Arizonian


----------



## ondoy

my ultra score, should i upgrade ?


----------



## HMoneyGrip

Curious...

For those with the Asus Strix, do you think the RGB backplate will still work with a third party water block installed?


----------



## PasK1234Xw

Quote:


> Originally Posted by *nexxusty*
> 
> Curious which bracket you are using for your AIO? Thinking of using my H110i for my 1080.


Why not just wait for AIO kit for FE?


----------



## Asus11

any been experiencing the card hittin power limit way too easy?


----------



## PasK1234Xw

yup


----------



## Neon Lights

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah... eg, a pencil mod on a 980.


Sorry this is my very first Nvidia card, therefore I never followed anything about Nvidia cards before.


----------



## nexxusty

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Why not just wait for AIO kit for FE?


Waiting!?!?!?! Waiting!?!?!? No that's a good idea actually.

Anyone know if there is one in the works? Seems pointless to do full waterblock this time around.


----------



## skline00

Quote:


> Originally Posted by *nexxusty*
> 
> Waiting!?!?!?! Waiting!?!?!? No that's a good idea actually.
> 
> Anyone know if there is one in the works? Seems pointless to do full waterblock this time around.


Add me to the world of pointless, as I just received my EK GTX 1080 fullblock from PC Performance and will have a Zotac GTX 1080 FE shipping tomorrow from Newegg.

I already have a custom waterloop on both my 5960x rig and 4790k rig so my graphics card was going under water. I will watch to see if I have any "thermal throttling".

I intend to run the GTX 1080 in my 5960x rig and replace the 2 R9 290s with the single GTX 980TI SC.


----------



## nexxusty

Quote:


> Originally Posted by *skline00*
> 
> Add me to the world of pointless, as I just received my EK GTX 1080 fullblock from PC Performance and will have a Zotac GTX 1080 FE shipping tomorrow from Newegg.
> 
> I already have a custom waterloop on both my 5960x rig and 4790k rig so my graphics card was going under water. I will watch to see if I have any "thermal throttling".


You won't. AIO's do 2100mhz+ without throttling so a full cover will do the same easily.


----------



## TK421

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Why not just wait for AIO kit for FE?


see my sig


----------



## Menta

Not very good quality but some one asked me to make a video of the 70% fan noise and its very silent maybe i will try to turn off the case fans and make a better video later.


----------



## PasK1234Xw

Quote:


> Originally Posted by *nexxusty*
> 
> Waiting!?!?!?! Waiting!?!?!? No that's a good idea actually.
> 
> Anyone know if there is one in the works? Seems pointless to do full waterblock this time around.


http://forums.evga.com/Founders-edition-hybrid-cooler-kit-will-it-happen-m2485313-p2.aspx#2488947
Quote:


> Originally Posted by *TK421*
> 
> see my sig


I know but they are making one for FE


----------



## zGunBLADEz

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 2088-2101 stable except, when changing in some parts on heaven y drops to 2050ish kind of hate boost and Iwant it disable... But pretty stable so far.
> 
> dont mind the gpu scale clocks, actually the numbers would be 1000 actually 2000 on the gpu clocks (in this case 2100 MAX) as the program dont read correctly the GPU speed and represent it on the scale but the graph is accurate
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 95F AMBIENT on DA HOUSE today lol making the apt hot today on chicago lol
> 
> 
> Spoiler: Warning: Spoiler!


Ok, now is more reasonable cooler than yesterday. Yesterday was HORRIBLE to test a new cooler or no? lol

Here with 80ish ambient temps same test heaven 4k at same settings




Hybrid mod btw


----------



## Asus11

hate the power limits these cards have think I might sell and go back to Maxwell


----------



## nexxusty

Quote:


> Originally Posted by *Asus11*
> 
> hate the power limits these cards have think I might sell and go back to Maxwell


Shunt resistor mod...


----------



## Dirgeth

2202/11200MHz on Strix cooler with CLU thermal phase + 100% FAN speed

25175 GPU score
http://www.3dmark.com/3dm/12448763?


----------



## Jpmboy

Quote:


> Originally Posted by *ondoy*
> 
> my ultra score, should i upgrade ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


see for yourself: http://www.overclock.net/t/1518806/fire-strike-ultra-top-30/0_20


----------



## TK421

Quote:


> Originally Posted by *nexxusty*
> 
> Shunt resistor mod...


I'd say just wait until the mod bios tool come out.


----------



## USlatin

I am very curious to see how it will turn out when people try to make a Hybrid out of one of these:

http://www.evga.com/Products/Product.aspx?pn=08G-P4-5180-KR


https://us.msi.com/Graphics-card/GeForce-GTX-1080-AERO-8G-OC.html#hero-overview









It would be the best of all worlds, temps, exhausting hot air, quiet, overclocking stability, especially considering the ~2.1GHz limit and that you could get the whole thing for $650
Quote:


> Originally Posted by *zGunBLADEz*
> 
> Ok, now is more reasonable cooler than yesterday. Yesterday was HORRIBLE to test a new cooler or no? lol
> 
> Here with 80ish ambient temps same test heaven 4k at same settings
> 
> Hybrid mod btw


Thanks for posting the extreme ambient charts. Rep+


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> see for yourself: http://www.overclock.net/t/1518806/fire-strike-ultra-top-30/0_20


I can see you are one of the editors of the list you've linked to and therefore have a vested interest in promoting it, so I understand why you (as well as some others in this thread) continually reference 3dmark performance numbers of both Maxwell and Pascal whilst ignoring game benchmarks. If all you care about is 3dmark, fine, that's your business and you can do as you please. Most people buy video cards to play video games though, so a far more relevant performance metric for these people would be actual video games. I've mentioned this already, but it seems to be necessary still. GP104 in the form of GTX 1080 is on average 22.6% faster than GM200 in the form of GTX 980 Ti when both cards are overclocked to the limits of their stock BIOS. BIOS mods, hard mods, and sub-ambient cooling results are not yet available for Pascal so it's an unequal comparison for example when you post a link to a thread with 3dmark performance figures for many BIOS-modded (or otherwise) GM200-based cards and use this to suggest that GP104 is not as fast as GM200. The cards at the top of your list are all running around 1.7GHz core clock which is much higher than can be achieved with a stock BIOS on GM200. I know, I had a 980 Ti under water with a modded BIOS and only managed a graphics score of 4858 in Fire Strike Ultra with a 1.55GHz core clock. In other words, you're presenting cherry-picked data.

For reference (once again) here is a summary of the performance of several GM204, GM200, and GP104 based products across 9 popular games as tested by Overclockers Club: https://docs.google.com/spreadsheets/d/10mwNtNsQXNJCjHQPtE5K636f-rXjZTRrmHszOUWyI6Y/edit?usp=sharing


----------



## sherlock

Day 3 with my 1080 FE



Finally put my big boy pants on and dialed up Afterburner 4.3 beta 4
+100% Voltage/+120% Power/+125 Core/+400 Mhz

Heaven 4.0, 69.7 FPS at 2560X1440

Peak boost is 2025 Mhz,
Average : 1974 Mhz
Min: 1948-1962Mhz.

Temp is 73 C with a custom fan curve.

Pretty happy with what I got and will run this in games for a week before going further(Based on result here I might have another +95 left in the core).
Quote:


> Originally Posted by *PasK1234Xw*
> 
> Why not just wait for AIO kit for FE?


If people liked waiting they wouldn't be owning a FE right now would they?


----------



## Benjiw

Quote:


> Originally Posted by *Asus11*
> 
> hate the power limits these cards have think I might sell and go back to Maxwell


Shunt mod until bios modding has been explored more. Not sure why you're willing to downgrade when you can remove your power issues with some liquid metal thermal compound or silver pencil or give it a few months there will be bios modding.


----------



## FattysGoneWild

Moving the voltage to 100% or even messing with it at all is a complete waste yes? I have read the card will do what it wants to regardless of set voltage. Even cranked to 100%.


----------



## TK421

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Moving the voltage to 100% or even messing with it at all is a complete waste yes? I have read the card will do what it wants to regardless of set voltage. Even cranked to 100%.


yep

most likely you're going to run out of tdp before you run out of the (very small) voltage scaling benefit


----------



## Bogga

Today I'll get my two strix. At home I've got one of those old wiggly sli-bridges that came with the mobo. Would it make any difference if I was to get ahold of another one while I wait for the HB-bridge?


----------



## TK421

Quote:


> Originally Posted by *Bogga*
> 
> Today I'll get my two strix. At home I've got one of those old wiggly sli-bridges that came with the mobo. Would it make any difference if I was to get ahold of another one while I wait for the HB-bridge?


using two sli bridge will supposedly have the same effect as a HB sli bridge


----------



## Bogga

Quote:


> Originally Posted by *TK421*
> 
> using two sli bridge will supposedly have the same effect as a HB sli bridge


Som buying another one and going with two will make a difference over just one?


----------



## TK421

Quote:


> Originally Posted by *Bogga*
> 
> Som buying another one and going with two will make a difference over just one?


it's a speculation, I read it somewhere in this thread but can't confirm if it's true or not


----------



## MrDerrikk

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bogga*
> 
> Today I'll get my two strix. At home I've got one of those old wiggly sli-bridges that came with the mobo. Would it make any difference if I was to get ahold of another one while I wait for the HB-bridge?
> 
> 
> 
> using two sli bridge will supposedly have the same effect as a HB sli bridge
Click to expand...

I'm pretty sure this is incorrect, Gamers Nexus did a test with this and apparently it just caused artifacting with no gain. Their SLI 1070 video recently mentions the reasons behind it, I forget exactly what.


----------



## TK421

Quote:


> Originally Posted by *MrDerrikk*
> 
> I'm pretty sure this is incorrect, Gamers Nexus did a test with this and apparently it just caused artifacting with no gain. Their SLI 1070 video recently mentions the reasons behind it, I forget exactly what.


welp

time to fork out more money to get the HB bridge then


----------



## ChevChelios

anyone tried Afterburner 4.3.0. beta 4 with the G1 Gaming 1080 ? Any issues ? We had conflicting reports in here earlier, some said it should work fine, some said G1 has issues with Afterburner beta


----------



## sebna

Quote:


> Originally Posted by *Menta*
> 
> 
> 
> 
> 
> Not very good quality but some one asked me to make a video of the 70% fan noise and its very silent maybe i will try to turn off the case fans and make a better video later.


Many thnaks for posting this. Much appreciated. Gives us undecided an idea what to expect


----------



## Jpmboy

Quote:


> Originally Posted by *techguymaxc*
> 
> I can see you are one of the editors of the list you've linked to and therefore have a vested interest in promoting it, so I understand why you (as well as some others in this thread) continually reference 3dmark performance numbers of both Maxwell and Pascal whilst ignoring game benchmarks. If all you care about is 3dmark, fine, that's your business and you can do as you please. Most people buy video cards to play video games though, so a far more relevant performance metric for these people would be actual video games. I've mentioned this already, but it seems to be necessary still. GP104 in the form of GTX 1080 is on average 22.6% faster than GM200 in the form of GTX 980 Ti when both cards are overclocked to the limits of their stock BIOS. BIOS mods, hard mods, and sub-ambient cooling results are not yet available for Pascal so it's an unequal comparison for example when you post a link to a thread with 3dmark performance figures for many BIOS-modded (or otherwise) GM200-based cards and use this to suggest that GP104 is not as fast as GM200. The cards at the top of your list are all running around 1.7GHz core clock which is much higher than can be achieved with a stock BIOS on GM200. I know, I had a 980 Ti under water with a modded BIOS and only managed a graphics score of 4858 in Fire Strike Ultra with a 1.55GHz core clock. In other words, you're presenting cherry-picked data.
> 
> For reference (once again) here is a summary of the performance of several GM204, GM200, and GP104 based products across 9 popular games as tested by Overclockers Club: https://docs.google.com/spreadsheets/d/10mwNtNsQXNJCjHQPtE5K636f-rXjZTRrmHszOUWyI6Y/edit?usp=sharing


lol - "vested interest"? In what... updating a google spread sheet? - No.
I am interested in looking at the aggregate data across gpu generations - this is very predictive of overall performance in an "upgrade" whether gaming, benching or what ever (and IMO are more reliable than the majority of poorly coded "built-in" gaming benchmarks, which are much more inconsistent over time). It is always up to the user of any data to discard the outliers (highs and lows) if appropriate to the analysis - for any use of open field datasets.
For the FM threads, drivers, bios and clocks are in the screenshots.

Nevertheless, one can always use the hardware review sites to find the best BF4 cards for example.

No.. no _vested interest_ in a particular OCN thread.


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - "vested interest"? In what... updating a google spread sheet? - No.
> I am interested in looking at the aggregate data across gpu generations - this is very predictive of overall performance in an "upgrade" whether gaming, benching or what ever (and IMO are more reliable than the majority of poorly coded "built-in" gaming benchmarks, which are much more inconsistent over time). It is always up to the user of any data to discard the outliers (highs and lows) if appropriate to the analysis - for any use of open field datasets.
> For the FM threads, drivers, bios and clocks are in the screenshots.
> 
> Nevertheless, one can always use the hardware review sites to find the best BF4 cards for example.
> 
> No.. no _vested interest_ in a particular OCN thread.


If you invest time into creating, maintaining, and promoting a project you have some amount of ego wrapped up in it. That is your vested interest. The fact that you're attempting to discredit actual game benchmarks in favor of a single synthetic benchmark is beyond absurd. What does "poorly coded" even mean in this context? Did you even bother reading either of the links I posted? 9 games tested across 3 resolutions with data for 970, 980, 980 Ti, Titan X, 1070, and 1080 - all overclocked.

I've presented you with lots of information, its up to you to decide what to do with it. If you only care about 3dmark - that's fine, you can use your time how you wish. If you're going to ignore all the other data on these cards and base your opinion of them off of a single data point then go around spreading your poorly developed, misinformed opinion, I'm going to call you out on it.


----------



## ilgello

Did anyone do any testing before and after Shunt Mod + Overclock ?


----------



## Jpmboy

Quote:


> Originally Posted by *techguymaxc*
> 
> If you invest time into creating, maintaining, and promoting a project you have some amount of ego wrapped up in it. That is your vested interest. The fact that you're attempting to discredit actual game benchmarks in favor of a single synthetic benchmark is beyond absurd. What does "poorly coded" even mean in this context? Did you even bother reading either of the links I posted? 9 games tested across 3 resolutions with data for 970, 980, 980 Ti, Titan X, 1070, and 1080 - all overclocked.
> 
> I've presented you with lots of information, its up to you to decide what to do with it. If you only care about 3dmark - that's fine, you can use your time how you wish. If you're going to ignore all the other data on these cards and base your opinion of them off of a single data point then go around spreading your poorly developed, misinformed opinion, I'm going to call you out on it.


dude - It's not a discussion point for this thread. and your "know it all" tone presuming to understand why someone would make or take the responsibility to maintain several OCN Benchmark threads is obviously beyond your thinking. EOD.


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> dude - It's not a discussion point for this thread. and your "know it all" tone presuming to understand why someone would make or take the responsibility to maintain several OCN Benchmark threads is obviously beyond your thinking. EOD.


I'm sorry if I've personally offended you by my choice of words. The underlying facts I am conveying are 100% accurate. I have shared information and presented an analysis. If you don't like my presentation then disregard it but don't throw out the data because you don't like the way the presenter is presenting it.


----------



## pez

Quote:


> Originally Posted by *techguymaxc*
> 
> I'm sorry if I've personally offended you by my choice of words. The underlying facts I am conveying are 100% accurate. I have shared information and presented an analysis. If you don't like my presentation then disregard it but don't throw out the data because you don't like the way the presenter is presenting it.


Ummmmm....

The guy he posted the screenshot for was asking if they should upgrade based on a 3DMark Firestrike Ultra score; therefore validating his reasoning for posting that screenshot. In fact, I'm not sure how much more relevant he could have made that post.

Secondly, if you want to promote that someone not upgrade based on those results alone, just say so. No need to try and analyze people's 'vested interests' to do so. Say your piece and be done with it....


----------



## techguymaxc

Quote:


> Originally Posted by *pez*
> 
> Ummmmm....
> 
> The guy he posted the screenshot for was asking if they should upgrade based on a 3DMark Firestrike Ultra score; therefore validating his reasoning for posting that screenshot. In fact, I'm not sure how much more relevant he could have made that post.
> 
> Secondly, if you want to promote that someone not upgrade based on those results alone, just say so. No need to try and analyze people's 'vested interests' to do so. Say your piece and be done with it....


I apologized, no need to beat a dead horse. I see what you are saying that his post was a relevant response. Allow me to explain my motivation for posting what I posted. I've seen a number of posts in this thread with potential upgraders/people who've already upgraded lamenting seemingly poor 3dmark scores and claiming something along the lines of "1080 is only 10% faster" based on 3dmark scores alone. My purpose then was to show these people that 1080 is a good deal faster than that. If some poor noob comes along and keeps seeing "1080 only 10% faster" over and over then draws the conclusion that it's not an upgrade, when in fact 1080 offers on average 22.6% more performance than 980 Ti they could be missing out. Maybe that 22.6% is all they need to play their favorite game with higher settings or to finally get X FPS in a certain title and make their game playing experience that much more enjoyable. I'm not here to attack anyone, I realize that my choice of words was poor and have apologized for it, not sure what else I could do at this point.


----------



## pez

Quote:


> Originally Posted by *techguymaxc*
> 
> I apologized, no need to beat a dead horse. I see what you are saying that his post was a relevant response. Allow me to explain my motivation for posting what I posted. I've seen a number of posts in this thread with potential upgraders/people who've already upgraded lamenting seemingly poor 3dmark scores and claiming something along the lines of "1080 is only 10% faster" based on 3dmark scores alone. My purpose then was to show these people that 1080 is a good deal faster than that. If some poor noob comes along and keeps seeing "1080 only 10% faster" over and over then draws the conclusion that it's not an upgrade, when in fact 1080 offers on average 22.6% more performance than 980 Ti they could be missing out. Maybe that 22.6% is all they need to play their favorite game with higher settings or to finally get X FPS in a certain title and make their game playing experience that much more enjoyable. I'm not here to attack anyone, I realize that my choice of words was poor and have apologized for it, not sure what else I could do at this point.


Well that would have been better to start out with







.

However, this is OCN and also the owner's thread for this card. Generally, you're going to see a mix of people who are benchmark aficionados, and some who are gamers. I benchmark synthetically for useful comparisons and reviews (if and when), but in general I game. Tis the reason I went with the 1080. We're all here for one reason or another, but the main point is that we are all here (most of us) to enjoy our GTX 1080s







.


----------



## alucardis666

Ok Serious question... Where can I find a 3rd party vendors card for the GTX 1080? Very interested in the MSI Sea Hawx Especially, that or the Asus Strix. Someone please assist.

Thanks!


----------



## AllGamer

Quote:


> Originally Posted by *alucardis666*
> 
> Ok Serious question... Where can I find a 3rd party vendors card for the GTX 1080? Very interested in the MSI Sea Hawx Especially, that or the Asus Strix. Someone please assist.
> 
> Thanks!


not available yet, I'm also looking forward to a Sea Hawk version.

EVGA on their forum mentioned they will make an AIO kit for all Founder Editions.

so now we have confirmed MSI and EVGA will make factory OC with AIO


----------



## pez

I'm impressed with the G1 so far to say the least. Time to test it with a real game, though.

This is after a competitive match of CS:GO on Dust2. TL;DR 41% max fan and 64C. Guessing that clock reading on HWMonitor isn't too accurate though







.


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> I'm impressed with the G1 so far to say the least. Time to test it with a real game, though.
> 
> This is after a competitive match of CS:GO on Dust2. TL;DR 41% max fan and 64C. Guessing that clock reading on HWMonitor isn't too accurate though
> 
> 
> 
> 
> 
> 
> 
> .


Impressive!


----------



## DADDYDC650

Quote:


> Originally Posted by *alucardis666*
> 
> Impressive!


64c running CS GO? Not that impressive considering that game can be run on my watch.


----------



## pez

Quote:


> Originally Posted by *DADDYDC650*
> 
> 64c running CS GO? Not that impressive considering that game can be run on my watch.


It can also manage to push most GPU's to 99% usage, but what do I know since you have some crazy watch







.

Crysis (1):
Max temps 70C, fans get up to 52% and GPU clock remains steady at a minimum of 1835 MHz. I have it in 'Gaming Mode' as I'm just curious to see the card perform at 'stock'. HWMonitor's GPU clocks are def. not to be trusted lol.


----------



## DADDYDC650

Quote:


> Originally Posted by *pez*
> 
> It can also manage to push most GPU's to 99% usage, but what do I know since you have some crazy watch
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Crysis (1):
> Max temps 70C, fans get up to 52% and GPU clock remains steady at a minimum of 1835 MHz. I have it in 'Gaming Mode' as I'm just curious to see the card perform at 'stock'. HWMonitor's GPU clocks are def. not to be trusted lol.


Nice card and cools similar to other AIB 1080's. CS GO doesn't push the 1080 at all but whatevs. Congrats!


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> It can also manage to push most GPU's to 99% usage, but what do I know since you have some crazy watch
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Crysis (1):
> Max temps 70C, fans get up to 52% and GPU clock remains steady at a minimum of 1835 MHz. I have it in 'Gaming Mode' as I'm just curious to see the card perform at 'stock'. HWMonitor's GPU clocks are def. not to be trusted lol.


Right that was my point. With 100% load 64C ain't bad. My OG 980 with a pretty decent overclock on her gets to 83c while playing overwatch maxed out @ 1080p.


----------



## pez

Well, for those who actually care about game temps or whatever, I post them







. Sue me for posting stuff that's on topic







.

Rocket League (approximately 10 minutes of gameplay/two competitive matches):
64-65C, 45% fan, 1911MHz GPU clock.


EDIT:
Quote:


> Originally Posted by *alucardis666*
> 
> Right that was my point. With 100% load 64C ain't bad. My OG 980 with a pretty decent overclock on her gets to 83c while playing overwatch maxed out @ 1080p.


I agree...some people will never be happy, however. Doom is next







.


----------



## ChevChelios

Quote:


> Originally Posted by *pez*
> 
> It can also manage to push most GPU's to 99% usage, but what do I know since you have some crazy watch
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Crysis (1):
> Max temps 70C, fans get up to 52% and GPU clock remains steady at a minimum of 1835 MHz. I have it in 'Gaming Mode' as I'm just curious to see the card perform at 'stock'. HWMonitor's GPU clocks are def. not to be trusted lol.


hmm, but that 2175 rpm fans and thats in stock Gaming mode ..

in manual OC would get close to 2400-2500 rpm, no ? isnt that getting a bit loud ?


----------



## nexxusty

Quote:


> Originally Posted by *DADDYDC650*
> 
> Nice card and cools similar to other AIB 1080's. CS GO doesn't push the 1080 at all but whatevs. Congrats!


Source engine literally requires SSGSAA to even put any type of load on even a 980.


----------



## pez

Quote:


> Originally Posted by *ChevChelios*
> 
> hmm, but that 2175 rpm fans and thats in stock Gaming mode ..
> 
> in manual OC would get close to 2400-2500 rpm, no ? isnt that getting a bit loud ?


No idea. I'm doing all this to ensure the card is stable at stock before I do any OC'ing. I game with headphones (open most of the time), so it takes a lot for me to notice. 55% matches my other fans at full tilt (maybe it's a tad louder) which are fairly decent in the noise department. However, I have a case with an acrylic window and the rear exhaust facing somewhat towards me. Sound effects people in different ways. However, fans have not exceeded 52% just yet. I'm not claiming it's the best AIB, but I'm happy with it







.

Also, interesting results. Because I chose 'Adaptive' sync, which works very well in this game, it pretty much didn't challenge the card. Settings are absolutely maxed (including Nightmare where available and AA is at TXAA x8. I'm going to give this another go with Adaptive sync turned off.
Doom (2016):
60C, 37% fan, 1924MHz core clock.


----------



## dmasteR

http://www.babeltechreviews.com/geforce-368-39-brings-performance-gtx-1080/view-all/

368.58 seems to bring a tad more performance for your 1080/1070 and even 980Ti!


----------



## sebna

Pez is there any chance you could record and upload to YT noise it generates at different fan settings like 30, 40, 50, 70,80% for example?


----------



## pez

Quote:


> Originally Posted by *dmasteR*
> 
> http://www.babeltechreviews.com/geforce-368-39-brings-performance-gtx-1080/view-all/
> 
> 368.58 seems to bring a tad more performance for your 1080/1070 and even 980Ti!


Nice! Gonna check this out more in-depth in a minute.
Quote:


> Originally Posted by *sebna*
> 
> Any chance you could record noise it generates at different fan settings like 30, 40, 50, 70,80% for example?


It will be tomorrow before I can do it as I'm just about to lay down for bed (3 hours late







).

50% IMO is going to be most people's 'bearable' maximum. I don't feel 52% is that much different, however. Since I game with headphones, I'm not as picky about the noise, either, but I did take off my headphones during Doom just now and had no qualms. I will try to do some testing for ya tomorrow, but I'm limited to an iPhone microphone/camera (i.e. potential to be a potato







).

Also, as said, Doom (2016) without Adaptive sync on.
looks about 68-69C, 1911MHz GPU clock, 48% fan speed, and definitely a lot more utilization. Less than ideal experience as Adaptive sync makes the game feel super smooth for me (I don't have a g-sync monitor). Nonetheless:


----------



## sebna

Hey, that is perfect if you could do it







Thanks for that. I also have to go to sleep now, but just unpacked Phanteks Evolv ATX TG and cannot stop staring at it







it will be a good replacement for my Antec P180 V1. Certainly will be much easier to build in phanteks.

Have you tried silent mode? Does it keep OC clocks in silent mode or drops them to lower values?

Cheers


----------



## pez

Quote:


> Originally Posted by *sebna*
> 
> Hey, that is perfect if you could do it
> 
> 
> 
> 
> 
> 
> 
> Thanks for that. I also have to go to sleep now, but just unpacked Phanteks Evolv ATX TG and cannot stop staring at it
> 
> 
> 
> 
> 
> 
> 
> 
> it will be a good replacement for my Antec P180 V1. Certainly will be much easier to build in phanteks.
> 
> Have you tried silent mode? Does it keep OC clocks in silent mode or drops them to lower values?
> 
> Cheers


Sound test went quicker then I thought so it's currently uploading. I'll come back and post the link once it's fully published/uploaded.

Microphone on the iPhone actually picked up the fan noise well...moreso than is actually perceived. However, it should be a decent enough result to show the differences in each increment







. I'll have to give silent mode a try tomorrow to check it out.

Also, that case is going to be a HUGE upgrade. One of my first 'nice' cases was a P180 v1. Loved that thing despite its flaws







. It's gonna be hard to sleep knowing you get to build in it tomorrow







.

Edit:
Not quite up, but once it's processed, the link should remain the same:


----------



## gagac1971

hey there i have gtx 1080 founders edition...i hope that evga will relase hydro kit for 1080...


----------



## bfedorov11

Quote:


> Originally Posted by *dmasteR*
> 
> http://www.babeltechreviews.com/geforce-368-39-brings-performance-gtx-1080/view-all/
> 
> 368.58 seems to bring a tad more performance for your 1080/1070 and even 980Ti!


w10 only? Is that normal? Never noticed since I'm on 8.

but but all the haters said there won't be any increases with drivers...


----------



## bp7178

Quote:


> Originally Posted by *gagac1971*
> 
> hey there i have gtx 1080 founders edition...i hope that evga will relase hydro kit for 1080...


EK has full waterblocks for the GTX 1080 cards.


----------



## gagac1971

Quote:


> Originally Posted by *bp7178*
> 
> EK has full waterblocks for the GTX 1080 cards.


nice but i like more that all in one hybrid coolers kit from evga ...no need to drain or....


----------



## pez

Quote:


> Originally Posted by *bfedorov11*
> 
> w10 only? Is that normal? Never noticed since I'm on 8.
> 
> but but all the haters said there won't be any increases with drivers...


Unfortunately there's some decreases in there. Shadows of Mordor was one that was noticeable and definitely fell outside of the 'margin of error'. However, there were some great improvements in there as well. Dying Light being one of them. One thing that I noticed in a couple games was that while SLI 980s perform around the same and have some back and forth battles, the 1080 tends to see better minimums, which is a win in my book







.


----------



## ChevChelios

Quote:


> Originally Posted by *pez*
> 
> Shadows of Mordor was one that was noticeable and definitely fell outside of the 'margin of error'.


from the article
Quote:


> The only real outlier we noted is with Shadows of Mordor which shows a significant decrease in performance for the GTX 1080, and we suspect that default Ultra was originally used instead of our current maximum settings.


----------



## pez

Quote:


> Originally Posted by *ChevChelios*
> 
> from the article


Ah, good catch. I didn't read too in-depth past the intro and then of course the graphs







.


----------



## sebna

Quote:


> Originally Posted by *pez*
> 
> Sound test went quicker then I thought so it's currently uploading. I'll come back and post the link once it's fully published/uploaded.
> 
> Microphone on the iPhone actually picked up the fan noise well...moreso than is actually perceived. However, it should be a decent enough result to show the differences in each increment
> 
> 
> 
> 
> 
> 
> 
> . I'll have to give silent mode a try tomorrow to check it out.
> 
> Also, that case is going to be a HUGE upgrade. One of my first 'nice' cases was a P180 v1. Loved that thing despite its flaws
> 
> 
> 
> 
> 
> 
> 
> . It's gonna be hard to sleep knowing you get to build in it tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Edit:
> Not quite up, but once it's processed, the link should remain the same:


I was off to bed by the time you posted







So have just seen it today.

Great Vid. Really comprehensive. Many thanks for that. It sounds like it is very quite and not at all annoying tone to it.

Yeah P180 is love hate relation. I loved it while it was built but working with it inside is a pure nightmare IMHO, no cable management, no easy access to HDDs... but once built a solid case, quite enough and efficient in cooling. Still glad I have moved on from it. It was a high time after a decade with it.

Cheers


----------



## rv8000

Does anyone have an EVGA card with the new ACX cooler yet? I still can't seem to find any decent reviews mentioning both noise levels and temps.


----------



## outofmyheadyo

So I havent really kept up to date with 1080s wich would be the best AIB card to go for, staying on air lookin for great overclocks and mainly silent cooler, was thinking about evga ftw or palit jetstream not interested kingpins classies or lightning want to keep the price reasonable.


----------



## ChevChelios

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So I havent really kept up to date with 1080s wich would be the best AIB card to go for, staying on air lookin for great overclocks and mainly silent cooler, was thinking about evga ftw or palit jetstream


MSI Gaming X 1080, but its a hefty price

otherwise G1 Gaming is fine too

I havent seen FTW yet in reviews/peformance

would not do Palit unless there was no other choice

but if you do go Palit/Gainward then go for the GameRock Premium/Phoenix GS-GLH with the big double fans and not for the jetstream IMO


----------



## SynchroSCP

Quote:


> Originally Posted by *rv8000*
> 
> Does anyone have an EVGA card with the new ACX cooler yet? I still can't seem to find any decent reviews mentioning both noise levels and temps.


Yes, the ACX version does a good job...nice balance of noise and temps. At stock it is noticeably more quiet than my Titan X was and maintained temps in the 70's even in extended gaming sessions. Only lasted a couple of days before I put it under water tho, I think jayztwocents has a review of that card on YT.

Full cover EK block arrived yesterday so I removed the Thermosphere and installed that, I can confirm that the EVGA backplate that came on my SC ACX version works very well with the EK block, no additional hardware needed to install it. The m2.5x7 screws that come with the EK block are all thats needed and there are 6 screw locations that allow mounting of the EVGA backplate when the EK full cover block is installed (one on the top corner required a nut and washer that was also included with the EK block).

For me thats a big win as I much prefer the EVGA backplate which is vented and a nice brushed finish, also saves $35 or so.


----------



## outofmyheadyo

Jetstream is supposed to be the quietest of theb AIB cards, and it is with 2 double fans.


----------



## sebna

I had Strix on pre-order but after disappointing (for my needs) acoustic performance I switched to MSI Gaming X - 4db quieter then Strix. I am not a fan of looks of MSI but it will have to do. G1 is also very quiet as is Inno3D X3 (however in case of Inno the cooling performance suffers because of it)

Cheers


----------



## ChevChelios

Gaming X or G1 seems to be two of the best choices right now


----------



## danamaniac

What u guys think of my SLi 1080's?, spent tooo much money but dont care when im gaming. Buying waterblocks next.


----------



## alucardis666

Wow. that looks like a beautiful and VERY clean system! Nice work! :-D


----------



## danamaniac

Grrrrr wish i had enough money left for the EK waterblocks, i really need them for the cards and then i can add the bottom radiator into the loop and it will be finnished.


----------



## pez

Quote:


> Originally Posted by *sebna*
> 
> I was off to bed by the time you posted
> 
> 
> 
> 
> 
> 
> 
> So have just seen it today.
> 
> Great Vid. Really comprehensive. Many thanks for that. It sounds like it is very quite and not at all annoying tone to it.
> 
> Yeah P180 is love hate relation. I loved it while it was built but working with it inside is a pure nightmare IMHO, no cable management, no easy access to HDDs... but once built a solid case, quite enough and efficient in cooling. Still glad I have moved on from it. It was a high time after a decade with it.
> 
> Cheers


Indeed. Later revisions of the case fixed quite a bit, but oh well. I still miss that case from time-to-time







. And very glad you got some use out of it







. Going to try a few more games today to see what kinda numbers I'm seeing. Once I get a baseline, I'm going to turn the power limit to max and see what it does on it's own with the Boost clocks.
Quote:


> Originally Posted by *danamaniac*
> 
> Grrrrr wish i had enough money left for the EK waterblocks, i really need them for the cards and then i can add the bottom radiator into the loop and it will be finnished.
> 
> 
> Spoiler: Warning: Spoiler!


Nice! Is that carbon fibre look a part of the case, or is that a 3M Vinyl type of thing going on?


----------



## danamaniac

Quote:


> Originally Posted by *pez*
> 
> Indeed. Later revisions of the case fixed quite a bit, but oh well. I still miss that case from time-to-time
> 
> 
> 
> 
> 
> 
> 
> . And very glad you got some use out of it
> 
> 
> 
> 
> 
> 
> 
> . Going to try a few more games today to see what kinda numbers I'm seeing. Once I get a baseline, I'm going to turn the power limit to max and see what it does on it's own with the Boost clocks.
> Nice! Is that carbon fibre look a part of the case, or is that a 3M Vinyl type of thing going on?


Its just cheap vinyl from my local auto store, 3m costs a fortune here, over $100 here. Worked great.


----------



## poinguan

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> Originally Posted by *outofmyheadyo*
> 
> So I havent really kept up to date with 1080s wich would be the best AIB card to go for, staying on air lookin for great overclocks and mainly silent cooler, was thinking about evga ftw or palit jetstream
> 
> 
> 
> MSI Gaming X 1080, but its a hefty price
> 
> otherwise G1 Gaming is fine too
> 
> I havent seen FTW yet in reviews/peformance
> 
> would not do Palit unless there was no other choice
> 
> but if you do go Palit/Gainward then go for the GameRock Premium/Phoenix GS-GLH with the big double fans and not for the jetstream IMO
Click to expand...

What's wrong with Palit? Why last choice? It seems Palit not getting any love. (Reminds me of ASRock, great quality at lower price).

Are you sure the Jetstream has smaller fans? AFAIK, both Jetstream & GameRock have 100mm fans. Based on Palit ad, the Gamerock probably has a little more fins, but that's hard to believe until one removes the cooler.


----------



## alucardis666

Ok...

I decided to say F-it. If anything I'll dump the FE later and get a AIB or a TI when those hit...

You win this round Nvidia...


----------



## pez

Quote:


> Originally Posted by *danamaniac*
> 
> Its just cheap vinyl from my local auto store, 3m costs a fortune here, over $100 here. Worked great.


Regardless, it looks nice. Once I saw it, I thought about it as a solution for modding without paint. I did the same with some interior bits of my car with a color I wasn't so sure I'd like if I would have went with a more permanent solution







.


----------



## outofmyheadyo

Going to grab an AIB today or tomorrow, sway me in the right direction guys, right now it`s MSI GAMING X, GIGABYTE G1, PALIT SUPER JETSTREAM, EVGA FTW anything else spectacular I missed ? Not interested in hybrids, since fan noise enough for me.


----------



## xTesla1856

Could anyone link me to EVGA ACX 3.0 cooler noise video? Thank you


----------



## pez

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Going to grab an AIB today or tomorrow, sway me in the right direction guys, right now it`s MSI GAMING X, GIGABYTE G1, PALIT SUPER JETSTREAM, EVGA FTW anything else spectacular I missed ? Not interested in hybrids, since fan noise enough for me.


Just based on reviews I've seen, it doesn't look like you can go wrong with any of the offerings from MSI and Gigabyte. EVGA should be a no-brainer and be good, but stranger things have happened







.

I don't personally care for the cooler design of the EVGA ACX 3.0 cards, but that's me







. What about the Asus STRIX?


----------



## sebna

What are you priorities?

TBH MSI Gaming X has it all.

FTW probably will not be as quiet as historical they never were but there is no reviews I know of so far.

For me it was between G1 and MSI but MSI is marginally quieter so problem solved







I was only choosing between reviewed cards with priority for quiet and efficient operation.

Also MSI has one more big advantage for me. Ironically it is its high price, because of which the queues for it are one of the smallest so also the wait time is shorter.

I was 122th in queue for Strix (going all the way to 700....) and I am 16th of MSI !







- I have ordered MSI today and Strix on 27th of May....

Cheers


----------



## Hackslash

Quote:


> Originally Posted by *poinguan*
> 
> What's wrong with Palit? Why last choice? It seems Palit not getting any love. (Reminds me of ASRock, great quality at lower price).
> 
> Are you sure the Jetstream has smaller fans? AFAIK, both Jetstream & GameRock have 100mm fans. Based on Palit ad, the Gamerock probably has a little more fins, but that's hard to believe until one removes the cooler.


got a palit gamerock premium here.

its very nice, its quiet can easily hold 2ghz+ stable and has no coil whine...

just try it!

i recommend it!


----------



## outofmyheadyo

It needs to be as quiet as possible, as I cant really hear my CPU fan, it would be lovely if the GPU would be near silent aswell, ofcouse if it overclocks well is a bonus, and I`d like to spend as little as possible.


----------



## sebna

Quote:


> Originally Posted by *Hackslash*
> 
> got a palit gamerock premium here.
> 
> its very nice, its quiet can easily hold 2ghz+ stable and has no coil whine...
> 
> just try it!
> 
> i recommend it!


Any chance you could record acoustic performance in similar fashion to great vid Pez has posted recently (yesterday evening)?

Thanks


----------



## pez

So either Fallout 4 runs like butter on the GTX1080. I knew it would in everything in the normal environment, but was worried about something like Diamond City. Lowest FPS I saw was 50 momentarily as I finished loading up the city, and then it never really got below 64 frames after that. I was seeing anywhere from 90-180 outside of the city. This is everything at full tilt (AA, Godrays, Bokeh, etc). Very pleased. Results below are from about 30 or so minutes of gameplay.

Fallout 4:
68C max, core clock sitting at 1848MHz, and fan percentage looks to be.....bugged. I'll have to restart AfterBurner for the next test







.


----------



## TK421

FO4 is borked in some places, don't use is as a benchmark and expect frames to be consistent throughout your gameplay.


----------



## pez

Quote:


> Originally Posted by *TK421*
> 
> FO4 is borked in some places, don't use is as a benchmark and expect frames to be consistent throughout your gameplay.


Haha, I never do with any Bethesda title







. I just specifically remember it running pretty poorly in Diamond City before. Hell it even brings the PS4 and Xbox One to its' knees. Not that that's anything short of easy to do







.


----------



## chronicfx

Quote:


> Originally Posted by *TK421*
> 
> using two sli bridge will supposedly have the same effect as a HB sli bridge


We will have to see benchmarks on that. They claim that the HD bridge runs 650MHZ and the older bridges 450MHZ, both use dual fingers. So we will see.


----------



## TK421

Quote:


> Originally Posted by *chronicfx*
> 
> We will have to see benchmarks on that. They claim that the HD bridge runs 650MHZ and the older bridges 450MHZ, both use dual fingers. So we will see.


difference will probably be more apparent in DOOM 2016, nightmare shadows currently have problem with sli because there's so much bandwidth to transfer.

Not sure how it will work with AMD XDMA CF though.


----------



## rv8000

Finally managed to pick up a card off newegg, gosh I've never checked out an item so fast. Had everything logged in, f5 f5 f5 f5 f5, click click boom boom. Definitely lost sleep over this







.


----------



## nabarun

max temp of my GTX 1080 with custom watercooling after few hours stressing the card. max temp around 37C° with a room temp of 21c°. I think it's time to overclock and reach that +2ghz clock


----------



## pez

My benchmarks are getting silly at this point. I've got Crysis 3, but I haven't played a minute of it and don't want to sit through any cut scenes just yet







.

So....I present to you Borderlands: The Pre-Sequel:
64C max temp, 1848 GPU clock, 40% fan speed. Turned everything up and even put PhysX on Ultra in attempts to get the card to work up as much of a sweat as possible. Without AA this could potentially be an easy 4K title for the card.


----------



## TK421

30c ambient and 52c load is respectable for a lowly AIO cooler or?


----------



## bfedorov11

I told myself I was going to keep it simple this build.. stay on air..... and I just ordered an EK block. Thoughts on if a single 240mm xt45 or 240gts can cool a 1080 and 6700k with ap15 fans?


----------



## TK421

Quote:


> Originally Posted by *bfedorov11*
> 
> I told myself I was going to keep it simple this build.. stay on air..... and I just ordered an EK block. Thoughts on if a single 240mm xt45 or 240gts can cool a 1080 and 6700k with ap15 fans?


recommend that you have 1x12/40 for every component and 1x12/40 extra for leftover heat


----------



## Spiriva

Quote:


> Originally Posted by *nabarun*


I love the color on your liquid, kinda got that beer look to it!


----------



## nabarun

Quote:


> Originally Posted by *Spiriva*
> 
> I love the color on your liquid, kinda got that beer look to it!


hehe thanks, i used to have black & red, but everyone has that these days and i was sick of it. and Orange is underestimated imo


----------



## Nizzen

Got a MSI 1080 Gaming X here in Norway











~2140mhz max stable in games before artifacts dancing


----------



## TK421

Any of you guys play on 1440p? Any noticeable performance drop from 1080?


----------



## papashimbers

Got my 1080 delivered today, was planning to try it out but happened to stuck in bed sick, the most I could muster up was opening it briefly to take a couple pics, and then put it back in the box to sit next to me on the floor...


----------



## Bogga

Got these the other day... got som stuttering issues in all games tested despite DDU, reinstalling drivers, lowering settings and so on. Bought W10 and got it installed... so far I haven't experienced any stuttering









Gotta try some more though...


----------



## pez

Very nice cards fellas!

Quote:


> Originally Posted by *TK421*
> 
> Any of you guys play on 1440p? Any noticeable performance drop from 1080?


I'm at 1440p, but I mean, jumping from 1080p to 1440p is pretty decent leap. Nearly any game that has a proper engine becomes more GPU dependent at that res and higher. Of all the games that I have tested, none have dipped below 60 with the exception of Fallout 4 momentarily and Crysis (1). Also, the way the frames dropped in Crysis, I'm pretty sure something was wrong in general as it acted as if it was running out of VRAM when it was only trying to user like 2GB at the time.


----------



## drop24

Those Strix are sexy. Did anyone use the on board fan headers yet to control some PWM case fans? I'm curious to hear how well that works.


----------



## pez

I remembered thinking was definitely an awesome feature for those who like having the extra fans on the side panel of their case.


----------



## Bogga

Nope, haven't tried that out yet...

Wont be doing that until I go dual loop in the end of the year









Or perhaps attach the case fans to it and try it out...


----------



## JonnyBigBoss

I received my GTX 1080 last week. I absolutely love it so far.

I have one issue, though. For some reason whenever my card hits 55 degrees C the fan begins spinning quickly, jumping from 55% to 80% (2100 to 2900 RPM) instantly before temps go back down.



My fan curve is currently set to a near 1:1 ratio. I really have no idea what could be causing this. Any ideas?


----------



## TK421

Quote:


> Originally Posted by *pez*
> 
> Very nice cards fellas!
> I'm at 1440p, but I mean, jumping from 1080p to 1440p is pretty decent leap. Nearly any game that has a proper engine becomes more GPU dependent at that res and higher. Of all the games that I have tested, none have dipped below 60 with the exception of Fallout 4 momentarily and Crysis (1). Also, the way the frames dropped in Crysis, I'm pretty sure something was wrong in general as it acted as if it was running out of VRAM when it was only trying to user like 2GB at the time.


Ah ok, so most modern title won't have a difficulty running on 1440p with the 1080 right?

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I received my GTX 1080 last week. I absolutely love it so far.
> 
> I have one issue, though. For some reason whenever my card hits 55 degrees C the fan begins spinning quickly, jumping from 55% to 80% (2100 to 2900 RPM) instantly before temps go back down.
> 
> 
> 
> My fan curve is currently set to a near 1:1 ratio. I really have no idea what could be causing this. Any ideas?


Clean install newest driver with DDU, seems to be a fan management issue on the old (first) driver.


----------



## pez

Quote:


> Originally Posted by *Bogga*
> 
> Nope, haven't tried that out yet...
> 
> Wont be doing that until I go dual loop in the end of the year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or perhaps attach the case fans to it and try it out...


Yeah, even in open air we could get some useful info. It'd be interesting to see how well it scales with the fans or to see if there's an option to preemptively curve the two fans to ramp up before GPU fans kick in.
Quote:


> Originally Posted by *JonnyBigBoss*
> 
> I received my GTX 1080 last week. I absolutely love it so far.
> 
> I have one issue, though. For some reason whenever my card hits 55 degrees C the fan begins spinning quickly, jumping from 55% to 80% (2100 to 2900 RPM) instantly before temps go back down.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> My fan curve is currently set to a near 1:1 ratio. I really have no idea what could be causing this. Any ideas?


I believe this was addressed in the newest driver. I believe this was reported from quite a few people with FE's. As mentioned by TK421 below, I'd recommend removal of current driver via DDU and then installing the latest drivers.
Quote:


> Originally Posted by *TK421*
> 
> Ah ok, so most modern title won't have a difficulty running on 1440p with the 1080 right?
> Clean install newest driver with DDU, seems to be a fan management issue on the old (first) driver.


I'd like to say yes, but what all do you consider modern? I haven't had the chance to test out Crysis 3 or something like Metro LL Redux, but I remember those two specifically hitting below the 60 FPS margin and being slightly above/under that margin. I'll have to take a look again.


----------



## spyui

Have anyone try overvolting AIB gtx 1080 under water ? what is your max OC ?


----------



## MCFC

Thinking of grabbing an Inno3D 1080 card because it's relatively cheap here compared to some of these other brands.
Any experiences with this brand or card?
The turboboost is 1898 MHz which sounds good, right?


----------



## pez

Quote:


> Originally Posted by *MCFC*
> 
> Thinking of grabbing an Inno3D 1080 card because it's relatively cheap here compared to some of these other brands.
> Any experiences with this brand or card?
> The turboboost is 1898 MHz which sounds good, right?


From what I've read, it's not as popular stateside as it's not sold as openly as other brands, so there may be few of us who have personal experiences here







.

If that's the cards factory OC/Boost, then that should be pretty awesome. At this point, finding out temps and noise levels would be your best bet, and may shed some light on how the GPU will boost. My card maintains it's 1835 clock or higher in 'gaming' mode, so I have no doubt that you will see those numbers. If it stays cool and you can provide it with the necessary airflow, I don't see why it won't boost even higher. The best I've seen boost without any tweaks or manual OC is 1945MHz.


----------



## ChevChelios

anyone here with the *Zotac 1080 AMP* (regular AMP, not Extreme) ? Impressions ?

https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-edition


----------



## JaBR23KiX

Hi there,

So glad to join this club with my brand new ROG STRIX-GTX1080-O8G-GAMING.

Not so much good for more OC but 2136 mhz stable in all benchmark what i tried.

Pre order 31.5.2016 and wait 14 days and finally get it yesterday.




Cheers and enjoy this great gpu:thumb:


----------



## JonnyBigBoss

Quote:


> Originally Posted by *TK421*
> 
> Clean install newest driver with DDU, seems to be a fan management issue on the old (first) driver.


Roger that. I just did a driver wipe and will try out the new install. Thanks!


----------



## dpoverlord

Finally!

Did step up with my EVGA Classified 980ti and got myself the 1080 founders edition.

Life cycle: 690-->Titan x3 --> Titan X x3 --> 980ti x1 ---> 1080 founders

Thoughts on things I should do with this to maximize my system

?


----------



## Radox-0

Quote:


> Originally Posted by *dpoverlord*
> 
> Finally!
> 
> Thoughts on things I should do with this to maximize my system
> ?


I recommend to maximise the value you use it as a gpu rather then a expensive coffee / tea coster









On a serious note just set up a custom fan profile to eliminate throttling with the fe card. It runs reasonably quiet for a refrence cooler so can be ramped up to reasonable levels without becoming annoying.


----------



## BigBeard86

what are people getting on their watercooled oc?

I am getting 2,088mhz on the FE 1080, stock volts and reference blower, and have two aio and g10s sitting around from my sold 290 crossfire setup; wondering if I should water cool this.


----------



## Benjiw

Quote:


> Originally Posted by *dpoverlord*
> 
> Finally!
> 
> Did step up with my EVGA Classified 980ti and got myself the 1080 founders edition.
> 
> Life cycle: 690-->Titan x3 --> Titan X x3 --> 980ti x1 ---> 1080 founders
> 
> Thoughts on things I should do with this to maximize my system
> 
> ?


Putting it in the system first should help a lot.


----------



## Naked Snake

I've cancelled my FTW preorder because I'm so damn tired of waiting, I will join the club soon


----------



## Spiriva

Quote:


> Originally Posted by *BigBeard86*
> 
> what are people getting on their watercooled oc?
> 
> I am getting 2,088mhz on the FE 1080, stock volts and reference blower, and have two aio and g10s sitting around from my sold 290 crossfire setup; wondering if I should water cool this.


2204mhz stable, EK waterblock.


----------



## Cial00

Quote:


> Originally Posted by *Spiriva*
> 
> 2204mhz stable, EK waterblock.


Which model?


----------



## Benjiw

Quote:


> Originally Posted by *Spiriva*
> 
> 2204mhz stable, EK waterblock.


Quote:


> Originally Posted by *Cial00*
> 
> Which model?


Yes please provide data so we can buy our OH or selves good overclockers and not end up with volt limited strix cards.


----------



## datwitch

Playing around with the voltage/frequency curve:

Gigabyte 1080 FE + EK Block/Backplate, CPU+GPU loop with 1x PE240 and 1x PE120







http://www.3dmark.com/fs/8817315


----------



## bonami2

Hi all









If i order gtx 1070... Do the sli bridge come with them?

Not sure if the evga pro sli bridge v2 on newegg.ca is the best one..

Im gonna run 4k and 5760x1080 so i want the most bandwith i can..


----------



## uggy

Its 2070 here.,

MSI founders with ek waterblocks acetal + nickel..

Power limit 120.
Core voltage default, not possible to change at the moment(?)


----------



## Asus11

Quote:


> Originally Posted by *Naked Snake*
> 
> I've cancelled my FTW preorder because I'm so damn tired of waiting, I will join the club soon


how are you paying 515?


----------



## dpoverlord

Quote:


> Originally Posted by *Benjiw*
> 
> Putting it in the system first should help a lot.


Quote:


> Originally Posted by *Radox-0*
> 
> I recommend to maximise the value you use it as a gpu rather then a expensive coffee / tea coster
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a serious note just set up a custom fan profile to eliminate throttling with the fe card. It runs reasonably quiet for a refrence cooler so can be ramped up to reasonable levels without becoming annoying.


Wait.... I cannot use the card in the box? I thought it just " magically works".









I'll make a fan profile since I hate throttling. The key thing I need to decide is if I should keep the 5930k / rampage V extreme or sell and upgrade it to a newer model and then OC.

Super pumped.


----------



## Spiriva

Quote:


> Originally Posted by *Cial00*
> 
> Which model?


Evga FE
Quote:


> Originally Posted by *bonami2*
> 
> Hi all
> 
> 
> 
> 
> 
> 
> 
> 
> If i order gtx 1070... Do the sli bridge come with them?
> Not sure if the evga pro sli bridge v2 on newegg.ca is the best one..
> Im gonna run 4k and 5760x1080 so i want the most bandwith i can..


No you wount get a sli bridge with either 1070 or 1080.

Evga sli bridge v2 works fine.


----------



## Benjiw

Quote:


> Originally Posted by *dpoverlord*
> 
> Wait.... I cannot use the card in the box? I thought it just " magically works".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll make a fan profile since I hate throttling. The key thing I need to decide is if I should keep the 5930k / rampage V extreme or sell and upgrade it to a newer model and then OC.
> 
> Super pumped.


Hahaha! you could use a PCI-e Extender to be fair!








Quote:


> Originally Posted by *Spiriva*
> 
> Evga FE


If more people have similar results I'll stick to EVGA.


----------



## Spiriva

Quote:


> Originally Posted by *Benjiw*
> 
> If more people have similar results I'll stick to EVGA.


All about luck i guess, my card does 2204mhz under water, my friends card does 2034mhz also under water both Evga FE.
Then again the difference in fps between 2204 / 2034 in games is pretty much none existing.


----------



## Radox-0

Quote:


> Originally Posted by *Benjiw*
> 
> Hahaha! you could use a PCI-e Extender to be fair!
> 
> 
> 
> 
> 
> 
> 
> 
> If more people have similar results I'll stick to EVGA.


Its not really the fact its EVGA, rather Silicon lottery as the founder cards are all identical broadly speaking. Only difference is the box its shipped in will be the company that manages the warrenty and for that reason I would still go EVGA.

Spiriva sounds like he hit the jackpot twice. Of the 3 EVGA FE cards I have, only one under water does 2200 mhz, one falls over at about 2100 and the third (which has now been sent back) was managing 2050 stable. Not a bad thing per say as 2050 seems reasonable for most.


----------



## Bogga

2012/2038 and 10600


----------



## Benjiw

Quote:


> Originally Posted by *Radox-0*
> 
> Its not really the fact its EVGA, rather Silicon lottery as the founder cards are all identical broadly speaking. Only difference is the box its shipped in will be the company that manages the warrenty and for that reason I would still go EVGA.
> 
> Spiriva sounds like he hit the jackpot twice. Of the 3 EVGA FE cards I have, only one under water does 2200 mhz, one falls over at about 2100 and the third (which has now been sent back) was managing 2050 stable. Not a bad thing per say as 2050 seems reasonable for most.


OOOOO Nice! I need me a 2200mhz card! Saying that I haven't the money right now, will have to put up with my 1660mhz 970.


----------



## Naked Snake

Quote:


> Originally Posted by *Asus11*
> 
> how are you paying 515?


No VAT I guess because I'm from Argentina.


----------



## JaBR23KiX

Tried some more overclocking and looks like 2088mhz stable. Nothing extreme


----------



## bonami2

Quote:


> Originally Posted by *Spiriva*
> 
> Evga FE
> No you wount get a sli bridge with either 1070 or 1080.
> 
> Evga sli bridge v2 works fine.


Thank you









Seen nvidia short medium and long bridge...

But evga got long or short...

I got a z97 gaming 7 and used the long crossfire bridge that came with my 7950.

So i imagine the long evga is the one i need


----------



## outofmyheadyo

So what is nvidia doing the hell is going on with this availability of the cards, what is this, great depression again, people are waving bundles of cash and shouting " shut up and take my money" and vendors are like " we dont have anything to sell" this is ridiculous.


----------



## rv8000

Bad yields. Restricting majority if chips for FE to make extra $$$. Low availability of GDDR5X. Released earlier than originally planned. Your guess is probably as good as any out there, and we will probably never know.


----------



## sherlock

Wonder if anyone else is running into this issue:

When I give my FE +500 on the memory(AB 4.3 beta 4), the clock shown is 5508(1337X4) instead of the 5500 as it should be. Wondering if this is some sort of glitch and I should back down to +492


----------



## emett

Ok, canceled my FTW from Amazon as they didn't even have an ETA. Managed to snap up a Gigabyte G1 a state away. Should be here Monday. W00t!


----------



## saeedkunna

EVGA FE on air @ 2050


----------



## rv8000

Gahhh, gonna be like a kid on Christmas morning when I get back from work tomorrow. So excited for some new tech to play with!


----------



## Setzer

So I jumped the gun and bought an EVGA GTX 1080 FTW. It will be in stock in a week, and then it'll be shipped.

But I already have my doubts if it's the right card








What do you guys think is the best card in terms of noise/temperature and overclocking?
The prices are weird here in Denmark, so ALL of the 1080 cards practically cost the same.
Thoughts?


----------



## MrDerrikk

Quote:


> Originally Posted by *Setzer*
> 
> So I jumped the gun and bought an EVGA GTX 1080 FTW. It will be in stock in a week, and then it'll be shipped.
> 
> But I already have my doubts if it's the right card
> 
> 
> 
> 
> 
> 
> 
> 
> What do you guys think is the best card in terms of noise/temperature and overclocking?
> The prices are weird here in Denmark, so ALL of the 1080 cards practically cost the same.
> Thoughts?


In Australia they're all at the 1.2k mark, so I preordered the FTW from B&H as I like the looks the best. EVGA are a safe bet to go with for quality normally, plus what with the hard wall on overclocking all cards seem to be in the same performance area.


----------



## BrightCandle

I have no idea when my EVGA FTW is coming. Up until the end of today the delivery was set for 16/06 (today) but now I have no ETA at all and its just "ordered on request". Seeing as how I preordered at the beginning of the month for an original delivery estimate of 10/06 its all looking like a royal mess. All of this screams yield issues as these are not cheap graphics cards the volumes for sale are pretty low.


----------



## MrDerrikk

Quote:


> Originally Posted by *BrightCandle*
> 
> I have no idea when my EVGA FTW is coming. Up until the end of today the delivery was set for 16/06 (today) but now I have no ETA at all and its just "ordered on request". Seeing as how I preordered at the beginning of the month for an original delivery estimate of 10/06 its all looking like a royal mess. All of this screams yield issues as these are not cheap graphics cards the volumes for sale are pretty low.


B&H originally said the ETA for stock to arrive was today too so I'm really hoping I get an email saying it's being sent off tonight. I don't know if I was put in the first wave of stock though so I'm doubting it'll happen.

Overall I'm just hoping it shows up before I move to the city in two weeks


----------



## bfedorov11

Quote:


> Originally Posted by *Benjiw*
> 
> Hahaha! you could use a PCI-e Extender to be fair!
> 
> 
> 
> 
> 
> 
> 
> 
> If more people have similar results I'll stick to EVGA.


If I were to roll the dice right now, I would go with an Evga FE too. I saw my card hit 216x a few times benching, stock heatsink with kryonaut. 3dmark read 2152. http://www.3dmark.com/fs/8807208 Ek block coming in a day or two.


----------



## crazysoccerman

Are there any videos or tutorials about how to set up simultaneous multi-projection?

All a google search yields is articles dated from nvidia's announcement a month ago.

I'm referring specifically to SMP's ability to make surround displays to not appear distorted.


----------



## bfedorov11

Quote:


> Originally Posted by *crazysoccerman*
> 
> Are there any videos or tutorials about how to set up simultaneous multi-projection?
> 
> All a google search yields is articles dated from nvidia's announcement a month ago.
> 
> I'm referring specifically to SMP's ability to make surround displays to not appear distorted.


I'm pretty sure others have said it is a feature that is game dependent which has to be implemented by the developer.


----------



## chronicfx

Quote:


> Originally Posted by *Nizzen*
> 
> Got a MSI 1080 Gaming X here in Norway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~2140mhz max stable in games before artifacts dancing


Can you show me your score in the "free version", I don't have extreme but I do have twin 980 Ti. Just wondering what an overclock like that brings.









Quote:


> Originally Posted by *Bogga*
> 
> 
> 
> 
> 
> Got these the other day... got som stuttering issues in all games tested despite DDU, reinstalling drivers, lowering settings and so on. Bought W10 and got it installed... so far I haven't experienced any stuttering
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta try some more though...


Two syllables.... Geeeee-Sync


----------



## crazysoccerman

Quote:


> Originally Posted by *bfedorov11*
> 
> I'm pretty sure others have said it is a feature that is game dependent which has to be implemented by the developer.


I'm curious about whether or not you need to get the angle of your surround monitors to match software specs. I would assume you do. Apparently LinusTechTips has a video coming out next week about it (it's already on vessel, which I don't have a subscription for).


----------



## Agoniizing

Why does my core clock throttle when my temps and power limit are good? I tried manually OC'ing, and my core clock will start high then it will slowly throttle down.


----------



## skline00

Just joined the GTX 1080 club this afternoon. Received a Zotac GTX 1080 FE and spent the evening adding A EK GTX 1080 Acrylic/copper waterblock to it.

I'll be playing with it in the next few days.


----------



## AllGamer

So the MSI finally arrived and installed

however I can't get my surround to run at 120hz it only goes 60hz, not sure what is going on

got 3x 3D monitors ASUS VS247H-P

never had problem with previous version of GTX cards but this GTX1080 is not detecting them as 120hz...

using latest drivers 368.39


----------



## trippinonprozac

What is the best way to play with voltage on these? Should I be letting PX do its own testing or creating my own voltage curve?

I have an EVGA FE with an EK block. Card maxs out at 29c so Im keen to see if there is any headroom with some voltage tweaks.


----------



## bfedorov11

Quote:


> Originally Posted by *trippinonprozac*
> 
> What is the best way to play with voltage on these? Should I be letting PX do its own testing or creating my own voltage curve?
> 
> I have an EVGA FE with an EK block. Card maxs out at 29c so Im keen to see if there is any headroom with some voltage tweaks.


Use afterburner beta and press ctrl f.

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/760_40#post_25243723


----------



## stanielz

2x watercooled 1080 SLI here OC'd to 2125 mhz

Fire Strike Extreme Score:

http://www.3dmark.com/fs/8818307

Cards dont seem to want to downclock with sli enabled when idle, toggling sli fixes this, anyone seeing this?


----------



## pez

Quote:


> Originally Posted by *Agoniizing*
> 
> Why does my core clock throttle when my temps and power limit are good? I tried manually OC'ing, and my core clock will start high then it will slowly throttle down.


What temps are you seeing when it starts throttling? What is your temp target set to? Power limit?


----------



## ChevChelios

Quote:


> Originally Posted by *Agoniizing*
> 
> Why does my core clock throttle when my temps and power limit are good? I tried manually OC'ing, and my core clock will start high then it will slowly throttle down.


um, what 1080 model ? what fan setting ? what temps ? what OC settings ?


----------



## Hilpi234

The new FS stresstest is quite n1 for testing stability



The card will throttle or add more Voltage every +-10°C you cannot prevent that.


----------



## NBAasDOGG

OMG gentleman. I found a way to bypass power limit and voltage in MSI afterburner. PL voltage can be increased up to 150% with the sliders. The program I'm using to bypass the slider limit is called Artmoney.
I will post a tutorial very soon.

Meanwhile you can take a look at this: http://hwbot.org/news/9088_guide_how_to_bypass_geforce_gtx_680_lightning_afterburner_limitation_by_nicklas0912

Please start reading from: Bypassing the voltage limitation - the actual overclocking

And please use the standard MSI afterburner skin.

If you blow up your card, I AM NOT RESPONSIBLE!!!


----------



## ChevChelios

thx, but I am not putting that kind of hurt on a 700 EUR card









2050-2100+ OC is good enough for me


----------



## JaBR23KiX

Quote:


> Originally Posted by *Agoniizing*
> 
> Why does my core clock throttle when my temps and power limit are good? I tried manually OC'ing, and my core clock will start high then it will slowly throttle down.


Same problem here. Around minute on 2106 mhz and after goes down to 2088mhz. Temps 53. Fan manualy to 80 procent. Asus strix gaming oc


----------



## ChevChelios

Quote:


> Around minute on 2106 mhz and after goes down to *2088*mhz.


oh thats fine by all accounts from everything Ive read .. 18 Mhz is a very small fluctutation

2088 sustained (assuming it is sustained) is a very fine result

I thought you were talking about losing like 100 Mhz or more despite the temps


----------



## outofmyheadyo

Quote:


> Originally Posted by *Hilpi234*
> 
> The new FS stresstest is quite n1 for testing stability
> 
> 
> 
> The card will throttle or add more Voltage every +-10°C you cannot prevent that.


Strange the firestrike extreme stress test gave my stock 980ti with stock clocks 90% what`s up with that?


----------



## Hilpi234

Simply open Afterburner, save your OC in a Profile, open Valley in a window and let it run...

Open the curve Editor and every 10 degrees, reset your OC for a moment, you will see the voltage, of the maxboost will rise higher and higher, if there is no more Voltage left, it starts to throttle...

If you fixed your voltage to a certain point, it will run as long as the core can provide stability to this point (depends on the workload), then your OC wil drop or your driver resets.

Because your stock TI, cannot keep your Framerate stable, same issue with the 1080, without Shunt-Mod and proper cooling.


----------



## livejamie

Is the EVGA FTW the best version of this at the moment? If I'm looking to upgrade should I wait? Or is now the time to pull the trigger?

Thanks!


----------



## Xeq54

I have shunt modded my MSI 1080 Gaming X and it still drops in load, even though the reported power usage in load is 70% now. It runs at 2114 for about a minute and then drops to 2104 and 2088 consecutively as it did before the shunt mod. Temps are around 65C with fan at 100%

It seems that there is something else causing the downclocking on pascal. Because on my GTX 970, I did the same shunt mod and it was running at 1600mhz constantly with no change in clockspeed.

We will need a bios editor, but given the changes in Pascal, I do think that it will take some time until a convenient bios editor such as MBE is available.


----------



## Shiftstealth

Anyone notice that the 1080 seahawk has an ALUMINUM radiator?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127942&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-Veeralava%20LLC-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=

Lawlerskates. Thats bad.


----------



## Hilpi234

Try this with Afterburner, you can simply use an Offset Overclock, because the Core takes, what it needs...


----------



## Frutek

Quote:


> Originally Posted by *Xeq54*
> 
> I have shunt modded my MSI 1080 Gaming X and it still drops in load, even though the reported power usage in load is 70% now. It runs at 2114 for about a minute and then drops to 2104 and 2088 consecutively as it did before the shunt mod. Temps are around 65C with fan at 100%
> 
> It seems that there is something else causing the downclocking on pascal. Because on my GTX 970, I did the same shunt mod and it was running at 1600mhz constantly with no change in clockspeed.
> 
> We will need a bios editor, but given the changes in Pascal, I do think that it will take some time until a convenient bios editor such as MBE is available.


Check new per voltage overclocking in MSI Afterburner. Click Ctrl+F then lock the clocks with voltage you are using by clicking L on selected dot. It shouldnt drop more than one step - 13mhz.


----------



## Benjiw

Quote:


> Originally Posted by *Shiftstealth*
> 
> Anyone notice that the 1080 seahawk has an ALUMINUM radiator?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127942&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-Veeralava%20LLC-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=
> 
> Lawlerskates. Thats bad.


It's an AIO so it will have anti corrosive in the fluid or the block has some kind of magic done to it.


----------



## Hilpi234

below 25° [email protected]
 above 25° [email protected]
 above 30° [email protected]
 above 40° [email protected]

If it Spikes over a certain Point only for 1 second your clock drops, or adds more Voltage, until the Tempreature reaches its lower Breakpoint



If you have those Stripes in GpuZ, it will throttle, or is going to... these, do not appear, if you use an Offset Overclock, because the GPU has complete control, over its Voltage.

It seems to be, somekind of stability mode...



... because the average frames, are much lower 64 to 61

Also, the Card gets hotter... this could be, the little more voltage in Offsetmode, but there is still the 3 frame difference, in consecutive runs...


----------



## wsarahan

Guys how are you?

Bought 2 of this:

The cards will arrive for me something about June 20/26










Is this a good card? With the sli Overclock is necessary? If yes how much do you thing this cards can handle? Can you post an tweak example to start? My goal is 2000 if i get 2000 i`m happy and stop

Another question, will i loose something not using the HB SLI Bridges? I can`t find anywhere yet, today i use this bridge:










Thanks


----------



## ChevChelios

besides the Palit GameRock Premium and Gainward Phoenix GS/GLH I also saw a EVGA ACX at local shops, but I held off on all those and ordered a G1 & a Zotac AMP - will take whichever gets here first, cancel the other one

there is also Gaming X, but MSI can do to hell with those prices


----------



## sherlock

Quote:


> Originally Posted by *Shiftstealth*
> 
> Anyone notice that the 1080 seahawk has an ALUMINUM radiator?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127942&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-Veeralava%20LLC-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=
> 
> Lawlerskates. Thats bad.


Alum radiator have never stopped AIO GPUs from keeping a 50C Core temp, so that's not really going to be relevant to the buyer.


----------



## pez

Quote:


> Originally Posted by *wsarahan*
> 
> Guys how are you?
> 
> Bought 2 of this:
> 
> The cards will arrive for me something about June 20/26
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is this a good card? With the sli Overclock is necessary? If yes how much do you thing this cards can handle? Can you post an tweak example to start? My goal is 2000 if i get 2000 i`m happy and stop
> 
> Another question, will i loose something not using the HB SLI Bridges? I can`t find anywhere yet, today i use this bridge:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


It was mentioned by someone that the EVGA v2 bridges will achieve/meet the standards of the HB bridge, but I'm not sure if that has been confirmed as of yet. I know cards like the Gigabyte Xtreme Gaming bundle will include a compatible bridge, but between the EVGA v2 SLI bridge, Gigabyte Xtreme Gaming bridge,and NVIDIA's own bridges, that's all I know of.


----------



## Radox-0

The bridges should not be an issue as such. You do get a warning in the Nvidia Control panel if you plug in a bridge that does not provide sufficient resolution. For 3440 x 1440 @ 100hz for example that EVGA V2 bridge or the Nvidia 3 Way bridge has no message. Plugging in a flexible bridge packaged with my motherboard did bring up the error message, not that when I tired it it games made any difference.

Separately the Nvidia graph suggests for most current configs the LED bridges such as he EVGA V2 are fine.


----------



## BrainSplatter

Does anyone know whether the HB bridge recommendations also refer to DSR resolutions? My guess is yes but maybe it's only related to monitor resolution?


----------



## Jpmboy

No pascal bios tweaker yet...








?


----------



## Spiriva

Quote:


> Originally Posted by *wsarahan*
> 
> Another question, will i loose something not using the HB SLI Bridges? I can`t find anywhere yet, today i use this bridge:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks




I got the evga v2 bridge and as you can see the sli menu does not complain about a to slow sli bridge, as it does if you use one of them old ugly looking sli bridges that ships with motherboards that support sli.


----------



## wsarahan

Quote:


> Originally Posted by *Spiriva*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> Another question, will i loose something not using the HB SLI Bridges? I can`t find anywhere yet, today i use this bridge:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks
> 
> 
> 
> 
> 
> I got the evga v2 bridge and as you can see the sli menu does not complain about a to slow sli bridge, as it does if you use one of them old ugly looking sli bridges that ships with motherboards that support sli.
Click to expand...

Thanks

And about the OC with the SLI how much do you think I can get with core and memos? I wanna 2000 at core and up a bit at memos, can you guys send me a template to start with the msi Afterburner 4.3 beta?

Thanks

Enviado do meu iPhone usando Tapatalk


----------



## wsarahan

And this acx 3.0 SC are good cards? Tks

Enviado do meu iPhone usando Tapatalk


----------



## S4ch4Z

You should be able to run past 2000 MHz stable with +100 or less on the core on the superclocked version.
+125 is what it takes for my FE to hold 2050 MHz stable with the fan ramped up to a 100%.
+400 on the memory is also a good baseline any 1080 should be able to achieve.

Don't bother maxing out the power limit to 120% as it helps a lot with core clock stability!


----------



## wsarahan

Quote:


> Originally Posted by *S4ch4Z*
> 
> You should be able to run past 2000 MHz stable with +100 or less on the core on the superclocked version.
> +125 is what it takes for my FE to hold 2050 MHz stable with the fan ramped up to a 100%.
> +400 on the memory is also a good baseline any 1080 should be able to achieve.
> 
> Don't bother maxing out the power limit to 120% as it helps a lot with core clock stability!


And the voltage % that was added at the new Afterburner? What value should I use?

Another question guys and this one is important

I have a [email protected] will this cpu with this oc will bottleneck with the 1080 SLI?

Thanks again

Enviado do meu iPhone usando Tapatalk


----------



## S4ch4Z

Quote:


> Originally Posted by *wsarahan*
> 
> And the voltage % that was added at the new Afterburner? What value should I use?
> 
> Another question guys and this one is important
> 
> I have a [email protected] will this cpu with this oc will bottleneck with the 1080 SLI?
> 
> Thanks again
> 
> Enviado do meu iPhone usando Tapatalk


The voltage slider didn't appear to have any effects on the voltage in my case,
didn't see it go higher than 1.093V when it was cranked to the max.

An OC'd 4770K won't really bottleneck per say but you won't get the most of your SLI
with its 16 PCI lanes. Don't worry, it won't get worse than a marginal loss by running
8X on each card.
Unless you're going for the win on benchmarks, you'll have plenty enough power for any game








(a single 1080 is already good enough for a 60 FPS smooth experience in the most demanding titles in 1440P IMHO)


----------



## BrainSplatter

Quote:


> Originally Posted by *wsarahan*
> 
> And the voltage % that was added at the new Afterburner? What value should I use?


Just start with no voltage increase. Often it's not necessary. And if frequency should be limited by power target (not sure about your EVGA card) then higher voltage might actually result in lower average frequency.
Quote:


> I have a [email protected] will this cpu with this oc will bottleneck with the 1080 SLI?


It depends mostly on the game. Some games like Total War-Warhammer can easily be CPU limited (GTA-V also sometimes), especially @ 1920x1080 in the case of 1080s. But most popular games are not easily CPU limited. Also at best u can expect 10% more performance with the fastest 4 core CPU available (and more cores are often not really used in most games).

One thing which can be an issue with recent games is the limit on the PCIE bus. Your mobo/chipset will only support x8 bus speed. And while this hasn't been much of a problem so far. It seems that with higher and higher resolutions (e.g. DSR) and higher texture resolutions, the number of PCIE lanes from GPU to CPU can now be a limit. Doom seems to be the prime example for this:
http://www.forum-3dcenter.org/vbulletin/showthread.php?p=11035280#post11035280
http://forums.guru3d.com/showthread.php?t=408168


----------



## bonami2

Quote:


> Originally Posted by *BrainSplatter*
> 
> Just start with no voltage increase. Often it's not necessary. And if frequency should be limited by power target (not sure about your EVGA card) then higher voltage might actually result in lower average frequency.
> It depends mostly on the game. Some games like Total War-Warhammer can easily be CPU limited (GTA-V also sometimes), especially @ 1920x1080 in the case of 1080s. But most popular games are not easily CPU limited. Also at best u can expect 10% more performance with the fastest 4 core CPU available (and more cores are often not really used in most games).
> 
> One thing which can be an issue with recent games is the limit on the PCIE bus. Your mobo/chipset will only support x8 bus speed. And while this hasn't been much of a problem so far. It seems that with higher and higher resolutions (e.g. DSR) and higher texture resolutions, the number of PCIE lanes from GPU to CPU can now be a limit. Doom seems to be the prime example for this:
> http://www.forum-3dcenter.org/vbulletin/showthread.php?p=11035280#post11035280
> http://forums.guru3d.com/showthread.php?t=408168


Thank you for sharing that doom info about pcie.

Im not sure if it reality or just a badly coded game.. Since lot of game have ways better graphic.

If it true... My x8 x8 with 1070 may be bottlenecked...


----------



## wsarahan

Quote:


> Originally Posted by *BrainSplatter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> And the voltage % that was added at the new Afterburner? What value should I use?
> 
> 
> 
> Just start with no voltage increase. Often it's not necessary. And if frequency should be limited by power target (not sure about your EVGA card) then higher voltage might actually result in lower average frequency.
> Quote:
> 
> 
> 
> I have a [email protected] will this cpu with this oc will bottleneck with the 1080 SLI?
> 
> Click to expand...
> 
> It depends mostly on the game. Some games like Total War-Warhammer can easily be CPU limited (GTA-V also sometimes), especially @ 1920x1080 in the case of 1080s. But most popular games are not easily CPU limited. Also at best u can expect 10% more performance with the fastest 4 core CPU available (and more cores are often not really used in most games).
> 
> One thing which can be an issue with recent games is the limit on the PCIE bus. Your mobo/chipset will only support x8 bus speed. And while this hasn't been much of a problem so far. It seems that with higher and higher resolutions (e.g. DSR) and higher texture resolutions, the number of PCIE lanes from GPU to CPU can now be a limit. Doom seems to be the prime example for this:
> http://www.forum-3dcenter.org/vbulletin/showthread.php?p=11035280#post11035280
> http://forums.guru3d.com/showthread.php?t=408168
Click to expand...

Thanks

My motherboard makes sli at 16/16x it's a Gigabyte Z87X OC FORCE, I can see at gpuz that both cards are at 16/16 with my actual 980Ti SLI, I don't know what this mobo have but it makes 16/16 when using 2 cards, I read about this mobo before buying it years ago

So making 16/16x and not 8/8x is a great diference?

Enviado do meu iPhone usando Tapatalk


----------



## Noshuru

What clocks do you guise get on your 1080s? Mine crashes in OW if I put anything more than 2055MHz.


----------



## bonami2

Quote:


> Originally Posted by *wsarahan*
> 
> Thanks
> 
> My motherboard makes sli at 16/16x it's a Gigabyte Z87X OC FORCE, I can see at gpuz that both cards are at 16/16 with my actual 980Ti SLI, I don't know what this mobo have but it makes 16/16 when using 2 cards, I read about this mobo before buying it years ago
> 
> So making 16/16x and not 8/8x is a great diference?
> 
> Enviado do meu iPhone usando Tapatalk


Im not sure if those plx mobo are really x16 x16

Or they just emulate it to make think the nvidia driver that you can make 3-4 ways sli.

It probably x8 x8 like me.. Since intel doest have the bandwith allocation for more than that

But i dont think it an issue.


----------



## MissHaswellE

Hey guys I'm planning on picking up a GTX 1080 non founders here in a little bit and I had a question about Nvidia surround.
I was browsing around and found a set of 1280x1024 monitors(3) for less than 100$ together as a bundle. I was thinking about picking them up before grabbing my 1080. I'm trying to remember though, doesn't Nvidia surround require all monitors to be the same model? Or can I use a mixed set of monitors as long as they're all the same resolution?


----------



## S4ch4Z

Quote:


> Originally Posted by *wsarahan*
> 
> Thanks
> 
> My motherboard makes sli at 16/16x it's a Gigabyte Z87X OC FORCE, I can see at gpuz that both cards are at 16/16 with my actual 980Ti SLI, I don't know what this mobo have but it makes 16/16 when using 2 cards, I read about this mobo before buying it years ago
> 
> So making 16/16x and not 8/8x is a great diference?
> 
> Enviado do meu iPhone usando Tapatalk


OK, so your motherboard has a PLX chip for added PCIE lanes, just like dual GPU graphics cards have.
It's a plus then but a 8X +8X SLI wouldn't have been much of a downside VS 16X + 16X anyway.


----------



## BrainSplatter

Quote:


> Originally Posted by *wsarahan*
> 
> My motherboard makes sli at 16/16x it's a Gigabyte Z87X OC FORCE,
> So making 16/16x and not 8/8x is a great diference?


As mentioned before, that's more of an emulation for being able to put in 3 or 4 NVIDIA cards. It doesn't really has the same bandwidth as a real x16/x16 chipset and the non X79/X99 compatible CPUs also have fewer PCIE lanes than their bigger brothers.

Said that, until now this hasn't really been a big issue yet. In almost all games so far u might loose 1-5% performance compared to a X79/X99 based solution. It's more something to keep an eye on for the future.


----------



## wsarahan

Quote:


> Originally Posted by *S4ch4Z*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> Thanks
> 
> My motherboard makes sli at 16/16x it's a Gigabyte Z87X OC FORCE, I can see at gpuz that both cards are at 16/16 with my actual 980Ti SLI, I don't know what this mobo have but it makes 16/16 when using 2 cards, I read about this mobo before buying it years ago
> 
> So making 16/16x and not 8/8x is a great diference?
> 
> Enviado do meu iPhone usando Tapatalk
> 
> 
> 
> OK, so your motherboard has a PLX chip for added PCIE lanes, just like dual GPU graphics cards have.
> It's a plus then but a 8X +8X SLI wouldn't have been much of a downside VS 16X + 16X anyway.
Click to expand...

So this mean that my MB makes a real 16/16 or its something different?

Enviado do meu iPhone usando Tapatalk


----------



## bonami2

Quote:


> Originally Posted by *wsarahan*
> 
> So this mean that my MB makes a real 16/16 or its something different?
> 
> Enviado do meu iPhone usando Tapatalk


It show x16 x16 in windows and all setting

But bandwith and performance wise it x8 x8.

But it more than enough.


----------



## wsarahan

Quote:


> Originally Posted by *bonami2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> So this mean that my MB makes a real 16/16 or its something different?
> 
> Enviado do meu iPhone usando Tapatalk
> 
> 
> 
> It show x16 x16 in windows and all setting
> 
> But bandwith and performance wise it x8 x8.
> 
> But it more than enough.
Click to expand...

If it's a real 8x8 and no 16x16 why they insert this extra lanes and show a wrong 16x16 at windows? The consumer will always be wrong about the real thing he bought










I thought I had a 16x16 real

Enviado do meu iPhone usando Tapatalk


----------



## axiumone

Quote:


> Originally Posted by *bonami2*
> 
> It show x16 x16 in windows and all setting
> 
> But bandwith and performance wise it x8 x8.
> 
> But it more than enough.


That's wrong.

http://www.avagotech.com/products/pcie-switches-bridges/pcie-switches/pex8747

Full 16x bandwidth at the cost of additional latency. It's a switch that routes packets through a buffer.


----------



## wsarahan

I found this, don't say much but....

The main difference is that the Z87X-OC Force is equipped with a PLX PEX8747 chip that can double the 16 PCI-Express 3.0 lanes from the CPU. Because of that, the four orange PCI-Express slots can be configured 16/0/16/0, 16/0/8/8 and 8/8/8/8 which supports 4-way SLI and 4-way Crossfire. The black PCI-Express x16 slot is directly connected to the CPU, so if you have one video card you want to overclock you have 0 extra latency. It's a clever solution we've seen before with MSI and ASRock.

Enviado do meu iPhone usando Tapatalk


----------



## S4ch4Z

Implementation of a PLX chip on your motherboard seems more like a tweak from the manufacturers
to enable 4 Way SLI support on chipsets which cannot normally support it than anything else.


----------



## wsarahan

Quote:


> Originally Posted by *S4ch4Z*
> 
> Implementation of a PLX chip on your motherboard seems more like a tweak from the manufacturers
> to enable 4 Way SLI support on chipsets which cannot normally support it than anything else.


I'll never have 4way sli ?

I just want that the SLI works at 16x16, don't know much about this plex chips

Enviado do meu iPhone usando Tapatalk


----------



## shilka

Has there been anything new about the Gigabyte Extreme Gaming card?


----------



## ottoore

http://www.anandtech.com/show/6170/four-multigpu-z77-boards-from-280350-plx-pex-8747-featuring-gigabyte-asrock-ecs-and-evga

Only extreme cpus can generate more than 16 pci lanes. Plx chips just help with management.


----------



## bonami2

Quote:


> Originally Posted by *axiumone*
> 
> That's wrong.
> 
> http://www.avagotech.com/products/pcie-switches-bridges/pcie-switches/pex8747
> 
> Full 16x bandwidth at the cost of additional latency. It's a switch that routes packets through a buffer.


they cant give more bandwith than what the cpu is set at im sure

No one would buy a 5820k or 6800k if the x16 was true

Show proof


----------



## bonami2

Well i googled a bit

The plx chip act as a switch

It allow to go from gpu 1 to 2 and allow them to run at x16 individualy

But both at full speed the pcie will be at x8

So in gaming it wortless.


----------



## AllGamer

Quote:


> Originally Posted by *S4ch4Z*
> 
> (a single 1080 is already good enough for a 60 FPS smooth experience in the most demanding titles in 1440P IMHO)


Not quite true,
yesterday I ran StarCitizen with only 1 GTX1080 and it was barely making 33 FPS in 2D Surround Mode,
if 3D Surround is enabled, the FPS drops in half.

I placed an order for the 2nd GTX1080 to go SLI, and hopefully it can pull 60+ FPS on 2D and at least 30 FPS on 3D Surround Mode.


----------



## CallsignVega

Quote:


> Originally Posted by *pez*
> 
> It was mentioned by someone that the EVGA v2 bridges will achieve/meet the standards of the HB bridge, but I'm not sure if that has been confirmed as of yet. I know cards like the Gigabyte Xtreme Gaming bundle will include a compatible bridge, but between the EVGA v2 SLI bridge, Gigabyte Xtreme Gaming bridge,and NVIDIA's own bridges, that's all I know of.


The new HB bridge is completely different physically than the old bridges. The HB uses both SLI fingers of both GPU's at the same time, something that does not happen with any older SLI bridge. The only work around right now may be to use two floppy single SLI ribbon cables until NVIDIA gets off their butts and makes the bridges which they've been dragging ass on.


----------



## S4ch4Z

It all comes to the games
Quote:


> Originally Posted by *AllGamer*
> 
> Not quite true,
> yesterday I ran StarCitizen with only 1 GTX1080 and it was barely making 33 FPS in 2D Surround Mode,
> if 3D Surround is enabled, the FPS drops in half.
> 
> I placed an order for the 2nd GTX1080 to go SLI, and hopefully it can pull 60+ FPS on 2D and at least 30 FPS on 3D Surround Mode.


Well there must be some exceptions of course. Not being a Star Citizen or The Witcher III player for instance,
I fully realize that one card cannot cut it for some.
But from what I've experienced using this card so far, I feel like having absolutely no use for a 1080 SLI as of now on Crysis 3, GTA V, Fallout 4 or Project CARS in 1440P (ditching just a few useless Ultra settings for High instead).


----------



## Cial00

Quote:


> Originally Posted by *AllGamer*
> 
> Not quite true,
> yesterday I ran StarCitizen with only 1 GTX1080 and it was barely making 33 FPS in 2D Surround Mode,
> if 3D Surround is enabled, the FPS drops in half.
> 
> I placed an order for the 2nd GTX1080 to go SLI, and hopefully it can pull 60+ FPS on 2D and at least 30 FPS on 3D Surround Mode.


That has more to do with the game being in an unoptimzed alpha state. I don't follow Star Citizen but the demo of the sp campaign they showed last year had horrible framerate and optimization. Be fairer to judge when it releases and has game ready drivers available .


----------



## Setzer

Quote:


> Originally Posted by *AllGamer*
> 
> Not quite true,
> yesterday I ran StarCitizen with only 1 GTX1080 and it was barely making 33 FPS in 2D Surround Mode,
> if 3D Surround is enabled, the FPS drops in half.
> 
> I placed an order for the 2nd GTX1080 to go SLI, and hopefully it can pull 60+ FPS on 2D and at least 30 FPS on 3D Surround Mode.


Star Citizen is still in alpha and runs like horses**t. It really is a very bad benchmark.


----------



## axiumone

Quote:


> Originally Posted by *bonami2*
> 
> Well i googled a bit
> 
> The plx chip act as a switch
> 
> It allow to go from gpu 1 to 2 and allow them to run at x16 individualy
> 
> But both at full speed the pcie will be at x8
> 
> So in gaming it wortless.


Bud, everything is in the link I posted. It's a switch. Meaning if you have 2 gpus with the switch on a z170 chipset, each one communicates with the cpu at 16x. The switch hold's the data packets as a buffer and sends the full 8GB/s worth of data every time it switches. Now clearly this happens at the cost of additional latency, Avagotech quotes that the latency is 100ns.

Now, whether this has any affect on gaming is still a very gray area. Very few reviewers have tested this. However, what we do know is that there is a performance difference between 16x and 8x pcie (See Doom 2016 in sli). That performance difference will only grow larger as more 1440 and 4k high refresh rate displays see adoption.

Hopefully that clears it up for you.


----------



## bonami2

Quote:


> Originally Posted by *axiumone*
> 
> Bud, everything is in the link I posted. It's a switch. Meaning if you have 2 gpus with the switch on a z170 chipset, each one communicates with the cpu at 16x. The switch hold's the data packets as a buffer and sends the full 8GB/s worth of data every time it switches. Now clearly this happens at the cost of additional latency, Avagotech quotes that the latency is 100ns.
> 
> Now, whether this has any affect on gaming is still a very gray area. Very few reviewers have tested this. However, what we do know is that there is a performance difference between 16x and 8x pcie (See Doom 2016 in sli). That performance difference will only grow larger as more 1440 and 4k high refresh rate displays see adoption.
> 
> Hopefully that clears it up for you.


If you se 2 gpu maxed it x8 x8 since both gpu work as hard. If one saturate the 2 do.

So it worthless except if you run 3 4 sli since nvidia doest work in sli on 4x


----------



## VSG

This was supposed to be announced tomorrow but a leak resulted in the press release coming sooner:

http://www.overclock3d.net/articles/gpu_displays/msi_to_release_water_cooled_gtx_1080_sea_hawk_ek_gpu/1


----------



## bfedorov11

http://www.avagotech.com/docs/12351854

http://www.overclock.net/t/1497055/z97-and-plx-latency/0_40#post_22448316

It seems PLX acts as a middle man/switch. 16 lanes to the cpu and 32 to 4 slots for 8x8x8x8. It probably works well since no gpu can saturate 16x 3.0 or even 8x 3.0.


----------



## AllGamer

Quote:


> Originally Posted by *geggeg*
> 
> This was supposed to be announced tomorrow but a leak resulted in the press release coming sooner:
> 
> http://www.overclock3d.net/articles/gpu_displays/msi_to_release_water_cooled_gtx_1080_sea_hawk_ek_gpu/1


not too surprise, it was long expected.

I actually saw those listed in a few Distributors / Warehouse sites, without any ETA or price info, but they listed not accepting back order yet for those.

I was actually planning to pick up a couple of those, before I ordered the 1080 FE, which I plan to run with EK blocks.

but now that the official sea hawks are listed to use EK blocks with custom logos on it... I might just ditch the current 1080 FE and get me 2 of these nice sexy MSI with factory EK blonks built-in


----------



## Baasha

Well, here it is folks:


----------



## Naked Snake

So what temps are you guys getting with 100% fan speed and 2.0 ghz OC on the FE?


----------



## pez

Quote:


> Originally Posted by *CallsignVega*
> 
> The new HB bridge is completely different physically than the old bridges. The HB uses both SLI fingers of both GPU's at the same time, something that does not happen with any older SLI bridge. The only work around right now may be to use two floppy single SLI ribbon cables until NVIDIA gets off their butts and makes the bridges which they've been dragging ass on.


Well in reference to the picture below, this is the reason for my saying that. Basically anything 5K and up is what's limited by having a LED bridge instead of the HB one.


----------



## auraofjason

Quote:


> Originally Posted by *Naked Snake*
> 
> So what temps are you guys getting with 100% fan speed and 2.0 ghz OC on the FE?


68-70c at 100% fan and 2088mhz on mine. Way too loud for me though.


----------



## wsarahan

2088 on cores an memos how much?

Thanks


----------



## auraofjason

Quote:


> Originally Posted by *wsarahan*
> 
> 2088 on cores an memos how much?
> 
> Thanks


+450 so 10908mhz (5454mhz in afterburner).


----------



## Naked Snake

Quote:


> Originally Posted by *auraofjason*
> 
> 68-70c at 100% fan and 2088mhz on mine. Way too loud for me though.


Thanks!


----------



## rv8000

So does Nvidia still have an issue with idle clocks @ 144hz?

Idling @ 1257 on the core @ 62c, second I drop to 120hz the clocks drop to lpstates...


----------



## auraofjason

Quote:


> Originally Posted by *rv8000*
> 
> So does Nvidia still have an issue with idle clocks @ 144hz?
> 
> Idling @ 1257 on the core @ 62c, second I drop to 120hz the clocks drop to lpstates...


Seems like it. Weird thing is though, if I enable g-sync it does downclock even at 165hz. If I turn off g-sync it will not downclock at 165hz or 144hz.


----------



## rv8000

Quote:


> Originally Posted by *auraofjason*
> 
> Seems like it. Weird thing is though, if I enable g-sync it does downclock even at 165hz. If I turn off g-sync it will not downclock at 165hz or 144hz.


Really disappointing.

On a side note, there's a slight amount of coil whine with my EVGA SC, but in terms of coil whine alone it's probably the quietest I've had since my 7950/7970 cards.


----------



## wsarahan

Quote:


> Originally Posted by *rv8000*
> 
> Really disappointing.
> 
> On a side note, there's a slight amount of coil whine with my EVGA SC, but in terms of coil whine alone it's probably the quietest I've had since my 7950/7970 cards.


How much did you get at your 1080 OC?

I bought 2 SC as well and waiting to arrive

Thanks


----------



## rv8000

Quote:


> Originally Posted by *wsarahan*
> 
> How much did you get at your 1080 OC?
> 
> I bought 2 SC as well and waiting to arrive
> 
> Thanks


Installing AB now, will check in a few minutes. Actually wanted to play games after not being able to touch any after 2 months of no GPU.


----------



## stanielz

MSI afterburner option "force constant voltage" doesnt do anything does it?


----------



## Benjiw

Quote:


> Originally Posted by *stanielz*
> 
> MSI afterburner option "force constant voltage" doesnt do anything does it?


It should keep the voltage pinned even at idle just like having your CPU voltage set to manual rather than adaptive or offset.


----------



## rv8000

So my EVGA SC is boosting to ~1940 at stock. Managed to get +120/550 (core hits around 2075).



Definitely a few more mhz left in it once we can bios mod, spiking above the 120% pl quite often in gt2.


----------



## Dayaks

Quote:


> Originally Posted by *S4ch4Z*
> 
> OK, so your motherboard has a PLX chip for added PCIE lanes, just like dual GPU graphics cards have.
> It's a plus then but a 8X +8X SLI wouldn't have been much of a downside VS 16X + 16X anyway.


Vega has shown in SLI that it is important to have 16x 3.0 PCIe lanes. He did some really good testing a while back and I am sure it's only more important as the cards have gotten faster.
https://hardforum.com/threads/pci-e-speed-tests-ramblings-for-video-cards.1878382/

TLDR: In SLI PCIe 3.0 16x is on average 14% faster than PCIe 3.0 8x (2.0 16x)


----------



## bonami2

Quote:


> Originally Posted by *Dayaks*
> 
> Vega has shown in SLI that it is important to have 16x 3.0 PCIe lanes. He did some really good testing a while back and I am sure it's only more important as the cards have gotten faster.
> https://hardforum.com/threads/pci-e-speed-tests-ramblings-for-video-cards.1878382/
> 
> TLDR: In SLI PCIe 3.0 16x is on average 14% faster than PCIe 3.0 8x (2.0 16x)


Well it do show an increase but nothing magical.

100 fps vs 114fps if it 14%

And 1070-1080 have new compression thing so maybe it gonna use less bandwith


----------



## marc0053

Got a gigabyte G1 GTX 1080 today and it's a lot of fun to play with.
Going from air to water gave me an additional 10mhz...
http://www.3dmark.com/fs/8833401


----------



## BigBeard86

Wow...only 10 MHz from air to water? I was thinking about mounting my kraken g10 on the 1080, but now it seems like it may not be worth the hassle.


----------



## Dayaks

Power limited for both air and water I take it?


----------



## rv8000

My god, 82C at 70% fan speed, this cooler is either awful or there's a mountain of thermal paste on there. My Tri-X Fury hardly hit 68c @ 40% fan speed.


----------



## ucode

Has anyone run a Furmark P1080 bench on their GTX 1080?


----------



## TK421

Quote:


> Originally Posted by *ucode*
> 
> Has anyone run a Furmark P1080 bench on their GTX 1080?


never run furmark on any video card, unless you want to burn it


----------



## MrDerrikk

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ucode*
> 
> Has anyone run a Furmark P1080 bench on their GTX 1080?
> 
> 
> 
> never run furmark on any video card, unless you want to burn it
Click to expand...

Shouldn't it be fine if you keep it at lower temps? I've done a few runs when testing my old 970 G1 and it kept below 60C, but now you've got me worried.


----------



## GOLDDUBBY

Who has the "reviewers" bios (86.04.17.00.27) ?? Please share it!!


----------



## BURGER4life

Quote:


> Originally Posted by *rv8000*
> 
> My god, 82C at 70% fan speed, this cooler is either awful or there's a mountain of thermal paste on there. .


----------



## Xeq54

Quote:


> Originally Posted by *MrDerrikk*
> 
> Shouldn't it be fine if you keep it at lower temps? I've done a few runs when testing my old 970 G1 and it kept below 60C, but now you've got me worried.


That depends, Furmark puts extremely high load on the card especially the VRM. For example my msi gtx1080 does not even run on boost clock with furmark, it stays at the base clock. There were cases when the VRM simply burned with furmark. Custom boards with large VRM should be fine, but I wouldnt run it on a reference board. That is why it is not used nowadays.


----------



## ChevChelios

Quote:


> Originally Posted by *TK421*
> 
> never run furmark on any video card, unless you want to burn it


so what should run to test OC then ? 3Dmark ?


----------



## marc0053

Quote:


> Originally Posted by *BigBeard86*
> 
> Wow...only 10 MHz from air to water? I was thinking about mounting my kraken g10 on the 1080, but now it seems like it may not be worth the hassle.


Maybe it is due to the G1 cooler being pretty efficient as it is. On air it idle around 35C and 62C on firestrike load and the waterblock is 25C idle and 32C load.
Quote:


> Originally Posted by *Dayaks*
> 
> Power limited for both air and water I take it?


The card does reach the power limit quite fast and the core clocks seem to throttle accordingly. Going from stock voltage to increasing it by +100mv did not help.


----------



## drop24

Quote:


> Originally Posted by *ChevChelios*
> 
> so what should run to test OC then ? 3Dmark ?


FFXIV Heavensward benchmark is the best. Run it maxed out settings at 4K (use DSR if you don't have a 4K monitor).


----------



## Jpmboy

Quote:


> Originally Posted by *BrainSplatter*
> 
> As mentioned before, that's more of an emulation for being able to put in 3 or 4 NVIDIA cards. It doesn't really has the same bandwidth as a real x16/x16 chipset and the non X79/X99 compatible CPUs also have fewer PCIE lanes than their bigger brothers.
> 
> Said that, until now this hasn't really been a big issue yet. In almost all games so far u might loose 1-5% performance compared to a X79/X99 based solution. It's more something to keep an eye on for the future.


well, not quite accurate. With a PLX multiplexer on board the PCIE lanes are running at x16. In the case where a 28 lane CPU would preclude using 3 way, the PLX will pace the lane load so the CPU does not exceed it's lane structure and com with the CPU bus at x16. The issue with a PLX in the lane path is that it will (must) lead to a slight increase in latency. It's easy to verify the actual lane speed using "concbandwidthtest" (google it). A PLX mb will run at x16 bandwidth.
Quote:


> Originally Posted by *wsarahan*
> 
> I found this, don't say much but....
> 
> The main difference is that the Z87X-OC Force is equipped with a PLX PEX8747 chip that can double the 16 PCI-Express 3.0 lanes from the CPU. Because of that, the four orange PCI-Express slots can be configured 16/0/16/0, 16/0/8/8 and 8/8/8/8 which supports 4-way SLI and 4-way Crossfire. The black PCI-Express x16 slot is directly connected to the CPU, so if you have one video card you want to overclock you have 0 extra latency. It's a clever solution we've seen before with MSI and ASRock.
> Enviado do meu iPhone usando Tapatalk


and ASUS E-WS motherboards.

Quote:


> Originally Posted by *marc0053*
> 
> Maybe it is due to the G1 cooler being pretty efficient as it is. On air it idle around 35C and 62C on firestrike load and the waterblock is 25C idle and 32C load.
> The card does reach the power limit quite fast and the core clocks seem to throttle accordingly. Going from stock voltage to increasing it by +100mv did not help.


yeah - we really need to get rid of the power limit in bios. I think these cards can do a lot more but for slamming into the power ceiling.


----------



## bonami2

So you agree with me that in PCIE bottlenecked case the plx chip is worthless?

SInce both gpu will do same work in sli.. They will each want their x16... And well the plx will give them x8

Kinda happy to have saved 100$ when i got my z97 gaming 7


----------



## reset1101

Hi, Ive been reading part of the thread but cant read it all and I dont find a clear answer to my question.

I currently have a 980Ti that boosts to 1350mhz. For playing at 1440p resolution, what gain would I get by changing to a 1080 that clocks around 2ghz?

Thanks a lot for your help


----------



## blurp

Quote:


> Originally Posted by *reset1101*
> 
> Hi, Ive been reading part of the thread but cant read it all and I dont find a clear answer to my question.
> 
> I currently have a 980Ti that boosts to 1350mhz. For playing at 1440p resolution, what gain would I get by changing to a 1080 that clocks around 2ghz?
> 
> Thanks a lot for your help


I boost @ 1500 fixed (modded bios) and my difference with a moderately overclocked 1080 is around 15%. I guess for you the difference would be around 20-25%. Not that much in real life experience I would say. 1080ti is around the corner...


----------



## ChevChelios

up to 9 months =/= around the corner =/


----------



## pez

From what u saw of the review, the guy used different chipsets and motherboards. A better test would have been to actually take a x16/x16 PCIe 3.0 board and actually limit the slots to x8/x8.

If that's what the guy had done, I'm not sure why he would have labeled if as PCIe 2.0 x16.


----------



## Dayaks

Quote:


> Originally Posted by *pez*
> 
> From what u saw of the review, the guy used different chipsets and motherboards. A better test would have been to actually take a x16/x16 PCIe 3.0 board and actually limit the slots to x8/x8.
> 
> If that's what the guy had done, I'm not sure why he would have labeled if as PCIe 2.0 x16.


He tells you right in the post it was all kept equal but the lanes. PCIe 3.0 8x = PCIe 2.0 16x in bandwidth.


----------



## pez

Quote:


> Originally Posted by *Dayaks*
> 
> He tells you right in the post it was all kept equal but the lanes. PCIe 3.0 8x = PCIe 2.0 16x in bandwidth.


Ah I read quickly and didn't go through it. Went back and read it.

This was a genuine concern of mine when I was looking at the new cards and thinking of SLI. However, at the same time, a couple of his results are at a point where there is not difference, or a margin of error. So it makes me ask 'what part of the game are you benching if not using a build in tool?', 'how many runs for each game did you try?' Etc.

Lots of variables here still that aren't quite addressed for those who aren't familiar with said individual


----------



## CallsignVega

Quote:


> Originally Posted by *marc0053*
> 
> Got a gigabyte G1 GTX 1080 today and it's a lot of fun to play with.
> Going from air to water gave me an additional 10mhz...


Ouch. Basically the reason I don't even both with water cooling anymore. With the die shrinks and air coolers getting really good these days, I've found that I can get within like 99% of the performance with air cooling and still keep a quiet system. Most things are voltage limited these days, not heat dissipation limited.

My 1st FTW has shipped, should be here on Monday. Who knows when the second will ship. I'd like to do the new SLI HB bridge vs old testing on different PCI-E bus speeds and see how she turns out.


----------



## SynchroSCP

Water cooling still has its place but your right, the lower TDP of new components and the improvements in air coolers make it a much more viable solution. Under water I get less throttling, my card holds 2132mhz much more consistently which is about as high as it will go stable with the stock bios no matter how much I fiddle with the curve and voltage. Plus going from the Titan X to 1080 the delta T of my loop never goes above 10C so I can maintain efficient cooling with minimal fan speed, a noticeable improvement.


----------



## davidelite10

I will be apart of this club when my 1080 comes in Saturday!

I can't wait!


----------



## MrDerrikk

Quote:


> Originally Posted by *CallsignVega*
> 
> My 1st FTW has shipped, should be here on Monday. Who knows when the second will ship. I'd like to do the new SLI HB bridge vs old testing on different PCI-E bus speeds and see how she turns out.


My FTW wasn't shipped yesterday so it's probably on backorder







looking forward to seeing what temps it gets though and if I made the right purchase


----------



## gulvastr

Got my MSI Gaming X 1080 yesterday. What a beast. Replaced my (2) MSI Gaming R9 290s equipped with Kraken G10's and I am fully satisfied. Just waiting on EK to get the blocks made. Hoping before the end of the month to have it under water


----------



## Benjiw

Quote:


> Originally Posted by *ChevChelios*
> 
> up to 9 months =/= around the corner =/


Lol indeed, people told me pascal was round the corner a year ago...


----------



## reset1101

I dont think Nvidia will release big Pascal anywhere soon. They first have to make profit out of 1080/1070s and that will take several months, as there isnt even real stock (in Spain at least, and I dont think its too different in the rest of Europe).

Then they may release a Titan-kind GPU, I wouldnt expect it until late this year/beginning of next year. And by end-Q1 / Q2 the 1080Ti.

But I might be totally wrong.


----------



## Menno

Hmm I run 1080 FE SLI @ 5K. Currently with 2 ribbons and x8/x8 PCIe 3.0. Wondering if extra lanes and the HB bridge gonna make difference. I always thought that x8 was enough.


----------



## CallsignVega

Just placed my 4-spacing HB bridge order:

http://www.geforce.com/hardware/10series/geforce-gtx-1080?ClickID=cfzawp4p4se7ksv7iw7ziefeafvvaiezensn


----------



## axiumone

Quote:


> Originally Posted by *CallsignVega*
> 
> Just placed my 4-spacing HB bridge order:
> 
> http://www.geforce.com/hardware/10series/geforce-gtx-1080?ClickID=cfzawp4p4se7ksv7iw7ziefeafvvaiezensn


Awesome, thank you. Saw your post before I saw the email notification.

Ordered overnight, I hope they go out tonight.


----------



## wsarahan

Guys i need your help

I just bought the HB bridge at Nvidia site but as i live in Brazil i asked a friend to buy and ship to me, but in Brazil we use cm and not mm to measure things

Today i have this bridge:



And i selected this size at Nvidia site, 4 Space one, is this both the same size ? I need to be 100% sure so i can confirm the order:



Thanks


----------



## CallsignVega

Quote:


> Originally Posted by *wsarahan*
> 
> Guys i need your help
> 
> I just bought the HB bridge at Nvidia site but as i live in Brazil i asked a friend to buy and ship to me, but in Brazil we use cm and not mm to measure things
> 
> Today i have this bridge:
> 
> 
> 
> And i selected this size at Nvidia site, 4 Space one, is this both the same size ? I need to be 100% sure so i can confirm the order:
> 
> 
> 
> Thanks


All you need to know is, from the two slots that both your GPU's *plug into* (not overhang), are there three empty slots in between? Then you need the 4-slot spacer.


----------



## wsarahan

Quote:


> Originally Posted by *CallsignVega*
> 
> All you need to know is, from the two slots that both your GPU's *plug into* (not overhang), are there three empty slots in between? Then you need the 4-slot spacer.


I think i use this slots with the arrow

But the picute i sent before they are the same lenght right?


----------



## Setzer

Quote:


> Originally Posted by *wsarahan*
> 
> but in Brazil we use cm and not mm to measure things


You do realize 10mm = 1cm?


----------



## axiumone

Quote:


> Originally Posted by *wsarahan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I think i use this slots with the arrow
> 
> But the picute i sent before they are the same lenght right?


You need the 4-slot bridge.


----------



## wsarahan

Quote:


> Originally Posted by *Setzer*
> 
> You do realize 10mm = 1cm?


Yes kkkk

but think with me
1 dollar = 3.6 Reais (Brazilian currency)

So i`m paying almost 4 times more than people at USA and Canada, i wanted to make 100000% sure about what i was doing kkkk that`s why i asked even looking twice at the bridges, it`s a lot of money for us here in Brazil


----------



## CallsignVega

Quote:


> Originally Posted by *Gerard;29642936*
> Another douchenozzle reviewing a ftw edition
> 
> 
> 
> 
> 
> 50-60mhz more on the core he got, ouchie,


Yikes, that is the second FTW that can't even hit the core clocks of some FE cards.


----------



## bonami2

Which hb bridge would fit for a msi z97 gaming 7 in sli?

Cant they just make short and long version? Damn


----------



## CallsignVega

Quote:


> Originally Posted by *bonami2*
> 
> Which hb bridge would fit for a msi z97 gaming 7 in sli?
> 
> Cant they just make short and long version? Damn


Why would they do that? There are three SLI spacing configs on motherboards, hence three SLI bridges. Your MB needs a 3-slot spacer.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *wsarahan*
> 
> Yes kkkk
> 
> but think with me
> 1 dollar = 3.6 Reais (Brazilian currency)
> 
> So i`m paying almost 4 times more than people at USA and Canada, i wanted to make 100000% sure about what i was doing kkkk that`s why i asked even looking twice at the bridges, it`s a lot of money for us here in Brazil


Well the value of the currency depends on what the money is worth in the country. How much does a Big Mac Co cost in Brazil money in Brazil?


----------



## axiumone

Oh FFS nvidia. Ordered the HB bridge overnight to have it delivered tomorrow. Order is already processed and have a tracking number. Scheduled for delivery Monday... No where in the web store does it say no weekend deliveries. Especially for overnight.


----------



## Cial00

Finally had time to install my 1080 into my loop today. Running the EVGA SC with the EK block & backplate. Temps are idle 28-31c with load topping out at 38-39c.... 22-23c ambient. I had a bit of coil whine when I was on the stock air cooler, but it has gone down dramatically under water. Really loving how cool this thing runs. This is with a 2100core, 10,500mem overclock. Can't wait for a custom bios!


----------



## SynchroSCP

Quote:


> Originally Posted by *Cial00*
> 
> Finally had time to install my 1080 into my loop today. Running the EVGA SC with the EK block & backplate. Temps are idle 28-31c with load topping out at 38-39c.... 22-23c ambient. I had a bit of coil whine when I was on the stock air cooler, but it has gone down dramatically under water. Really loving how cool this thing runs. This is with a 2100core, 10,500mem overclock. Can't wait for a custom bios!


Pretty much my experience as well, will be interesting if the SC version turns out to clock better than the FTW overall.


----------



## ChevChelios

Quote:


> Running the *EVGA* SC with the EK block & backplate. Temps are idle 28-31c with load topping out at 38-39c.... 22-23c ambient. *I had a bit of coil whine* when I was on the stock air cooler, but it has gone down dramatically under water. Really loving how cool this thing runs. This is with a 2100core, 10,500mem overclock. Can't wait for a custom bios!


hmm

so so far we have

EVGA SC with a bit of coil whine

thats the second EVGA FTW now that underperforms when overclocking ..


----------



## Cial00

Quote:


> Originally Posted by *SynchroSCP*
> 
> Pretty much my experience as well, will be interesting if the SC version turns out to clock better than the FTW overall.


Ya I'm curious too. I think they will all perform similarly until we have custom bios, and then maybe the custom pcbs will shine. But even if they are able to hit 2300-2400 core I don't think the added cost is worth it for 250mhz.

I personally feel the cheapest 1080 reference with the EK block is going to be the best price/performance for this generation of x80s on water.


----------



## CallsignVega

Quote:


> Originally Posted by *axiumone*
> 
> Oh FFS nvidia. Ordered the HB bridge overnight to have it delivered tomorrow. Order is already processed and have a tracking number. Scheduled for delivery Monday... No where in the web store does it say no weekend deliveries. Especially for overnight.


Ya I hate that. Usually it would have to say weekend delivery specifically.


----------



## wsarahan

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Well the value of the currency depends on what the money is worth in the country. How much does a Big Mac Co cost in Brazil money in Brazil?


18,50 Brazilian real


----------



## wsarahan

Guys i have a question

It`s now avaiable this board here in Brazil



I already bought the Evga ACX 3.0 SC one, should i ask the store to change? It`s the same value

What you guys think?

Thanks


----------



## techguymaxc

Quote:


> Originally Posted by *wsarahan*
> 
> Guys i have a question
> 
> It`s now avaiable this board here in Brazil
> 
> 
> 
> I already bought the Evga ACX 3.0 SC one, should i ask the store to change? It`s the same value
> 
> What you guys think?
> 
> Thanks


If you're going to keep the stock cooler go with the MSI, it ought to be cooler and quieter than the EVGA. If you intend to water cool, go with whichever one you can find a block for.


----------



## wsarahan

Quote:


> Originally Posted by *techguymaxc*
> 
> If you're going to keep the stock cooler go with the MSI, it ought to be cooler and quieter than the EVGA. If you intend to water cool, go with whichever one you can find a block for.


i`ll stay with the stock cooler, no water


----------



## stanielz

Quote:


> Originally Posted by *Cial00*
> 
> Finally had time to install my 1080 into my loop today. Running the EVGA SC with the EK block & backplate. Temps are idle 28-31c with load topping out at 38-39c.... 22-23c ambient. I had a bit of coil whine when I was on the stock air cooler, but it has gone down dramatically under water. Really loving how cool this thing runs. This is with a 2100core, 10,500mem overclock. Can't wait for a custom bios!


So I'm running SLI 1080s and we have the same idle temps but my load temps are like 10c higher than yours, how much rad space you have? Wonder if my loop is ****ed :/


----------



## rv8000

Quote:


> Originally Posted by *wsarahan*
> 
> Guys i have a question
> 
> It`s now avaiable this board here in Brazil
> 
> 
> 
> I already bought the Evga ACX 3.0 SC one, should i ask the store to change? It`s the same value
> 
> What you guys think?
> 
> Thanks


Get the MSI card if it is truly the same price, period!

Side note, repasted my ACX 3.0 SC, temps went from 83c down to about 75c max, idles temps are pretty much a crap shoot. This is at 70% fan speed or ~2000 rpm.


----------



## Cial00

Quote:


> Originally Posted by *stanielz*
> 
> So I'm running SLI 1080s and we have the same idle temps but my load temps are like 10c higher than yours, how much rad space you have? Wonder if my loop is ****ed :/


Just a 360 rad with push/pull. I used kryonaut on the gpu instead of EK's tim.


----------



## ChevChelios

Quote:


> Originally Posted by *wsarahan*
> 
> Guys i have a question
> 
> It`s now avaiable this board here in Brazil
> 
> 
> 
> I already bought the Evga ACX 3.0 SC one, should i ask the store to change? It`s the same value
> 
> What you guys think?
> 
> Thanks


yes absolutely change for the Gaming X if they cost the same

if I could freely choose any custom 1080 outside of the highest-end stuff like Classy, Kingpin, Lightning etc. .. then I'd choose Gaming X


----------



## Shaded War

Anyone have any experience with the Zotac 1080 Amp? I just caught them in stock on newegg, and ordered it. I can't really find any reviews or videos on it since they all focus on the Extreme version. I'v never bought a Zotac product before and have no idea what to expect from them. Since all 1080's overclock about the same, I just want something cool and quiet and this looks like it should do.

Official website LINK

Also, still in stock on Newegg for $640 if you want one! LINK


----------



## Setzer

Quote:


> Originally Posted by *ChevChelios*
> 
> yes absolutely change for the Gaming X if they cost the same
> 
> if I could freely choose any custom 1080 outside of the highest-end stuff like Classy, Kingpin, Lightning etc. .. then I'd choose Gaming X


Is it really that much better than the EVGA FTW card?


----------



## rv8000

Quote:


> Originally Posted by *Setzer*
> 
> Is it really that much better than the EVGA FTW card?


Cooler wise afaik, yes. Component wise the FTW may be superior, though there's no easy way to find the quality of caps/chokes used. The op was asking about the SC vs the Gaming X, the FTW has a larger cooler and should perform better than the standard ACX 3.0.


----------



## ChevChelios

Quote:


> Originally Posted by *Setzer*
> 
> Is it really that much better than the EVGA FTW card?


I havent seen extensive reviews on FTWs cooler tbh, but I cant imagine it being better then Gaming X which already beats Strix and G1 by a bit ..

and on the other hand there have been 2 (I think ?) reports now of FTW failing to achieve 2050-2100 OC which virtually all other cards can do ...

and between ACX 3.0 vs Gaming X its an absolute no brainer


----------



## stanielz

Quote:


> Originally Posted by *Cial00*
> 
> Just a 360 rad with push/pull. I used kryonaut on the gpu instead of EK's tim.


Yeah I used EKs tim. I have 480 rad and 360 rad with just pull. Is your cpu also in your loop? What kind of pump are you running? Feels like I did something wrong and TIM shouldn't make that big of a dif considering my rad space.


----------



## Benjiw

Quote:


> Originally Posted by *stanielz*
> 
> Yeah I used EKs tim. I have 480 rad and 360 rad with just pull. Is your cpu also in your loop? What kind of pump are you running? Feels like I did something wrong and TIM shouldn't make that big of a dif considering my rad space.


Hmm, what rads are they, what pump are you using and how fast is the pump going? obviously pump speed makes a difference if the coolant is flowing too slowly and you fans are too low.


----------



## stanielz

Quote:


> Originally Posted by *Benjiw*
> 
> Hmm, what rads are they, what pump are you using and how fast is the pump going? obviously pump speed makes a difference if the coolant is flowing too slowly and you fans are too low.


360 rad is EK XE 60mm, and the 480 (x2 240s) are EK PE 40mm. Pump is an ek ddc 3.1 @ 3k RPMs, which is not adjustable. And i'm just running EKs clear coolant. Soft hoses, no 90s. I live in england and theres no AC, idk how much that plays a role.


----------



## wsarahan

Quote:


> Originally Posted by *ChevChelios*
> 
> yes absolutely change for the Gaming X if they cost the same
> 
> if I could freely choose any custom 1080 outside of the highest-end stuff like Classy, Kingpin, Lightning etc. .. then I'd choose Gaming X


Looked at the website, there is only one piece avaiable so no deal, i`ll probably have to saty with SC model

I contacted the store owner let`s see


----------



## Maintenance Bot

Quote:


> Originally Posted by *Shaded War*
> 
> Anyone have any experience with the Zotac 1080 Amp? I just caught them in stock on newegg, and ordered it. I can't really find any reviews or videos on it since they all focus on the Extreme version. I'v never bought a Zotac product before and have no idea what to expect from them. Since all 1080's overclock about the same, I just want something cool and quiet and this looks like it should do.
> 
> Official website LINK
> 
> Also, still in stock on Newegg for $640 if you want one! LINK


Let us all know how that card clocks, looks nice. I had a 980ti amp extreme that was a good card.

I was just looking at the 1080 Amp Extreme specs and there website says 10.8ghz for default memory clock. I wonder if they are binning mem chips or something.

https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-extreme#spec


----------



## rv8000

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Let us all know how that card clocks, looks nice. I had a 980ti amp extreme that was a good card.
> 
> I was just looking at the 1080 Amp Extreme specs and there website says 10.8ghz for default memory clock. I wonder if they are binning mem chips or something.
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-extreme#spec


Almost every card I've seen in reviews is hitting ~11ghz effective on the memory, I'd say Zotac is just pushing the clocks as far as they can to get better stock performance.


----------



## Setzer

Quote:


> Originally Posted by *ChevChelios*
> 
> I havent seen extensive reviews on FTWs cooler tbh, but I cant imagine it being better then Gaming X which already beats Strix and G1 by a bit ..
> 
> and *on the other hand there have been 2 (I think ?) reports now of FTW failing to achieve 2050-2100 OC which virtually all other cards can do ...*
> 
> and between ACX 3.0 vs Gaming X its an absolute no brainer


Yeah the 2 most recent YouTube reviews of the FTW card shows some pretty bad overclocking results considering the tag of the card.
I've already cancelled my FTW order and replaced it with the MSI card (which should be in stock 1 week earlier).
Thanks for the help


----------



## Plex

Well, I lost the silicon lottery.

I can barely hit 1900 with my EVGA SC 1080.

Rough life.


----------



## Setzer

Quote:


> Originally Posted by *Plex*
> 
> Well, I lost the silicon lottery.
> 
> I can barely hit 1900 with my EVGA SC 1080.
> 
> Rough life.


That sucks!
But turns out more and more EVGA cards are struggling to overclock well. SC and FTW hmmm


----------



## stanielz

Quote:


> Originally Posted by *Setzer*
> 
> That sucks!
> But turns out more and more EVGA cards are struggling to overclock well. SC and FTW hmmm


yeah both of my FEs hit 2139.. pretty odd


----------



## rv8000

Quote:


> Originally Posted by *Setzer*
> 
> That sucks!
> But turns out more and more EVGA cards are struggling to overclock well. SC and FTW hmmm


Max I'm hovering between 2075~2100 on the core with my SC, and other AIB cards have been all over the place as well.

We need the ability to mod bios before we can really put any board differences to rest, or truly find out if it's just how pascal is going to be.


----------



## Setzer

These two cards were only able to overclock with an extra 60 and 75 MZz respectively - on top of the 1860 MHz boost clock that the FTW has as standard. That's 1920 / 1935 MHz respectively. Pretty bad for FTW cards if you ask me


Spoiler: YouTube vids


----------



## stanielz

yo ***, Awesomesauce Network complains about LEDs, claims it faster than an FE which can prob get a higher OC. no mention of the ****ty oc? lost all respect for that dude just now.


----------



## Cial00

Quote:


> Originally Posted by *stanielz*
> 
> Yeah I used EKs tim. I have 480 rad and 360 rad with just pull. Is your cpu also in your loop? What kind of pump are you running? Feels like I did something wrong and TIM shouldn't make that big of a dif considering my rad space.


CPU is in loop, 2600k @ 4.8ghz. Using Swifttech MCP D5 variant running at 2400rpm, about 46% max speed. Could be other factors at play: what's your ambient temp? Good airflow in case etc? Do you know what the temp of your water is? My loop is usually around 30c during a gaming session, so right in the 10 degree delta range.


----------



## Setzer

Quote:


> Originally Posted by *stanielz*
> 
> yo ***, Awesomesauce Network complains about LEDs, claims it faster than an FE which can prob get a higher OC. no mention of the ****ty oc? lost all respect for that dude just now.


Not to mention there were only 3 games benchmarked. A bit sloppy IMO


----------



## mercs213

Does anyone know what metal the MSI Gaming X is on the heatsink touching the die? http://www.overclock3d.net/gfx/articles/2016/06/08132546279l.JPG

http://www.overclock3d.net/reviews/gpu_displays/msi_gtx1080_gaming_x_review/2

I preorded a 1070 Gaming X from Amazon and would like to apply Coollaboratory Liquid Ultra to it, but I'm unsure if it's aluminium or nickel plated. Assuming the 1080 variant is the same.


----------



## stanielz

Quote:


> Originally Posted by *Cial00*
> 
> CPU is in loop, 2600k @ 4.8ghz. Using Swifttech MCP D5 variant running at 2400rpm, about 46% max speed. Could be other factors at play: what's your ambient temp? Good airflow in case etc? Do you know what the temp of your water is? My loop is usually around 30c during a gaming session, so right in the 10 degree delta range.


yeah i have pretty good airflow. i have a 4790k @ 4.8 ghz too. my ambient is **** because i live in england and they don't believe in ac and its summer. mm I dont know what my water temp is, i guess i could measure the res temp.

edit: just measured my res temp at 31.5 c, i let them get as hot as i know and water temp went up to 33.5 c


----------



## Shaded War

Quote:


> Originally Posted by *mercs213*
> 
> Does anyone know what metal the MSI Gaming X is on the heatsink touching the die? http://www.overclock3d.net/gfx/articles/2016/06/08132546279l.JPG
> 
> http://www.overclock3d.net/reviews/gpu_displays/msi_gtx1080_gaming_x_review/2
> 
> I preorded a 1070 Gaming X from Amazon and would like to apply Coollaboratory Liquid Ultra to it, but I'm unsure if it's aluminium or nickel plated. Assuming the 1080 variant is the same.


That picture looks like nickel plating.


----------



## mercs213

Thanks, I thought the same. Also from their site (haha should of looked there first!):
Quote:


> SOLID BASEPLATE
> To temper the heat generated by a powerful GPU, MSI GAMING series graphics cards use a solid nickel-plated copper base plate. The base plate catches the heat from the GPU and transfers it to the heat pipes for dissipation so you can keep cool.


https://www.msi.com/Graphics-card/GEFORCE-GTX-1070-GAMING-X-8G.html


----------



## VSG

Quote:


> Originally Posted by *Setzer*
> 
> That sucks!
> But turns out more and more EVGA cards are struggling to overclock well. SC and FTW hmmm


The SC is just an FE with different cooler, it's subjected to the same silicon lottery. The non-reference PCBs are where the AIBs add in just the GPU cores provided by Nvidia and are subject to binning. This is why I generally say go with ACX/ACX SC or go Classified/Hydrocopper etc as far as EVGA goes. The FTW/Classified Reference edition/Hybrid/Classified Kingpin Reference editions are all a gamble, especially the reference editions which feature overkill PCBs but cores that did not pass the advertised clocks for the normal editions.


----------



## axiumone

Quote:


> Originally Posted by *Setzer*
> 
> These two cards were only able to overclock with an extra 60 and 75 MZz respectively - on top of the 1860 MHz boost clock that the FTW has as standard. That's 1920 / 1935 MHz respectively. Pretty bad for FTW cards if you ask me
> 
> 
> Spoiler: YouTube vids


That's really unfortunate to hear. I'm getting about 2030 out of my acx sc cards.









I sincerely hope that a bios editor will be available soon. Otherwise I'm extremely tempted to go back to my titan's and pass these 1080's to friend's who are in much more need.


----------



## Cial00

Quote:


> Originally Posted by *stanielz*
> 
> yeah i have pretty good airflow. i have a 4790k @ 4.8 ghz too. my ambient is **** because i live in england and they don't believe in ac and its summer. mm I dont know what my water temp is, i guess i could measure the res temp.
> 
> edit: just measured my res temp at 31.5 c, i let them get as hot as i know and water temp went up to 33.5 c


weird, and your 1080s are running 50c at load?


----------



## CallsignVega

Quote:


> Originally Posted by *Plex*
> 
> Well, I lost the silicon lottery.
> 
> I can barely hit 1900 with my EVGA SC 1080.
> 
> Rough life.


Damn that's horrible. Wouldn't even be faster than my 980Ti.


----------



## stanielz

Quote:


> Originally Posted by *Cial00*
> 
> weird, and your 1080s are running 50c at load?


running in series, so at 99 percent load in a benchmark for like 30 mins they are 46 C and 49 C.

In real games both cards rarely ever get used that much, for example in the witcher 3 at 1440p both cards stay at mid-low 40s.


----------



## bastian

Quote:


> Originally Posted by *CallsignVega*
> 
> Damn that's horrible. Wouldn't even be faster than my 980Ti.


A 1080 would still be faster. Unless you have a magic 980 Ti that clocks well over 1500 on the core.


----------



## axiumone

Quote:


> Originally Posted by *bastian*
> 
> A 1080 would still be faster. Unless you have a magic 980 Ti that clocks well over 1500 on the core.


That's exactly why he posted...

Damn, even my titan's get higher than 1500 in sli.


----------



## goodnightworld

hey guys, my 1080 gpu temp is hovering 86 degree when i am in ultra setting overwatch... and i am a noob on fan profile settings... any kind soul can help me with fan profile settings? i had downloaded msi afterburner... need help on how to set it accordingly.


----------



## Shaded War

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Let us all know how that card clocks, looks nice. I had a 980ti amp extreme that was a good card.
> 
> I was just looking at the 1080 Amp Extreme specs and there website says 10.8ghz for default memory clock. I wonder if they are binning mem chips or something.
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-extreme#spec


Will report back on overclocks and cooling once I get it. I got 3 day shipping so hopefully next Wednesday or Thursday.

Seems like I got in on the first batch of the AMP edition cards as there is literally nothing out there about them. Zotac was always at the very lowest end of the list when it came to video card purchases. But their 900 series and 1080 cards seem to really bring up their game. Something about those cheap orange fans of their past cards really pushed me away from their brand entirely without ever trying them, so I'm hoping they made a solid product now that I'm giving them a fair chance.


----------



## BelowAverageIQ

Quote:


> Originally Posted by *geggeg*
> 
> The SC is just an FE with different cooler, it's subjected to the same silicon lottery. The non-reference PCBs are where the AIBs add in just the GPU cores provided by Nvidia and are subject to binning. This is why I generally say go with ACX/ACX SC or go Classified/Hydrocopper etc as far as EVGA goes. The FTW/Classified Reference edition/Hybrid/Classified Kingpin Reference editions are all a gamble, especially the reference editions which feature overkill PCBs but cores that did not pass the advertised clocks for the normal editions.


Sorry Geggeg, I am a bit confused with your post.

I will be purchasing an EVGA card and was considering the Classified edition. Are you saying that it is not worth it? I thought the classified edition was hand picked or binned GPU's.

Currently SLI 980's. Will probably go single card this time around, so may wait for the Ti's to be released.

Cheers


----------



## JedixJarf

Here's my new 1080 G1 with my kraken G10 and H75 slapped onto it


----------



## ChevChelios

yeah Im also very interested in how Zotac 1080 AMP performs


----------



## JedixJarf

Quote:


> Originally Posted by *ChevChelios*
> 
> yeah Im also very interested in how Zotac 1080 AMP performs


Probably exactly the same as every other 1080 on the market.


----------



## ChevChelios

Quote:


> Originally Posted by *JedixJarf*
> 
> Probably exactly the same as every other 1080 on the market.


... coolers thermals/noise, coil whine ...


----------



## gerbil80

Quote:


> Originally Posted by *JedixJarf*
> 
> Here's my new 1080 G1 with my kraken G10 and H75 slapped onto it


Very nice, would be interested to hear how well that performs!?


----------



## CallsignVega

Looks like BLT has the Gigabyte Xtreme regular and premium pack up for pre-order. Got two of the premium packs. I think it has the best cooler out of them all, and has up to 3x HDMI 2.0 ports for those of us with multiple VR setups.









http://www.shopblt.com/search/order_id=%2521ORDERID%2521&s_max=25&t_all=1&s_all=gtx+1080


----------



## Ragnarook

My frist message here, so hello guys! Tought i would start off with posting my scores in 3dmark and Valley.





[email protected] - Evga Gtx1080 FE [email protected] Watercooled cpu/gpu´s with EK water blocks.


----------



## bonami2

Quote:


> Originally Posted by *CallsignVega*
> 
> Why would they do that? There are three SLI spacing configs on motherboards, hence three SLI bridges. Your MB needs a 3-slot spacer.


Ok yea well i think it the long one evga call.

I mean the hb bridge seem to be only 2 slot on all of them ?


----------



## Benjiw

Can someone point me to some information? Why nvidia gpus use one core to run? I remember reading it somewhere but I can't find it now.


----------



## VSG

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Sorry Geggeg, I am a bit confused with your post.
> 
> I will be purchasing an EVGA card and was considering the Classified edition. Are you saying that it is not worth it? I thought the classified edition was hand picked or binned GPU's.
> 
> Currently SLI 980's. Will probably go single card this time around, so may wait for the Ti's to be released.
> 
> Cheers


Go for it, provided it is not ridiculously priced.


----------



## drop24

Quote:


> Originally Posted by *wsarahan*
> 
> Guys i need your help
> 
> but in Brazil we use cm and not mm to measure things


This really cracked me up.


----------



## Bogga

Quote:


> Originally Posted by *Ragnarook*
> 
> My frist message here, so hello guys! Tought i would start off with posting my scores in 3dmark and Valley.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [email protected] - Evga Gtx1080 FE [email protected] Watercooled cpu/gpu´s with EK water blocks.


Hey, stop stalking me


----------



## gagac1971




----------



## goodnightworld

anyone can help me with fan profile settings?


----------



## bonami2

Quote:


> Originally Posted by *Benjiw*
> 
> Can someone point me to some information? Why nvidia gpus use one core to run? I remember reading it somewhere but I can't find it now.


all gpu are on 1 core...

Until direct x 12

If i understand well the tech.


----------



## Naennon

Quote:


> Originally Posted by *ChevChelios*
> 
> yeah Im also very interested in how Zotac 1080 AMP performs


like this (watercooled)


----------



## Ragnarook

Quote:


> Originally Posted by *Bogga*
> 
> Hey, stop stalking me


haha /haunt!


----------



## danamaniac

Quote:


> Originally Posted by *Ragnarook*
> 
> My frist message here, so hello guys! Tought i would start off with posting my scores in 3dmark and Valley.
> 
> 
> 
> 
> 
> [email protected] - Evga Gtx1080 FE [email protected] Watercooled cpu/gpu´s with EK water blocks.


Hey man i have 2 FE 1080's aswell, now im saving for waterblocks. Heres my rig, can you share some pics of your watercooled 1080s? I wana see what they look like in a build with blocks on, trying to imagine my pc but with ek waterblocks on my 1080's.


----------



## Asus11

just got mine today guys

it boosts to 1898 on its own is this good or average?


----------



## TK421

Quote:


> Originally Posted by *Asus11*
> 
> just got mine today guys
> 
> it boosts to 1898 on its own is this good or average?


try oc a bit more, it can do higher


----------



## Asus11

Quote:


> Originally Posted by *TK421*
> 
> try oc a bit more, it can do higher


yeh I haven't oced it yet

is there any point in increasing the voltage?

with my 1070 it didnt care about voltage with or without it was the same

is this the same for the 1080? just slide power limit to max ?


----------



## ChevChelios

try first without touching the voltage slider

then with to see if it allows a higher core clock

it may not make a difference


----------



## Asus11

Quote:


> Originally Posted by *ChevChelios*
> 
> try first without touching the voltage slider
> 
> then with to see if it allows a higher core clock
> 
> it may not make a difference


yeh because I noticed with the 1070 if I put the voltage to max the power limit was easily hit


----------



## THEROTHERHAMKID

Anyone got some easy instructions for overclocking the 1080?
Can I use same as older cards?
Msi afterburner,heaven and monitoring temps with gpuz and real temp?
I usually leave voltage as is? Or is it worth upping voltage?


----------



## Asus11

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Anyone got some easy instructions for overclocking the 1080?
> Can I use same as older cards?
> Msi afterburner,heaven and monitoring temps with gpuz and real temp?
> I usually leave voltage as is? Or is it worth upping voltage?


download evga OC precision its just for the 1070/1080

also the latest msi afterburner too

newest HWinfo shows these gpus old ones dont


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *Asus11*
> 
> download evga OC precision its just for the 1070/1080
> 
> also the latest msi afterburner too
> 
> newest HWinfo shows these gpus old ones dont


Thanks for the help cheers


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *Asus11*
> 
> download evga OC precision its just for the 1070/1080
> 
> also the latest msi afterburner too
> 
> newest HWinfo shows these gpus old ones dont


Its a g1 1080
Will evga OC still work?


----------



## Asus11

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Its a g1 1080
> Will evga OC still work?


yup any gpu will work


----------



## THEROTHERHAMKID

Nice thanks ?


----------



## Spikeyjohnson

Just finished water cooling my Nvidia GTX 1080 (Founders Edition). I'm up to 2025 mhz on the core and 10800 mhz on the Vram and that's where I'll leave it







I had it at 2075 or so but it stuttered a little bit every once in a while so I dropped it down. I have seen it get to 49 degrees max and it shares the water loop with my 5930K which is at 4.99 GHZ. I'm pretty happy with those temps and clocks seeing as before the EK block, it was at 80 degrees constantly while sticking to about 1750 mhz.


----------



## Asus11

Quote:


> Originally Posted by *Spikeyjohnson*
> 
> Just finished water cooling my Nvidia GTX 1080 (Founders Edition). I'm up to 2025 mhz on the core and 10800 mhz on the Vram and that's where I'll leave it
> 
> 
> 
> 
> 
> 
> 
> I had it at 2075 or so but it stuttered a little bit every once in a while so I dropped it down. I have seen it get to 49 degrees max and it shares the water loop with my 5930K which is at 4.99 GHZ. I'm pretty happy with those temps and clocks seeing as before the EK block, it was at 80 degrees constantly while sticking to about 1750 mhz.


also mine it watercooled!

im sure theres alot more life left in that OC my friend

what is your watercooling setup like? rad/pump etc


----------



## Spikeyjohnson

Quote:


> Originally Posted by *Asus11*
> 
> also mine it watercooled!
> 
> im sure theres alot more life left in that OC my friend
> 
> what is your watercooling setup like? rad/pump etc


I'm running my system in this order Pump/Res (EK Xres 140 D5) >> GPU (EK block and back) >> 360 Rad (EK Coolstream PE) >> CPU (EK Supremacy x99) >> 240 Rad (EK Coolstream PE) >> ... Each of the radiators have 2000 RPM Noctua NFF-12 IPPC fans

I'll probably tweak with it quite a bit more but just from about half an hours worth, it is doing really well.

It was super impressive to go from ~80C on the reference to toping out around ~49C with the new block!


----------



## Asus11

Quote:


> Originally Posted by *Spikeyjohnson*
> 
> I'm running my system in this order Pump/Res (EK Xres 140 D5) >> GPU (EK block and back) >> 360 Rad (EK Coolstream PE) >> CPU (EK Supremacy x99) >> 240 Rad (EK Coolstream PE) >> ... Each of the radiators have 2000 RPM Noctua NFF-12 IPPC fans
> 
> I'll probably tweak with it quite a bit more but just from about half an hours worth, it is doing really well.
> 
> It was super impressive to go from ~80C on the reference to toping out around ~49C with the new block!


wow im running a 240mm off GPU/CPU and only hittin 51c max on GPU with 26c ambient, your CPU must get toasty


----------



## Spikeyjohnson

Quote:


> Originally Posted by *Asus11*
> 
> wow im running a 240mm off GPU/CPU and only hittin 51c max on GPU with 26c ambient, your CPU must get toasty


It definitely does. I don't know what your running but I was surprised by how hot this 5930K gets compared to my old 4770k. The 4770 was idling around 30. This 5930K averages around 36-36 and hits up to 65 C when working hard.


----------



## Asus11

Quote:


> Originally Posted by *Spikeyjohnson*
> 
> It definitely does. I don't know what your running but I was surprised by how hot this 5930K gets compared to my old 4770k. The 4770 was idling around 30. This 5930K averages around 36-36 and hits up to 65 C when working hard.


i7 6700k, my case is an open air one so that does help I guess


----------



## BelowAverageIQ

Quote:


> Originally Posted by *geggeg*
> 
> Go for it, provided it is not ridiculously priced.


Thank you. Will be a while yet (obviously, as they haven't been released yet).

Are you saying that given the actual design and production of the GP104, that basically ALL cards will be the same? Less of a lottery than previous GPU's?

On a separate note, I LOVE Houston. Only got to visit for a few weeks, but fell in love with the place.

Wish I could pack up and come over to live there.

Cheers

Rob


----------



## bfedorov11

Quote:


> Originally Posted by *Asus11*
> 
> wow im running a 240mm off GPU/CPU and only hittin 51c max on GPU with 26c ambient, your CPU must get toasty


What kind of load? Fans?

I'm about to fire up my 6700k and 1080 under water. I went with 2x 240mm but thought about going with a single 240mm. Guess I won't have to worry about noise.


----------



## VSG

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Thank you. Will be a while yet (obviously, as they haven't been released yet).
> 
> Are you saying that given the actual design and production of the GP104, that basically ALL cards will be the same? Less of a lottery than previous GPU's?
> 
> On a separate note, I LOVE Houston. Only got to visit for a few weeks, but fell in love with the place.
> 
> Wish I could pack up and come over to live there.
> 
> Cheers
> 
> Rob


At this point, yes. Unless some breakthrough happens, it looks like an even more locked down Maxwell where most GPUs were able to do the same core/memory clocks.

Hopefully you get to come back to Houston sooner than later


----------



## goodnightworld

guys i am playing ultra settings with 1080, cpu 5960X and with a temp of 82-86 maximum... is it normal? I am using zotac 1080 fe...


----------



## grimboso

Quote:


> Originally Posted by *JaBR23KiX*
> 
> Same problem here. Around minute on 2106 mhz and after goes down to 2088mhz. Temps 53. Fan manualy to 80 procent. Asus strix gaming oc


What is your memory OC? I think it might be error correction ki
Quote:


> Originally Posted by *bfedorov11*
> 
> What kind of load? Fans?
> 
> I'm about to fire up my 6700k and 1080 under water. I went with 2x 240mm but thought about going with a single 240mm. Guess I won't have to worry about noise.


Less noise is best noise









Running an i7-6700k on 2x240 with fans running at 600 rpm and soon I'll be running two FTWs on two 360. Highly overkill, but I like a silent loop. The watertemp on the cpu loop after 1 hour of P95 small was 27 (24 ambient) and highest recorded cpu Temp at 4.7/1.34 was 47.

Ontopic: seems that with standard bios water is doing little for overclocking, is this what you guys see at the moment as well?


----------



## Cial00

Quote:


> Originally Posted by *grimboso*
> 
> What is your memory OC? I think it might be error correction ki
> Less noise is best noise
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running an i7-6700k on 2x240 with fans running at 600 rpm and soon I'll be running two FTWs on two 360. Highly overkill, but I like a silent loop. The watertemp on the cpu loop after 1 hour of P95 small was 27 (24 ambient) and highest recorded cpu Temp at 4.7/1.34 was 47.
> 
> Ontopic: seems that with standard bios water is doing little for overclocking, is this what you guys see at the moment as well?


Yes, no crazy overclocks on stock bios but water keeps it more consistent due to low temps. Good news is there is a lot of headroom when these cards are only hitting 40ish under load on water.


----------



## Asus11

Quote:


> Originally Posted by *bfedorov11*
> 
> What kind of load? Fans?
> 
> I'm about to fire up my 6700k and 1080 under water. I went with 2x 240mm but thought about going with a single 240mm. Guess I won't have to worry about noise.


im using some swiftech fans that came off a swiftech h220x , the rig is silent even when gaming my CPU is also delidded and overclocked to 4.8

its not watercooled by the swiftech h220x though, EK D5


----------



## Radox-0

Quote:


> Originally Posted by *goodnightworld*
> 
> guys i am playing ultra settings with 1080, cpu 5960X and with a temp of 82-86 maximum... is it normal? I am using zotac 1080 fe...


Temps are common with stock FE fan profile, the target temp is 83 degrees so the temps will end up a few degrees around that sort of figure by default and manage the boost accordingly. If your worried and or want to raise the boost, manually set up a fan profile. IMO these cards are slightly more quiet compared to prior reference blowers, so higher fan % is more tolerable.


----------



## grimboso

Quote:


> Originally Posted by *Cial00*
> 
> Yes, no crazy overclocks on stock bios but water keeps it more consistent due to low temps. Good news is there is a lot of headroom when these cards are only hitting 40ish under load on water.


Hopefylly then with custom bios and a higher PT / volt Will give us maybe 23-2400?


----------



## Cial00

Quote:


> Originally Posted by *grimboso*
> 
> Hopefylly then with custom bios and a higher PT / volt Will give us maybe 23-2400?


Yeah based on what I'm reading from LN2 results I think we can expect water to top out around 2300, maybe 2400 with custom vbios.


----------



## Asus11

Quote:


> Originally Posted by *Asus11*
> 
> im using some swiftech fans that came off a swiftech h220x , the rig is silent even when gaming my CPU is also delidded and overclocked to 4.8
> also its an open air case so it is silent even more so I have no sound proofing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> its not watercooled by the swiftech h220x though, EK D5


----------



## GiveMeHope

I played around with overclocking on my Zotac 1080 Founders while running Furmark and the clock raised to 2126 at points.



How is this generally considered among all 1080s? Do I have a good/bad/average chip or what?

Thanks


----------



## zetoor85

anyone pushing 2.5ghz oc? or did nvidia lie again


----------



## Asus11

Quote:


> Originally Posted by *zetoor85*
> 
> anyone pushing 2.5ghz oc? or did nvidia lie again


when did Nvidia say 2.5?


----------



## MCFC

Quote:


> Originally Posted by *AllGamer*
> 
> Not quite true,
> yesterday I ran StarCitizen with only 1 GTX1080 and it was barely making 33 FPS in 2D Surround Mode,
> if 3D Surround is enabled, the FPS drops in half.
> 
> I placed an order for the 2nd GTX1080 to go SLI, and hopefully it can pull 60+ FPS on 2D and at least 30 FPS on 3D Surround Mode.


that shouldn't even be touched as a benchmark for videocards until it releases in 10-20 years


----------



## Nizzen

Quote:


> Originally Posted by *zetoor85*
> 
> anyone pushing 2.5ghz oc? or did nvidia lie again


Gigabyte Unleashes GeForce GTX 1080 Xtreme Gaming Water Cooling Graphics Card

Read more: http://wccftech.com/gigabyte-geforce-gtx-1080-xtreme-gaming/#ixzz4C2z4m1VO

Closer?


----------



## Shaded War

Quote:


> Originally Posted by *zetoor85*
> 
> anyone pushing 2.5ghz oc? or did nvidia lie again


I saw that a LN2 run hit 2.5Ghz. I doubt we will ever see water or air cooled hit this. It seems Nvidia may have done something to cause high voltage to reset drivers.

LINK
Quote:


> However it is uncertain if NVIDIA approved this card design and its unlocked BIOS, so we don't know if retail cards will be able to achieve similar frequencies.


----------



## zGunBLADEz

so i have been playing with this card for awhile and im getting the same result on 1.000mV vs 1.093mV... Now that i manage to cap and feed a constant voltage to the card non stop doesnt matter whats the load all way up to 1.093mV see no use of voltage. That nets me 2088 complete stable.
Quote:


> Originally Posted by *Shaded War*
> 
> I saw that a LN2 run hit 2.5Ghz. I doubt we will ever see water or air cooled hit this. *It seems Nvidia may have done something to cause high voltage to reset drivers.*
> 
> LINK


Probably as im playing with the curve on afterburner and i manage to have the card @ 1.000mV the entire ordeal or any voltage i desire even 1.093mV CONSTANT non stop.
She will follow it religiously unless she goes into power limit borders..


----------



## skline00

I was playing around with my Zotac GTX 1080 FE under water with an EK block. I was able to raise the core to 1973 (1607 +366) which showed an exact boost to 2100. I didn't even change the power target, I suspect my temps are so low due to the water cooling that it is stable. Ran Firestrike AND Heaven without problems.


----------



## BigBeard86

Quote:


> Originally Posted by *skline00*
> 
> I was playing around with my Zotac GTX 1080 FE under water with an EK block. I was able to raise the core to 1973 (1607 +366) which showed an exact boost to 2100. I didn't even change the power target, I suspect my temps are so low due to the water cooling that it is stable. Ran Firestrike AND Heaven without problems.


What was your max oc prior to watercooling?


----------



## MCFC

So now that the ftw edition is more like ftl, should I wait for the classified instead? (kingpin is probably going to come out in 6 months if not more so **** waiting for that)


----------



## skline00

Quote:


> Originally Posted by *BigBeard86*
> 
> What was your max oc prior to watercooling?


Sorry, but I never ran it with it with the Air Cooler.I ran it with only the Water block .


----------



## Shaded War

Quote:


> Originally Posted by *MCFC*
> 
> So now that the ftw edition is more like ftl, should I wait for the classified instead? (kingpin is probably going to come out in 6 months if not more so **** waiting for that)


I haven't been reading of any OC advantages from any of the custom boards weather it has 10+2 phase, dual 8-pin power, or custom PCB. So my decision was just to go with a GPU that has a decent cooler for the price range you want.

I really wanted the MSI Gaming X, but I'm not bothering with paying $720 when I was able to get the Zotac AMP for $640 shipped. Saved $80 and it will likely OC all the same.


----------



## BigBeard86

Hey guys; just finished installing the g10 on my 1080.

First thing: you cannot keep the oem backplate on the card, as the backplate must screw into the stock cooler portion. I don't know why people told me that I can.

Secondly...I am getting 32c full load, in heaven benchmark. Unbelievable! Idle temps at 25C.

Untitled.jpg 1094k .jpg file

I also gained 50mhz for my max stable OC. Cant wait for bios mods for this card...there is so much head room, in terms of temps.


----------



## keem21

Quote:


> Originally Posted by *BigBeard86*
> 
> Hey guys; just finished installing the g10 on my 1080.
> 
> First thing: you cannot keep the oem backplate on the card, as the backplate must screw into the stock cooler portion. I don't know why people told me that I can.
> 
> Secondly...I am getting 32c full load, in heaven benchmark. Unbelievable! Idle temps at 25C.
> 
> Untitled.jpg 1094k .jpg file
> 
> I also gained 50mhz for my max stable OC. Cant wait for bios mods for this card...there is so much head room, in terms of temps.


Nice temps! what cooler are you using with the G10?


----------



## BigBeard86

Quote:


> Originally Posted by *keem21*
> 
> Nice temps! what cooler are you using with the G10?


Thanks. I am using the NZXT Kraken x41. Very pleased with it...I had it since it first came out, and used it on my crossfire 290 setup. I knew if it kept those cards cool, cooling a 1080 would be a piece of cake.


----------



## jeanjean15

Hi .

Is anybody did a test with the new HB SLI bridge ?

It increases performances or not ?

Thanks in advance .


----------



## Dante007

Any one go beyond 2113 core ? i test
ASUS STRIX
FOUNDER EDTITION
Gigabyte G1
Gigabyte Xtreme
MSI Gaming X

all the same and max was 2113 on Xtreme with low memory 1375
and highest memory was MSI Gaming X 1430


----------



## Jpmboy

Quote:


> Originally Posted by *Dante007*
> 
> Any one go beyond 2113 core ? i test
> ASUS STRIX
> FOUNDER EDTITION
> Gigabyte G1
> Gigabyte Xtreme
> MSI Gaming X
> 
> all the same and max was 2113 on Xtreme with low memory 1375
> and highest memory was MSI Gaming X 1430




don;t have a pic with gpuZ sensor tab open for FSE.

MK11 Extrreme is much tougher on the GPU:



EVGA FE


----------



## S4ch4Z

Quote:


> Originally Posted by *Dante007*
> 
> Any one go beyond 2113 core ? i test
> ASUS STRIX
> FOUNDER EDTITION
> Gigabyte G1
> Gigabyte Xtreme
> MSI Gaming X
> 
> all the same and max was 2113 on Xtreme with low memory 1375
> and highest memory was MSI Gaming X 1430


My FE 1080 also kind of walled just a tad over 2,1GHz on stock cooler, but it couldn't hold if for long anyway.
Watercooling seems to help reaching a few more MHz but it's not really a game changer.
It just helps with freq stability as long as your card isn't hitting too far over the power limit.

Got it to pass Heaven Benchmark at 2176 MHz flat playing with the custom curve tool in Precision X
(with stock MEM though)


----------



## THEROTHERHAMKID

What settings do i use for heaven to oc my 1080? 1080p or 4k?
Whats best? Evga or afterburner?


----------



## CallsignVega

Quote:


> Originally Posted by *jeanjean15*
> 
> Hi .
> 
> Is anybody did a test with the new HB SLI bridge ?
> 
> It increases performances or not ?
> 
> Thanks in advance .


I don't think any HB SLI bridges are out in the public yet. I ordered mine as soon as they became available on NVIDIA site on Friday, but hasn't shipped yet.


----------



## axiumone

Aw, that's a bummer dude. Mine shipped on friday, got it this morning.

Had about 20 min to test it. Eyeballing it, I don't see any perceptible difference in frametimes. Games that had good sli support, felt as smooth as using two ribbon cables. Games that had bad sli support (the division), still had a slight stutter.

Edit - Forgot to mention, this is at 7200x2560 resolution, so it should be the exact intended use of HB bridges.


----------



## gulvastr

After OC'ing my MSI gaming X over the weekend I was able to hit about 2075 on the core but once gaming it settled around 2000. Memory test stable at 11000+ but once gaming I have to lower it to 10900 to get rid of artifacts. This is with the stock cooler....still waiting on that EK block to be released, they have a part number for the block listed but you cant not buy yet.


----------



## CallsignVega

Got my first FTW in. Time to test if it's actually FTW or FTL!


----------



## dagget3450

Quote:


> Originally Posted by *CallsignVega*
> 
> Got my first FTW in. Time to test if it's actually FTW or FTL!


FOR THE WIN OR FASTER THAN LIGHT!!


----------



## NASzi

Got minz in yesterday

http://s384.photobucket.com/user/no...06326232713902_7470269693548481165_o.jpg.html


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What settings do i use for heaven to oc my 1080? 1080p or 4k?
> Whats best? Evga or afterburner?


Anyone help please?


----------



## gamingarena

Quote:


> Originally Posted by *CallsignVega*
> 
> I don't think any HB SLI bridges are out in the public yet. I ordered mine as soon as they became available on NVIDIA site on Friday, but hasn't shipped yet.


Actually people start receiving them on geforce/EVGA forums and as we all expected its all Nvidia BS Hype nothing else zero performance increase even compared to single ribbon cable Zero increase!


----------



## Twau

My old R9 390 broke a few months ago, the store gave me 400€ in credit since it was out of stock by that time, ordered EVGA GeForce GTX1080 FTW today for 390€.
Feels good to only give away 390€ for a new card this year that is way faster than my old. Now I am just waiting for it to come back in stock!


----------



## PasK1234Xw

Quote:


> Originally Posted by *gamingarena*
> 
> Actually people start receiving them on geforce/EVGA forums and as we all expected its all Nvidia BS Hype nothing else zero performance increase even compared to single ribbon cable Zero increase!


That same one person posted on EVGA and nvidia i wouldn't be so quick to judge. If there are no scaling issues i wouldn't expect to see improvements.

There are times usage drops in particular games that could be due to bandwidth limits and this could help IDK Also SLI with 1080 in doom in a joke but seems fine with older gen cards supposedly its bandwidth issue.

Im not holding breath though TBH but still hope. Ill judge when i get it installed on my own system. **** people still say SLI micro stutter and i never have this problem so i would take what people say with grain of salt when it comes to SLI

edit
one vs 2 single sli bridge...yes it help scaling


----------



## BehindTimes

Posted on [H]ardForum already, and while for the most part, I don't notice much difference when it comes to average framerate, I am noticing a higher max framerate @ 4k. Too much could be involved though which could account for that though. Maybe there might be a difference in VR or DX12, but I don't have anything really to test those two.


----------



## axiumone

Also, few posts up. My HB bridge is here as well.


----------



## stanielz

Quote:


> Originally Posted by *PasK1234Xw*
> 
> That same one person posted on EVGA and nvidia i wouldn't be so quick to judge. If there are no scaling issues i wouldn't expect to see improvements.
> 
> There are times usage drops in particular games that could be due to bandwidth limits and this could help IDK Also SLI with 1080 in doom in a joke but seems fine with older gen cards supposedly its bandwidth issue.
> 
> Im not holding breath though TBH but still hope. Ill judge when i get it installed on my own system. **** people still say SLI micro stutter and i never have this problem so i would take what people say with grain of salt when it comes to SLI
> 
> edit
> one vs 2 single sli bridge...yes it help scaling


according to gamer nexus:




2x floppy bridges dont equal an LED or HB bridge. as far as micro stutter, it used to be an issue when i didnt have g sync and had sli'd 770s, but now with gsync sli'd 1080s its pretty much eliminated. at least for me.
Quote:


> Originally Posted by *axiumone*
> 
> Also, few posts up. My HB bridge is here as well.


so gpu usage didnt change for you in any game?


----------



## kx11

The Division benchmark @ 4k , graphics settings @ start of the video





using 368.39 driver


----------



## JedixJarf

Quote:


> Originally Posted by *BigBeard86*
> 
> Hey guys; just finished installing the g10 on my 1080.
> 
> First thing: you cannot keep the oem backplate on the card, as the backplate must screw into the stock cooler portion. I don't know why people told me that I can.
> 
> Secondly...I am getting 32c full load, in heaven benchmark. Unbelievable! Idle temps at 25C.
> 
> Untitled.jpg 1094k .jpg file
> 
> I also gained 50mhz for my max stable OC. Cant wait for bios mods for this card...there is so much head room, in terms of temps.


Probably depends on the backplate, fits perfectly fine on my g1 1080


----------



## PasK1234Xw

Quote:


> Originally Posted by *stanielz*
> 
> 2x floppy bridges dont equal an LED or HB bridge. as far as micro stutter, it used to be an issue when i didnt have g sync and had sli'd 770s, but now with gsync sli'd 1080s its pretty much eliminated. at least for me.


I never said it was equal to anything but it improved his usage so its pretty obvious HB isn't a gimmick. Pascal requires both sides to be populated for high bandwidth. Pref tech only used single channel and PCIE did the rest.

Also gsync fixing your stutter prove your stutter wasn't SLI related. It was vsync stutter from not keeping frames matched with your static refresh rate.

edit


----------



## stanielz

Quote:


> Originally Posted by *PasK1234Xw*
> 
> I never said it was equal to anything but it improved his usage so its pretty obvious HB isn't a gimmick. Pascal requires both sides to be populated for high bandwidth. Pref tech only used single channel and PCIE did the rest.
> 
> Also gsync fixing your stutter prove your stutter wasn't SLI related. It was vsync stutter from not keeping frames matched with your static refresh rate.
> 
> Like i said take ^peoples^ comments when it comes to SLI with grain of salt.


yeah, i never said u said or didnt say anything, i was only contributing to the discussion. i didnt say gsync fixed micro stutter i just said that now with gsync and 1080s i dont notice any micro stutter like i did before, at least for me, yes, take with a grain of sailt, because its my opinion which is based on my build and my experiences, duh.


----------



## PasK1234Xw

Gotcha sorry for that. Hard to tell sometimes where people are going with post.
Anytime i mention SLI and no stutter everyone jumps down my throat.


----------



## axiumone

Quote:


> Originally Posted by *stanielz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> according to gamer nexus:
> 
> 
> 
> 
> 2x floppy bridges dont equal an LED or HB bridge. as far as micro stutter, it used to be an issue when i didnt have g sync and had sli'd 770s, but now with gsync sli'd 1080s its pretty much eliminated. at least for me.
> so gpu usage didnt change for you in any game?


Not as far as I can tell yet.

Also, the nvidia bridge does have an led after all. I like this strip better then the logo.


Spoiler: Warning: Spoiler!







GPU usage seems the same in some quick a-b testing of witcher 3. The new HB bridge and 2 x ribbons seem to produce nearly identical gpu usage. Will try to post results from more games if I have the time.

Edit 2 - Tested Shadows of Mordor, which nvidia was showing off the new HB bridge with. Results are exactly the same as the witcher above. Same exact gpu usage. Both felt very smooth. So if there is any difference, it could be only in frame time variance, but unfortunately I don't have the equipment to test that.


Spoiler: Warning: Spoiler!


----------



## orlfman

my strix 1080 just came today.



put me on the list!


----------



## immortalkings

Got my Zotac AMP! Edition yesterday... got a problem here.. hope you guys could help me out.. i can't install the latest driver from nvidia.. it says "this nvidia graphics driver is not compatible with this version of windows" but i downloaded the correct driver.. can't update my windows 10 enterprise to version 1511.. so i need a new format windows 10? downloading the installer now.. i hope it would work later after work.. is there any suggestion besides on formating my whole PC


----------



## skline00

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Anyone help please?


I've used both and actually use the Firestorm software that came with my ZOTAC GTX1080 FE. It allows me to raise my power target to 120%. The silicon in each of these cards is probably close. Note: my card is watercooled with an EK block so temps aren't the problem.

What card do you have?


----------



## skline00

Quote:


> Originally Posted by *immortalkings*
> 
> 
> 
> Got my Zotac AMP! Edition yesterday... got a problem here.. hope you guys could help me out.. i can't install the latest driver from nvidia.. it says "this nvidia graphics driver is not compatible with this version of windows" but i downloaded the correct driver.. can't update my windows 10 enterprise to version 1511.. so i need a new format windows 10? downloading the installer now.. i hope it would work later after work.. is there any suggestion besides on formating my whole PC


Do a system restore.


----------



## skline00

Quote:


> Originally Posted by *orlfman*
> 
> my strix 1080 just came today.
> 
> 
> 
> put me on the list!


Congrats and welcome aboard. That looks like a beautiful card. What are the stock core and memory settings?


----------



## orlfman

Quote:


> Originally Posted by *skline00*
> 
> Congrats and welcome aboard. That looks like a beautiful card. What are the stock core and memory settings?


thanks! was going to wait for the evga ftw since it would match my new sabertooth s better, but with no eta, and saw it in stock on newegg i decided to snag it up. it looks really cool. for its size its pretty light but extremely sturdy.

stock speeds are: 1759 core 1898 boost and 10000 memory.


----------



## skline00

Quote:


> Originally Posted by *orlfman*
> 
> thanks! was going to wait for the evga ftw since it would match my new sabertooth s better, but with no eta, and saw it in stock on newegg i decided to snag it up.
> 
> stock speeds are: 1759 core 1898 boost and 10000 memory.


WOW that's impressive.

You are going to love this card.


----------



## CallsignVega

Update on my FTW, it's pretty good. Very stable 1.08v, clock stays locked on. None of the spiky boost frequencies that you see with the FE.

It does 2088 MHz core and 11016 MHZ memory. At those specs, the card settles down at 60C at 50% fan. The fan is a bit louder than I was expecting, but quieter than the FE.

I'd like to see what it could do at 1.25v. Need a new BIOS!


----------



## Asus11

anyone done the shunt power mod? im thinking of doing it









also im thinking of also using CLU on the die.. anything to be worried about if I make sure the surrounding is insulated


----------



## VSG

Quote:


> Originally Posted by *Asus11*
> 
> anyone done the shunt power mod? im thinking of doing it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also im thinking of also using CLU on the die.. anything to be worried about if I make sure the surrounding is insulated


Won't do anything if the voltage limit doesn't ask for more power than what a single 8 pin is capable of providing. Power and temps aren't the issue, voltage is. GPU Boost 3.0 combining voltage along with those two is killing overclocking potential.


----------



## pez

Quote:


> Originally Posted by *stanielz*
> 
> yeah, i never said u said or didnt say anything, i was only contributing to the discussion. i didnt say gsync fixed micro stutter i just said that now with gsync and 1080s i dont notice any micro stutter like i did before, at least for me, yes, take with a grain of sailt, because its my opinion which is based on my build and my experiences, duh.


Quote:


> Originally Posted by *PasK1234Xw*
> 
> Gotcha sorry for that. Hard to tell sometimes where people are going with post.
> Anytime i mention SLI and no stutter everyone jumps down my throat.


At this point when I see people complain about micro stutters with SLI (and maybe even Crossfire) I just assume they are doing something wrong or have no idea what they are talking about. Over the years I've SLI/Crossfire'd 8800/9800GTs, HD4870/4890s, and more recently GTX 970s and did not have any issues whatsoever.

We are just 3 people with anecdotal evidence, but I mean, 5 million could never be wrong, right?


----------



## Shad0w59

I got my GTX 1080, here's an unboxing:


----------



## BigBeard86

Quote:


> Originally Posted by *JedixJarf*
> 
> Probably depends on the backplate, fits perfectly fine on my g1 1080


Cool The backplate on the 1080 FE is plastic trash anyways.

however, there were 3 square thermal pads on the back plate, cooling some little black structures. Did you notice it on your card too?


----------



## Asus11

Quote:


> Originally Posted by *geggeg*
> 
> Won't do anything if the voltage limit doesn't ask for more power than what a single 8 pin is capable of providing. Power and temps aren't the issue, voltage is. GPU Boost 3.0 combining voltage along with those two is killing overclocking potential.


have seen some people claim 50-100mhz more overclock when doing it

which isn't bad but could mean hitting 2200


----------



## Bloodymight

Got my Palit 1080 Super JetStream

Stock boost clock is around 1938MHz 72°C up to 50% fan speed


Best way to test GPU OC is to run Unreal Tournament, somehow that game destroyed the "stable" OC I had lol
After OC Boost clock lingers at 2038MHz 75°C up to 52% fan speed



Couldn't go any higher than +80MHz Offset, Unreal tournament is the only game that crashed at +85MHz Offset, could run FireStrike, Heavens, Witcher 3, etc. for hours with +95MHz offset

Fan gets noticable at >47% but it's still VERY quiet, I'm pretty impressed by it.

Palit JetStream/GameRock and Gainward Phoenix use the same aftermarket cooler, they are all very quiet/considered the quietest currently available.

here are some pics


----------



## bfedorov11

Quote:


> Originally Posted by *Asus11*
> 
> anyone done the shunt power mod? im thinking of doing it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also im thinking of also using CLU on the die.. anything to be worried about if I make sure the surrounding is insulated


I literally just fired my rig back up under water now. Fans were rubbing on the aluminum.. took two days to fix.









I reapplied clu on the shunt resistors while installing my ek block and it still doesn't seem like it helps. Max temp is 36 in FS. Core speed still fluctuates, mostly in scene 1. Power limit still pings everywhere.



Bios/PT is really holding these cards back. Anyone know whats up with bios tools? Seems pretty strange it's taking this long.


----------



## Shadowdane

Quote:


> Originally Posted by *bfedorov11*
> 
> Bios/PT is really holding these cards back. Anyone know whats up with bios tools? Seems pretty strange it's taking this long.


Considering Gpu-z just added support to save the BIOS on the 1070/1080.. I bet we see tools updated for the Pascal BIOS soon!

http://www.techpowerup.com/downloads/2718/techpowerup-gpu-z-v0-8-9

Also with the GPU Boost 3.0 stuff I bet a lot of stuff changed in the BIOS, it will likely take time to figure out what's new features are in these BIOS dumps.


----------



## keem21

2050mhz/5350mhz overclock
GTX 1080 FE
stock aircooling stable at 80c



this card badly needs an increased voltage to OC higher.


----------



## dmasteR

Anyone know why a VRM is missing on the GTX 1080 FE?


----------



## bfedorov11

Quote:


> Originally Posted by *dmasteR*
> 
> Anyone know why a VRM is missing on the GTX 1080 FE?
> 
> 
> Spoiler: Warning: Spoiler!


Didn't need it. They'll probably reuse the pcb for the next cards.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *skline00*
> 
> I've used both and actually use the Firestorm software that came with my ZOTAC GTX1080 FE. It allows me to raise my power target to 120%. The silicon in each of these cards is probably close. Note: my card is watercooled with an EK block so temps aren't the problem.
> 
> What card do you have?


Thanks for the reply it's a g1 1080


----------



## ChevChelios

I feel like even if they mod the BIOS and increase the voltage to push 2200-2300+ its going to require a waterblock, since on the increased voltage & core current AIB fans will either get too loud or the card will get too hot if the fans stays the same

so its for enthuasiasts/water-cooling only


----------



## stanielz

anyone else notice with a 1080 and VR headset that things look much clearer/less pixelated? and im coming from a titan x, wonder what gives, i thought the new tech still had to be implemented into future games/


----------



## max883

Got my MSI GTX 1080 GAMING X!

volt: auto
power: 120%
core clock: +200 = 2.2GHz
mem: +500 = 11000
Fann: auto

3dmark firestrike: GPU score: 25.478P


----------



## MestasDeejay

got a evga gtx 1080 sc.

cant overclock it above

volt: auto
power: 120%
core clock: +120
mem: +275
Fann: custom curve

its that normal?¿ i saw a lot of people going +200 core +500 memory....


----------



## Bloodymight

Quote:


> Originally Posted by *Shadowdane*
> 
> Considering Gpu-z just added support to save the BIOS on the 1070/1080.. I bet we see tools updated for the Pascal BIOS soon!
> 
> http://www.techpowerup.com/downloads/2718/techpowerup-gpu-z-v0-8-9
> 
> Also with the GPU Boost 3.0 stuff I bet a lot of stuff changed in the BIOS, it will likely take time to figure out what's new features are in these BIOS dumps.


I'm pretty new when it comes to graphics bios and I just noticed an option from ThunderMaster which saves the Bios as a .rom file, do you guys need that?


----------



## Associated

Quote:


> Originally Posted by *Shad0w59*
> 
> I got my GTX 1080, here's an unboxing:


Now lets see what it can do!


----------



## ChevChelios

they told me I should get my G1 1080 next week

praise the sun !

now I gotta snag a good deal on the XB271HU and Im all set until Volta and 4K 144Hz HDR/OLED gaming monitors arrive

.. still not sure if I should be using Gigabyte Extreme Engine or the AfterBurner to OC the G1 .. guess I'll try AfterBurner first


----------



## looniam

i'll just leave this here.
Is the SLI HB Bridge essential?

but read the whole article.


----------



## PasK1234Xw

Its hard to take a "review" serious when testing old ass games and ones that dont even support SLI


----------



## ChevChelios

WoW surely needs those 1080 SLIs


----------



## BrainSplatter

Quote:


> Originally Posted by *looniam*
> 
> i'll just leave this here.
> Is the SLI HB Bridge essential?


Not sure these were good tests. They should test games which run slow @4K like Witcher 3 or bandwidth hogs like Doom. Also, why check whether non-SLI enabled games would magically run with HB-bridges ?!?


----------



## PasK1234Xw

Ill have mine Thursday im not expecting huge gains but id expect that the random 80% usage drops i see at times even out.


----------



## axiumone

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Ill have mine Thursday im not expecting huge gains but id expect that the random 80% usage drops i see at times even out.


Lower your expectations, as that won't happen. The bridge has zero effect on gpu usage or frame rates.


----------



## pez

Quote:


> Originally Posted by *ChevChelios*
> 
> they told me I should get my G1 1080 next week
> 
> praise the sun !
> 
> now I gotta snag a good deal on the XB271HU and Im all set until Volta and 4K 144Hz HDR/OLED gaming monitors arrive
> 
> .. still not sure if I should be using Gigabyte Extreme Engine or the AfterBurner to OC the G1 .. guess I'll try AfterBurner first


From what I can tell, you can essentially set an LED color (and apply effect as needed/wanted) if you want and then disable/uninstall the program. You can then use something that isn't as atrocious, clunky or dysfunctional to do the actual important stuff







.


----------



## PasK1234Xw

Quote:


> Originally Posted by *axiumone*
> 
> Lower your expectations, as that won't happen. The bridge has zero effect on gpu usage or frame rates.


You had 2 bridge hooked up im still using one so im sure i will see difference regardless if it any different than two normal bridges im starved for bandwidth with only one. Granted it has almost no effect on performance i just want my usage back to normal


----------



## GTANY

The techpowerup SLI review is bad : no frametimes and no subjective feeling about smoothness. With such a review, I don't know if the SLI-HB bridge improves the gaming experience.


----------



## ralphi59

Hi all
What about this one ?


----------



## ralphi59

He turn like a champ !!!
But precision is bugged like hell.


----------



## CallsignVega

Quote:


> Originally Posted by *ChevChelios*
> 
> I feel like even if they mod the BIOS and increase the voltage to push 2200-2300+ its going to require a waterblock, since on the increased voltage & core current AIB fans will either get too loud or the card will get too hot if the fans stays the same
> 
> so its for enthuasiasts/water-cooling only


My 1520 MHz 980Ti Lightnings put out more heat than a 1080 and the fans were virtually silent doing so.

Quote:


> Originally Posted by *axiumone*
> 
> Lower your expectations, as that won't happen. The bridge has zero effect on gpu usage or frame rates.


Correct, in theory really only stutters/spikes should be minimized at higher resolutions and FPS. This is due to all the frames being sent over the HB SLI bridge and not spliced together via the SLI bridge and PCI-E bridge which have slightly different traffic/speeds/delays.

Heck, 4K @ 60 Hz may not see much of a benefit. But I would surely expect that 4K @ 120 Hz and 5K at 60 Hz would, in addition to surround. That's where I ran into my 8x vs 16x PCI-E 3.0 limitations.


----------



## axiumone

EVGA HB bridges available.

http://www.evga.com/articles/01020/evga-pro-sli-bridge-hb/


----------



## max883

se if i can get GPU score of 26.000P in Firestrike









EDIT: volt at 100% and gpu at 2.25GHz i get 26.688P in firestrike GPU score


----------



## Jpmboy

Quote:


> Originally Posted by *immortalkings*
> 
> 
> 
> Got my Zotac AMP! Edition yesterday... got a problem here.. hope you guys could help me out.. i can't install the latest driver from nvidia.. it says "this nvidia graphics driver is not compatible with this version of windows" but i downloaded the correct driver.. can't update my windows 10 enterprise to version 1511.. so i need a new format windows 10? downloading the installer now.. i hope it would work later after work.. is there any suggestion besides on formating my whole PC


No need ot re-install or zero-base the rig. This error is cause by the need to update Windows 10 to Builld 1151 (OS Build 10586.420). It is required for the loader to work (and for the driver if using W10.)
Quote:


> Originally Posted by *dmasteR*
> 
> Anyone know why a VRM is missing on the GTX 1080 FE?


it's not missing... it is for future use.


----------



## Asus11

Quote:


> Originally Posted by *max883*
> 
> 
> 
> 
> 
> se if i can get GPU score of 26.000P in Firestrike
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: volt at 100% and gpu at 2.25GHz i get 26.688P in firestrike GPU score


the pic says 25k though?

what kind of 1080 do you have hittin 2.25 u lucky.....









my 1080 can do 25,300 second try im sure I can hit 26k though


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> No need ot re-install or zero-base the rig. This error is cause by the need to update Windows 10 to Builld 1151 (OS Build 10586.420). It is required for the loader to work (and for the driver if using W10.)
> it's not missing... it is for future use.


when I seen it missing I thought to myself... hmmm I defo can see the 1080 ti coming now

then cried cause I just blew a load of $ on the 1080 haha


----------



## RGSPro

Quote:


> Originally Posted by *axiumone*
> 
> EVGA HB bridges available.
> 
> http://www.evga.com/articles/01020/evga-pro-sli-bridge-hb/


Thanks for the update! In for one, even though shipping prices are pretty high... These also look 99% like their old pro bridge minus the color. Shame really because they had the opportunity to make something new and cool looking.


----------



## Jpmboy

Quote:


> Originally Posted by *Bloodymight*
> 
> I'm pretty new when it comes to graphics bios and I just noticed an option from ThunderMaster which saves the Bios as a .rom file, do you guys need that?


the most recent version of NVflash can save and flash the 1080. We just need a way to tweak the bios...


----------



## Baasha

So these just came in:


----------



## RGSPro

Quote:


> Originally Posted by *Baasha*
> 
> So these just came in:


Did you buy a 3-way LED bridge for benchmarking/testing?


----------



## Baasha

Quote:


> Originally Posted by *RGSPro*
> 
> Did you buy a 3-way LED bridge for benchmarking/testing?


I've had the 3-way LED bridge for a while - just took a pic with it

initial impressions (just ran 3DMark Fire Strike Ultra - scaling seems better (?) but actual results are pretty much the same as the 3-way LED bridge.

Will do more benchmarks soon.


----------



## looniam

Quote:


> Originally Posted by *BrainSplatter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> i'll just leave this here.
> Is the SLI HB Bridge essential?
> 
> 
> 
> Not sure these were good tests. They should test games which run slow @4K like Witcher 3 or bandwidth hogs like Doom. Also, why check whether non-SLI enabled games would magically run with HB-bridges ?!?
Click to expand...

not sure myself nor had i thought it would magically make a difference; just the only review i've seen comparing bridges.

which lead me to:
Quote:


> Originally Posted by *Baasha*
> 
> I've had the 3-way LED bridge for a while - just took a pic with it
> 
> initial impressions (just ran 3DMark Fire Strike Ultra - scaling seems better (?) but actual results are pretty much the same as the 3-way LED bridge.
> 
> *Will do more benchmarks soon.*


please do. and if you take requests, there are some already asked for ie. TW3 and whatnot.

i'm sure folks would appreciate it.


----------



## axiumone

Quote:


> Originally Posted by *looniam*
> 
> not sure myself nor had i thought it would magically make a difference; just the only review i've seen comparing bridges.
> 
> which lead me to:
> please do. and if you take requests, there are some already asked for ie. TW3 and whatnot.
> 
> i'm sure folks would appreciate it.


Do you mean 1080 in sli using different bridges?

Take a look - http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/1300_100#post_25275188

No difference what so ever.


----------



## looniam

Quote:


> Originally Posted by *axiumone*
> 
> Do you mean 1080 in sli using different bridges?
> 
> Take a look - http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/1300_100#post_25275188
> 
> No difference what so ever.


missed that.

have some (+REP)


----------



## Zaxbys

So from everything I am reading there does not seem to be much reason to choose an AIB card when you are going to water cool them?


----------



## Spiriva

Quote:


> Originally Posted by *axiumone*
> 
> EVGA HB bridges available.
> 
> http://www.evga.com/articles/01020/evga-pro-sli-bridge-hb/


Ofc Evga wount ship to Europe and on the EU site there is no sli bridges to be found just "coming soon".


----------



## BigBeard86

Quote:


> Originally Posted by *Zaxbys*
> 
> So from everything I am reading there does not seem to be much reason to choose an AIB card when you are going to water cool them?


My evga FE boosts to 2164 under water. 2125 with 100% consistency.


----------



## Vellinious

Quote:


> Originally Posted by *BigBeard86*
> 
> My evga FE boosts to 2164 under water. 2125 with 100% consistency.


How much power limit throttling are you seeing at those clocks?


----------



## BigBeard86

Quote:


> Originally Posted by *Vellinious*
> 
> How much power limit throttling are you seeing at those clocks?


MSI afterburner does show spikes of hitting the 120% power target, but no core throttling occurs. It is probably the main reason why it won't go above 2126mz.


----------



## Juub

Quote:


> Originally Posted by *Asus11*
> 
> the pic says 25k though?
> 
> what kind of 1080 do you have hittin 2.25 u lucky.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my 1080 can do 25,300 second try im sure I can hit 26k though


25K... Seems even a pair of them wouldn't be such an upgrade over my 980's. I reach 28K with my 980's in FireStrike. Here's hoping 1080 Ti will score 32K+. Two of them would be dreamy assuming the scaling isn't complete trash.


----------



## Dayaks

I'd like to do the shunt mod on the 1080 tomorrow. Any idea why they have in step 3 to add resistors on top of capacitors? I think the first half of the step to just solder over the shunt resistors would suffice. Placing low ohm resistors over capacitor's doesn't sound quite right. Soldering over the shunts is all I did with my 980.

https://xdevs.com/doc/xDevs.com/ocg_1080/senseB.jpg


----------



## Scrimstar

Really hope the Hybrid/Classified/Lightning come out in July, every nvidia release is so bad with their supply.

rmbr when 970s resellin for $450


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> My 1520 MHz 980Ti Lightnings put out more heat than a 1080 and the fans were virtually silent doing so.
> Correct, in theory really only stutters/spikes should be minimized at higher resolutions and FPS. This is due to all the frames being sent over the HB SLI bridge and not spliced together via the SLI bridge and PCI-E bridge which have slightly different traffic/speeds/delays.
> 
> Heck, 4K @ 60 Hz may not see much of a benefit. But I would surely expect that 4K @ 120 Hz and 5K at 60 Hz would, in addition to surround. That's where I ran into my 8x vs 16x PCI-E 3.0 limitations.


hey vega... think I could get you to pm me or post the bios from that 1080 FTW you have?


----------



## steponz

Quote:


> Originally Posted by *Dayaks*
> 
> I'd like to do the shunt mod on the 1080 tomorrow. Any idea why they have in step 3 to add resistors on top of capacitors? I think the first half of the step to just solder over the shunt resistors would suffice. Placing low ohm resistors over capacitor's doesn't sound quite right. Soldering over the shunts is all I did with my 980.
> 
> https://xdevs.com/doc/xDevs.com/ocg_1080/senseB.jpg


Thats the real mod you have to do for power.. doing the shunts doesn't help.... Trust me ... Ive already tested it..









PM me if ya want to do it.


----------



## steponz

Quote:


> Originally Posted by *Asus11*
> 
> the pic says 25k though?
> 
> what kind of 1080 do you have hittin 2.25 u lucky.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my 1080 can do 25,300 second try im sure I can hit 26k though


More like throttling.... 2.25 with out power mod.. don't think so...........


----------



## chronicfx

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Ill have mine Thursday im not expecting huge gains but id expect that the random 80% usage drops i see at times even out.


Why is your XB270HU at 165Hz? Is there a way to overclock it? Not that I would need it, just wondering. Mine is 144Hz

yes... off topic to say the least.. but so is arguing midrange vs, high end







I have two 980Ti, do you think I should go for the 1080's.... At least 5 times I have added a pair to my cart then deleted at checkout... I am obsessed but know in the back of my mind $1500 is alot to blow for an extra 30FPS when you are hitting 100 already....


----------



## Lays

Quote:


> Originally Posted by *geggeg*
> 
> Won't do anything if the voltage limit doesn't ask for more power than what a single 8 pin is capable of providing. Power and temps aren't the issue, voltage is. GPU Boost 3.0 combining voltage along with those two is killing overclocking potential.


Since when did we care about what a single 8 pin was "capable" of providing on *overclock.net* ? Lol

Surely you and I both know that measly 150w rating is a bit misleading, that spec is really quite old from what I understand. Hell, we have PSU's now with double 8 pin cables coming from 1 connector on the PSU, and people that do XOC are pulling 500w+ through 8 pin connectors without problems.


----------



## bfedorov11

Best I can do since going under water.. 5933 ultra graphics score.. 2177mhz/5556mhz. Still pings PL a few times with shunt resistor mod. Game stable at 2164mhz with very few drops below 2150 in witcher 3. 55-60 fps in 4k with everything maxed but hairworks AA off.

http://www.3dmark.com/fs/8897015


----------



## steponz

Quote:


> Originally Posted by *Lays*
> 
> Since when did we care about what a single 8 pin was "capable" of providing on *overclock.net* ? Lol
> 
> Surely you and I both know that measly 150w rating is a bit misleading, that spec is really quite old from what I understand. Hell, we have PSU's now with double 8 pin cables coming from 1 connector on the PSU, and people that do XOC are pulling 500w+ through 8 pin connectors without problems.


Reading this nonsense kills me.....


----------



## steponz

Quote:


> Originally Posted by *bfedorov11*
> 
> Best I can do since going under water.. 5933 ultra graphics score.. 2177mhz/5556mhz. Still pings PL a few times with shunt resistor mod. Game stable at 2164mhz with very few drops below 2150 in witcher 3. 55-60 fps in 4k with everything maxed but hairworks AA off.
> 
> http://www.3dmark.com/fs/8897015


You sure your not throttling with the shunt mod? Show some gpuz results... the shunt mod should reduce you so you don't hit the limit at all.. In other words... should be ok for 2500mhz.


----------



## Lays

Quote:


> Originally Posted by *steponz*
> 
> Reading this nonsense kills me.....


My post or his post? Lol









Also hey Joe, how ya doin?


----------



## immortalkings

done reformatting my PC and installing all the Apps that are needed.. now i'm playing games no OC.. hitting 1873+ boost clock but the temperature is hitting 82c aswell.. im using a Zotac 1080 AMP edition.. i though AIB partners only hit 70c+ temp.. the idle temp is 40c.. when the room temp is hot its 45c.. is it normal?


----------



## stanielz

Quote:


> Originally Posted by *Lays*
> 
> Since when did we care about what a single 8 pin was "capable" of providing on *overclock.net* ? Lol
> 
> Surely you and I both know that measly 150w rating is a bit misleading, that spec is really quite old from what I understand. Hell, we have PSU's now with double 8 pin cables coming from 1 connector on the PSU, and people that do XOC are pulling 500w+ through 8 pin connectors without problems.


Quote:


> Originally Posted by *steponz*
> 
> Reading this nonsense kills me.....


hes right, im running my SLI 1080s off a single connector with the double 8 pins hes referring to with no issues. cards are oc'd and stable at 2125 mhz.


----------



## steponz

Hey man.. not yours of course...

People really need to start looking at the power cap and not clocks.

Quote:


> Originally Posted by *Lays*
> 
> My post or his post? Lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also hey Joe, how ya doin?


----------



## pez

Quote:


> Originally Posted by *immortalkings*
> 
> done reformatting my PC and installing all the Apps that are needed.. now i'm playing games no OC.. hitting 1873+ boost clock but the temperature is hitting 82c aswell.. im using a Zotac 1080 AMP edition.. i though AIB partners only hit 70c+ temp.. the idle temp is 40c.. when the room temp is hot its 45c.. is it normal?


It sounds like you're not getting adequate airflow. I haven't seen any testing done on the Zotac cards yet, but these AIB coolers are putting most of that hot air right into your case. Higher idle temps are normal as I believe most if not all of the GTX 10X0 series does not run the fan at anything below 60C.


----------



## immortalkings

Quote:


> It sounds like you're not getting adequate airflow. I haven't seen any testing done on the Zotac cards yet, but these AIB coolers are putting most of that hot air right into your case. Higher idle temps are normal as I believe most if not all of the GTX 10X0 series does not run the fan at anything below 60C.


il try to open the case then and turn on my air-conditioning if it change something.. if thats the case.. i need to buy a full set of case fans to get a better airflow.. i only use stock fans of my s340 by the way


----------



## emett

Sorry if this is a silly question but with my G1 do we have voltage control? Ive unlocked it in MSI afterburner and it can go up to 100. But does it effect any thing on these Pascal cards?


----------



## pez

Quote:


> Originally Posted by *immortalkings*
> 
> il try to open the case then and turn on my air-conditioning if it change something.. if thats the case.. i need to buy a full set of case fans to get a better airflow.. i only use stock fans of my s340 by the way


How are your fans set up exactly? From the stock photos I'm seeing, there's a fan at the top and the rear. What about fans in the front?

If you've got a tower cooler on your CPU, you could go the route of using two front intake fans and a single rear exhaust. Your front airflow is a bit limited between the front panel and the resistance form the filters, but better fans can really do some work in that case. You should put your PC specs in your signature







. (see below)

Click me!


----------



## kiario

MSI 1080 installed

volt: auto
power: 120%
core clock: +150 = 2050 mhz Stabilized after 30 min heaven
mem: +500 = 11010
Fan: auto(stock bios)

123.3 in heaven benchmark
ultra
extreme
8xaa
1920x1080

Temps 69*


----------



## immortalkings

Quote:


> How are your fans set up exactly? From the stock photos I'm seeing, there's a fan at the top and the rear. What about fans in the front?
> 
> If you've got a tower cooler on your CPU, you could go the route of using two front intake fans and a single rear exhaust. Your front airflow is a bit limited between the front panel and the resistance form the filters, but better fans can really do some work in that case.


i got one ID cooling 140mm on the top front.. the buttom is stock, the top is stock and the rear is my H80i GT.. the fronts are intake and the top is exhaust.. i think i need to change the top exhaust and the buttom front ID-cooling 140mm.. the wind of the ID-cooling 140mm is strong, i can't even feel the air on my 120mm stocks..


----------



## max883

KIARIO: i have the same clocks as you, but +150 core clock gives me 2125.MHz Gpu MSI GTX 1080 GAIMING X
volt: auto
power: 120%
core clock: +150 = 2088 Stabilized after 30 min heaven
mem: +500 = 11010
Fan: auto(stock bios)

I Wonder what asic i have on my card??









got 70.c max temp in doom4 With all settings maxed at 4K


----------



## pez

Quote:


> Originally Posted by *immortalkings*
> 
> i got one ID cooling 140mm on the top front.. the buttom is stock, the top is stock and the rear is my H80i GT.. the fronts are intake and the top is exhaust.. i think i need to change the top exhaust and the buttom front ID-cooling 140mm.. the wind of the ID-cooling 140mm is strong, i can't even feel the air on my 120mm stocks..


You might try changing the top fan to intake, but it may be too close to the H80i to be effective. However, it's easy enough to change around and try. My guess however is that ultimately getting better intakes are going to go a long way for you.

2 x 140mm fans in the front for intake will pull quite a bit of air and can be fairly silent in doing so. Another possibility or option could be to change the H80i to exhaust out of the top and the rear as an intake, but it might run into the same issue as I mentioned before.

Noctua, Phanteks and others make good 140mm fans that would be great for intakes







.


----------



## KickAssCop

Which is the best 1080 out?
People who went 1080 after 980 Ti recommend the upgrade or not?


----------



## max883

MSI GTX 1080 GAMING X, EVGA GTX 1080 ACX 3.0, Palit GeForce GTX 1080 GameRock.

If you have a GTX 980 Ti i recommend not to Upgrade! Not wort it!


----------



## BrainSplatter

Quote:


> Originally Posted by *KickAssCop*
> 
> Which is the best 1080 out?


'Best' in which sense? Most overclocking potential? Quietest operation (fan noise/coil whine)? Best price/performance ratio? ...

Upgrading from 980TI will give about 15-30% more performance (both overclocked to the max), usually more quiet operation (due to less power consumption), 2GB more of VRAM and potentially new features. Worth it? U decide.


----------



## Jpmboy

Quote:


> Originally Posted by *steponz*
> 
> Hey man.. not yours of course...
> 
> *People really need to start looking at the power cap and not clocks.*


^^ *this*. The power limit is killing these cards!!


----------



## Uing07

3Dmark11 X @ Default clock and with EVGA Pro V2 Bridge;


I'll benchmarking later with NVIDIA's HB Bridge and EVGA HB Bridge;


----------



## gerbil80

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ *this*. The power limit is killing these cards!!


Silly question, but is that related to power delivery - as in connectors/phases, or the hardware itself do you think?


----------



## Vellinious

Quote:


> Originally Posted by *gerbil80*
> 
> Silly question, but is that related to power delivery - as in connectors/phases, or the hardware itself do you think?


No. The power limits set in the bios are too low. The Pascal bios editor will be out soon, if it isn't out there already, that will allow for those numbers to be customized, and then....no more problems.


----------



## Jpmboy

Quote:


> Originally Posted by *Vellinious*
> 
> No. The power limits set in the bios are too low. The Pascal bios editor will be out soon, if it isn't out there already, that will allow for those numbers to be customized, and then....no more problems.


^ This.

(no bios tweaker yet AFAIK)


----------



## escalibur

https://www.overclockers.co.uk/gigabyte-geforce-gtx-1080-xtreme-waterforce-8192mb-gddr5x-pci-express-graphics-card-gv-n1080xtreme-gx-185-gi.html



Can't wait for the local shop to have them in stock.


----------



## khemist

[/URL]

Just got my EVGA.


----------



## superkyle1721

Quote:


> Originally Posted by *escalibur*
> 
> https://www.overclockers.co.uk/gigabyte-geforce-gtx-1080-xtreme-waterforce-8192mb-gddr5x-pci-express-graphics-card-gv-n1080xtreme-gx-185-gi.html
> 
> 
> 
> Can't wait for the local shop to have them in stock.


1936 boost clocks full VRM and memory cooling instead of just core. If I was in the market for a 1080 this would be the one to get. With PBT release I feel this card will dominate.

Always destroying exergy


----------



## escalibur

Quote:


> Originally Posted by *superkyle1721*
> 
> 1936 boost clocks full VRM and memory cooling instead of just core. If I was in the market for a 1080 this would be the one to get. With PBT release I feel this card will dominate.
> 
> Always destroying exergy


Indeed. At least it shouldn't be worst than this: https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_Waterforce/27.html


----------



## superkyle1721

Quote:


> Originally Posted by *escalibur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *superkyle1721*
> 
> 1936 boost clocks full VRM and memory cooling instead of just core. If I was in the market for a 1080 this would be the one to get. With PBT release I feel this card will dominate.
> 
> Always destroying exergy
> 
> 
> 
> Indeed. At least it shouldn't be worst than this: https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_Waterforce/27.html
Click to expand...

I own two of those running at 1569 24/7 stable. I am well aware how good the are








http://www.3dmark.com/fs/8366229

Always destroying exergy


----------



## Asus11

Quote:


> Originally Posted by *steponz*
> 
> Thats the real mod you have to do for power.. doing the shunts doesn't help.... Trust me ... Ive already tested it..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PM me if ya want to do it.


you sure shunt mod doesn't help not even a little? im tempted to do it to get even 50mhz more out of the card


----------



## shalafi

Quote:


> Originally Posted by *escalibur*
> 
> https://www.overclockers.co.uk/gigabyte-geforce-gtx-1080-xtreme-waterforce-8192mb-gddr5x-pci-express-graphics-card-gv-n1080xtreme-gx-185-gi.html
> 
> Can't wait for the local shop to have them in stock.


~970€


----------



## marc0053

I can confirm the shunt mod on the Gigabyte GTX 1080 G1 was enough to eliminate core clock throttling for me on all the firestrike, 3Dmark11, Catzilla and vantage so far. Instead of hitting the power limit at 108% is now maxes out around 101%. This did not help getting additional OC room though, simply stop throttle. For me it was 3x resistors and I put CLU over. Using the pencil method did not work for me this time. For reference I am using a EK thermosphere waterblock with a small 120mm fan on the vrm. The unloaded temps are reading 25C and loaded up to 32C ish. I would assume any increase in voltage from the current max of 1.08v would hit this power limit very quickly.


----------



## steponz

Quote:


> Originally Posted by *Asus11*
> 
> you sure shunt mod doesn't help not even a little? im tempted to do it to get even 50mhz more out of the card


Have the people with 2250 and other high clocks show you there tdp... it doesn't work. Nvidia put extra checks to combat this

You need 3 x 10 ohm resistors on the caps.. not the shunts.

Otherwise if you do the shunts properly it will lock to 239.. or something like that.. can't remember the exact number.

Then you will get proper low tdp.... I was with Tin benching the card to 2500 plus


----------



## steponz

Quote:


> Originally Posted by *marc0053*
> 
> I can confirm the shunt mod on the Gigabyte GTX 1080 G1 was enough to eliminate core clock throttling for me on all the firestrike, 3Dmark11, Catzilla and vantage so far. Instead of hitting the power limit at 108% is now maxes out around 101%. This did not help getting additional OC room though, simply stop throttle. For me it was 3x resistors and I put CLU over. Using the pencil method did not work for me this time.


Shunt mod should get you much much lower TDP.

Gigabyte G1 is a non reference card.. so it could be different. I'm specifically talking about the reference card aka Founder's Edition.


----------



## steponz

Also.. Show power in Gpuz.. much better to calculate than using AB.
Quote:


> Originally Posted by *marc0053*
> 
> I can confirm the shunt mod on the Gigabyte GTX 1080 G1 was enough to eliminate core clock throttling for me on all the firestrike, 3Dmark11, Catzilla and vantage so far. Instead of hitting the power limit at 108% is now maxes out around 101%. This did not help getting additional OC room though, simply stop throttle. For me it was 3x resistors and I put CLU over. Using the pencil method did not work for me this time.


Shunt mod should get you much much lower TDP.

Gigabyte G1 is a non reference card.. so it could be different. I'm specifically talking about the reference card aka Founder's Edition.
Quote:


> Originally Posted by *marc0053*
> 
> I can confirm the shunt mod on the Gigabyte GTX 1080 G1 was enough to eliminate core clock throttling for me on all the firestrike, 3Dmark11, Catzilla and vantage so far. Instead of hitting the power limit at 108% is now maxes out around 101%. This did not help getting additional OC room though, simply stop throttle. For me it was 3x resistors and I put CLU over. Using the pencil method did not work for me this time. For reference I am using a EK thermosphere waterblock with a small 120mm fan on the vrm. The unloaded temps are reading 25C and loaded up to 32C ish. I would assume any increase in voltage from the current max of 1.08v would hit this power limit very quickly.


Quote:


> Originally Posted by *marc0053*
> 
> I can confirm the shunt mod on the Gigabyte GTX 1080 G1 was enough to eliminate core clock throttling for me on all the firestrike, 3Dmark11, Catzilla and vantage so far. Instead of hitting the power limit at 108% is now maxes out around 101%. This did not help getting additional OC room though, simply stop throttle. For me it was 3x resistors and I put CLU over. Using the pencil method did not work for me this time. For reference I am using a EK thermosphere waterblock with a small 120mm fan on the vrm. The unloaded temps are reading 25C and loaded up to 32C ish. I would assume any increase in voltage from the current max of 1.08v would hit this power limit very quickly.


----------



## marc0053

Quote:


> Originally Posted by *steponz*
> 
> Shunt mod should get you much much lower TDP.


Ok will try that next time, thanks Joe!!


----------



## steponz

Quote:


> Originally Posted by *marc0053*
> 
> Quote:
> 
> 
> 
> Originally Posted by *steponz*
> 
> Shunt mod should get you much much lower TDP.
> 
> Ok will try that next time, thanks Joe!!
> 
> 
> 
> This should get you a more accurate power setting and should give you a proper oc clock.. Even with the tdp changed a bit.. your clocks are really not what ya think...
> 
> I did a bunch of testing on computex and had to play with it quite a bit... Not until Tin and I talked about the shunts did we realize that it wasn't the only thing ya needed... the 10 ohm resistors are the way to go.
> 
> Gigabyte changes the G1 quite a bit usually.. so it could be quite different for that card. so if its not similar.. don't do it.
Click to expand...


----------



## Asus11

Quote:


> Originally Posted by *steponz*
> 
> Have the people with 2250 and other high clocks show you there tdp... it doesn't work. Nvidia put extra checks to combat this
> 
> You need 3 x 10 ohm resistors on the caps.. not the shunts.
> 
> Otherwise if you do the shunts properly it will lock to 239.. or something like that.. can't remember the exact number.
> 
> Then you will get proper low tdp.... I was with Tin benching the card to 2500 plus


isnt the 1080 fe 180w tdp? having 239 tdp would be better no?


----------



## steponz

Quote:


> Originally Posted by *Asus11*
> 
> isnt the 1080 fe 180w tdp? having 239 tdp would be better no?


When I modded the shunts.. the gpu clocks was locked at 239 mhz.. not tdp


----------



## Asus11

Quote:


> Originally Posted by *steponz*
> 
> When I modded the shunts.. the gpu clocks was locked at 239 mhz.. not tdp







check comments by ''TrueRegulators''

this is the reason I wanted to do it


----------



## steponz

Quote:


> Originally Posted by *Asus11*
> 
> 
> 
> 
> 
> 
> check comments by ''TrueRegulators''
> 
> this is the reason I wanted to do it


Just because he says he got 100 mhz... doesn't mean he actually did.

Alot of people don't understand that when they are setting clocks... the tdp with throttle them down.

The shunts don't work.. do the 3x 10 ohm resistors that Tin says to do.. Thats what you want.....


----------



## steponz

Quote:


> Originally Posted by *steponz*
> 
> Just because he says he got 100 mhz... doesn't mean he actually did.
> 
> Alot of people don't understand that when they are setting clocks... the tdp with throttle them down.
> 
> The shunts don't work.. do the 3x 10 ohm resistors that Tin says to do.. Thats what you want.....


Also that guy has no idea what to do as he hasn't obviously modded a card.

This is what you need to look at: http://forum.kingpincooling.com/showthread.php?t=3879


----------



## bfedorov11

Quote:


> Originally Posted by *steponz*
> 
> The shunts don't work.. do the 3x 10 ohm resistors that Tin says to do.. Thats what you want.....


Gonna order some and try it next week if I have time. 2500 plus with what cooling?


----------



## Asus11

Quote:


> Originally Posted by *bfedorov11*
> 
> Gonna order some and try it next week if I have time. 2500 plus with what cooling?


most likely LN2 lol

not sure if its worth actually soldering stuff on your new 1080 unless thats what your into


----------



## bfedorov11

Yeah, I'm wondering if it's worth voiding the warranty for ambient water.


----------



## Asus11

Quote:


> Originally Posted by *bfedorov11*
> 
> Yeah, I'm wondering if it's worth voiding the warranty for ambient water.


tough one.. id wait it out for bios tweakers tbh


----------



## fat4l

Quote:


> Originally Posted by *BrainSplatter*
> 
> 'Best' in which sense? Most overclocking potential? Quietest operation (fan noise/coil whine)? Best price/performance ratio? ...
> 
> Upgrading from 980TI will give about 15-30% more performance (both overclocked to the max), usually more quiet operation (due to less power consumption), 2GB more of VRAM and potentially new features. Worth it? U decide.


Rather 10-15% when both oced...


----------



## Asus11

Quote:


> Originally Posted by *fat4l*
> 
> Rather 10-15% when both oced...


apparently titan x overclocked to 1500mhz a overclocked 1080 is 12% faster

so I say a 1080 could be 17-20% faster than a 980 ti overclocked


----------



## ChevChelios

2050-2100Mhz 1080 is about 20% faster

less so if you are talking about a 1550-1600+ 980Ti


----------



## bfedorov11

My old TX, 1500/8000, 5163 graphics, http://www.3dmark.com/fs/8457511

1080, 5933 graphics, http://www.3dmark.com/fs/8897015

12.97%

1080, new drivers... no bios tweaks.....


----------



## escalibur

Quote:


> Originally Posted by *superkyle1721*
> 
> I own two of those running at 1569 24/7 stable. I am well aware how good the are
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/8366229
> 
> Always destroying exergy


Waterforce was released way too late. I managed to have 4 different 980 Ti and I was too lazy to move to Waterforce. One of many reasons why im going this time with Xtreme Water is SLI in mind. Probably the best and coolest solution right after custom loop builds.

Quote:


> Originally Posted by *shalafi*
> 
> ~970€


Overclockers.co.uk has never been the cheapest shop.







That's why we have Mindfactory.de and others like them.


----------



## Hilpi234

I also modded the shunts, it does prevent throttling, but you cannot even get 1 Mhz more... and my clock, is not locked down...

But there is still somekind of Limit in place...

This would not be possible without it (you can apply any clock you want, you only get <5950) ... but it is not stable at all... max is still 2126 like before...

http://www.3dmark.com/fs/8774921

If you use the curve, you get Perf-Caps, if you use Offset no Perf caps... In GPUz. But you cannot apply more voltage, without the curve...


----------



## superkyle1721

Quote:


> Originally Posted by *escalibur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *superkyle1721*
> 
> I own two of those running at 1569 24/7 stable. I am well aware how good the are
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/8366229
> 
> Always destroying exergy
> 
> 
> 
> Waterforce was released way too late. I managed to have 4 different 980 Ti and I was too lazy to move to Waterforce. One of many reasons why im going this time with Xtreme Water is SLI in mind. Probably the best and coolest solution right after custom loop
Click to expand...

Very true. If I wasn't in the process of building a new rig right at the time I wouldn't have considered it. Great sli temps though and for just a few extra hundred it was much easier to stomach then buying the $1000 custom loop I priced out. The money I saved from not going custom loop allowed me to buy another card. Worth it IMO but I really love the custom loop look. One day I'll save the money and build it.

Always destroying exergy


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> hey vega... think I could get you to pm me or post the bios from that 1080 FTW you have?


Sent. Let me know if you can work any magix on it.


----------



## escalibur

Quote:


> Originally Posted by *superkyle1721*
> 
> Very true. If I wasn't in the process of building a new rig right at the time I wouldn't have considered it. Great sli temps though and for just a few extra hundred it was much easier to stomach then buying the $1000 custom loop I priced out. The money I saved from not going custom loop allowed me to buy another card. Worth it IMO but I really love the custom loop look. One day I'll save the money and build it.
> 
> Always destroying exergy


Custom loop is amazing if you have time, money and will to build and also maintain it. I don't.


----------



## jommy999

any Gigabyte 1080 Xtreme owner here ? I heard about issue of QC that fan hitting the X-bar frame in front of it and coil whine. Do you have that problems ? or just minority ?

Thank you in advance


----------



## escalibur

Quote:


> Originally Posted by *jommy999*
> 
> any Gigabyte 1080 Xtreme owner here ? I heard about issue of QC that fan hitting the X-bar frame in front of it and coil whine. Do you have that problems ? or just minority ?
> 
> Thank you in advance


According to Newegg's reviews: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125873&cm_re=xtreme_1080-_-14-125-873-_-Product there should be some?


----------



## toncij

Quote:


> Originally Posted by *bfedorov11*
> 
> My old TX, 1500/8000, 5163 graphics, http://www.3dmark.com/fs/8457511
> 
> 1080, 5933 graphics, http://www.3dmark.com/fs/8897015
> 
> 12.97%
> 
> 1080, new drivers... no bios tweaks.....


It's actually 14,91%.







Comparing graphics only. Which brings us to questions:
- how noisy and hot is 1080 and what model cooling it is?
- how much change do you actually feel in real use? (I presume little to none, but ymmv.


----------



## steponz

Again to get around power you need to solder 3 x 10 ohm resistors on the caps and it will remove the power cap.

Already have proven it. Doing the shunts doesn't do anything and can actually hurt performance.
Quote:


> Originally Posted by *Hilpi234*
> 
> I also modded the shunts, it does prevent throttling, but you cannot even get 1 Mhz more... and my clock, is not locked down...
> 
> But there is still somekind of Limit in place...
> 
> This would not be possible without it (you can apply any clock you want, you only get <5950) ... but it is not stable at all... max is still 2126 like before...
> 
> http://www.3dmark.com/fs/8774921
> 
> If you use the curve, you get Perf-Caps, if you use Offset no Perf caps... In GPUz. But you cannot apply more voltage, without the curve...


----------



## Hilpi234

I do not lose performance, only if i modify the curve... and this happens, with and without shunt mod...


----------



## steponz

Try a kingpin at 1500 with Samsung mem and the performance will be even higher.

Why 1080 can't beat Kingpin .. it needs 2800mhz to try to beat our LN2 clocks.
Quote:


> Originally Posted by *toncij*
> 
> It's actually 14,91%.
> 
> 
> 
> 
> 
> 
> 
> 
> Comparing graphics only. Which brings us to questions:
> - how noisy and hot is 1080 and what model cooling it is?
> - how much change do you actually feel in real use? (I presume little to none, but ymmv.


----------



## Asus11

Quote:


> Originally Posted by *steponz*
> 
> Try a kingpin at 1500 with Samsung mem and the performance will be even higher.
> 
> Why 1080 can't beat Kingpin .. it needs 2800mhz to try to beat our LN2 clocks.


not that much higher


----------



## Hilpi234

In fact, if you shunt your card, it will increase its Voltage, to keep, the clock stable, at higher Temps/Load... this starts at 1.043 and goes up to 1.075 if you open the Voltage-Slider.


----------



## CallsignVega

Quote:


> Originally Posted by *jommy999*
> 
> any Gigabyte 1080 Xtreme owner here ? I heard about issue of QC that fan hitting the X-bar frame in front of it and coil whine. Do you have that problems ? or just minority ?
> 
> Thank you in advance


Ya, I heard some owners reporting issues. Although, it probably isn't hard to slightly bend it to make it not touch. Close tolerances between fan blades and shroud increase efficiency.


----------



## steponz

If your gonna claim that it eorks.. people need to start posting GPUz and some scores with screenshots.. voltage control is locked
So don't believe anything unless you attach a multimeter.
Quote:


> Originally Posted by *Hilpi234*
> 
> In fact, if you shunt your card, it will increase its Voltage, to keep, the clock stable, at higher Temps/Load... this starts at 1.043 and goes up to 1.075 if you open the Voltage-Slider.


----------



## steponz

Again shunt mod doesn't do anything as I have said multiple times.. did you actually solder or use clu?

Even if you did its not gonna help any.

Adding the resistors to the caps is what helps power.. already proved it by benching 2500 plus on 1080 on LN2.

Shunting will just limit it you and does nothing to help power all the way.. if you have 1 or 2 of the shunts it can lower it a bit.. but not what you need to get the best possible clocks even on air.

Post some GPUz screenshot with scores.. use a benchmark.. that way you can actually gauge if your getting more performance out of the clocks.. I see countless guys on these forums saying they clock so high.. but really their performance is garbage.
Quote:


> Originally Posted by *Hilpi234*
> 
> I do not lose performance, only if i modify the curve... and this happens, with and without shunt mod...


----------



## Jpmboy

Quote:


> Originally Posted by *marc0053*
> 
> I can confirm the shunt mod on the Gigabyte GTX 1080 G1 was enough to eliminate core clock throttling for me on all the firestrike, 3Dmark11, Catzilla and vantage so far. Instead of hitting the power limit at 108% is now maxes out around 101%. This did not help getting additional OC room though, simply stop throttle. For me it was 3x resistors and I put CLU over. Using the pencil method did not work for me this time. For reference I am using a EK thermosphere waterblock with a small 120mm fan on the vrm. The unloaded temps are reading 25C and loaded up to 32C ish. I would assume any increase in voltage from the current max of 1.08v would hit this power limit very quickly.


hey Marc, thanks for posting your results... +1


----------



## Hilpi234

Ultra @ 2113/5599 +203/+595 i did it 1 time before i started the Stresstest... http://www.3dmark.com/3dm/12372223

 10 minutes...

the Frames drop slightly over time but this is related to the heat, you can also see, it starts @ 1.05 and goes up to 1.075...



and here is the File...

Ultra-Stress.zip 408k .zip file


And if you truly want it, i can run it again, for your GPUz screen

Try it, a founders would drop to 2000 or lower in a Minute... and the Frames get worse...

If I run this @ Extreme the Frames are the same and the Temp stays @ 42°C


----------



## steponz

Quote:


> Originally Posted by *Hilpi234*
> 
> Ultra @ 2113/5599 +203/+595 i did it 1 time before i started the Stresstest... http://www.3dmark.com/3dm/12372223
> 
> 10 minutes...
> 
> the Frames drop slightly over time but this is related to the heat, you can also see, it starts @ 1.05 and goes up to 1.075...
> 
> 
> 
> and here is the File...
> 
> Ultra-Stress.zip 408k .zip file
> 
> 
> And if you truly want i can run it again for your GPUz screen
> 
> Try it, a founders would drop to 2000 or lower in a Minute... and the Frames get worse...


Can't really see your stuff.. I want to see your power in gpuz.

By doing the 10 ohm on the caps you remove all.. which could give you higher results. Because with just shunts you will be limited.

Which mod did you do?

Did you solder.. use CLU? Using solder when I did the 3 resistors.. if would drop the locked clocks.
If you only short 2 of the resistors it will drop power a little bit.. but not much and depending on how you clocked memory would also effect score.

Why I want to see a gpuz screenshot.


----------



## Hilpi234

I had 3 shunts with CLU, but i removed it from 2 of them today... now only the one on the 8 Pin is modded...

brb...


----------



## steponz

Ahh ok.. makes sense.. why your power is only going down a little bit. But you should be pretty close to the cap
Quote:


> Originally Posted by *Hilpi234*
> 
> I had 3 shunts with CLU, but i removed it from 2 of them today... now only the one on the 8 Pin is modded...
> 
> brb...


----------



## steponz

Quote:


> Originally Posted by *Hilpi234*
> 
> I had 3 shunts with CLU, but i removed it from 2 of them today... now only the one on the 8 Pin is modded...
> 
> brb...


Also can you run FSE instead of FSU.

I can tell exactly how your performance is with that... instead of FSU.


----------



## toncij

Quote:


> Originally Posted by *steponz*
> 
> If your gonna claim that it eorks.. people need to start posting GPUz and some scores with screenshots.. voltage control is locked
> So don't believe anything unless you attach a multimeter.


Hasn't it been said that voltage increase, like power delivery improvements don't help due to the fact that chips simply can't take more? Kingpin Tin said so if I recall correctly?


----------



## Hilpi234

Here you go... I ran Extreme this time... because i hate to hear the coils squeal like hell


----------



## rv8000

I'm definitely enjoying my card from a raw power perspective, but Nvidia really seems to have sucked all the fun out of playing with the tech alongside such restrictive bios and boost 3.0. I've never been so bored with a GPU before


----------



## steponz

Voltage is only really needed on ln2.. have t really tested on water.. tbh didn't seem to help all that much. Mostly cold scaling with extremely cold temps
Quote:


> Originally Posted by *toncij*
> 
> Hasn't it been said that voltage increase, like power delivery improvements don't help due to the fact that chips simply can't take more? Kingpin Tin said so if I recall correctly?


----------



## steponz

Yeah hopefully this changes..


----------



## Hilpi234

and? ^^


----------



## steponz

What brand of card is this?

Quote:


> Originally Posted by *Hilpi234*
> 
> 
> 
> Here you go... I ran Extreme this time... because i hate to hear the coils squeal like hell


Quote:


> Originally Posted by *Hilpi234*
> 
> 
> 
> Here you go... I ran Extreme this time... because i hate to hear the coils squeal like hell


----------



## steponz

Quote:


> Originally Posted by *Hilpi234*
> 
> 
> 
> Here you go... I ran Extreme this time... because i hate to hear the coils squeal like hell


What score did you get in FSE? Really just want to see GPU score.


----------



## Hilpi234

A EVGA Founders...

http://www.3dmark.com/fs/8905732

Over 12k


----------



## Asus11

Quote:


> Originally Posted by *Hilpi234*
> 
> A EVGA Founders...


whats ur mods just 3 shunt resistors with CLU?


----------



## Hilpi234

Just one, close to the 8 Pin, with lots of CLU







but it also worked with 3 before...

is it correct, compared to the clock, or not ?


----------



## Jpmboy

Quote:


> Originally Posted by *Hilpi234*
> 
> 
> 
> Here you go... I ran Extreme this time... because i hate to hear the coils squeal like hell


nicely done. no slamming into the power limit. The question is does this result in better performance...


----------



## steponz

Quote:


> Originally Posted by *Hilpi234*
> 
> Just one, close to the 8 Pin, with lots of CLU
> 
> 
> 
> 
> 
> 
> 
> but it also worked with 3 before...
> 
> is it correct, compared to the clock, or not ?


Seems ok.. low on memory though.

I don't think you actually shorted all 3 when you tried. you will get a lock on the gpu.


----------



## Jpmboy

all this sheet is simply to overcome the power limit set in bios (for ambient users). Maybe these sponsored/extreme overclockers can push to get a bios tweaker out, rather than putz with these board hacks, what next, bailing wire and bubble gum?


----------



## Hilpi234

I know the Perf drop you are talking about, it will happen, if you modify the Curve and lock the Voltage of the GPU... in Offset it switches it`s Voltage every 10 °C, to keep the clock stable...

You wanted to see this











Only happens, in the curve


----------



## steponz

Here's some scores from my testing early on.. this is before the official driver was released.. you seem to be scoring a bit low.

Could be diff between win8 and win 10.. Ill have to mod this card i have and retest to see.

Here's the scores... Only GPU Score.. because thats all that matters.. FSE.





With newer driver.. the performance should be much better.


----------



## Hilpi234

it varies... but it is always 12-12100... I do lose performance... if I go higher with the ram...


----------



## Jpmboy

I find it hard to tell when the PL kicks in with gpuZ. AB 4.3 beta is pretty clear:











That's from Unigine Heaven 4.0 (130 fps)


----------



## Hilpi234

As I said, it works, to keep the clock stable... and it is only one shunt









But you can say "Goodbye" to the Curve-Editor







it is simply bugged crap right now...


----------



## steponz

Quote:


> Originally Posted by *Hilpi234*
> 
> it varies... but it is always 12-12100... I do lose performance... if I go higher with the ram...


Yeah it seems like mem is interesting on these with the new card.

I remember testing the mem clocks.. and everytime I tried to test and bump mem.. it would lower the gpu clocks. So since you have only 1 shunt done Im wondering if your getting held back by this.
Maybe try to put some CLU on the other and see if that helps.

You notice on my card.. pretty much no tdp at all....


----------



## steponz

Ill hardmod this card i over the next day or two and Ill check to see the clocks and performance after each.. easier to compare whats doing what on the same system.

Ill also update with the newest driver and os as that was all prerelease stuff.

I have a feeling you can score higher if you play around a bit.


----------



## Hilpi234

Sure... but this is simply 24/7 water... Played with these settings, the new Witcher Addon, for hours... only the Voltage moves, nothing else...

It is ~ 20% faster, than my old Crap TX ... I am happy right now









This was the highest ... http://www.3dmark.com/fs/8768338 but it was not stable... because of the temp...

But, back to my question, the last picture i posted, you did expect this Result and GPUz readout?


----------



## XCalinX

Thanks for adding me!


----------



## immortalkings

my GPU on Open side panel with running Heaven Benchmark.. it still hits above 75-79c... its still high to a GTX 1080 AMP! Edition AIB card... is my card got an issue or faulty? or it could just be normal on a Zotac card?

but il try to change my fans and get a new one to change the stock fans.. i don't want to intake the top fan cause it will get all the dirt/dust.. front panel will do the job with a filter on front. il change the H80i 120mm fans as well.. il try to find a better fans that will fit my color theme


----------



## steponz

It's a little low.. and I'm only on air.

Quote:


> Originally Posted by *Hilpi234*
> 
> Sure... but this is simply 24/7 water... Played with these settings, the new Witcher Addon, for hours... only the Voltage moves, nothing else...
> 
> It is ~ 20% faster, than my old Crap TX ... I am happy right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This was the highest ... http://www.3dmark.com/fs/8768338 but it was not stable... because of the temp...
> 
> But, back to my question, the last picture i posted, you did expect this Result and GPUz readout?


----------



## steponz

You try changing paste.. factory paste is usually garbage.
Quote:


> Originally Posted by *immortalkings*
> 
> my GPU on Open side panel with running Heaven Benchmark.. it still hits above 75-79c... its still high to a GTX 1080 AMP! Edition AIB card... is my card got an issue or faulty? or it could just be normal on a Zotac card?
> 
> but il try to change my fans and get a new one to change the stock fans.. i don't want to intake the top fan cause it will get all the dirt/dust.. front panel will do the job with a filter on front. il change the H80i 120mm fans as well.. il try to find a better fans that will fit my color theme


----------



## Hilpi234

Do not get me wrong Steponz, but i am pretty sure, all 3 were shunted, I had ~1.7-8 of the normal PT, the Ram did not care









+600-700 Perf drop, then a very slow rise, but artifacts...

The Core can not do more... bench yes... stable no.

Could do maybe more... but only with more voltage... and if i use the Curve you see what happens


----------



## immortalkings

Quote:


> You try changing paste.. factory paste is usually garbage.


sorry for the noob question.. will it void the warranty if open the card? or it depends on what country on how they offer warranty? but will try it.. il buy the best thermal paste i could get near on my place.


----------



## Asus11

Quote:


> Originally Posted by *immortalkings*
> 
> sorry for the noob question.. will it void the warranty if open the card? or it depends on what country on how they offer warranty? but will try it.. il buy the best thermal paste i could get near on my place.


I think it voids the warranty on Zotac cards

if it has a sticker dont open it, if it doesn't i guess u could and no1 would know









also hilpi

im about to mod my card, do I clu the 1 shunt or all 3? im confused how you only have one done


----------



## Shaded War

Quote:


> Originally Posted by *immortalkings*
> 
> my GPU on Open side panel with running Heaven Benchmark.. it still hits above 75-79c... its still high to a GTX 1080 AMP! Edition AIB card... is my card got an issue or faulty? or it could just be normal on a Zotac card?
> 
> but il try to change my fans and get a new one to change the stock fans.. i don't want to intake the top fan cause it will get all the dirt/dust.. front panel will do the job with a filter on front. il change the H80i 120mm fans as well.. il try to find a better fans that will fit my color theme


I have a 1080 amp coming in the mail tomorrow so I'll find out. Hopefully that is just the thermal paste and not the way the cooler performs, or I'll be a little bit disappointed. I'v seen other GPUs of mine benefit 5°C or better from new paste, so post some results when you get done.


----------



## Hilpi234

This was my 1st try... worked, but it was not enough... then i added more to all 3 of them, worked well 110% PT without 55% after(Valley)... yesterday i thought, i could only try one... removed it from the middle and the bottom and added more to the top...

Only RS3 is modded... right now.

Still works


----------



## Naked Snake

The shipping on my Gtx 1080 is being delayed until monday *cry* I can't take it anymore









Now back on topic, is it safe to do the shunt-mod on the FE ?


----------



## Asus11

Quote:


> Originally Posted by *Hilpi234*
> 
> 
> 
> This was my 1st try... worked, but it was not enough... then i added more to all 3 of them, worked well 110% PT without 55% after(Valley)... yesterday i thought, i could only try one... removed it from the middle and the bottom and added more to the top...
> 
> Only RS3 is modded...
> 
> Still works


are you on water? or just air? I think im going to do it now then







you did not notice any increase in overclock did you? basically card just not hitting the power limit? I think this needs to be done to the 1070 more than 1080 imo when I had the 1070 it was hittin the 112% PT soooo easy lol I think its because old GDDR5 takes more power

the mod is mainly for the FE model but people have tried on custom also @snake


----------



## Naked Snake

Quote:


> Originally Posted by *Asus11*
> 
> are you on water? or just air? I think im going to do it now then
> 
> 
> 
> 
> 
> 
> 
> you did not notice any increase in overclock did you? basically card just not hitting the power limit? I think this needs to be done to the 1070 more than 1080 imo when I had the 1070 it was hittin the 112% PT soooo easy lol I think its because old GDDR5 takes more power
> 
> the mod is mainly for the FE model but people have tried on custom also @snake


Ohh I see I might try it then since I've swiched my FE Zotac for a FE Evga and I'm pretty sure or I've read you don't void the warranty on the Evga cards. Thanks for the reply


----------



## Hilpi234

I am on water... but you still have the temp throttle, every 10°C... @~50°C the core will switch to 1.081 and @~60°C to 1.093 after that, it will throttle...

 below 25° [email protected]
 above 25° [email protected]
 above 30° [email protected]
 above 40° [email protected]

@~45°C it jumps to 1.075

posted this 6 Days ago, the Card switches it`s voltage, at certain Temp-Points/and Load... to keep the clock stable...

This, also seems to be the reason, why the curve Oc is slower, than the Offset...


----------



## Burke888

Just received one of the new SLI bridges in the mail.
I'm sure others have posted but I have been extremely disappointed with SLI support this time around. I'm not sure if it is a driver issues but many games do not support SLI properly. My fps is so inconsistent and low in Doom that I have to disable SLI. Other game such as ARMA 3 have a similar issue.

I've been an SLI user since the GTX 680 days and have never seen such poor support. I would not encourage anyone to buy two of these, unless you do so knowing that many games will not utilize both cards.
I had hoped that since Nvidia is dropping support for 3 Way and 4 Way SLI that 2 Way SLI support would improve but that does not seem to be the case. If I had to predict it seems more likely that Nvidia is slowly working to kill SLI. I can't imagine them wanting to do that as it would be a huge increase in GPU sales if SLI was properly supported. Very sad for the hardcore enthusiasts out there.


----------



## ChevChelios

1080Ti will come to save the enthusiasts in about 8-9 months


----------



## kx11

Quote:


> Originally Posted by *Burke888*
> 
> Just received one of the new SLI bridges in the mail.
> I'm sure others have posted but I have been extremely disappointed with SLI support this time around. I'm not sure if it is a driver issues but many games do not support SLI properly. My fps is so inconsistent and low in Doom that I have to disable SLI. Other game such as ARMA 3 have a similar issue.
> 
> I've been an SLI user since the GTX 680 days and have never seen such poor support. I would not encourage anyone to buy two of these, unless you do so knowing that many games will not utilize both cards.
> I had hoped that since Nvidia is dropping support for 3 Way and 4 Way SLI that 2 Way SLI support would improve but that does not seem to be the case. If I had to predict it seems more likely that Nvidia is slowly working to kill SLI. I can't imagine them wanting to do that as it would be a huge increase in GPU sales if SLI was properly supported. Very sad for the hardcore enthusiasts out there.


try this driver
http://forums.guru3d.com/showthread.php?t=408220

and test SLi scaling on a game that supports SLi like Witcher3 and The Division , AC syndicate ... etc


----------



## MrDerrikk

Quote:


> Originally Posted by *immortalkings*
> 
> my GPU on Open side panel with running Heaven Benchmark.. it still hits above 75-79c... its still high to a GTX 1080 AMP! Edition AIB card... is my card got an issue or faulty? or it could just be normal on a Zotac card?
> 
> but il try to change my fans and get a new one to change the stock fans.. i don't want to intake the top fan cause it will get all the dirt/dust.. front panel will do the job with a filter on front. il change the H80i 120mm fans as well.. il try to find a better fans that will fit my color theme


Definitely change the paste, I remember shying away from the Zotac cards because there were so many reviews saying how they'd get rubbish thermals until replacing the paste. If I felt like putting them on water I'd go with one but I want an air cooler to work out of the box myself.


----------



## emett

Thinking about getting a second card. Will 2 way 1080 SLI be fine on PCI-E 2.0 with a 3930k @4.7?


----------



## Jpmboy

Quote:


> Originally Posted by *Hilpi234*
> 
> As I said, it works, to keep the clock stable... and it is only one shunt
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But you can say "Goodbye" to the Curve-Editor
> 
> 
> 
> 
> 
> 
> 
> it is simply bugged crap right now...


the curve editor is workin fine here... as you can see in my post above, the voltage is constant @ 1.09V thoughout the run, but the power limit is basically always on which holds the FSE graphics score at the 11.4K range with this card. No temp throttling at all, whether the water is at 25C or 5C, the core never exceeds 32C.









Spoiler: Warning: Spoiler!


----------



## tistou77

Hello

someone will take the waterblock Koolance for 1080 ?
I would need to "measure" for the IN / OUT (different from 980(ti))

Thanks


----------



## emett

Why does no one answer questions around here any more?


----------



## RGSPro

Finished putting in my EVGA SC 1080s both fitted with EK waterblocks. Strangely they make a "flutter" type noise when they are under full load. Sounds almost like they have a fan going but they are on water.

Anyone else getting this light flutter noise when the card is on full load? I am pretty sure both of mine are doing it.

Coil whine?


----------



## skline00

Quote:


> Originally Posted by *RGSPro*
> 
> Finished putting in my EVGA SC 1080s both fitted with EK waterblocks. Strangely they make a "flutter" type noise when they are under full load. Sounds almost like they have a fan going but they are on water.
> 
> Anyone else getting this light flutter noise when the card is on full load? I am pretty sure both of mine are doing it.
> 
> Coil whine?


RGSPro have you purged all the air bubbles out of your system?

Took me a day or so and then the system really quieted down.

If I was to guess, what you hear are tiny air bubbles still working their way out of the system into the reservoir.


----------



## RGSPro

Quote:


> Originally Posted by *skline00*
> 
> RGSPro have you purge all the air bubbles out of your system?
> 
> Took me a day or so and then the system really quieted down.
> 
> I fi was to guess, what you hear are tiny air bubbles still working their way out of the system into the reservoir.


No it's not the air bubbles. It's only when the GPUs are under load. I can turn all my fans/pumps off for a second on my fan controller and it's still making the flutter sound.

I guess it can happen with most new cards...
http://forums.evga.com/RMA-worth-it-for-little-Coil-Whine-m2231622.aspx


----------



## skline00

RGSPro, what kind of pump are you running?


----------



## RGSPro

Quote:


> Originally Posted by *skline00*
> 
> RGSPro, what kind of pump are you running?


I have dual Swiftech MCP655-B on a Lamptron FC5 fan controller. If I turn off the pumps and fans while it's making that flutter you can still hear it right from the cards. Definitely coil whine under load.

I am going to try it on an EVGA 1300 Supernova PSU tomorrow, I'm currently running it on an AX1200, I guess it is a pretty old PSU but it's solid.


----------



## skline00

Keep me posted.


----------



## Hilpi234

@Jpmboy

Do a Extreme run or Normal, with offset and one by edditing the curve... you will always lose points, but the clock is the same ...

Naennon also confirmed this


----------



## CallsignVega

Sister arrived!



HB bridge arrives tomorrow.

This is interesting:


----------



## stanielz

so if im gaming at 1440p 144 hz, would i gain something going from LED bridge to HB bridge?


----------



## CallsignVega

Quote:


> Originally Posted by *stanielz*
> 
> so if im gaming at 1440p 144 hz, would i gain something going from LED bridge to HB bridge?


----------



## stanielz

Quote:


> Originally Posted by *CallsignVega*


i've seen that chart but from the early benchmarks im seeing of the HB bridge vs running 4k DSR on my LED bridge, im losing performance.

with that being said

1440p @ 144hz pushes more bandwidth than 4k @ 60 hertz, and if im losing performance at 4k 60 hz, it stands to reason the same is going on for 1440p @ 144 hz.


----------



## steponz

Quote:


> Originally Posted by *emett*
> 
> Why does no one answer questions around here any more?


Should run fine. but you will lose some performance.


----------



## CallsignVega

Quote:


> Originally Posted by *stanielz*
> 
> i've seen that chart but from the early benchmarks im seeing of the HB bridge vs running 4k DSR on my LED bridge, im losing performance.
> 
> with that being said
> 
> 1440p @ 144hz pushes more bandwidth than 4k @ 60 hertz, and if im losing performance at 4k 60 hz, it stands to reason the same is going on for 1440p @ 144 hz.


Are you trying to avoid getting the HB bridge? I'd just get it and be done with it no matter the display resolution. Future proof, to an extent.


----------



## stanielz

Quote:


> Originally Posted by *CallsignVega*
> 
> Are you trying to avoid getting the HB bridge? I'd just get it and be done with it no matter the display resolution. Future proof, to an extent.


no, im running ek waterblocks and i the nvidia ones dont physically fit on my set up so im waiting and trying to get informed. i just used my old floppy bridge to benchmark vs my LED bridge.

LED bridge: 55 fps
Old floppy bridge: 71 fps

this is in the division, ultra, set to 4k DSR.

edit: nvm im getting 71 fps with both now, im starting to think maybe my LED bridge just wasnt seated right to begin with


----------



## stanielz

double post


----------



## pez

Quote:


> Originally Posted by *immortalkings*
> 
> my GPU on Open side panel with running Heaven Benchmark.. it still hits above 75-79c... its still high to a GTX 1080 AMP! Edition AIB card... is my card got an issue or faulty? or it could just be normal on a Zotac card?
> 
> but il try to change my fans and get a new one to change the stock fans.. i don't want to intake the top fan cause it will get all the dirt/dust.. front panel will do the job with a filter on front. il change the H80i 120mm fans as well.. il try to find a better fans that will fit my color theme


Quote:


> Originally Posted by *steponz*
> 
> You try changing paste.. factory paste is usually garbage.


Quote:


> Originally Posted by *Shaded War*
> 
> I have a 1080 amp coming in the mail tomorrow so I'll find out. Hopefully that is just the thermal paste and not the way the cooler performs, or I'll be a little bit disappointed. I'v seen other GPUs of mine benefit 5°C or better from new paste, so post some results when you get done.


This is actually a good idea. I dropped 10C in BF4 from doing this on my old PCS+ 7870 MYST. I've done it to a couple cards before, but this was by far the most extreme variance I'd ever seen from swapping paste.

Quote:


> Originally Posted by *CallsignVega*
> 
> Sister arrived!
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> HB bridge arrives tomorrow.
> 
> This is interesting:


Very nice!

What fans are on your Noctua cooler? Thermalright?


----------



## xKrNMBoYx

I'm waiting for my turn in queue for Evga Step Up. Got an email yesterday saying my Step-Up got a free upgrade from the reference ACX 3.0 to the Super clocked model that is 100MHz faster. Can't wait for my turn.


----------



## emett

Will SLI 1080s be ok on PCI-E 2 8x with a 3930k 4.7???


----------



## BrainSplatter

Quote:


> Originally Posted by *emett*
> 
> Will SLI 1080s be ok on PCI-E 2 8x with a 3930k 4.7???


Depends a lot on the game but PCI-E 2.0 x8 will certainly impact a number of games. In some cases the performance loss might be pretty strong actually. For example the new Doom shows pretty big performance difference between PCI-E 3.0 x8 and x16.
http://forums.guru3d.com/showpost.php?p=5292306&postcount=16
Btw, that poster knows his stuff, he is a major contributor for a big custom SLI bit thread on http://www.forum-3dcenter.org/


----------



## emett

Good info thx. I might just run a single card till the 1080ti is out then


----------



## BrainSplatter

I am also trying to wait for the 1080TI, lol. Another bad aspect of SLI is that in CPU bound situations, SLI will lower performance by about 10% (probably due to the additional driver/game management overhead for SLI). That's at least what I found when doing extensive tests in Total War games (those are pretty easily CPU bottlenecked, even with sth like a 5Ghz I7 CPU).

CPU bottlenecked games are not very common (Arma 3 is another example or Project Cars in the rain with a lot of opponents I think) but just sth. else to keep in mind for SLI.

I also hope that DX12 will help with that problem due to the better multi-core usage and massively increased draw-call performance between CPU and GPU. But at the moment most DX12 games actually have really bad SLI scaling if at all (e.g. Rise of the Tomb Raider, Hitman)


----------



## Eorzean

Does anyone know the difference between the MSI Gaming and Gaming X? Is the Gaming X just factory OCed?


----------



## toncij

Just got my 1080 MSI GamingX - at 2109MHz compared to TitanX at 1465MHz and gains go from 0% (BF4 5K) to 9%.

I think I might just return it for refund...


----------



## pez

Quote:


> Originally Posted by *BrainSplatter*
> 
> Depends a lot on the game but PCI-E 2.0 x8 will certainly impact a number of games. In some cases the performance loss might be pretty strong actually. For example the new Doom shows pretty big performance difference between PCI-E 3.0 x8 and x16.
> http://forums.guru3d.com/showpost.php?p=5292306&postcount=16
> Btw, that poster knows his stuff, he is a major contributor for a big custom SLI bit thread on http://www.forum-3dcenter.org/


Quote:


> Originally Posted by *BrainSplatter*
> 
> I am also trying to wait for the 1080TI, lol. Another bad aspect of SLI is that in CPU bound situations, SLI will lower performance by about 10% (probably due to the additional driver/game management overhead for SLI). That's at least what I found when doing extensive tests in Total War games (those are pretty easily CPU bottlenecked, even with sth like a 5Ghz I7 CPU).
> 
> CPU bottlenecked games are not very common (Arma 3 is another example or Project Cars in the rain with a lot of opponents I think) but just sth. else to keep in mind for SLI.
> 
> I also hope that DX12 will help with that problem due to the better multi-core usage and massively increased draw-call performance between CPU and GPU. But at the moment most DX12 games actually have really bad SLI scaling if at all (e.g. Rise of the Tomb Raider, Hitman)


Is there links to testing for this? Doom capped at 60 FPS doesn't struggles to even utilize a single 1080 over 60%. I find it hard to believe that bandwidth limitations are necessarily the reason why 1080s in SLI on x8/x8 PCI-e 3.0 are seeing a performance hit. Then again, I haven't seen any results that really show this in the first place. I'm not trying to discredit the guy or you, but I would just like to see conclusive results.


----------



## BrainSplatter

Quote:


> Originally Posted by *pez*
> 
> Is there links to testing for this? Doom capped at 60 FPS doesn't struggles to even utilize a single 1080 over 60%. I find it hard to believe that bandwidth limitations are necessarily the reason why 1080s in SLI on x8/x8 PCI-e 3.0 are seeing a performance hit. Then again, I haven't seen any results that really show this in the first place. I'm not trying to discredit the guy or you, but I would just like to see conclusive results.


The main thread is primarily in German (u can use googl translate) but also has screenshots:
http://www.forum-3dcenter.org/vbulletin/showthread.php?p=11034642#post11034642

Also note latest results regarding the HB-bridge which is a similar issue:


----------



## emett

Quote:


> Originally Posted by *toncij*
> 
> Just got my 1080 MSI GamingX - at 2109MHz compared to TitanX at 1465MHz and gains go from 0% (BF4 5K) to 9%.
> 
> I think I might just return it for refund...


You're playing 5k BF with a single Titan x?


----------



## Asus11

Quote:


> Originally Posted by *emett*
> 
> You're playing 5k BF with a single Titan x?


this is what got me too









yeh maybe the titan X runs close to the 1080 but other games the 1080 will destory it..

coming from a guy who recently had a titan x underwater @ 1470/8000 a kingpin @ 1500 1070 @ 2176 and now a 1080 @ 2152 lol

the kingpin/ titan x is faster in BF4 than the 1070 overclocked but in games like projects cars a 1500mhz kingpin matched exactly a 1070 overclocked in project cars

plus overall my loop runs cooler, the 1080 on the other hand is faster albeit by a little in bf4 but need all the extra performance @ 3440









the titan x would get very hot with a flashed bios and unlimited PL compared to this 1080, considering im only running GPU/CPU off a single 240mm I need the cooler stuff


----------



## pez

Quote:


> Originally Posted by *BrainSplatter*
> 
> The main thread is primarily in German (u can use googl translate) but also has screenshots:
> http://www.forum-3dcenter.org/vbulletin/showthread.php?p=11034642#post11034642
> 
> Also note latest results regarding the HB-bridge which is a similar issue:


Still reading through the thread. The reason I'm so skeptical is based on the review below:
https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/10.html

Granted, the GTX980 is severely outperformed by the GTX1080 now, we're seeing a very small difference and that difference doesn't really get any type of significant until the slot that is uses is PCI-e 1.1 x8. Unfortunately it doesn't seem to be possible to 'measure' how much bandwidth the GPU is trying to pull from the lanes. The issues seem to be more related to not-so-great scaling off the bat.


----------



## BrainSplatter

Quote:


> Originally Posted by *pez*
> 
> The reason I'm so skeptical is based on the review below:https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/10.html


I know. It seems very game specific. In the same thread u can see that for Ryse the difference is already more like 10%:
https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/15.html

And it's also similar to the HB-bridge video where many games show minimal differences and some show dramatic differences.


----------



## pez

Quote:


> Originally Posted by *BrainSplatter*
> 
> I know. It seems very game specific. In the same thread u can see that for Ryse the difference is already more like 10%:
> https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/15.html
> 
> And it's also similar to the HB-bridge video where many games show minimal differences and some show dramatic differences.


Agreed.

I found this video as well:





I'm pretty happy with how he did the comparison considering he did the best he could to loop the same portion of each game to do the tests, so I think that's a really good representative. And he is seeing quite a difference in some games, and not so much in others. I definitely hope to see someone do a similar test with the GTX 1080s soon







. I plan on going SLI, and I think I'd be ok with the performance hit until something worthwhile comes along. Thank you for posting links and for the gentlemanly discussion!


----------



## bastian

Quote:


> Originally Posted by *Eorzean*
> 
> Does anyone know the difference between the MSI Gaming and Gaming X? Is the Gaming X just factory OCed?


X is factory overclocked, Non-X is not. Both are the same otherwise. Might as well get the Non-X as it is slightly cheaper.
Quote:


> Originally Posted by *toncij*
> 
> Just got my 1080 MSI GamingX - at 2109MHz compared to TitanX at 1465MHz and gains go from 0% (BF4 5K) to 9%.
> 
> I think I might just return it for refund...


Not every game/application benefits much from a 1080 over a 980 Ti/Titan X.


----------



## Jpmboy

Quote:


> Originally Posted by *emett*
> 
> Will SLI 1080s be ok on PCI-E 2 8x with a 3930k 4.7???


yes... but 2.0 x8 will affect concurrent bandwidth across the PCIE bus. you'll loose a FPS compared to 3.0 x16, The high bandwidth bridge can conpensate some, but not much. Frankly, I doubt you would notice a difference gaming at 150FPS vs 130FPS.








Quote:


> Originally Posted by *toncij*
> 
> Just got my 1080 MSI GamingX - at 2109MHz compared to TitanX at 1465MHz and gains go from 0% (BF4 5K) to 9%.
> 
> I think I might just return it for refund...


at that resolution the video ram is in play. If you DO game at 5K with one card, wait for the full die (P100), not the Ti.


----------



## toncij

Quote:


> Originally Posted by *Asus11*
> 
> this is what got me too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yeh maybe the titan X runs close to the 1080 but other games the 1080 will destory it..
> 
> coming from a guy who recently had a titan x underwater @ 1470/8000 a kingpin @ 1500 1070 @ 2176 and now a 1080 @ 2152 lol
> 
> the kingpin/ titan x is faster in BF4 than the 1070 overclocked but in games like projects cars a 1500mhz kingpin matched exactly a 1070 overclocked in project cars
> 
> plus overall my loop runs cooler, the 1080 on the other hand is faster albeit by a little in bf4 but need all the extra performance @ 3440
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the titan x would get very hot with a flashed bios and unlimited PL compared to this 1080, considering im only running GPU/CPU off a single 240mm I need the cooler stuff


Well, at Auto (medium) it runs at 60 and above FPS. At Ultra 1440 runs at 75-90, at high it runs at 90-110. Moving from 1280MHz to 1470MHz got me a small bost to 95-100 and 100-130 max.
1080 @ 2.1ish (2050-2120) gives me additional small boost like moving it to 110-120 and 120-145. Most other games have similar effect. Seems like 1080 really adds nothing significant.

I'm talking about MSI 1080...


----------



## toncij

Quote:


> Originally Posted by *bastian*
> 
> X is factory overclocked, Non-X is not. Both are the same otherwise. Might as well get the Non-X as it is slightly cheaper.
> Not every game/application benefits much from a 1080 over a 980 Ti/Titan X.


Quote:


> Originally Posted by *Jpmboy*
> 
> yes... but 2.0 x8 will affect concurrent bandwidth across the PCIE bus. you'll loose a FPS compared to 3.0 x16, The high bandwidth bridge can conpensate some, but not much. Frankly, I doubt you would notice a difference gaming at 150FPS vs 130FPS.
> 
> 
> 
> 
> 
> 
> 
> 
> at that resolution the video ram is in play. If you DO game at 5K with one card, wait for the full die (P100), not the Ti.


Yes, I know. But so many benchmarks in reviews show much better results and more gain that I really hoped we'd get at least some boost there.

Too bad... now looking to maybe try if going FE and AIO would give me some ability to go past 2-2.1GHz...


----------



## Jpmboy

Quote:


> Originally Posted by *toncij*
> 
> Yes, I know. But so many benchmarks in reviews show much better results and more gain that I really hoped we'd get at least some boost there.
> 
> Too bad... now looking to maybe try if going *FE and AIO would give me some ability to go past 2-2.1GH*z...


not until the power ceiling is removed.


----------



## RGSPro

Quote:


> Originally Posted by *skline00*
> 
> Keep me posted.


https://goo.gl/photos/L4Byh26rLE1vSEDS8

Here is the "flicker" sound of the coil whine while running Heaven. Fans and pumps are off so that's all the 1080s. The beeping at the end is me brining the pumps on with a warning that they haven't spun up yet.


----------



## nexxusty

Quote:


> Originally Posted by *chronicfx*
> 
> We will have to see benchmarks on that. They claim that the HD bridge runs 650MHZ and the older bridges 450MHZ, both use dual fingers. So we will see.


I was wondering if 2 bridges had the same effect. Not surprised if they do.


----------



## nexxusty

Quote:


> Originally Posted by *emett*
> 
> Will SLI 1080s be ok on PCI-E 2 8x with a 3930k 4.7???


3930k's have 40 PCIE Lanes....

You will get full 16x links on both cards. Also.... my 3930k ran PCIE 3.0 just fine. Try the tool from Nvidia to set PCIE to 3.0 on X79 boards.

Test without overclocks. Some chips cannot do it.

In the end, you should have 2 GPU's with 16x 3.0 links. Hopefully.


----------



## ValSidalv21

Hey guys, I need some advise.

A local shop just got the FE and G1 in stock and I'm not sure which one to get. The G1 is some 20$ cheaper, but it's also really ugly. I prefer the looks of the reference design.

So my questions are... is the FE that bad as most people say when it comes to noise and thermal performance vs the aftermarket cards like the G1? What about throttling? Can one keep a stable 2000+ MHz at acceptable noise levels?

I kind of like the STRIX, but it's not available yet and I'm not that good at waiting







Also, most likely will be more expensive than these two (maybe even 50$ above the FE) so not sure it's worth it.


----------



## nexxusty

Quote:


> Originally Posted by *ValSidalv21*
> 
> Hey guys, I need some advise.
> 
> A local shop just got the FE and G1 in stock and I'm not sure which one to get. The G1 is some 20$ cheaper, but it's also really ugly. I prefer the looks of the reference design.
> 
> So my questions are... is the FE that bad as most people say when it comes to noise and thermal performance vs the aftermarket cards like the G1? What about throttling? Can one keep a stable 2000+ MHz at acceptable noise levels?
> 
> I kind of like the STRIX, but it's not available yet and I'm not that good at waiting
> 
> 
> 
> 
> 
> 
> 
> Also, most likely will be more expensive than these two (maybe even 50$ above the FE) so not sure it's worth it.


I have an FE and it does 2050mhz stable (no clock fluctuations) at 80% fan. Haven't tested it lower fan wise.

Not too loud for me, keeps the card around 65c-70c loaded.

I went with FE because the difference this time around is nil. Maybe 50mhz or so it seems.

Hacked BIOSES will be released for FE first as well, positive of that.


----------



## sherlock

Quote:


> Originally Posted by *ValSidalv21*
> 
> Hey guys, I need some advise.
> 
> A local shop just got the FE and G1 in stock and I'm not sure which one to get. The G1 is some 20$ cheaper, but it's also really ugly. I prefer the looks of the reference design.
> 
> So my questions are... is the FE that bad as most people say when it comes to noise and thermal performance vs the aftermarket cards like the G1? What about throttling? Can one keep a stable 2000+ MHz at acceptable noise levels?
> 
> I kind of like the STRIX, but it's not available yet and I'm not that good at waiting
> 
> 
> 
> 
> 
> 
> 
> Also, most likely will be more expensive than these two (maybe even 50$ above the FE) so not sure it's worth it.


If you don't play with headphone on don't get a FE, as you might need 100% fan to keep it below 75C at max OC(like mine). Get the G1 and don't bother looking at the card after you install it.


----------



## nexxusty

75c at 100% fan? Your system isint being cooled well. Or so it would seem anyway.

As I said, I do 80% fan at 70c MAX. My case airflow is perfect, as perfect as one could have it anyway.

I'd look at your airflow.... 100% fan for 75c doesn't seem right... 100% fan puts me down in the low 60's.

Could it be where you live? Ambient temps? My PC shop never goes over 74f... it's temperature controlled.


----------



## ValSidalv21

Quote:


> Originally Posted by *sherlock*
> 
> If you don't play with headphone on don't get a FE, as you might need 100% fan to keep it below 75C at max OC(like mine). Get the G1 and don't bother looking at the card after you install it.


Damn, that's bad. I was thinking a custom profile with 80% max fan speed would be sufficient and not to loud, but guess not.

About not looking at the G1 card... well I have the Air 540 right next to me with it's huge window panel, It's impossible not to look








Quote:


> Originally Posted by *nexxusty*
> 
> 75c at 100% fan? Your system isint being cooled well. Or so it would seem anyway.
> 
> As I said, I do 80% fan at 70c MAX. My case airflow is perfect, as perfect as one could have it anyway.
> 
> I'd look at your airflow.... 100% fan for 75c doesn't seem right... 100% fan puts me down in the low 60's.
> 
> Could it be where you live? Ambient temps? My PC shop never goes over 74f... it's temperature controlled.


Ok, this is more encouraging. My room ambient is 25-26c at worst, and case temps are 1-2c on top of that.


----------



## ChevChelios

G1 >>> FE


----------



## ValSidalv21

Quote:


> Originally Posted by *ChevChelios*
> 
> G1 >>> FE


I know man, but I would rather avoid the G1 if it's not a huge difference in performance/noise between the two.


----------



## nexxusty

Quote:


> Originally Posted by *ValSidalv21*
> 
> Damn, that's bad. I was thinking a custom profile with 80% max fan speed would be sufficient and not to loud, but guess not.
> 
> About not looking at the G1 card... well I have the Air 540 right next to me with it's huge window panel, It's impossible not to look
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, this is more encouraging. My room ambient is 25-26c at worst, and case temps are 1-2c on top of that.


So that would put your case at around 82f. Mine is not far off at around 76-77f.

I'd expect you to load around 72-73c max at 80%. In that area, definitely won't be much more.


----------



## CallsignVega

Quote:


> Originally Posted by *pez*
> 
> This is actually a good idea. I dropped 10C in BF4 from doing this on my old PCS+ 7870 MYST. I've done it to a couple cards before, but this was by far the most extreme variance I'd ever seen from swapping paste.
> Very nice!
> 
> What fans are on your Noctua cooler? Thermalright?


They are Noctua fans:

http://noctua.at/en/nf-p14r-redux-1500-pwm


----------



## sherlock

Quote:


> Originally Posted by *nexxusty*
> 
> 75c at 100% fan? Your system isint being cooled well. Or so it would seem anyway.
> 
> As I said, I do 80% fan at 70c MAX. My case airflow is perfect, as perfect as one could have it anyway.
> 
> I'd look at your airflow.... 100% fan for 75c doesn't seem right... 100% fan puts me down in the low 60's.
> 
> Could it be where you live? Ambient temps? My PC shop never goes over 74f... it's temperature controlled.


There are certain batches of FE that have badly applied TIM that run really hot even at high % fan(I did have both 120% power and 100% voltage in MSI AB), mine is most likely one of those.


----------



## nexxusty

Quote:


> Originally Posted by *sherlock*
> 
> There are certain batches of FE that have badly applied TIM that run really hot even at high % fan(I did have both 120% power and 100% voltage in MSI AB), mine is most likely one of those.


SSDD. I'm not surprised. Cards going down the line, it stands to reason a couple of them will have too much TIM applied (Because we all know how liberal they are with TIM application!!).

I was thinking of replacing the TIM on my FE... however now I'm thinking it won't do much if anything.


----------



## Jpmboy

Quote:


> Originally Posted by *nexxusty*
> 
> SSDD. I'm not surprised. Cards going down the line, it stands to reason a couple of them will have too much TIM applied (Because we all know how liberal they are with TIM application!!).
> 
> I was thinking of replacing the TIM on my FE... however now I'm thinking it won't do much if anything.


If the TIM is poorly applied it will lower temps a lot. If not - you will probably still use better TIM than OEM.


----------



## kx11

i have to say that even 1080 can't handle Quantum Break @ 1440p


----------



## pez

Quote:


> Originally Posted by *CallsignVega*
> 
> They are Noctua fans:
> 
> http://noctua.at/en/nf-p14r-redux-1500-pwm


Nice! Thanks for the link. I'll have to look into those. Can't wait for the Chromax covers to come out, either







.


----------



## ChevChelios

QB is broken on a quantum level









thankfully Forza 6 Apex seems to run just fine even on 970-980 level cards, so I expect 1080 will absolutely smash Forza Horizon 3 @ 1440p ...


----------



## emett

[
Quote:


> Originally Posted by *nexxusty*
> 
> 3930k's have 40 PCIE Lanes....
> 
> You will get full 16x links on both cards. Also.... my 3930k ran PCIE 3.0 just fine. Try the tool from Nvidia to set PCIE to 3.0 on X79 boards.
> 
> Test without overclocks. Some chips cannot do it.
> 
> In the end, you should have 2 GPU's with 16x 3.0 links. Hopefully.


Quote:


> Originally Posted by *nexxusty*
> 
> 3930k's have 40 PCIE Lanes....
> 
> You will get full 16x links on both cards. Also.... my 3930k ran PCIE 3.0 just fine. Try the tool from Nvidia to set PCIE to 3.0 on X79 boards.
> 
> Test without overclocks. Some chips cannot do it.
> 
> In the end, you should have 2 GPU's with 16x 3.0 links. Hopefully.


AWESOME! Totally forgot with the 3930k they will run at PCI-E 2 16x. just tested with an old Titan, as I had never taken notice before and of course you are right. THANK YOU!
Also when I had SLI Tiatans the NVidia PCI-E 3 hack worked but it doesn't seem to be working for my single 1080 I have atm.

W00t will order my second G1 today.
theres life in this 3930k yet. !!


----------



## nexxusty

Quote:


> Originally Posted by *emett*
> 
> [
> 
> AWESOME! Totally forgot with the 3930k they will run at PCI-E 2 16x. just tested with an old Titan, as I had never taken notice before and of course you are right. THANK YOU!
> Also when I had SLI Tiatans the NVidia PCI-E 3 hack worked but it doesn't seem to be working for my single 1080 I have atm.
> 
> W00t will order my second G1 today.
> theres life in this 3930k yet. !!


Np man, glad I could jog the memory. Hehe.

Nvidia might have disabled the tool in the latest drivers. Not sure. I'd definitely ask about that.

3930k's are still beefy CPU's. TONS of life left in X79.


----------



## emett

Quote:


> Originally Posted by *nexxusty*
> 
> 3930k's are still beefy CPU's. TONS of life left in X79.


Yeah true, i'm running 4.7 24/7 on stock volts haha. Golden CPU


----------



## CapKrunch

Hello folks,

I have been waiting for EVGA GTX 1080 FTW to be available and I just went to Best Buy to buy new xbox one controller for my PC cuz my old one broken. So, I just happened to saw 1 Geforece GTX 1080 FE for 699 dollars at Best Buy and I couldn't believe that they have one.

Now, I'm getting nice fatty paycheck tomorrow and I'm not sure if I should forget about EVGA and buy that GeForce card from Best Buy?

Right now, I have 970 in my system and I always want to have 1080 to set everything at max at 1440p and be happy.

What do yall think?


----------



## MrDerrikk

Quote:


> Originally Posted by *CapKrunch*
> 
> Hello folks,
> 
> I have been waiting for EVGA GTX 1080 FTW to be available and I just went to Best Buy to buy new xbox one controller for my PC cuz my old one broken. So, I just happened to saw 1 Geforece GTX 1080 FE for 699 dollars at Best Buy and I couldn't believe that they have one.
> 
> Now, I'm getting nice fatty paycheck tomorrow and I'm not sure if I should forget about EVGA and buy that GeForce card from Best Buy?
> 
> Right now, I have 970 in my system and I always want to have 1080 to set everything at max at 1440p and be happy.
> 
> What do yall think?


You're in the exact same position I'm in, except I'm looking for surround support instead. The FTW looks cheaper (at least I got it cheaper) and has more goodies, but I guess it comes down to how much patience you have. Something else to keep in mind is the FTW has had bad reviews so far as overclocking goes, however I'm not sure how much of that is incompetence on the part of reviewers or drawing a bad card in the silicon lottery.


----------



## ChevChelios

dont buy FE

get FTW or another custom card

patience will be rewarded


----------



## CapKrunch

waiting is something I couldn't do but ya'll are right. I'll be patient and waiting for something goodies.

Thanks guys


----------



## ChevChelios

I am still waiting for my G1 myself

FEs are aplenty, but I keep on waiting


----------



## Shaded War

Quote:


> Originally Posted by *MrDerrikk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CapKrunch*
> 
> Hello folks,
> 
> I have been waiting for EVGA GTX 1080 FTW to be available and I just went to Best Buy to buy new xbox one controller for my PC cuz my old one broken. So, I just happened to saw 1 Geforece GTX 1080 FE for 699 dollars at Best Buy and I couldn't believe that they have one.
> 
> Now, I'm getting nice fatty paycheck tomorrow and I'm not sure if I should forget about EVGA and buy that GeForce card from Best Buy?
> 
> Right now, I have 970 in my system and I always want to have 1080 to set everything at max at 1440p and be happy.
> 
> What do yall think?
> 
> 
> 
> You're in the exact same position I'm in, except I'm looking for surround support instead. The FTW looks cheaper (at least I got it cheaper) and has more goodies, but I guess it comes down to how much patience you have. Something else to keep in mind is the FTW has had bad reviews so far as overclocking goes, however I'm not sure how much of that is incompetence on the part of reviewers or drawing a bad card in the silicon lottery.
Click to expand...

I ended up snagging a Zotac 1080 AMP for $640 and just installed it today. Much better load noise levels and temps than a FE while being cheaper, so just hold in there for anything besides FE to come in stock. OC to 2100 / 11000 no problem and near silent gaming.

I'm also coming off a single GTX 970 for surround displays, and the 1080 is running BF4 Ultra 4xmsaa at 60fps all day long. One of my favorite graphics cards to come out since the AMD 7970.


----------



## MrDerrikk

Quote:


> Originally Posted by *Shaded War*
> 
> I ended up snagging a Zotac 1080 AMP for $640 and just installed it today. Much better load noise levels and temps than a FE while being cheaper, so just hold in there for anything besides FE to come in stock. OC to 2100 / 11000 no problem and near silent gaming.
> 
> I'm also coming off a single GTX 970 for surround displays, and the 1080 is running BF4 Ultra 4xmsaa at 60fps all day long. One of my favorite graphics cards to come out since the AMD 7970.


How's the Zotac going temps-wise compared to other AIB cards? I've always been wary of them due to reports of sub-standard TIM being used on the cards and of sag.


----------



## ChevChelios

just from visual I would be wary of the 1080 AMP Extreme sagging too .. that thing is big as a house

but the regular 1080 AMP looks fine

dunno about temps


----------



## Shaded War

Quote:


> Originally Posted by *MrDerrikk*
> 
> How's the Zotac going temps-wise compared to other AIB cards? I've always been wary of them due to reports of sub-standard TIM being used on the cards and of sag.


I haven't adjusted any voltages or fan curves, I just bumped up TDP and +150 core, 490 memory and called it done. I'm about to play some BF4 and I'll leave GPU-Z running in background to record temps and clock speeds during a full multiplayer round.


----------



## Shaded War

I played 30 minutes of BF4 multiplayer on the Zotac 1080 AMP and the average temp was 72°C with an average 2060 core clock with my quick unrefined overclock.
I know for sure the card goes over 2100, but I haven't tested the overclocking potential in depth. Changing the TIM would probably lower temps some, but they are good enough where I don't feel the need to.

LINK to a 3DMark run I made.

With vsync enabled in games, I have to take my side panel off my case and hold my head near the GPU to hear the fans or coil whine. It makes less noise than my Noctua NF F12 fans on the resistor and fan controller turned down. When you hit super high FPS with vsync off, coil whine gets loud but it doesn't sound like normal whine. Its similar to the sound a low rpm HDD spinning up, but louder. But the fans themselves are nearly silent. The coil whine is a little disappointing, but unless you are at something like 200FPS it's still quieter than my Gigabyte 970 G1 that had no whine.

The unit itself is rather wide and heavy, but it doesn't sag. The backplate feels to be made of either carbon or PCB glass, and it holds it together nicely. The fan shroud is made of metal and the same carbon material accents that they dipped into a carbon fiber hydro dip.

The only other thing I can think of worth mentioning is I cannot figure out how to change the RGB lights. They are listed in the options on the Zotac firestorm overclocking tool, but they are whited out and don't do anything. I tried to disable them with the Geforce experience program, but it doesn't do anything either.


----------



## MrDerrikk

Sounds like Zotac have hit upon another winner then, might take one of those if my FTW doesn't show up soon.


----------



## LBear

Been checking all week for the AMP Extreme since the listing went up on newegg/amazon. Really hope this card arrive soon or i might end up going EVGA.


----------



## Crazy G

Quote:


> Originally Posted by *Shaded War*
> 
> LINK to a 3DMark run I made.


Man, I´ve got the same results with a Titan X @ 1460/7900/ 1.218v! Just ordered a G1 arriving tuesday. Seems that I´ll return it, sadly.


----------



## toncij

Quote:


> Originally Posted by *Crazy G*
> 
> Man, I´ve got the same results with a Titan X @ 1460/7900/ 1.218v! Just ordered a G1 arriving tuesday. Seems that I´ll return it, sadly.


I've thought you couldn't possibly have 23535 with a TitanX at 1460. Can you show it to us? :O


----------



## Crazy G

I´ve got over 22k in 3DMark 11 but Fire Strike that´s pretty much the max I´ve got.


----------



## toncij

Quote:


> Originally Posted by *Crazy G*
> 
> Man, I´ve got the same results with a Titan X @ 1460/7900/ 1.218v! Just ordered a G1 arriving tuesday. Seems that I´ll return it, sadly.


I've thought you couldn't possibly have 23535 with a TitanX at 1460. Can you show it to us?
Quote:


> Originally Posted by *Crazy G*
> 
> I get over 22k in 3DMark 11 but Fire Strike that´s pretty much the max I´ve got.


I can't move from 20k in FireStrike - 23k sounds impossible? Link?


----------



## Crazy G

There are some misunderstanding. I mentioned this FS results http://www.3dmark.com/3dm/12648136?


----------



## toncij

Quote:


> Originally Posted by *Crazy G*
> 
> Man, I´ve got the same results with a Titan X @ 1460/7900/ 1.218v! Just ordered a G1 arriving tuesday. Seems that I´ll return it, sadly.


Quote:


> Originally Posted by *Crazy G*
> 
> There are some misunderstanding. I mentioned this FS results http://www.3dmark.com/3dm/12648136?


So, did you get same results with a TX at 1460 or not?


----------



## Crazy G

Yeap

http://www.3dmark.com/fs/5327694


----------



## Kielon

Quote:


> Originally Posted by *Crazy G*
> 
> Yeap
> 
> http://www.3dmark.com/fs/5327694


EVGA 1080 SC air-cooled is like 16% faster when it comes to graphic score...
http://www.3dmark.com/compare/fs/5327694/fs/8917602


----------



## Crazy G

You have a top multi thread CPU and a quite high OC on your 1080. Perhaps I´ll keep the arriving G1. Like every new GPU, they alms us with an average of 15% performance to milk our $ untill next year another 15% increase and the milking...it sucks. They have technology years ahead that could be used today, but the milking is what they really want.....


----------



## Kielon

GPU clocks reported by 3Dmark may be misleading as real clocks are way lower due to power throttling. Custom bios should fix it though.


----------



## Clockster

My Gigabyte GTX1080 Extreme premium pack is on the way.
Should be here in the next 2 hours. Will post some picks for you guys.


----------



## CallsignVega

Quote:


> Originally Posted by *Clockster*
> 
> My Gigabyte GTX1080 Extreme premium pack is on the way.
> Should be here in the next 2 hours. Will post some picks for you guys.


Make sure to hand spin the fans to see if they hit the shroud before install.


----------



## Clockster

Quote:


> Originally Posted by *CallsignVega*
> 
> Make sure to hand spin the fans to see if they hit the shroud before install.


Checked it and it's perfect.

Gorgeous thing this


----------



## PasK1234Xw

Just got my HB SLI bridge and shocker i see improvements unlike what others have claimed also firestrike went up

before single bridge

stock 38k

2Ghz 40k
http://www.3dmark.com/3dm/12298819

After HB Bridge

stock
http://www.3dmark.com/3dm/12309510

2Ghz
http://www.3dmark.com/3dm/12661311


----------



## toncij

One question, not trolling: how many games actually work with SLI lately?


----------



## Spiriva

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Just got my HB SLI bridge and shocker i see improvements unlike what others have claimed also firestrike went up
> 
> before single bridge
> 
> stock 38k
> 
> 2Ghz 40k
> http://www.3dmark.com/3dm/12298819
> 
> After HB Bridge
> 
> stock
> http://www.3dmark.com/3dm/12309510
> 
> 2Ghz
> http://www.3dmark.com/3dm/12661311


Nice gain with the hb bridge, I just placed an order for the Evga hb bridge. Hope Ill see a gain too.
So close to 30k, just a little bit more oc and you got it!


----------



## kx11

anyone whos got Rise of the Tomb Raider , turn on VXAO and try to launch the game

it crashes with me all the time


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *kx11*
> 
> anyone whos got Rise of the Tomb Raider , turn on VXAO and try to launch the game
> 
> it crashes with me all the time


I'll try when I'm home I'm away until tomorrow


----------



## AllGamer

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Just got my HB SLI bridge and shocker i see improvements unlike what others have claimed also firestrike went up
> 
> before single bridge
> 
> stock 38k
> 
> 2Ghz 40k
> http://www.3dmark.com/3dm/12298819
> 
> After HB Bridge
> 
> stock
> http://www.3dmark.com/3dm/12309510
> 
> 2Ghz
> http://www.3dmark.com/3dm/12661311


Very nice!









with that confirmation, then the HB bridge and a 2nd GTX1080 is worth it


----------



## Spikeyjohnson

Quote:


> Originally Posted by *kx11*
> 
> anyone whos got Rise of the Tomb Raider , turn on VXAO and try to launch the game
> 
> it crashes with me all the time


I had this same issue. I looked it, and can't remember where I found the answer, but if you go down 1 setting from VXAO, it will run. From what I read, it seems to be a driver issue that is being ironed out by the Developer with the help of Nvidia?


----------



## kx11

Quote:


> Originally Posted by *Spikeyjohnson*
> 
> I had this same issue. I looked it, and can't remember where I found the answer, but if you go down 1 setting from VXAO, it will run. From what I read, it seems to be a driver issue that is being ironed out by the Developer with the help of Nvidia?


if you go down from VXAO then HBAO+ is on , i want VXAO since Nvidia promoted the crap out of it


----------



## nexxusty

Quote:


> Originally Posted by *ChevChelios*
> 
> I am still waiting for my G1 myself
> 
> FEs are aplenty, but I keep on waiting


Lol for what? 50-100mhz?

Complete waste of time IMO. Only reason not to buy an FE is to buy a Non FE ref card and save $100.


----------



## sherlock

Quote:


> Originally Posted by *nexxusty*
> 
> Lol for what? 50-100mhz?
> 
> Complete waste of time IMO. Only reason not to buy an FE is to buy a Non FE ref card and save $100.


There are people wanting a quieter and cooler card than FE, clocks aren't really that important for 1080 at this point.

My 1080 Xtreme gaming just came in, quite nice to have 60-65 temp on sub 50% silentfan. The 1080 FE I had went for $800+shipping on ebay so I pretty much got all my money back on that anyway.



Mid fan didn't touch the X and don't give buzz sound as a result, a relief after reading all the QC issue this card had online.


----------



## nexxusty

Quote:


> Originally Posted by *sherlock*
> 
> There are people wanting a quieter and cooler card than FE, clocks aren't really that important for 1080 at this point.
> 
> My 1080 Xtreme gaming just came in, quite nice to have 60-65 temp on sub 50% silentfan. The 1080 FE I had went for $800+shipping on ebay so I pretty much got all my money back on that anyway.


Good point.

Didn't even think about that.

You enjoying your Xtreme Gaming?


----------



## sherlock

Quote:


> Originally Posted by *nexxusty*
> 
> Good point.
> 
> Didn't even think about that.
> 
> You enjoying your Xtreme Gaming?


So far so good, 1974/60-65C in 3D mark and Heaven out of the box. Got the whole weekend to tweak it with afterburner. If it don't work out I can always return( a few bent heatsink pins on the right end) or ebay it after seeing how the 1080 FTW that comes in next Wednesday do, advantage of being a 1080 owner in this market.


----------



## ChevChelios

Quote:


> Originally Posted by *nexxusty*
> 
> *Lol for what*? 50-100mhz?
> 
> Complete waste of time IMO. Only reason not to buy an FE is to buy a Non FE ref card and save $100.


............ the cooler ..............

which has been the major reason to buy non-ref only since the dawn of time


----------



## nexxusty

Quote:


> Originally Posted by *ChevChelios*
> 
> ............ the cooler
> ..............


Yeah that's been answered. Get with the times. ;-)


----------



## ChevChelios

you were asking me, so I answered you


----------



## nexxusty

Quote:


> Originally Posted by *ChevChelios*
> 
> you were asking me, so I answered you


No I know bro, just being affable.


----------



## Naked Snake

So I got my EVGA 1080, the fan is really loud at 70% maybe it's because I sit right beside my Phantom 820 but oh well, I've achieved 2063 Mhz on the core @ 73C and I'm sure I can push more, just doing a fast test.


----------



## AllGamer

SLI 2x GTX 1080 of different speed

Currently I have the MSI GTX 1080 FE = stock everything.
I plan to pick up a new MSI GTX 1080 Sea Hawk EK for the water loop

I originally planned to get 2x FE and put them on water.

but now knowing the new stuff are coming re: Sea Hawk with EK blocks

I was thinking either getting 2x Sea Hawk EK

or keep current FE + EK block
and then add a new Sea Hawk with EK
to do SLI

the Sea Hawk EK will be factory clocked higher than FE

so If I put them on SLI, I know it should theoretically work, but the Sea Hawk will probably run at a lower speed due the FE stock speed.

but I should be able to use MSI Afterburner to make both cards run at the same speed, right ?!?!

Can some one confirm if this works?
anyone which have had similar experiences with older GTX models from different manufactures / speed ratings.


----------



## Jpmboy

Quote:


> Originally Posted by *AllGamer*
> 
> SLI 2x GTX 1080 of different speed
> 
> Currently I have the MSI GTX 1080 FE = stock everything.
> I plan to pick up a new MSI GTX 1080 Sea Hawk EK for the water loop
> 
> I originally planned to get 2x FE and put them on water.
> 
> but now knowing the new stuff are coming re: Sea Hawk with EK blocks
> 
> I was thinking either getting 2x Sea Hawk EK
> 
> or keep current FE + EK block
> and then add a new Sea Hawk with EK
> to do SLI
> 
> the Sea Hawk EK will be factory clocked higher than FE
> 
> so If I put them on SLI, I know it should theoretically work, but the Sea Hawk will probably run at a lower speed due the FE stock speed.
> 
> but I should be able to use MSI Afterburner to make both cards run at the same speed, right ?!?!
> 
> Can some one confirm if this works?
> anyone which have had similar experiences with older GTX models from different manufactures / speed ratings.


usually you can just flash both cards to the same bios and they will play well together. Otherwise, they will work fine, but you will likely need to clock each independently.


----------



## AllGamer

Quote:


> Originally Posted by *Jpmboy*
> 
> usually you can just flash both cards to the same bios and they will play well together. Otherwise, they will work fine, but you will likely need to clock each independently.


Oh yeah, you are right, I totally forgot we can download / flash / customize BIOS with the BIOS editor floating around.

in that case I'll probably pick up an EK block for my existing FE edition, and just pickup the EK edition when they restock.

then I just need to download the firmware from the EK edition and flash it on the FE edition.

Thanks


----------



## moustang

Just arrived.



Got a little business to take care of, then I'll be swapping out the cooler to hybrid cooling, and then I should be ready to go in about 3 hours or so.


----------



## ChevChelios

Quote:


> Originally Posted by *moustang*
> 
> Just arrived.
> 
> 
> 
> Got a little business to take care of, then I'll be swapping out the cooler to hybrid cooling, and then I should be ready to go in about 3 hours or so.


damn the MSIs 1070/1008 look so sexy

if they didnt cost so much here compared to all other models Id definitely go MSI over anything else


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> usually you can just flash both cards to the same bios and they will play well together. Otherwise, they will work fine, but you will likely need to clock each independently.


so these cards can be safetly bios flashed? which bios can the FE take?


----------



## reset1101

Hi. I have just received a Palit 1080 today. Its amazing and its absurdly silent but Im having some weird behaviour in games at times. I have frame drops I didnt have with the 980Ti. I have removed old drivers with DDU and installed latest nvidia drivers.

Thanks for your help


----------



## Setzer

Quote:


> Originally Posted by *moustang*
> 
> Just arrived.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Got a little business to take care of, then I'll be swapping out the cooler to hybrid cooling, and then I should be ready to go in about 3 hours or so.


Nice! Mine should be arriving tomorrow, or Monday


----------



## gree

Quote:


> Originally Posted by *moustang*
> 
> Just arrived.
> 
> 
> 
> Got a little business to take care of, then I'll be swapping out the cooler to hybrid cooling, and then I should be ready to go in about 3 hours or so.


Why get the x instead of the seahawk?


----------



## immortalkings

Quote:


> I played 30 minutes of BF4 multiplayer on the Zotac 1080 AMP and the average temp was 72°C with an average 2060 core clock with my quick unrefined overclock.


what country are you living? my Zotac AMP is hitting 82c and my average is 80c.. i hope my card isn't faulty.. il try to change the Thermal Compound when i got one.. the ambient temp in my country is high i think


----------



## moustang

Quote:


> Originally Posted by *gree*
> 
> Why get the x instead of the seahawk?




Why pay extra for the Seahawk when I already had a hybrid cooling setup that I could install for free?

I was running GTX 770 SLI with hybrid cooling. I only had to remove the G10 bracket and AIO from one of the 770s and install it on the Gaming X.

Interesting side note here....

With the hybrid cooling it's defaulting to 1975mhz on the GPU. I haven't touched a thing.


----------



## Jpmboy

Quote:


> Originally Posted by *AllGamer*
> 
> Oh yeah, you are right, I totally forgot we can download / flash / customize BIOS with the BIOS editor floating around.
> 
> in that case I'll probably pick up an EK block for my existing FE edition, and just pickup the EK edition when they restock.
> 
> then I just need to download the firmware from the EK edition and flash it on the FE edition.
> 
> Thanks


well don't get too excited just yet.. we can't mod a bios atm. But NVflash will save and flash bios from/to the card. Main thing is to make sure that windows 10 is updated (no really - otherwise nothing will work)
must have at least this version:


Quote:


> Originally Posted by *Asus11*
> 
> so these cards can be safetly bios flashed? which bios can the FE take?


any with the same power section.

this nvflash works:

NVFlash_Certs_Bypassed_v5.287_x64.zip 1155k .zip file


----------



## Naked Snake

So I got a dud in the mem OC, can't get stable after 50+ mem







temps are good for a FE but sadly the fan drives me crazy if it's running at more than 60% and at that point temps sucks


----------



## bastian

Mine has arrived. Zero coil whine and a great overclocker. +250 on the core and +500 on the memory. That brings the core up to 2164 boost. Memory goes to 5508. Haven't tried any higher yet. And no modifications to the voltage.



Interestingly, the Non-X has a different BIOS from the X. The Non-X is sold exclusively to Newegg at the moment. As you can see I can go up to 121 on the power, whereas the X's can only do 107. Both the Non-X and X have the same power phases and 6+8 pin. The Non-X just doesn't come factory OC as much as the X.

I have the fan speed set fixed at 50 percent, or around 1200-1300 rpm. Reaches around 70-73 under load. Even going as high as 60-70 percent fan speed is fairly quiet.

This is the best AIB card, IMHO.


----------



## moustang

Quote:


> Originally Posted by *bastian*
> 
> Interestingly, the Non-X has a different BIOS from the X. The Non-X is sold exclusively to Newegg at the moment. As you can see I can go up to 121 on the power, whereas the X's can only do 107. Both the Non-X and X have the same power phases and 6+8 pin. The Non-X just doesn't come factory OC as much as the X.


Congrats on the card.

But just as an FYI, if you use the older Afterburner you can set the power to 121 on the Gaming X as well.

Of course I tried it on mine and it wasn't stable at that setting. It crashed after about 10 minutes, but it took it and ran for a while. I'm now running a more conservative and stable overclock with the core at 2154mhz.


----------



## bastian

Quote:


> Originally Posted by *moustang*
> 
> Congrats on the card.
> 
> But just as an FYI, if you use the older Afterburner you can set the power to 121 on the Gaming X as well.
> 
> Of course I tried it on mine and it wasn't stable at that setting. It crashed after about 10 minutes, but it took it and ran for a while. I'm now running a more conservative and stable overclock with the core at 2154mhz.


Oh? I am using the latest Afterburner beta. I'm so close to 2.2. I have to see if I can get there, even if the boost can't hold it for much!


----------



## ChevChelios

yeah Gaming (X) is a winner .. now if only MSI didnt charge so much


----------



## moustang

Quote:


> Originally Posted by *bastian*
> 
> Oh? I am using the latest Afterburner beta. I'm so close to 2.2. I have to see if I can get there, even if the boost can't hold it for much!


I hit 2226mhz

It lasted about 3 minutes and then crashed. Not heat related either, it was only at 46C at the time.

I've got my memory up at 5535 and have had no problems at all. I'm thinking about pushing it to 5550 and see if it holds.


----------



## moustang

Quote:


> Originally Posted by *ChevChelios*
> 
> yeah Gaming (X) is a winner .. now if only MSI didnt charge so much


That's the one downside, but at the same time their VRAM and VRM cooling is second to none. Since I knew I was going to add hybrid cooling it became a choice between the Gaming X or buying another cheaper card and then having to spend another $40-$80 on heatsinks for the VRAM and VRM to keep them cool. None of the other cards have all of the VRAM covered with a single heat spreader plus the VRM covered with their own heatsink. Once all was said and done there was only like a $20 price difference and it was worth the $20 to me just to not have to deal with the hassle of it all.


----------



## moustang

OK, I think I found right about the stable limits of my card.

Core = 2126mhz. If I try to push it above 2130mhz it crashes. 2126mhz runs just fine for extended playing.
Memory = 5551mhz. I tried 5582mhz but got artifacting after about 10 minutes.5551mhz runs without problems.

I could probably spend the next day trying to push each setting up 1mhz at a time to find the very limits, but it ran for 30 minutes straight without a single hiccup at 2126mhz core and 5551mhz memory. I'll let it sit there, at least until a modded BIOS becomes available.

It should also be noted that there is no difference between Power Limit at 100 or at 121. It ran just as stable and just as fast at 100.


----------



## fayzaan

finally got mine too







, haven't overclocked or anything yet. Will soon and let you guys know the results


----------



## dentnu

Hi guys I am planning on buying a 1080 and wanted to know what is the best brand out of all of them to get at the moment ? I plan to do a hybrid G10 bracket install on it once I get it so will not be using the stock cooler. I have been using EVGA for over 10 years and was planing on getting the FTW or SC but just saw the MSI gaming (X) has independent vram and vrm heat spreader which looks to be compatible with the G10 bracket.

1. Does anyone here have the MSI gaming (X) and has been able to install the G10 bracket and keep those heat spreaders on ?

2.How is the warranty and RMA process on the MSI cards are they as good as EVGA ?

3.It looks like the EVGA FTW and other cards with more power connectors don't really overclock that much more should I just get the cheapest card I can find ?

Thanks


----------



## emett

Am I right in thinking running 2 ribbon SLI bridges will be the same as the HB SLI bridge?


----------



## AllGamer

Quote:


> Originally Posted by *emett*
> 
> Am I right in thinking running 2 ribbon SLI bridges will be the same as the HB SLI bridge?


nope, check a few pages back

a guy actually tested before (2 ribbon)

and after with actual HB SLI bridges

quite a big difference in speed

--- EDIT ---

found it
http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/1600_50#post_25287111
Quote:


> Originally Posted by *PasK1234Xw*
> 
> Just got my HB SLI bridge and shocker i see improvements unlike what others have claimed also firestrike went up
> 
> before single bridge
> 
> stock 38k
> 
> 2Ghz 40k
> http://www.3dmark.com/3dm/12298819
> 
> After HB Bridge
> 
> stock
> http://www.3dmark.com/3dm/12309510
> 
> 2Ghz
> http://www.3dmark.com/3dm/12661311


----------



## moustang

Quote:


> Originally Posted by *dentnu*
> 
> Hi guys I am planning on buying a 1080 and wanted to know what is the best brand out of all of them to get at the moment ? I plan to do a hybrid G10 bracket install on it once I get it so will not be using the stock cooler. I have been using EVGA for over 10 years and was planing on getting the FTW or SC but just saw the MSI gaming (X) has independent vram and vrm heat spreader which looks to be compatible with the G10 bracket.
> 
> 1. Does anyone here have the MSI gaming (X) and has been able to install the G10 bracket and keep those heat spreaders on ?


Yes, see my previous posts here. I'm running the Gaming X with the G10 bracket and the Kraken X41 AIO. Factory VRAM heatspreader and VRM heatsink were untouched and are still in place. Only the GPU heatsink was removed. I even left the backplate on.

Quote:


> 2.How is the warranty and RMA process on the MSI cards are they as good as EVGA ?


Can't say for certain since I've never had to RMA any of the 6 MSI products that I've owned. Wish I could say the same about the EVGA products I've owned, but I'm on my second PSU in 3 years.

Quote:


> 3.It looks like the EVGA FTW and other cards with more power connectors don't really overclock that much more should I just get the cheapest card I can find ?


A lot of people are going that route, but if you're going to add the G10 and AIO cooling then I suggest you take a good look at the VRAM and VRM cooling on whatever card you're interested in. If it's inadequate then you'll be buying more heatsinks to be adding to them, and that sort of defeats the purpose of buying the cheapest card.


----------



## emett

Thanks for the link Allgamer but where does he say he was running two ribbons?


----------



## emett

Never mind its confirmed two ribbon cables give the same boost as the HB bridge @ the end of the hardware unboxed video..


----------



## Bloodymight

Quote:


> Originally Posted by *moustang*
> 
> OK, I think I found right about the stable limits of my card.
> 
> Core = 2126mhz. If I try to push it above 2130mhz it crashes. 2126mhz runs just fine for extended playing.
> Memory = 5551mhz. I tried 5582mhz but got artifacting after about 10 minutes.5551mhz runs without problems.
> 
> I could probably spend the next day trying to push each setting up 1mhz at a time to find the very limits, but it ran for 30 minutes straight without a single hiccup at 2126mhz core and 5551mhz memory. I'll let it sit there, at least until a modded BIOS becomes available.


Try to test your clocks with Unreal Tournament alpha(it's free) that game got my card crashing after a while even though it was stable for ~4-5 hours of gaming/testing (Witcher 3, several 3Dmark runs, Heaven Benchmark, Ark)


----------



## fayzaan

Quote:


> Originally Posted by *bastian*
> 
> 
> 
> Mine has arrived. Zero coil whine and a great overclocker. +250 on the core and +500 on the memory. That brings the core up to 2164 boost. Memory goes to 5508. Haven't tried any higher yet. And no modifications to the voltage.
> 
> 
> 
> Interestingly, the Non-X has a different BIOS from the X. The Non-X is sold exclusively to Newegg at the moment. As you can see I can go up to 121 on the power, whereas the X's can only do 107. Both the Non-X and X have the same power phases and 6+8 pin. The Non-X just doesn't come factory OC as much as the X.
> 
> I have the fan speed set fixed at 50 percent, or around 1200-1300 rpm. Reaches around 70-73 under load. Even going as high as 60-70 percent fan speed is fairly quiet.
> 
> This is the best AIB card, IMHO.


Hey Bastion, how are you testing the stability of your overclock? I have the same card, but I am having difficulty even getting 130+ core to be stable, even with core voltage up to 100. I am running Firestrike for testing, and keep getting driver crash. Even though temps don't go above 75c.


----------



## Xeq54

MSI released the "reviewers" OC bios for the gaming X 1080 and 70. Just flashed it. Do not see much difference because I have the shunt mod. But base clock is now 1706 and 10105 for memory.


----------



## Ragnarook

Finally got over 7000 in Valley bench











4790k @ 5ghz, both 1080 @ 2164mhz and the mem @ +550


----------



## Jpmboy

Quote:


> Originally Posted by *Ragnarook*
> 
> Finally got over 7000 in Valley bench
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4790k @ 5ghz, both 1080 @ 2164mhz and the mem @ +550


run Heaven 4.0. 1.0 is ancient.


----------



## stocksux

Getting ready to join the club. Finally snuck an order in on newegg after getting an in stock alert on my phone. So stoked! Ordered the Asus Strix OC. Also got notice the EK waterblock for it will be here on the same day the card gets here! (next Wednesday)


----------



## bastian

Quote:


> Originally Posted by *fayzaan*
> 
> Hey Bastion, how are you testing the stability of your overclock? I have the same card, but I am having difficulty even getting 130+ core to be stable, even with core voltage up to 100. I am running Firestrike for testing, and keep getting driver crash. Even though temps don't go above 75c.


I run Valley in a loop and also run Witcher 3 to check stability.


----------



## XCalinX

Man. I'm dying for custom bios to come out. The stock bios is such a limiting factor for my watercooled 1080...


----------



## traxtech

Yeah no kidding, surprised it's going awhile


----------



## Ragnarook

Quote:


> Originally Posted by *Jpmboy*
> 
> run Heaven 4.0. 1.0 is ancient.


Both Heaven 4.0 & Valley 1.0 are from Feb 2013, so they are both pretty ancient.

"Download Unigine Valley Benchmark. Shortly after releasing the Heaven benchmark Unigine Valley is now released. Valley is available for Windows, Linux and Mac OSX operating systems, 32bit and 64bit respectively."


----------



## BrightCandle

Quote:


> Originally Posted by *toncij*
> 
> One question, not trolling: how many games actually work with SLI lately?


Hardware unboxed (youtube) and guru3d have both done SLI testing and the basic answer seems to be its sufficiently worth it in about 2/3 of games so long as you are at 1440p and wanting high frame rates or 4k.

That isn't actually that great because the 2x670/680 and 2x 970 crowds both got more than 90% of games showing at least 50% scaling. Its not as good as it used to be but that isn't really SLI's fault its changes in the way games are written.


----------



## GTANY

Quote:


> Originally Posted by *bastian*
> 
> 
> 
> Mine has arrived. Zero coil whine and a great overclocker. +250 on the core and +500 on the memory. That brings the core up to 2164 boost. Memory goes to 5508. Haven't tried any higher yet. And no modifications to the voltage.
> 
> 
> 
> Interestingly, the Non-X has a different BIOS from the X. The Non-X is sold exclusively to Newegg at the moment. As you can see I can go up to 121 on the power, whereas the X's can only do 107. Both the Non-X and X have the same power phases and 6+8 pin. The Non-X just doesn't come factory OC as much as the X.
> 
> I have the fan speed set fixed at 50 percent, or around 1200-1300 rpm. Reaches around 70-73 under load. Even going as high as 60-70 percent fan speed is fairly quiet.
> 
> This is the best AIB card, IMHO.


Is a SLI bridge included with the card ?

Indeed, I ordered 2 MSI 1080 Gaming for a SLI and I already own a SLI bridge. If a second one is included, it would replace a HB-SLI bridge.


----------



## bastian

Quote:


> Originally Posted by *GTANY*
> 
> Is a SLI bridge included with the card ?
> 
> Indeed, I ordered 2 MSI 1080 Gaming for a SLI and I already own a SLI bridge. If a second one is included, it would replace a HB-SLI bridge.


No SLI bridge.


----------



## GTANY

OK, thank you for your reply.


----------



## Jpmboy

Quote:


> Originally Posted by *Ragnarook*
> 
> Both Heaven 4.0 & Valley 1.0 are from Feb 2013, so they are both pretty ancient.
> 
> "Download Unigine Valley Benchmark. Shortly after releasing the Heaven benchmark Unigine Valley is now released. Valley is available for Windows, Linux and Mac OSX operating systems, 32bit and 64bit respectively."


so then Heaven 1.0 is what? archaic?









run unigine tropics... even older.


----------



## Ragnarook

Quote:


> Originally Posted by *Jpmboy*
> 
> so then Heaven 1.0 is what? archaic?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> run unigine tropics... even older.


Haha yes it must be







Cause Heaven 4.0 and Valley 1.0 is equally old.

http://www.phoronix.com/scan.php?page=article&item=unigine_valley_preview&num=1


----------



## Bogga

I prefer Heaven to see if my clocks are stable. I can pass Valley and 3DMark with a clock that wont pass Heaven...


----------



## toncij

Quote:


> Originally Posted by *moustang*
> 
> Yes, see my previous posts here. I'm running the Gaming X with the G10 bracket and the Kraken X41 AIO. Factory VRAM heatspreader and VRM heatsink were untouched and are still in place. Only the GPU heatsink was removed. I even left the backplate on.
> Can't say for certain since I've never had to RMA any of the 6 MSI products that I've owned. Wish I could say the same about the EVGA products I've owned, but I'm on my second PSU in 3 years.
> A lot of people are going that route, but if you're going to add the G10 and AIO cooling then I suggest you take a good look at the VRAM and VRM cooling on whatever card you're interested in. If it's inadequate then you'll be buying more heatsinks to be adding to them, and that sort of defeats the purpose of buying the cheapest card.


Did you have to mod G10 any? I have Corsair's HG10 N980 on a TitanX, planing to get 1080 for a wife's machine.

Quote:


> Originally Posted by *Ragnarook*
> 
> Finally got over 7000 in Valley bench
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4790k @ 5ghz, both 1080 @ 2164mhz and the mem @ +550


Test with a bit less than 5GHz maybe?


----------



## Jpmboy

Quote:


> Originally Posted by *Bogga*
> 
> I prefer Heaven to see if my clocks are stable. I can pass Valley and 3DMark with a clock that wont pass Heaven...


^^This. Heaven 4.0 loads the card much harder than Heaven 1.0, or Valley.








Quote:


> Originally Posted by *toncij*
> 
> Did you have to mod G10 any? I have Corsair's HG10 N980 on a TitanX, planing to get 1080 for a wife's machine.
> Test with a bit less than 5GHz maybe?


no reason to.. the score is pretty low already.
Something ain;t right.
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_20


----------



## moustang

Quote:


> Originally Posted by *Bloodymight*
> 
> Try to test your clocks with Unreal Tournament alpha(it's free) that game got my card crashing after a while even though it was stable for ~4-5 hours of gaming/testing (Witcher 3, several 3Dmark runs, Heaven Benchmark, Ark)


I would never test hardware with Alpha software.

Alpha software itself is buggy and unstable. That's why it's Alpha software. It's not even stable enough to be tested at a Beta stage yet. That means that if you get a crash you really have no idea if the crash is because of the hardware or the software.


----------



## jase78

Quote:


> Originally Posted by *stocksux*
> 
> Getting ready to join the club. Finally snuck an order in on newegg after getting an in stock alert on my phone. So stoked! Ordered the Asus Strix OC. Also got notice the EK waterblock for it will be here on the same day the card gets here! (next Wednesday)


Same thing here. Now in stock finally gave me enough time to click the link and order. I also snatched a strix oc from newegg. It's scheduled to arrive Tuesday.


----------



## moustang

Quote:


> Originally Posted by *toncij*
> 
> Did you have to mod G10 any? I have Corsair's HG10 N980 on a TitanX, planing to get 1080 for a wife's machine.


Nope, no changes at all, unless you consider relocating one foam pad that's used to support the end of the card a mod. Not any trouble though, just pulled it off and put it on in a slightly different spot so it wouldn't come down on the fan connector on the card.

Other than that it was a straight swap. I spent longer removing the old cables that went to my second 770 than I did installing the G10 on the 1080.


----------



## Ragnarook

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^This. Heaven 4.0 loads the card much harder than Heaven 1.0, or Valley.


That might be true, for me it passes both valley, heaven and 3dmark. But valley doesnt take 10 mins to run like heaven.

However that wasnt what we talked about, you said i should run heaven cause valley was ancient, i just wanted to point out that heaven 4.0 was released back in feb 2013, just like valley.
So that anyone should run heaven 4.0 instead of valley 1.0 because "heaven 4.0" is some what newer is false.


----------



## dentnu

Quote:


> Originally Posted by *moustang*
> 
> Yes, see my previous posts here. I'm running the Gaming X with the G10 bracket and the Kraken X41 AIO. Factory VRAM heatspreader and VRM heatsink were untouched and are still in place. Only the GPU heatsink was removed. I even left the backplate on.
> Can't say for certain since I've never had to RMA any of the 6 MSI products that I've owned. Wish I could say the same about the EVGA products I've owned, but I'm on my second PSU in 3 years.
> A lot of people are going that route, but if you're going to add the G10 and AIO cooling then I suggest you take a good look at the VRAM and VRM cooling on whatever card you're interested in. If it's inadequate then you'll be buying more heatsinks to be adding to them, and that sort of defeats the purpose of buying the cheapest card.


Thanks for the reply I am going to give the msi gaming card a shot since it works so well with the G10.Which one do you have the gaming or gaming X ? I just set up alerts through nowinstock for all the evga cards and msi gaming and X version. which ever comes in stock first I plan to get hope its the MSI card.


----------



## moustang

I've got the Gaming X version because I bought it at my local Microcenter. The Gaming version is Newegg exclusive (At least for now).

It is my understanding that both cards are identical physically and the only difference is the factory preset clock settings. Both should manually overclock identically.


----------



## fat4l

There's this dilemma still in my head guys ......

Should I keep Asus 980Ti strix in SLI with waterblocks or go and get 1080 with a waterblock ?








Price-wise=the same.
Performance-wise=I would expect "up to" 50% more performance out of 980Ti SLI.

1. Not sure how 980Ti will perform under DX12 titles
2. Not gonna change graphics cards in 1-2years time
3. Not sure about SLI support
4. I play in 1440p so ....not sure if 1080 can get close to 100fps+
5. These asus strix 980Ti cards are voltage locked to 1.21V so watercooling may not be really worth it
6. 1080 is voltage locked too so.....beh...

What to do lol...


----------



## chronicfx

Quote:


> Originally Posted by *fat4l*
> 
> There's this dilemma still in my head guys ......
> 
> Should I keep Asus 980Ti strix in SLI with waterblocks or go and get 1080 with a waterblock ?
> 
> 
> 
> 
> 
> 
> 
> 
> Price-wise=the same.
> Performance-wise=I would expect "up to" 50% more performance out of 980Ti SLI.
> 
> 1. Not sure how 980Ti will perform under DX12 titles
> 2. Not gonna change graphics cards in 1-2years time
> 3. Not sure about SLI support
> 4. I play in 1440p so ....not sure if 1080 can get close to 100fps+
> 5. These asus strix 980Ti cards are voltage locked to 1.21V so watercooling may not be really worth it
> 6. 1080 is voltage locked too so.....beh...
> 
> What to do lol...


Keep the 980 Ti SLI unless you are going 1080SLI. They will perform fine, if you are looking for a good investment a gsync monitor for your benq would be best money spent. Really I can't stress enough that all of the stutters are GONE.... Since I went GSYNC.


----------



## fat4l

Quote:


> Originally Posted by *chronicfx*
> 
> Keep the 980 Ti SLI unless you are going 1080SLI. They will perform fine, if you are looking for a good investment a gsync monitor for your benq would be best money spent. Really I can't stress enough that all of the stutters are GONE.... Since I went GSYNC.


Changed the sig now







I already have G-sync pg289Q


----------



## bfedorov11

Quote:


> Originally Posted by *fat4l*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> There's this dilemma still in my head guys ......
> 
> Should I keep Asus 980Ti strix in SLI with waterblocks or go and get 1080 with a waterblock ?
> 
> 
> 
> 
> 
> 
> 
> 
> Price-wise=the same.
> Performance-wise=I would expect "up to" 50% more performance out of 980Ti SLI.
> 
> 1. Not sure how 980Ti will perform under DX12 titles
> 2. Not gonna change graphics cards in 1-2years time
> 3. Not sure about SLI support
> 4. I play in 1440p so ....not sure if 1080 can get close to 100fps+
> 5. These asus strix 980Ti cards are voltage locked to 1.21V so watercooling may not be really worth it
> 6. 1080 is voltage locked too so.....beh...
> 
> What to do lol...


I would go with the 1080. I guess it depends on the games that you play. Do they actually have sli support? I went from 2x TX to a single 1080 for 4k60hz and I am happy. I'll never go back to sli until I start to see support in games at release again.


----------



## AllGamer

if you are running GTX980 non-Ti then upgrade to GTX 1080

if you already own GTX980 Ti, stay with what you have until GTX 1080 Ti comes out


----------



## fat4l

The problem is I can get 2x 980Ti's very cheap= same price as 1x 1080....
Thats the decision to make...


----------



## AllGamer

Quote:


> Originally Posted by *fat4l*
> 
> The problem is I can get 2x 980Ti's very cheap= same price as 1x 1080....
> Thats the decision to make...


In that case you'll be better off with 1 new GTX 1080, no point picking up yesterday's tech, when you can get the current generation.

I thought you already have a SLI GTX 980Ti setup, since you don't get what is currently new.

and when they push out the GTX 1080 Ti, just get a 2nd GTX 1080 to SLI with the current one.

whenever Ti version comes out, the previous model always drop in price by half.


----------



## Clockster

Mmmm I got my GTX1080 Extreme and I absolutely adore the card.
The software is also great...except for the part where some of my steam games launch and close instantly when the software is running.
Obviously I'm scared of a vac ban so kind of scared to run it...

Ideas?


----------



## Jpmboy

Quote:


> Originally Posted by *Ragnarook*
> 
> That might be true, for me it passes both valley, heaven and 3dmark. But valley doesnt take 10 mins to run like heaven.
> 
> However that wasnt what we talked about, you said i should run heaven cause valley was ancient, i just wanted to point out that heaven 4.0 was released back in feb 2013, just like valley.
> So that anyone should run heaven 4.0 instead of valley 1.0 because "heaven 4.0" is some what newer is false.


erm - not quite. Was it not Heaven 1.0 that you posted, not Heaven 4.0?


----------



## Pragmatist

Quote:


> Originally Posted by *Clockster*
> 
> Mmmm I got my GTX1080 Extreme and I absolutely adore the card.
> The software is also great...except for the part where some of my steam games launch and close instantly when the software is running.
> Obviously I'm scared of a vac ban so kind of scared to run it...
> 
> Ideas?


You won't get a vacation for using a software meant for your graphics card or the like. C'mon dude. It's not like it's hooking into the game, or modifying the memory.


----------



## emett

Hooked on 4k DSR down scaled to 2k atm. W3 looks incredible.


----------



## Ragnarook

Quote:


> Originally Posted by *Jpmboy*
> 
> erm - not quite. Was it not Heaven 1.0 that you posted, not Heaven 4.0?


Aha okay you tought it was Heaven 1.0 I had run. Its been a missunderstanding, It was Valley 1.0 my picture was from


----------



## guyinthecorner1

I have a Gigabyte G1 gaming 1080 and I have been experiencing a fan issue. I have a profile set so the fans don't spin until 60°C. However at times, the fans rev up very fast at idle around 40-50°C for a few seconds. This occurs every 30 seconds or so. Has anyone else had this problem?


----------



## axiumone

Haven't seen anyone take one of these apart, so here you go.

EVGA ACX 3.0 Tear down.



See the rest below.


Spoiler: Warning: Spoiler!


----------



## kx11

Dual 1080 Strix finally arrived


----------



## KillerBee33

Hello.
MSI Ref. coming this tuesday. Was wondering if i can do the same thing i did with 980 Ref. using EVGA Hybrid AIO or do i have to wait for official unit.
Thanx in advance


----------



## axiumone

Look here.

http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look


----------



## KillerBee33

Quote:


> Originally Posted by *axiumone*
> 
> Look here.
> 
> http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look


Thanx







Got paranoid when i saw AIR temps on a Ref. 1080


----------



## kx11

Strix is ripping it



surprisingly this Strix GPU is light compared to TitanX XTREMEGAMING and 1080 FE

no kidding this GPU feels lighter than FE


----------



## Jpmboy

Quote:


> Originally Posted by *Ragnarook*
> 
> Aha okay you tought it was Heaven 1.0 I had run. Its been a missunderstanding, It was Valley 1.0 my picture was from


ahh - my bad. sorry.


----------



## aylan1196

Guys any one with evga fe bios


----------



## Asus11

Quote:


> Originally Posted by *aylan1196*
> 
> Guys any one with evga fe bios


all FE have the same bios


----------



## aylan1196

And whts the best nvflash to use different bios


----------



## aylan1196

I need the evga for the prescient oxc


----------



## axiumone

Quote:


> Originally Posted by *aylan1196*
> 
> I need the evga for the prescient oxc


The latest version of precision also asked me for my evga cards serial number. I don't think you'll be able to get around just using the bios, you'll probably need someone to share their serial with you...


----------



## fat4l

So guys I decided to go with 1080









I'm not sure if it's worth to pay the premium for AIB cards or it's fine to go with reference design and much cheaper.....








Hmmmm.

I wonder what cards have the ability to go over 1.25v voltage lock? Maybe kingpin ?
Which one is the best to go with if I'm custom EK watercooling it ?


----------



## aylan1196

Then msi afterburner I guess


----------



## Asus11

Quote:


> Originally Posted by *fat4l*
> 
> So guys I decided to go with 1080
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not sure if it's worth to pay the premium for AIB cards or it's fine to go with reference design and much cheaper.....
> 
> 
> 
> 
> 
> 
> 
> 
> Hmmmm.
> 
> I wonder what cards have the ability to go over 1.25v voltage lock? Maybe kingpin ?
> Which one is the best to go with if I'm custom EK watercooling it ?


go with a MSI AERO or FE for custom water or any of EVGA ACX reference designs

basically grab the cheapest ref board you can get


----------



## fat4l

Quote:


> Originally Posted by *Asus11*
> 
> go with a MSI AERO or FE for custom water or any of EVGA ACX reference designs
> 
> basically grab the cheapest ref board you can get


and the reason behind it?


----------



## looniam

Quote:


> Originally Posted by *axiumone*
> 
> *The latest version of precision also asked me for my evga cards serial number.* I don't think you'll be able to get around just using the bios, you'll probably need someone to share their serial with you...


/lurk

WUT?!?!


----------



## Asus11

Quote:


> Originally Posted by *fat4l*
> 
> and the reason behind it?


because all 1080s under water will most likely perform the same


----------



## Xeq54

Well, thanks to the new bios from MSI for the Gaming X, I just passed 3dmark at 2156 to 2177 mhz on air cooling. Kind of interesting, the new bios does not allow for a higher voltage, but the voltage stays at 1093 during load so I was able to go higher. With the original bios, I was able to pass 3dmark with 2088mhz max clock.

http://www.3dmark.com/3dm/12725517 Graphics score 24366


----------



## ChevChelios

new BIOSes already appearing ?

where do you get it ?


----------



## fat4l

Quote:


> Originally Posted by *Asus11*
> 
> because all 1080s under water will most likely perform the same


And is there any noticeable coil whine on FE edition ?


----------



## dentnu

Quote:


> Originally Posted by *ChevChelios*
> 
> new BIOSes already appearing ?
> 
> where do you get it ?


The bios he is talking about was release a few days ago by MSI and is for the Gaming X 1080. They released it cause they got allot **** regarding this bios as it was supposedly on all the 1080 gaming x cards they sent to reviewers. The bios is clocked a bit higher than what the gaming x cards are shipping with from the factory. If you have a MSI 1080 gaming x card you can download the bios straight from the MSI support website and upgrade your card.

http://www.bit-tech.net/news/hardware/2016/06/22/msi-defends-bios-hack/1


----------



## Asus11

Quote:


> Originally Posted by *fat4l*
> 
> And is there any noticeable coil whine on FE edition ?


pot luck









my 1070fe had alot

this 1080 fe now barely any at all


----------



## Setzer

How does one go about installing this BIOS from MSI?
Is it the same old shady deal of using USB drives and weird commands during boot - or is there some software from their side that can do it?


----------



## Asus11

Quote:


> Originally Posted by *Setzer*
> 
> How does one go about installing this BIOS from MSI?
> Is it the same old shady deal of using USB drives and weird commands during boot - or is there some software from their side that can do it?


most likely new nvflash software for pascal


----------



## fat4l

Quote:


> Originally Posted by *geggeg*
> 
> Speaking of which,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't link the review links here but those results should help.


I'm using 2x QD4 13x19mm and I love them...
I saw the review and these 13x19 must have very low restrictions I suppose...


----------



## skline00

fat 41 I use the same Koolance QD4s in my 4790k rig with an external MO-RA3-420 Pro radiator and they work VERY well.


----------



## VSG

Yeah, they are great if you don't mind the price increase and the limited options.


----------



## fat4l

Quote:


> Originally Posted by *skline00*
> 
> fat 41 I use the same Koolance QD4s in my 4790k rig with an external MO-RA3-420 Pro radiator and they work VERY well.


You sir dont know about it but you are on of those who inspired me to get mora3








Not many ppl own these big rads ...


----------



## Vellinious

Quote:


> Originally Posted by *fat4l*
> 
> You sir dont know about it but you are on of those who inspired me to get mora3
> 
> 
> 
> 
> 
> 
> 
> 
> Not many ppl own these big rads ...


R'amen. The MO RAs are beastly. Using one in my rig as well.


----------



## pez

Quote:


> Originally Posted by *fat4l*
> 
> There's this dilemma still in my head guys ......
> 
> Should I keep Asus 980Ti strix in SLI with waterblocks or go and get 1080 with a waterblock ?
> 
> 
> 
> 
> 
> 
> 
> 
> Price-wise=the same.
> Performance-wise=I would expect "up to" 50% more performance out of 980Ti SLI.
> 
> 1. Not sure how 980Ti will perform under DX12 titles
> 2. Not gonna change graphics cards in 1-2years time
> 3. Not sure about SLI support
> 4. I play in 1440p so ....not sure if 1080 can get close to 100fps+
> 5. These asus strix 980Ti cards are voltage locked to 1.21V so watercooling may not be really worth it
> 6. 1080 is voltage locked too so.....beh...
> 
> What to do lol...


I'd honestly stick with your current SLI setup. A single 1080 will be a bit better, but I don't think necessarily perceptible at 1440p.
Quote:


> Originally Posted by *axiumone*
> 
> Haven't seen anyone take one of these apart, so here you go.
> 
> EVGA ACX 3.0 Tear down.
> 
> 
> 
> See the rest below.
> 
> 
> Spoiler: Warning: Spoiler!


Nice. Looks like taking off that cheesy plating gives it potential to look less flashy.


----------



## Antykain

I've had my Gigabyte GTX 1080 G1 Gaming card for about a week and half or so.. Really loving it so far! I upgraded from a EVGA 780 Classy and the performance increase was pretty impressive. Going to be putting her underwater as soon as EK gets the FC block released. It's listed on their website already, just not in stock yet. Not expecting to get much more performance gains by putting it underwater, but.. it just has to be done.












Still have not completely tested the card to see how much OC headroom it has.. that is coming still.


----------



## fayzaan

So I tried overclocking my gtx 1080, having a real hard time getting it stable at 2k. I can get it stable around 1960ish..

if I go over to 2k then sometimes driver crashes. I tried upping the voltage, seemed to help but still at times i get crashes.

I also tried using the new method of incrementing on per voltage basis (whatever they call it), that does help cuz i can raise the clocks on 1093 voltage specifically, but wont go much higher.

Stil really happy with the card, no more multi-gpu stress, smooth gameplay and most games work at 4k just fine.

Do you guys think i will have better luck with custom bios'? And what about this gaming x bios? Can i use it? My card is the newegg exclusive msi gtx 1080 gaming (non-x).


----------



## kx11

just a cool shot of Strix 1080 SLi


----------



## ChevChelios

Quote:


> Originally Posted by *Antykain*
> 
> I've had my Gigabyte GTX 1080 G1 Gaming card for about a week and half or so.. Really loving it so far! I upgraded from a EVGA 780 Classy and the performance increase was pretty impressive. Going to be putting her underwater as soon as EK gets the FC block released. It's listed on their website already, just not in stock yet. Not expecting to get much more performance gains by putting it underwater, but.. it just has to be done.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still have not completely tested the card to see how much OC headroom it has.. that is coming still.


are you using GB Extreme engine or the MSI afterburner to OC the G1 ? any issues ?

any fan noise/spin up issues ?


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> just a cool shot of Strix 1080 SLi


How is your oc for both ?


----------



## kx11

Quote:


> Originally Posted by *fat4l*
> 
> How is your oc for both ?


i could reach 2114 and a little with , but stable OC while gaming @ 4k should be no more than 1950mhz , so yeah there's throttling however there's almost no fan noise coming out while heavy gaming stress is applied on the GPUs

i'm using asus LED SLi bridge and it's doing a good job scaling the performance @ 4k , this was done on the OC mode bios provided by Asus there's also Gaming and Silence mode too , Silence will downclock the Core+MEM clocks by -50









over all solid GPU and feels lighter than FE

one more thing i love is that the Power connector slots are upside down so the locks are on top of the GPU for easy removal


----------



## ValSidalv21

Got myself the FE today, it's my first reference card







. Can't wait to see how it compares vs the 980 Ti G1 from my sig.


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> i could reach 2114 and a little with , but stable OC while gaming @ 4k should be no more than 1950mhz , so yeah there's throttling however there's almost no fan noise coming out while heavy gaming stress is applied on the GPUs
> 
> i'm using asus LED SLi bridge and it's doing a good job scaling the performance @ 4k , this was done on the OC mode bios provided by Asus there's also Gaming and Silence mode too , Silence will downclock the Core+MEM clocks by -50
> 
> 
> 
> 
> 
> 
> 
> 
> 
> over all solid GPU and feels lighter than FE
> 
> one more thing i love is that the Power connector slots are upside down so the locks are on top of the GPU for easy removal


What would you achieve if you ramp up the fans to 100%?
Im deciding which 1080 to buy and as im watercooling i need a good card.
However i kinda feel that all custom cards are the same as FE...


----------



## webmi




----------



## Klocek001

not a bad oc achieved considering it's my day one of owning 1080. I think it boosts to 2100 when I jump in the game and then settles around 2063MHz after some time.
Boost of out the box is almost 1900MHz as long as I can keep it around 70 degrees. btw this is TW3 1440p with every slider absolutely maxed.
was a bit afraid of reference cooling as I always bought AIB, but I'm genuinely surprised with it. 60% is silent, 70% is almost inaudible. 80% is what I'd like to run in oc mode. 90% is for the hot days (loud,but nothing too loud by any means). I expected to hear a jet engine at 100% but frankly I could live with that, only slightly audible when I play games without headphones.
above all, the FE cooler has a very pleasant airflow/wind sort of sound, nothing like the roar of Windforce at high rpm.


----------



## ChevChelios

Quote:


> Originally Posted by *Klocek001*
> 
> 
> 
> not a bad oc achieved considering it's my day one of owning 1080. I think it boosts to 2100 when I jump in the game and then settles around 2063MHz after some time.
> Boost of out the box is almost 1900MHz as long as I can keep it around 70 degrees. btw this is TW3 1440p with every slider absolutely maxed.
> was a bit afraid of reference cooling as I always bought AIB, but I'm genuinely surprised with it. 60% is silent, 70% is almost inaudible. 80% is what I'd like to run in oc mode. 90% is for the hot days (loud,but nothing too loud by any means). I expected to hear a jet engine at 100% but frankly I could live with that, only slightly audible when I play games without headphones.
> above all, the FE cooler has a very pleasant airflow/wind sort of sound, nothing like the roar of Windforce at high rpm.


what is it that writes Gsync On on your screen there ?


----------



## Klocek001

Quote:


> Originally Posted by *ChevChelios*
> 
> what is it that writes Gsync On on your screen there ?


lol it's a gsync screen so it says gsync


----------



## ChevChelios

Quote:


> Originally Posted by *Klocek001*
> 
> lol it's a gsync screen so it says gsync


I mean is it software or some setting in the monitors OSD ?


----------



## Klocek001

Quote:


> Originally Posted by *ChevChelios*
> 
> I mean is it software or some setting in the monitors OSD ?


gsync indicator in nv control panel. display - gsync indicator.


----------



## Malinkadink

I'm somewhat of a gigabyte loyalist as i've had a very good experience so far with my 970 g1. Is there really any reason to go for the 1080 Xtreme for $699 over the 1080 g1 for $649?


----------



## pez

Quote:


> Originally Posted by *Malinkadink*
> 
> I'm somewhat of a gigabyte loyalist as i've had a very good experience so far with my 970 g1. Is there really any reason to go for the 1080 Xtreme for $699 over the 1080 g1 for $649?


That premium package is very nice and I'd say you get quite a bit for your money. If you like the look/design of it, that's also a benefit. Surprisingly it grew on me, but I ended up with a G1.

However, there was rumor of a recall in APAC for the cards due to QC issues between the fans hitting the plastic 'X' and bent fins. The fins being the bigger issue. You could order one and you may get lucky, but overall, people seem to be happier with their G1s so far.


----------



## nexxusty

Quote:


> Originally Posted by *moustang*
> 
> I would never test hardware with Alpha software.
> 
> Alpha software itself is buggy and unstable. That's why it's Alpha software. It's not even stable enough to be tested at a Beta stage yet. That means that if you get a crash you really have no idea if the crash is because of the hardware or the software.


You played it?

It doesn't crash. Ever. Unless you're unstable. It's an "Alpha" by name only.


----------



## Talon2016

@Bastian Nah my Gaming X goes up to 121% power limit as well. Maybe they're just shipping with different versions of the vBIOS?

I'm running (so far) 2100mhz on the core stable. I'll test to see if I can go further later today. Oddly enough last week this was artifacting and not stable when tried. Artifacts were immediately noticed and BF4 would crash soon after. Possible I didn't get enough voltage applied but I tried again and played for a few hours at 2100mhz, no throttling stable clocks, and zero artifacts.. With fans at 50% I sat around 69-70C.


----------



## Setzer

My Gaming X goes up to 121% as well


----------



## Kylar182

Sooo... Why are these cards not overclocking well? The best set is at number 39 on Firestrike Ultra and it's a 6950x and 4 Way 1080. It's over 2k less points than my Titan X's. Are they overrated or just bad overclockers?


----------



## ChevChelios

from 1750+Mhz stock to close to 2100 (and sometimes 2100+) is not bad, thats 17%+ core OC

not as good as 980Ti, but not bad


----------



## bastian

Quote:


> Originally Posted by *Setzer*
> 
> My Gaming X goes up to 121% as well


They must be tweaking the vBIOS then, cause I know some people with the X are only able to go up to 107.


----------



## Robilar

Grabbed the Asus FE. With my monitor, won't really need to overclock. By the time a 1440P version of my monitor is out, I expect the 1080Ti will be released and I will upgrade at that point.

http://s1201.photobucket.com/user/RobilarOCN/media/DSC01617_zpssmll0rn2.jpg.html

http://s1201.photobucket.com/user/RobilarOCN/media/1080_zpsxpkseb64.jpg.html


----------



## Kylar182

Sooo... Why are these cards not overclocking well? The best set is at number 39 on Firestrike Ultra and it's a 6950x and 4 Way 1080. It's over 2k less points than my Titan X's. Are they overrated or just bad overclockers?


----------



## Benjiw

Quote:


> Originally Posted by *Kylar182*
> 
> Sooo... Why are these cards not overclocking well? The best set is at number 39 on Firestrike Ultra and it's a 6950x and 4 Way 1080. It's over 2k less points than my Titan X's. Are they overrated or just bad overclockers?


Too early to tell yet, people haven't bios modded them that I know of? As for them not overclocking well, they're faster stock than cards out now and they overclock to 2ghz in most cases so I wouldn't say they're bad. Unless you mean extreme LN2 overclocking then that's a whole different kettle of fish.


----------



## Kylar182

Quote:


> Originally Posted by *Benjiw*
> 
> Too early to tell yet, people haven't bios modded them that I know of? As for them not overclocking well, they're faster stock than cards out now and they overclock to 2ghz in most cases so I wouldn't say they're bad. Unless you mean extreme LN2 overclocking then that's a whole different kettle of fish.


I do not mean LN2, mine are on water and they don't heat stop. No one really does LN2 on cards anyhow, just not practical even with a good bench. I understand the speed but if the pipe is small or if the drivers suck they'll still be ****. I suppose only time will tell.


----------



## Robilar

True enough but when you consider that the vast majority of people buying this card are doing so to actually use it to play games rather than bench, it's faster at stock than my 980Ti was overclocked. It will last me until the 1080Ti comes out at which point I might upgrade again.

Personally I would never buy cards solely because of their bench scores? I get that it gives us a rough performance idea compared to older gen cards but ultimately benching serves no other purpose. Min/Max frame rates are all that I am concerned about.


----------



## KillerBee33

Is Pascal BiosTweaker out yet with NVFlash?


----------



## moustang

Quote:


> Originally Posted by *nexxusty*
> 
> You played it?
> 
> It doesn't crash. Ever. Unless you're unstable. It's an "Alpha" by name only.


Just because it doesn't crash that does not mean it's stable or bug free. Ask anyone who has run an SLI rig for any period of time how many games they've found that stuttered or had laggy performance until an SLI fix was released for the game. The games all ran, they didn't crash, but performance was definitely hindered by the game.


----------



## moustang

Quote:


> Originally Posted by *bastian*
> 
> They must be tweaking the vBIOS then, cause I know some people with the X are only able to go up to 107.


Nope, just using a different version of Afterburner than what shipped with the card will allow the 121 power setting. Mine did the same thing right out of the box. I didn't install any of the software that came with the card, just installed the card, the latest Nvidia drivers, and then opened up the version of Afterburner which I had been using for my 770s and it let me go up to 121.

As far as I can tell it doesn't make any difference though. If it was unstable at 107 it was unstable at 121. In fact I've found no advantage to increasing it over the default 100. My GPU runs just fine at 2100mhz and 100 power level.


----------



## IF6WAS9

Quote:


> Originally Posted by *bastian*
> 
> They must be tweaking the vBIOS then, cause I know some people with the X are only able to go up to 107.


The gaming bios lets you go to 121% and the OC bios only goes to 107%. So the people reporting 107% have either flashed to the OC bios or are using the MSI Gaming APP to set it to OC mode.


----------



## Kylar182

Quote:


> Originally Posted by *Robilar*
> 
> True enough but when you consider that the vast majority of people buying this card are doing so to actually use it to play games rather than bench
> 
> Personally I would never buy cards solely because of their bench scores? I get that it gives us a rough performance idea compared to older gen cards but ultimately benching serves no other purpose. Min/Max frame rates are all that I am concerned about.


Mmm, I disagree. People buy cards because of Firestrike or Heaven scores. Honestly I don't know anyone that doesn't buy a card based on a bench score (including in game benchmarks). So either the card is **** or the drivers suck. It's overclocking freq is above anything on benchmarks now but it's still lower on the scores and framerates. Somethings up, I really do want to know. Personally I'm waiting on Pascal Titans but if Pascal sucks I'd rather just wait it out.


----------



## moustang

Quote:


> Originally Posted by *Setzer*
> 
> How does one go about installing this BIOS from MSI?
> Is it the same old shady deal of using USB drives and weird commands during boot - or is there some software from their side that can do it?


Assuming you have the MSI GTX 1080 Gaming X card you just download the BIOS update and run the .bat file included with it. It does everything else. You can install it from within windows, no need to do anything else.

Read the Acrobat file that comes with the download to see which version you want to install. One boots to Gaming mode by default, the other boots to OC mode. Just run the bat file, select the mode you want to boot to, and that's it. It does everything else.


----------



## Talon2016

Quote:


> Originally Posted by *IF6WAS9*
> 
> The gaming bios lets you go to 121% and the OC bios only goes to 107%. So the people reporting 107% have either flashed to the OC bios or are using the MSI Gaming APP to set it to OC mode.


This. I flashed my Gaming X to the "Reviewers OC vBIOS" and my power limit was reduced to 107% max setting in Afterburner. Went back to my Stock Gaming X vBIOS and 121% again max setting in Afterburner. I also found the review bios to be less overclockable. My default vBIOS provides better overclocking. It's also a newer revision.


----------



## i7monkey

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ *this*. The power limit is killing these cards!!


If the 1080's power limit was increased from 120% to say 150%, wouldn't it overload the card and damage it in the longrun (or quickly) if cards used more than 120?


----------



## Jpmboy

Quote:


> Originally Posted by *Robilar*
> 
> Grabbed the Asus FE. With my monitor, won't really need to overclock. By the time a 1440P version of my monitor is out, I expect the 1080Ti will be released and I will upgrade at that point.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1201.photobucket.com/user/RobilarOCN/media/DSC01617_zpssmll0rn2.jpg.html
> 
> http://s1201.photobucket.com/user/RobilarOCN/media/1080_zpsxpkseb64.jpg.html


asic quality is not supported on this card and the number in your screenshot is incorrect. Update your version of GPUz.

Quote:


> Originally Posted by *i7monkey*
> 
> If the 1080's power limit was increased from 120% to say 150%, wouldn't it overload the card and damage it in the longrun (or quickly) if cards used more than 120?


The "%" - like 150% - is a value calculated from the base entry in bios... so if you mean that by increasing the TDP to say.. 350W would that overload the card? No. The bios has hardware triggers that will shut down the card in the event of over temp. Of course, overlocking any hardware adds stress to the parts... and if you do, you need to be able to manage the increased thermals. This is no different than mods that increase the TDP and PL on every generation preceding Pascal.I've been running SLI titans at 1.274V and a 475W power limit (+100% over stock) for what - 2 years? Still very strong cards.
lol - this IS Overclock.net, right?
Quote:


> Originally Posted by *Kylar182*
> 
> Mmm, I disagree. People buy cards because of Firestrike or Heaven scores. Honestly I don't know anyone that doesn't buy a card based on a bench score (including in game benchmarks). So either the card is **** or the drivers suck. It's overclocking freq is above anything on benchmarks now but it's still lower on the scores and framerates. Somethings up, I really do want to know. Personally I'm waiting on Pascal Titans but if Pascal sucks I'd rather just wait it out.


Firestrike on BWE seems bugged, but Mk11 does not. Heaven and Vally are running properly IMO, and in all these, the 1080 is doing better than any previous generation... don't forget to compare ambient to ambient cooling results. Check 3DMK11 Preformance and extreme for 1 and 2 cards on the HOF. and also check OCN's own bench mark threads for Valley and Heaven 4.0.


----------



## i7monkey

Quote:


> Originally Posted by *Jpmboy*
> 
> The "%" - like 150% - is a value calculated from the base entry in bios... so if you mean that by increasing the TDP to say.. 350W would that overload the card? No. The bios has hardware triggers that will shut down the card in the event of over temp. Of course, overlocking any hardware adds stress to the parts... and if you do, you need to be able to manage the increased thermals. This is no different than mods that increase the TDP and PL on every generation preceding Pascal.I've been running SLI titans at 1.274V and a 475W power limit (+100% over stock) for what - 2 years? Still very strong cards.
> lol - this IS Overclock.net, right?


What I meant to say is that do these cards have the power circuitry/capability to run at higher watts without damaging the hardware (assume temps and voltage are in perfect check).


----------



## Malinkadink

Anyone running 1080 g1s can report on their OC results? If it can generally hit 2.1ghz with relative easy i think i'll gladly pick it up over the Xtreme variant that seems to be having some QC problems with fans rubbing and bent heatsink fins.


----------



## Pendulum

Quote:


> Originally Posted by *axiumone*
> 
> Haven't seen anyone take one of these apart, so here you go.
> 
> EVGA ACX 3.0 Tear down.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> See the rest below.


Thanks for the tear down. I've got the same card on the way now.
I have an OC'd EVGA GTX 460 that's been running for 6 years straight in my media rig with 0 issues. Figured I'd give EVGA another go.
I like the way the Strix looks a little better but they are even harder to find in-stock.


----------



## AllGamer

Quote:


> Originally Posted by *Robilar*
> 
> Grabbed the Asus FE. With my monitor, won't really need to overclock. By the time a 1440P version of my monitor is out, I expect the 1080Ti will be released and I will upgrade at that point.


Congrats! seems like you finally made up your mind









You've been going back and forth for a while in the other thread.

I was almost sure you'll have gone for the ASUS Strixx instead of the FE model, that was a bit surprising since you never mentioned it on the other thread.


----------



## ValSidalv21

Didn't have much time to play with my card, been busy all day doing backups so I can replace my LTSB Windows 10 with the latest Pro. Now I have tons of junk on my PC







. I thought that only the latest beta driver required updated Windows version. Guess I was wrong.

Anyway, so far I managed +190 on the core, haven't tried to overclock the memory yet. The best Fire Strike Graphics Score I got is 23 746 although for some weird reason the higher Graphics Score I achieve the lower Combined Score is. It goes from 8 903 with a Graphics Score of 23 381, then 8 826 / 23 642, then 8 778 / 23 697, then 8 774 / 23 703 all the way down to 8 684 for my best Graphics Score. Almost linear


----------



## wsarahan

Guys

Finally me both acx 3.0 SC will arrive tomorrow, how are you making oc?

Just like the 900 series using the core and memos slider or using the new curve oc?

If you are using the old method are you change the voltage %? If yes how much is the overall that people are using for safe?

Tks

Enviado do meu iPhone usando Tapatalk


----------



## Jpmboy

Quote:


> Originally Posted by *i7monkey*
> 
> What I meant to say is that do these cards have the power circuitry/capability to run at higher watts without damaging the hardware (assume temps and voltage are in perfect check).


as much as any other reference PCB does. There's only one way to know for sure isn't there...


----------



## wsarahan

And another question, the asic value still makes the difference?

People always said to put the highest asic card on top and the lowest on bottom, should I still do that? My higher asic cards always had a higher boost, this still happens with 1080's?

Enviado do meu iPhone usando Tapatalk


----------



## smonkie

Quote:


> Originally Posted by *Klocek001*
> 
> 
> 
> not a bad oc achieved considering it's my day one of owning 1080. I think it boosts to 2100 when I jump in the game and then settles around 2063MHz after some time.
> Boost of out the box is almost 1900MHz as long as I can keep it around 70 degrees. btw this is TW3 1440p with every slider absolutely maxed.
> was a bit afraid of reference cooling as I always bought AIB, but I'm genuinely surprised with it. 60% is silent, 70% is almost inaudible. 80% is what I'd like to run in oc mode. 90% is for the hot days (loud,but nothing too loud by any means). I expected to hear a jet engine at 100% but frankly I could live with that, only slightly audible when I play games without headphones.
> above all, the FE cooler has a very pleasant airflow/wind sort of sound, nothing like the roar of Windforce at high rpm.


How someone could say ANYTHING at ~3000 rpm is "almost inaudible" is beyond my mind. Sure, we all have different awareness, but c'mon. 3000 rpm is loud.


----------



## Jpmboy

Quote:


> Originally Posted by *wsarahan*
> 
> And another question, the asic value still makes the difference?
> People always said to put the highest asic card on top and the lowest on bottom, should I still do that? My higher asic cards always had a higher boost, this still happens with 1080's?
> Enviado do meu iPhone usando Tapatalk


ASIC is not supported on the 1080.


----------



## wsarahan

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> And another question, the asic value still makes the difference?
> People always said to put the highest asic card on top and the lowest on bottom, should I still do that? My higher asic cards always had a higher boost, this still happens with 1080's?
> Enviado do meu iPhone usando Tapatalk
> 
> 
> 
> ASIC is not supported on the 1080.
Click to expand...

So the boost will be the same at both cards? At.y 980Ti the high asic has a better boost

Thanks

Enviado do meu iPhone usando Tapatalk


----------



## wsarahan

How much should I start trying at core and memos? I wanna achieve 2000 in core and no idea in memos, what you suggest guys?


----------



## chronicfx

Just got mine straight from NVIDIA. Using 2 ribbon cables atm.



So far just a first push of +112 core and +200 mem. My slider power limit goes up to 120% so I will have to give it a push more than the +50 I have now. I use Astro A50 headphones and have an air conditioner in the window next to me so the fan speed profile of 50c-75c (40%-100% fan) does not bother me. Here is Heaven 4.0 1440p with the first push I stated.


----------



## Robilar

Quote:


> Originally Posted by *AllGamer*
> 
> Congrats! seems like you finally made up your mind
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You've been going back and forth for a while in the other thread.
> 
> I was almost sure you'll have gone for the ASUS Strixx instead of the FE model, that was a bit surprising since you never mentioned it on the other thread.


A couple of things changed my mind. I had a credit at a local store that was going to waste and they had an Asus FE in stock. Plus with my currently monitor, I can easily max out BF4 and still see 200FPS. I plan to eventually grab the 1440P version of the Z35 when it is released by which time hopefully the 1080Ti will also be out.

I've been calling every computer company in Toronto for the last two weeks and they have zero stock as they are all still filling pre-orders.

Plus I just sold my 980Ti and would be without a video card


----------



## chronicfx

Quote:


> Originally Posted by *Robilar*
> 
> A couple of things changed my mind. I had a credit at a local store that was going to waste and they had an Asus FE in stock. Plus with my currently monitor, I can easily max out BF4 and still see 200FPS. I plan to eventually grab the 1440P version of the Z35 when it is released by which time hopefully the 1080Ti will also be out.
> 
> I've been calling every computer company in Toronto for the last two weeks and they have zero stock as they are all still filling pre-orders.
> 
> Plus I just sold my 980Ti and would be without a video card


Just put both of my 980Ti (evga hybrid + msi 6g) on a shelf.. Not sure what to do with them... Sell or give them to my nephews who are visiting next week and hope my brother doesn't try to buy them a pentium processor to go with it..... I am sure there are a lot of sharks circling to get them cheap so don't PM me with BS offers lmao.. If I put them up I will do so in the marketplace at a fair price, just not sure what that is now and can't go by the money grubbers on EBAY.. I at least want to get one of these 1080's worth back if I go that route..... Better do that before AMD drops in a few days though eh...


----------



## Naked Snake

So is it normal that games fires at 2100mhz or up and then they drop and run at 2088-2063 Mhz all the time??? I don't think that the cause is because of the temps, I'm hitting 69C on my card @ 90% Fan but I find it weird that it cannot mantain 2100mhz all the time and core fluctuate between 2088-2063mhz on my FE, some games it even drops to 2050mhz for the full game session.


----------



## chronicfx

Quote:


> Originally Posted by *Naked Snake*
> 
> So is it normal that games fires at 2100mhz or up and then they drop and run at 2088-2063 Mhz all the time??? I don't think that the cause is because of the temps, I'm hitting 69C on my card @ 90% Fan but I find it weird that it cannot mantain 2100mhz all the time and core fluctuates between 2088-2063mhz on my FE, some games it even drops to 2050mhz for the full game session.


I think certain temperatures are drop points. I found that too. I think when you approach 70 it drops a bit. Totally normal. You really need water or be in the low 60's all the time to avoid the throttling.


----------



## Naked Snake

Quote:


> Originally Posted by *chronicfx*
> 
> I think certain temperatures are drop points. I found that too. I think when you approach 70 it drops a bit. Totally normal. You really need water or be in the low 60's all the time to avoid the throttling.


Wow I didn't think of that and I guess you are right, I just did a quick test and it actually drops from 2100 to 2088-2076-2063 if the temp change from 68-69-70-71 C. Damn I don't have any cash to go to watercooling right now







haha


----------



## axiumone

Quote:


> Originally Posted by *Naked Snake*
> 
> Wow I didn't think of that and I guess you are right, I just did a quick test and it actually drops from 2100 to 2088-2076-2063 if the temp change from 68-69-70-71 C. Damn I don't have any cash to go to watercooling right now
> 
> 
> 
> 
> 
> 
> 
> haha


It's trickier than just temps too. I've found that even if the temps are stable you can power throttle. Really pushing the GPU up to its tdp at higher resolutions will make it throttle even if your temps are in check.


----------



## jtom320

Question for anyone on water with these.

I got a 360 and a 240 rad with GT's. 360 is exhaust, 240 is intake. Temps are around 49-50 degrees about after about an hour and a half of valley. Never really seen it go above 50 in a game.

OC'd to 2050. Could probably push that farther.

Anyway my question is if other watercoolers are getting similar temps. My mATX case doesn't make it easy to get great airflow and it's currently negative pressure. Plus my Typhoons are running around 1000 RPM. So I'm fairly certain everything's cool. Just curious though for other watercooled results in general.


----------



## chronicfx

Quote:


> Originally Posted by *jtom320*
> 
> Question for anyone on water with these.
> 
> I got a 360 and a 240 rad with GT's. 360 is exhaust, 240 is intake. Temps are around 49-50 degrees about after about an hour and a half of valley. Never really seen it go above 50 in a game.
> 
> OC'd to 2050. Could probably push that farther.
> 
> Anyway my question is if other watercoolers are getting similar temps. My mATX case doesn't make it easy to get great airflow and it's currently negative pressure. Plus my Typhoons are running around 1000 RPM. So I'm fairly certain everything's cool. Just curious though for other watercooled results in general.


50c is solid temps for sure. I am sure with a larger case and more room they may go down some but you are fine. Those RADS should absorb all these GPU have to throw at it no problem.


----------



## Naked Snake

Quote:


> Originally Posted by *axiumone*
> 
> It's trickier than just temps too. I've found that even if the temps are stable you can power throttle. Really pushing the GPU up to its tdp at higher resolutions will make it throttle even if your temps are in check.


Damn true, just finished testing the difference between 1080p, 1440p and 4k while checking for the same temps, at 4k my GPU is at 100%-99% load instead of 99-97% and it's throttling harder.

At least now I know that this is normal behavior with this card, my last 970 SLI was running a rock solid OC without any kind of drop at any resolution as long as my temps were good, I don't think I even reached a tdp limit with those cards, Vram limit sure... but anyway I will have to learn to live with this kind of new GPU Boost, it was kind of weird at first since I've never experienced this kind of behavior with older cards.


----------



## bfedorov11

Quote:


> Originally Posted by *jtom320*
> 
> Question for anyone on water with these.
> 
> I got a 360 and a 240 rad with GT's. 360 is exhaust, 240 is intake. Temps are around 49-50 degrees about after about an hour and a half of valley. Never really seen it go above 50 in a game.
> 
> OC'd to 2050. Could probably push that farther.
> 
> Anyway my question is if other watercoolers are getting similar temps. My mATX case doesn't make it easy to get great airflow and it's currently negative pressure. Plus my Typhoons are running around 1000 RPM. So I'm fairly certain everything's cool. Just curious though for other watercooled results in general.


What are temps while gaming? I would flip those other fans around. Set all the rads as intake. Doesn't that case have a single exhaust fan?

I'm running a similar setup with two 240mm rads and ap15 fans. Gpu @ 2150, 6700k @ 4.7, gpu temp never breaks 45. My fans will go up to 1850 though. Only using a single 120mm 2000rpm exhaust fan.


----------



## Klocek001

I was really hoping for higher.


----------



## pez

Quote:


> Originally Posted by *nexxusty*
> 
> You played it?
> 
> It doesn't crash. Ever. Unless you're unstable. It's an "Alpha" by name only.


Quote:


> Originally Posted by *moustang*
> 
> Just because it doesn't crash that does not mean it's stable or bug free. Ask anyone who has run an SLI rig for any period of time how many games they've found that stuttered or had laggy performance until an SLI fix was released for the game. The games all ran, they didn't crash, but performance was definitely hindered by the game.


This. Regardless of how well it runs for you, an Alpha game is a poor representation of a benchmark unless you're benchmarking performance of the game's own release versions. The engine is probably stable, but it doesn't mean that every component of it is in that particular game.
Quote:


> Originally Posted by *Robilar*
> 
> True enough but when you consider that the vast majority of people buying this card are doing so to actually use it to play games rather than bench, it's faster at stock than my 980Ti was overclocked. It will last me until the 1080Ti comes out at which point I might upgrade again.
> 
> Personally I would never buy cards solely because of their bench scores? I get that it gives us a rough performance idea compared to older gen cards but ultimately benching serves no other purpose. Min/Max frame rates are all that I am concerned about.


Quote:


> Originally Posted by *Robilar*
> 
> A couple of things changed my mind. I had a credit at a local store that was going to waste and they had an Asus FE in stock. Plus with my currently monitor, I can easily max out BF4 and still see 200FPS. I plan to eventually grab the 1440P version of the Z35 when it is released by which time hopefully the 1080Ti will also be out.
> 
> I've been calling every computer company in Toronto for the last two weeks and they have zero stock as they are all still filling pre-orders.
> 
> Plus I just sold my 980Ti and would be without a video card


Knew there was a reason I liked ya







. I'm the same way about card purchases. I could care less about the bench scores.

And I just got a 4K panel, so I'm somewhat looking forward to potential SLI....if I don't decide to go mini ITX with my own build.


----------



## jtom320

Quote:


> Originally Posted by *bfedorov11*
> 
> What are temps while gaming? I would flip those other fans around. Set all the rads as intake. Doesn't that case have a single exhaust fan?
> 
> I'm running a similar setup with two 240mm rads and ap15 fans. Gpu @ 2150, 6700k @ 4.7, gpu temp never breaks 45. My fans will go up to 1850 though. Only using a single 120mm 2000rpm exhaust fan.


It really depends on the game. GTA 5 at 4k hit 49.

Thing is I'm using hard tubing and it gets in the way of the rear fan. One of the rads has to be flipped. I'm thinking about maybe adding a 120 at the bottom of the case as intake. Just to even out air in/air out. Never had anything with negative air pressure before.

I know my temps aren't bad at all but if this case was like an inch longer I could put in an exhaust and I think it'd be a pretty big decrease. Just a shame. I'm bigger on the silence thing then I am on the temp thing.


----------



## versions

Sitting here bored as all hell waiting for my EVGA FE to show up. The customer service people at UPS assured me it would arrive yesterday, and of course it didn't. Assuming it will show up today then. Emptied the loop yesterday in preparation, and the waterblock from EK showed up (one day earlier with DHL than UPS despite ordering it one day later and the block not being in stock when i ordered). Excited to finally get my hands on it now.


----------



## JedixJarf

Quote:


> Originally Posted by *Klocek001*
> 
> 
> 
> I was really hoping for higher.


Lol, 100% would be amazing


----------



## fat4l

Quote:


> Originally Posted by *Klocek001*
> 
> 
> 
> I was really hoping for higher.


Why is it showing such high values all the time ?


----------



## Nizzen

Quote:


> Originally Posted by *fat4l*
> 
> Why is it showing such high values all the time ?


Use the new version, and you know why









No asic read support for pascal...


----------



## Bloodymight

Quote:


> Originally Posted by *Naked Snake*
> 
> So is it normal that games fires at 2100mhz or up and then they drop and run at 2088-2063 Mhz all the time??? I don't think that the cause is because of the temps, I'm hitting 69C on my card @ 90% Fan but I find it weird that it cannot mantain 2100mhz all the time and core fluctuate between 2088-2063mhz on my FE, some games it even drops to 2050mhz for the full game session.


My 1080 starts throttling at 50°C already from 2100 to 2088, and goes further down every 10°C. Above 70°C or 80°C it goes down every 5°C.
I tested it up to 96°C where my clock was at 1938MHz.

At After certain temperatures it even increased its voltage.

btw I found this post which explains it a bit further but take it with a grain of salt


__
https://www.reddit.com/r/4prnpi/any_point_for_water_cooling_1080_for_overclocking/d4ni6vg
Quote:


> Semiconductor engineer here (not at NVIDIA but we use the exact same process technology at the same FAB).
> 
> Yes, the TSMC 16FF+ process is subject to the standard process slowdown at higher temperatures. This means you should *expect the best OC performance between 25'C and 50'C* junction temp and you will see *rolloff all the way up to the max temp*. Note: this process does have inversion (like all modern ones) so going "super cold" (below 0'C) is actually worse for performance (maybe only relevant for my friends in Finland).
> 
> In other news it is a GREAT process (best we've seen in years) and *it is running faster than expected on our early samples, but with lower deviation.* *This means the "silicon lottery" is less swingy than early in 28nm* (which has now settled down)
> 
> TL;DR : Yes, if you are a real tweaker, you should liquid cool below 50'C for best OC performance.


that might explain why Pascal feels limited at 2100MHz

+ AMDs polaris might have similar properties


----------



## BrainSplatter

Quote:


> it is running faster than expected on our early samples, but with lower deviation. This means the "silicon lottery" is less swingy than early in 28nm


It's the same as what we see in CPUs. It seems that the smaller the chip structures the smaller the OC range (probably makes sense physically). Sandy Bridge and Ivy Bridge had a higher OC spread than Skylake or Broadwell CPUs.

If u look at the overclocking statistics from siliconlottery.com about variation in Skylake top clock speed (same maximum voltage) u see that the clock speed range for the same voltage is really pretty small:

4.9Ghz: 2%
4.8Ghz: 17%
4.7Ghz: 59%
4.6Ghz: 93%

translated into Pascal clock speeds that would be (chips are obviously not directly comparable but it gives an idea):
2.20 Ghz: 2%
2.15 Ghz: 20%
2.10 Ghz: 60%
2.05 Ghz: 93%


----------



## Pandora's Box

Well...Just bought a GTX 1080 (founders edition) to replace my Titan X. I am doing a complete PC overhaul as I want to move to a small form factor case with an ITX motherboard. There's no way I can keep my Titan X at it's current clock speeds and overvoltage in the ITX case I will be using without the card severely throttling. The HardwareCanucks review on the 1080 is what pushed me over the edge. 30% faster on average at 1440P over a Titan X.


----------



## Jpmboy

Quote:


> Originally Posted by *jtom320*
> 
> Question for anyone on water with these.
> I got a 360 and a 240 rad with GT's. 360 is exhaust, 240 is intake. Temps are around 49-50 degrees about after about an hour and a half of valley. Never really seen it go above 50 in a game.
> OC'd to 2050. Could probably push that farther.
> Anyway my question is if other watercoolers are getting similar temps. My mATX case doesn't make it easy to get great airflow and it's currently negative pressure. Plus my Typhoons are running around 1000 RPM. So I'm fairly certain everything's cool. Just curious though for other watercooled results in general.


with only a UniBlock, I never see gpu core temps above 38C - ever (which is approx +15C over ambient). this is with a single 360 (fat) rad and a 5.0GHz 6700K.


----------



## minisale

Hi guys,

just found the best bios for flash: Inno 3D X3

No 3DMark throttling anymore. Constantly 2126MHz during all runs. Powerlimit was max 75% (before 120%)

Please re-check:

https://drive.google.com/file/d/0B8SwuRi3m-GcVDBsSHMwSXpQRVU/view?usp=sharing


----------



## TurricanM3

Quote:


> Originally Posted by *minisale*
> 
> Hi guys,
> 
> just found the best bios for flash: Inno 3D X3
> 
> No 3DMark throttling anymore. Constantly 2126MHz during all runs. Powerlimit was max 75% (before 120%)
> 
> Please re-check:
> 
> https://drive.google.com/file/d/0B8SwuRi3m-GcVDBsSHMwSXpQRVU/view?usp=sharing


Must be a cool guy you got that bios from.


----------



## aylan1196

how to flash the gtx 1080 ?


----------



## scaramonga

Anyone tried the new *Precision XOC 6.02* yet? Does it still not work correctly with non-evga FE cards, and exhibit the same errors people have been getting, such as OC scanner running on its own accord?


----------



## AllGamer

Happy Happy! Joy Joy!









Just placed an order for pair of these babies


GeForce GTX 1080 SEA HAWK EK X

Now... what to do with my current GTX 1080









contemplating to sell it, while it's still hot.
if not transplant it to the Home Theatre rig, to use it with the HTC Vive (also on back order)


----------



## minisale

https://drive.google.com/file/d/0B8SwuRi3m-GcVDBsSHMwSXpQRVU/view?usp=sharing
Quote:


> Originally Posted by *TurricanM3*
> 
> Must be a cool guy you got that bios from.


best guy of the world.


----------



## aylan1196

Guys can I flash my msi fe withe evga fe bios if so please any help
By the way I flashed my fe with gtx 1080 x8 bios from msi website but can't revert back pls advise


----------



## fat4l

What about asus 1080 matrix / evga kingpin ? Full voltage control is what I want


----------



## Jpmboy

Quote:


> Originally Posted by *minisale*
> 
> https://drive.google.com/file/d/0B8SwuRi3m-GcVDBsSHMwSXpQRVU/view?usp=sharing
> best guy of the world.


reference board bios?


----------



## Jpmboy

Quote:


> Originally Posted by *aylan1196*
> 
> Guys can I flash my msi fe withe evga fe bios if so please any help
> By the way I flashed my fe with gtx 1080 x8 bios from msi website but can't revert back pls advise


did you issues the command: _nvflash --protectoff_ ??


----------



## aylan1196

and then after protect off
?


----------



## aylan1196

plz can u link me the nvflash version ty


----------



## aylan1196

Help guys


----------



## seabiscuit68

Just got mine last weekend
EVGA GTX 1080 SuperClock ACX 3.0

It is okay so far. I am having the issue where my overclocked monitor won't work after it reboots or goes to sleep. This is highly annoying. Very quiet. Very cool. Good looking card. Plays everything great.

Just curious though - how many of you guys plan on selling these off to get the Ti when it comes out? How much do you think you will be able to get for it?


----------



## aylan1196

Guys I can't find any nvflash to work properly Iam screwed I want to reflash my orig bios plz any one


----------



## Outcasst

Quote:


> Originally Posted by *aylan1196*
> 
> Guys I can't find any nvflash to work properly Iam screwed I want to reflash my orig bios plz any one


http://www.overclock.net/t/1601329/gtx-1070-1080-bios-who-has-it/0_100#post_25299351


----------



## Outcasst

Flashed Strix OC BIOS to EVGA Founder's card. No problems so far.


----------



## AllGamer

Quote:


> Originally Posted by *Outcasst*
> 
> Flashed Strix OC BIOS to EVGA Founder's card. No problems so far.


Sweet!, got a link to the ASUS Strix BIOS?









I wan't to give a swirl too


----------



## Outcasst

Quote:


> Originally Posted by *AllGamer*
> 
> Sweet!, got a link to the ASUS Strix BIOS?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wan't to give a swirl too


https://www.techpowerup.com/vgabios/183797/asus-gtx1080-8192-160522

Max fan speed is 3600RPM. Equal to 90% on the FE BIOS.

I've actually flashed most of the BIOS's on TPU site. They all work just fine, however most of them have very low max fan speeds so they're useless to me.


----------



## Jpmboy

Quote:


> Originally Posted by *aylan1196*
> 
> and then after protect off
> ?


first disable the video driver in device manager

open nvflash as an evevated cmnd prompt in the folder holding the nvflash.exe file

nvflash --list (list cards and pcie ID)
nvflash --protectoff
nvflash -6 newromname.rom
Exit
restart.

2 cards:

flash commands for 2 card SLI with no PLX chip on the MB:

1) disable SLI
2) open dev manager and disable drivers on both cards
3) open the NVFlash folder:

Win7: shift-rt-click in the folder > "open command window here"
W8/8.1/10: file>open> command prompt as admin

4) type:

nvflash --list (note the GPU pcie lanes - probably 0, 1, 2 etc unless you have a PLX chip
nvflash -i1 --protectoff
nvflash -i1 -6 newromname.rom
Hit Y every time asked

5) Type:

nvflash -i0 --protectoff
nvflash -i0 -6 newromname.rom
Hit Y when asked

Exit the cmnd window after the flash finishes

6) enable the drivers in dev manager. DO NOT enable sli yet
7) reboot and enable SLI
8) reboot again

Quote:


> Originally Posted by *aylan1196*
> 
> plz can u link me the nvflash version ty


I posted a link to the opne that works a few days ago...

NVFlash_Certs_Bypassed_v5.287_x64.zip 1155k .zip file


----------



## aylan1196

triead all methods no use
plz advise


----------



## pez

Gigabyte updated the Xtreme Gaming OC app to be quite a bit more functional. It's still far from perfect, but a good deal less atrocious than the first iteration. Looks like they took away OSD (didn't work for me anyways) and there's still no way to expand the monitoring window (attached or detached) like you can in MSI AB.


----------



## Asus11

Quote:


> Originally Posted by *Outcasst*
> 
> https://www.techpowerup.com/vgabios/183797/asus-gtx1080-8192-160522
> 
> Max fan speed is 3600RPM. Equal to 90% on the FE BIOS.
> 
> I've actually flashed most of the BIOS's on TPU site. They all work just fine, however most of them have very low max fan speeds so they're useless to me.


im on water, so the fan profiles mean nothing to me but

have you noticed better performance? less throttling?

also when flashed did the clocks basically turn into a strix card lol?

also did you remove some power limits??

thanks


----------



## Outcasst

Quote:


> Originally Posted by *Asus11*
> 
> im on water, so the fan profiles mean nothing to me but
> 
> have you noticed better performance? less throttling?
> 
> also when flashed did the clocks basically turn into a strix card lol?
> 
> also did you remove some power limits??
> 
> thanks


Throttling seems about the same, however the maximum power draw I saw was 114% whereas on the FE BIOS I was constantly touching 119%, however GPU-Z indicated it's still being limited by power at some points.

The Strix OC BIOS is boosting to 2088 down to 2036 on a heaven 4.0 run, and that's stock, no manual overclocking. And it's completely stable.


----------



## Asus11

Quote:


> Originally Posted by *Outcasst*
> 
> Throttling seems about the same, however the maximum power draw I saw was 114% whereas on the FE BIOS I was constantly touching 119%, however GPU-Z indicated it's still being limited by power at some points.
> 
> The Strix OC BIOS is boosting to 2088 down to 2036 on a heaven 4.0 run, and that's stock, no manual overclocking. And it's completely stable.


holy moly









sounds tempting, is the way to flash exactly the same as previous card but with the pascal nvflash?

--protectoff

--backup.rom

-6 newbios.rom

?










think I may give it a go


----------



## Outcasst

If you've got the right version which I linked to a few posts back, all you need to do is run

nvflash -6 strix.rom

then press Y

And Nvflash will take care of everything else. I've never used the protectoff command.

And of course do the backup command if you haven't got a copy of your original BIOS already.


----------



## KillerBee33

Quote:


> Originally Posted by *Outcasst*
> 
> If you've got the right version which I linked to a few posts back, all you need to do is run
> 
> nvflash -6 strix.rom
> 
> then press Y
> 
> And Nvflash will take care of everything else. I've never used the protectoff command.
> 
> And of course do the backup command if you haven't got a copy of your original BIOS already.


Asuming NVFlash folder with new BIOS in it is in C:
Command Prompt
C:/nvflash ENTER
nvflash "name of BIOS + .rom" ENTER
"Y" ENTER
Win10 run Command Prompt not as Admin.
DONT FORGET TO DISABLE GPU IN DEVICE MANAGER BEFORE FLASHING


----------



## Outcasst

Quote:


> Originally Posted by *KillerBee33*
> 
> DONT FORGET TO DISABLE GPU IN DEVICE MANAGER BEFORE FLASHING


Nvflash does that part for you.


----------



## KillerBee33

Quote:


> Originally Posted by *Outcasst*
> 
> Nvflash does that part for you.


I still use it manually , from KEPLER days


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> I still use it manually , from KEPLER days


me too. I really gotta ask joedirt to issue a version that does not automatically disable the drivers... it can lead to a borked flash with SLI if you do not disable SLI first and/or have kboost enabled.


----------



## Asus11

Quote:


> Originally Posted by *Outcasst*
> 
> If you've got the right version which I linked to a few posts back, all you need to do is run
> 
> nvflash -6 strix.rom
> 
> then press Y
> 
> And Nvflash will take care of everything else. I've never used the protectoff command.
> 
> And of course do the backup command if you haven't got a copy of your original BIOS already.


rep + 1

just did it, took 2 mins

boosts to 2088 on its own when I just ran valley









will play around more with it haha


----------



## wsarahan

Will install my both acx 3.0 SC cards today,no way to see which one has better asic and better boost right? Just installing one each time and looking the max boost for each one individually? Tks


----------



## Cial00

Quote:


> Originally Posted by *Asus11*
> 
> rep + 1
> 
> just did it, took 2 mins
> 
> boosts to 2088 on its own when I just ran valley
> 
> 
> 
> 
> 
> 
> 
> 
> 
> will play around more with it haha


Awesome news. Have u tested a stable OC?


----------



## Asus11

Quote:


> Originally Posted by *Cial00*
> 
> Awesome news. Have u tested a stable OC?


seems the same, you be able to get a slightly better oc, like 13 more on the core

real benefits to this I think are stock speeds, at stock goes to 2088


----------



## urbanshaft

got my strix 1080 today
without touching her she gets 2100mhz boost
but goes to 2050 in games




very silent


----------



## Jpmboy

Quote:


> Originally Posted by *Asus11*
> 
> seems the same, you be able to get a slightly better oc, like 13 more on the core
> 
> real benefits to this I think are stock speeds, at stock goes to 2088


power limit _should_ be higher.


----------



## Outcasst

Quote:


> Originally Posted by *Jpmboy*
> 
> power limit _should_ be higher.


It's slightly higher. I just reverted back to the stock BIOS and i'm seeing dips down to 2025MHz whereas the lowest I saw was 2036 while looping heaven on the Strix BIOS.


----------



## chronicfx

Quote:


> Originally Posted by *urbanshaft*
> 
> got my strix 1080 today
> without touching her she gets 2100mhz boost
> but goes to 2050 in games
> 
> 
> 
> 
> very silent


That thing is a monster!!! WOW!


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> power limit _should_ be higher.


Quote:


> Originally Posted by *Outcasst*
> 
> It's slightly higher. I just reverted back to the stock BIOS and i'm seeing dips down to 2025MHz whereas the lowest I saw was 2036 while looping heaven on the Strix BIOS.


will also echo what outcasst has said

its very slightly higher, and it holds the clocks better which is a nice addtion but whats most thing I found out

im on water and under 50c I stay at the boost I overclocked to, as soon as I hit 50c it drops by 13, im tempted to use CLU on the GPU what do you guys reckon? is it safe enough and should I see atleast 3c difference? I know it can run etc but id put thermal pads around the gpu die only thing worrying me is could it tarnish the gpu die itself? on my cpus in the past I had no troubles but not familiar with the gpu die


----------



## chronicfx

Quote:


> Originally Posted by *Asus11*
> 
> will also echo what outcasst has said
> 
> its very slightly higher, and it holds the clocks better which is a nice addtion but whats most thing I found out
> 
> im on water and under 50c I stay at the boost I overclocked to, as soon as I hit 50c it drops by 13, im tempted to use CLU on the GPU what do you guys reckon? is it safe enough and should I see atleast 3c difference? I know it can run etc but id put thermal pads around the gpu die only thing worrying me is could it tarnish the gpu die itself? on my cpus in the past I had no troubles but not familiar with the gpu die


CLU is an iffy call for a GPU because it is so thin. It may cause more non-contact issues than it solves with thermal transfer. You can give it a try, but I tried it once (several times in a single day) on my 7990 and I got even worse temps and I am pretty good at using it after applying it about 10 times to three different CPU's with great results, of course I was more inclined to think my thermal pads thickness were the issue, but that said your block needs to be "perfect" to get a good mount with liquid metal . Just my own experience, I like thermal paste on GPU's and CLU on my CPU's. The dies are big enough on the GPU to handle the lower heat transfer/cm^2.

jpmboy and I are both chemists so I respect his opinion and ability to look at it from many angles. Just a perk from the line work. If he says give it a go, then by all means. My attempt did not work, but then again I suspected I had a bad thermal pad (the blue one that was different from the rest) that was too thick even to the extent of calling XSPC about it.


----------



## Asus11

Quote:


> Originally Posted by *chronicfx*
> 
> CLU is an iffy call for a GPU because it is so thin. It may cause more non-contact issues than it solves with thermal transfer. You can give it a try, but I tried it once (several times in a single day) on my 7990 and I got even worse temps and I am pretty good at using it after applying it about 10 times to three different CPU's with great results, of course I was more inclined to think my thermal pads thickness were the issue, but that said your block needs to be "perfect" to get a good mount with liquid metal . Just my own experience, I like thermal paste on GPU's and CLU on my CPU's. The dies are big enough on the GPU to handle the lower heat transfer/cm^2.
> 
> jpmboy and I are both chemists so I respect his opinion and ability to look at it from many angles. Just a perk from the line work. If he says give it a go, then by all means. My attempt did not work, but then again I suspected I had a bad thermal pad (the blue one that was different from the rest) that was too thick even to the extent of calling XSPC about it.


thanks for your input









may stick to the thermal paste then, have kyronaut on it atm but im 100% sure gelid extreme is better

maybe ill get some better fans


----------



## Jpmboy

Quote:


> Originally Posted by *Asus11*
> 
> seems the same, you be able to get a slightly better oc, like 13 more on the core
> 
> real benefits to this I think are stock speeds, at stock goes to 2088


I'm less concerned with reported "clocks" or frequencies than productivity. Does the bios actually lead to higher FPS or benchmark scores? For example, going back a ways.. flashing the 980 Strix bios (by Shammy) to a 980 Kiingpin actually ran lower clocks but was much more productive (faster in real tests) due to it being a more efficient bios.
Quote:


> Originally Posted by *Outcasst*
> 
> It's slightly higher. I just reverted back to the stock BIOS and i'm seeing dips down to 2025MHz whereas the lowest I saw was 2036 while looping heaven on the Strix BIOS.


what was the best avg FPS from each?
Quote:


> Originally Posted by *Asus11*
> 
> thanks for your input
> 
> 
> 
> 
> 
> 
> 
> 
> may stick to the thermal paste then, have kyronaut on it atm but im 100% sure gelid extreme is better
> maybe ill get some better fans


lol - I'd go by @chronicfx 's recommendation. He's the guy running a million+ dollar NMR with a liquid helium cooled superconducting magnet that uses LN2 just to cool the LHe... or, oh, wait.. it a million_$ Mass spec, not the NMR.








(no really)

But: if you already have Grizzly on the die and still see 50C... figure out how to drop the coolant temp somehow (aquarium chillers work wonders). CL:U is okay to use with a uniblock, unfortunately full cover blocks will present good contact mount issues at the CLU level.


----------



## wsarahan

Guys can you take me a little doubt?

I tried my both cards alone one each time at heaven, no oc, nothing, just to rest

One is boosting to 1987 and the other boosts to 1974 is this diference normal? Should I let the 1987 on top so? Both acx 3.0 SC evga

Thanks

Enviado do meu iPhone usando Tapatalk


----------



## Asus11

I will do some benchies but I doubt ill see anything, ill do them anyway

yeh ill stick to grizzly, and I don't think i'll be adding anything to the PC watercooling system, if this was just a bench rig I would but it does pretty dam well off a 240m rad for both cpu/gpu
cpu nevers goes over 70c @ 4.9 and gpu max is 52c

might just upgrade the fans or run fans on full for benching but not for gaming as it would be annoying as hell









heres my last firestrike http://www.3dmark.com/3dm/12574399

also

best valley was 123.6


----------



## Outcasst

Quote:


> Originally Posted by *Jpmboy*
> 
> what was the best avg FPS from each?


I just did a run on each BIOS, I got 115.1 FPS on the Strix BIOS and 116.3 on The FE BIOS set to the same clocks. 1080p


----------



## Asus11

best with FE

http://www.3dmark.com/3dm/12574399

best with strix

http://www.3dmark.com/3dm/12774160


----------



## KillerBee33

Quote:


> Originally Posted by *Asus11*
> 
> best with FE
> 
> http://www.3dmark.com/3dm/12574399
> 
> best with strix
> 
> http://www.3dmark.com/3dm/12774160


4.8 on that 6700...what Voltage?


----------



## Asus11

Quote:


> Originally Posted by *KillerBee33*
> 
> 4.8 on that 6700...what Voltage?


1.35v

also went back to 4.8 to test the strix..

then flash bios back to FE and tried exactly like I did in the strix and it seems FE bios is superior

http://www.3dmark.com/3dm/12774758

v

http://www.3dmark.com/fs/9010837


----------



## Outcasst

Quote:


> Originally Posted by *Asus11*
> 
> 1.35v
> 
> also went back to 4.8 to test the strix..
> 
> then flash bios back to FE and tried exactly like I did in the strix and it seems FE bios is superior
> 
> http://www.3dmark.com/3dm/12774758
> 
> v
> 
> http://www.3dmark.com/fs/9010837


This is my finding as well. FE BIOS is slightly better.


----------



## Jpmboy

Quote:


> Originally Posted by *Outcasst*
> 
> I just did a run on each BIOS, I got 115.1 FPS on the Strix BIOS and 116.3 on The FE BIOS set to the same clocks. 1080p


Quote:


> Originally Posted by *Asus11*
> 
> best with FE
> 
> http://www.3dmark.com/3dm/12574399
> 
> best with strix
> 
> http://www.3dmark.com/3dm/12774160


Thanks guys! +1 to both!

Yeah, some bioses are just more efficient and can be more productive at the same or lower clocks. Things like rail alignment, timings and interval alignment... etc. HIgher frequency is not always better - especially with the hard-coded error correction these new architectures have.


----------



## emett

Anyone else got their cards not down clocking at Desktop? It's not the 144hz bug as I have dropped to 119hz and still have the issue.
Using MSI Afterburner. Cards running at 1695 for desktop :S


----------



## axiumone

Quote:


> Originally Posted by *emett*
> 
> Anyone else got their cards not down clocking at Desktop? It's not the 144hz bug as I have dropped to 119hz and still have the issue.
> Using MSI Afterburner. Cards running at 1695 for desktop :S


Using more than one screen per chance?


----------



## emett

No just a 1440p. Even dropping to 59hz hasn't stopped it..


----------



## chronicfx

Quote:


> Originally Posted by *emett*
> 
> Anyone else got their cards not down clocking at Desktop? It's not the 144hz bug as I have dropped to 119hz and still have the issue.
> Using MSI Afterburner. Cards running at 1695 for desktop :S


Sorry I missed this? What is the 144MHz bug?


----------



## emett

Reinstalling drivers fixed it.


----------



## bfedorov11

I tried the strix bios on my evga fe and had a game freeze up at like 1700mhz in a couple minutes. I was using the evga sc bios a few days ago, had a driver crash while using firefox.. never had that happen before. I'm on a pretty fresh w8 install.

Not sure if software readings are correct, but it seemed like the strix bios provided more voltage. It didn't help with my max oc so I don't see the point of using it if it generates more heat. FE bios would rarely jump over 1.050, but the strix was 1.06x fairly often.

I'm sticking to my stock bios.


----------



## emett

So having a drama atm with SLI. When I restart I can select 144hz but as soon as I run a 3d program it reverts back to 59hz. Anyone know about this issue?


----------



## emett

Seems it was due to running 2 different length SLI ribbons.


----------



## steeludder

Quote:


> Originally Posted by *Asus11*
> 
> 1.35v
> 
> also went back to 4.8 to test the strix..
> 
> then flash bios back to FE and tried exactly like I did in the strix and it seems FE bios is superior
> 
> http://www.3dmark.com/3dm/12774758
> 
> v
> 
> http://www.3dmark.com/fs/9010837


Are you still experiencing throttling after putting your 1080 under water?


----------



## Asus11

Quote:


> Originally Posted by *steeludder*
> 
> Are you still experiencing throttling after putting your 1080 under water?


once it hits like 50c it will down clock and go back up and down

and occasionally it will hit the power limit but very rarely


----------



## steeludder

Quote:


> Originally Posted by *Asus11*
> 
> once it hits like 50c it will down clock and go back up and down
> 
> and occasionally it will hit the power limit but very rarely


Downclock at 50C?? Seriously?


----------



## wsarahan

Guys how are you

This is the max i achieved at my 2 EVGA Sc 1080



This makes the boost 2079 on cores

I realized that changing the voltage at afterburner did nothelped at nothing to bump the core and the memos, is is normal? And the final result of 2079 for an sli is ok?

Thanks


----------



## Maintenance Bot

Quote:


> Originally Posted by *wsarahan*
> 
> Guys how are you
> 
> This is the max i achieved at my 2 EVGA Sc 1080
> 
> 
> 
> This makes the boost 2079 on cores
> 
> I realized that changing the voltage at afterburner did nothelped at nothing to bump the core and the memos, is is normal? And the final result of 2079 for an sli is ok?
> 
> Thanks


Yes, 2079 is good oc for sli. I saw no improvement with voltage adjustment either.


----------



## Jpmboy

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Yes, 2079 is good oc for sli. I saw no improvement with voltage adjustment either.


I'm not convinced the voltage adjustment actually changes the voltage.


----------



## Naked Snake

Is it weird that I've got a better and stable OC with the new EVGA XOC 6.0.2? lol

I've used the new thing with the OC scanner and yielded better results than using MSI afterburner, well just a little bit better, I'm stable at 2088mhz and rarely dropping to 2076mhz with EVGA XOC and with afterburner I was at 2076mhz and dropping to 2063-2050mhz with manual OC using the sliders.


----------



## moustang

Quote:


> Originally Posted by *wsarahan*
> 
> Guys how are you
> 
> This is the max i achieved at my 2 EVGA Sc 1080
> 
> This makes the boost 2079 on cores
> 
> I realized that changing the voltage at afterburner did nothelped at nothing to bump the core and the memos, is is normal? And the final result of 2079 for an sli is ok?
> 
> Thanks


2079 is quite respectable. Any improvement above that wouldn't be noticeable outside of a benchmark test. You would never see the difference.

And I too have found that changing the voltage made absolutely no difference. The cards are clearly not limited by their voltage when it comes to overclocking, their primary limit is thermal throttling. The lower you can get the temps the more consistent their speed will be. Even a lower clocked card can outperform a higher clocked card if you can keep it cooler.


----------



## Jpmboy

Quote:


> Originally Posted by *moustang*
> 
> 2079 is quite respectable. Any improvement above that wouldn't be noticeable outside of a benchmark test. You would never see the difference.
> 
> And I too have found that changing the voltage made absolutely no difference. The cards are clearly not limited by their voltage when it comes to overclocking, their primary limit is thermal throttling. The lower you can get the temps the more consistent their speed will be. Even a lower clocked card can outperform a higher clocked card if you can keep it cooler.


Once you control the temperatures it's really the Power limit that is holding these cards back. I can keep the core below 23C thru any benchmark or game loop... only to SLAM into the PL and drop clocks by 1 or more bins. This is really annoying!


----------



## wsarahan

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> Guys how are you
> 
> This is the max i achieved at my 2 EVGA Sc 1080
> 
> 
> 
> This makes the boost 2079 on cores
> 
> I realized that changing the voltage at afterburner did nothelped at nothing to bump the core and the memos, is is normal? And the final result of 2079 for an sli is ok?
> 
> Thanks
> 
> 
> 
> Yes, 2079 is good oc for sli. I saw no improvement with voltage adjustment either.
Click to expand...

Thanks for helping guys

I tested two loops at firestrike and 3 heaven ultra loops is that enough?

Enviado do meu iPhone usando Tapatalk


----------



## emett

100fps Witcher 3 on Ultra @1440p. My life is complete.


----------



## Outcasst

Quote:


> Originally Posted by *wsarahan*
> 
> Thanks for helping guys
> 
> I tested two loops at firestrike and 3 heaven ultra loops is that enough?
> 
> Enviado do meu iPhone usando Tapatalk


In personal experience I have found GTA 5 to be the best at finding a stable overclock.


----------



## moustang

Quote:


> Originally Posted by *Jpmboy*
> 
> Once you control the temperatures it's really the Power limit that is holding these cards back. I can keep the core below 23C thru any benchmark or game loop... only to SLAM into the PL and drop clocks by 1 or more bins. This is really annoying!


I'm not convinced it's the power limit.

My temps are controlled (under 50C at all times), but I hit the same GPU limit regardless of the power being at 100, 107, or 121. The power setting makes no difference on stability nor overclock.

There is definitely something limiting the cards, but I don't believe it's the power setting itself.


----------



## Maintenance Bot

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm not convinced the voltage adjustment actually changes the voltage.


Yeah kinda thought the same thing. Got a amp extreme inbound, maybe they got v check points on the pcb, I will check when it gets here.


----------



## chronicfx

Quote:


> Originally Posted by *emett*
> 
> Seems it was due to running 2 different length SLI ribbons.


Mine are two different lengths, only by an inch or so but I am not having this issue with my XB270HU


----------



## wsarahan

Quote:


> Originally Posted by *Outcasst*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> Thanks for helping guys
> 
> I tested two loops at firestrike and 3 heaven ultra loops is that enough?
> 
> Enviado do meu iPhone usando Tapatalk
> 
> 
> 
> In personal experience I have found GTA 5 to be the best at finding a stable overclock.
Click to expand...

Thanks I'll try later

Enviado do meu iPhone usando Tapatalk


----------



## Jpmboy

Quote:


> Originally Posted by *moustang*
> 
> I'm not convinced it's the power limit.
> 
> My temps are controlled (under 50C at all times), but I hit the same GPU limit regardless of the power being at 100, 107, or 121. The power setting makes no difference on stability nor overclock.
> 
> There is definitely something limiting the cards, but I don't believe it's the power setting itself.


using AB if you leave the PL at 100% and do a max clock run... then set it to 120% and do the same, you can see the PL cut in in the AB sensor graph (or gpuZ if you prefer that). I'm not able to post the screenies till this evening..
Quote:


> Originally Posted by *Maintenance Bot*
> 
> Yeah kinda thought the same thing. Got a amp extreme inbound, maybe they got v check points on the pcb, I will check when it gets here.


cool - really looking for that data! Somewhere I have DMM read points for the naked reference card... somewhere.


----------



## aylan1196

So guys what's the best recent bios for the gtx 1080 fe
I can maintain 2050 on both sli anything more the display driver crash


----------



## KillerBee33

You Know it!


----------



## Asus11

Quote:


> Originally Posted by *KillerBee33*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You Know it!


nice choice msi fe same as mine









also anyone else have evga precision oc and the manual run button is greyed out?


----------



## wsarahan

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *moustang*
> 
> I'm not convinced it's the power limit.
> 
> My temps are controlled (under 50C at all times), but I hit the same GPU limit regardless of the power being at 100, 107, or 121. The power setting makes no difference on stability nor overclock.
> 
> There is definitely something limiting the cards, but I don't believe it's the power setting itself.
> 
> 
> 
> using AB if you leave the PL at 100% and do a max clock run... then set it to 120% and do the same, you can see the PL cut in in the AB sensor graph (or gpuZ if you prefer that). I'm not able to post the screenies till this evening..
> Quote:
> 
> 
> 
> Originally Posted by *Maintenance Bot*
> 
> Yeah kinda thought the same thing. Got a amp extreme inbound, maybe they got v check points on the pcb, I will check when it gets here.
> 
> Click to expand...
> 
> cool - really looking for that data! Somewhere I have DMM read points for the naked reference card... somewhere.
Click to expand...

This means that the power limit is not reaching 120%?

Thanks

Enviado do meu iPhone usando Tapatalk


----------



## JedixJarf

Quote:


> Originally Posted by *wsarahan*
> 
> This means that the power limit is not reaching 120%?
> 
> Thanks
> 
> Enviado do meu iPhone usando Tapatalk


The only way to know is to watch your voltage, if it sits at 1.093 (i think it is) then you are hitting the limit for the pascal 1080 at this time.


----------



## aylan1196

So when do we expect to see vbios for the 1080 any eta


----------



## wsarahan

Guys can you help me again with one thing?

At my 1080 SLI when i start a game both cards have same clock, for example 2079/2079

After some minutes one card goes to 2063 but the other one stays at 2079, this not happened with my 980TI SLi rig, is it normal? If no what should i so to maintain both cards equal?

At MSI Afterburner 4.3 beta 4 the cards anre synced

Thanks


----------



## Robilar

Could the hotter card be throttling? I've had a few SLI setups in the fast and typically the top card runs much hotter than the lower. Could be one is running hotter than the other.

Or the bios on the two cards might not match. Are they exactly the same cards?


----------



## wsarahan

Quote:


> Originally Posted by *Robilar*
> 
> Could the hotter card be throttling? I've had a few SLI setups in the fast and typically the top card runs much hotter than the lower. Could be one is running hotter than the other.
> 
> Or the bios on the two cards might not match. Are they exactly the same cards?


Yep the bios are the same, at least at Gpuz

The top card is hotter but not by much and the card that is getting the lowest core when they are no equal is the bottom one

They stay synced for some minutes and than one stays with the clock and the other downs a little, 10mhz maybe


----------



## Spiriva

Quote:


> Originally Posted by *Jpmboy*
> 
> Once you control the temperatures it's really the Power limit that is holding these cards back. I can keep the core below 23C thru any benchmark or game loop... only to SLAM into the PL and drop clocks by 1 or more bins. This is really annoying!












Its the same for me core voltage at 0% or at 100% (or any thing inbetween) the card will run at 1.025v anyhow. Its the same for my second card but that card runs at 1.050v all the time. Same with Power limit at 0% or 120%.
Evga FE 1080, both of them.


----------



## wsarahan

Quote:


> Originally Posted by *Spiriva*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> Once you control the temperatures it's really the Power limit that is holding these cards back. I can keep the core below 23C thru any benchmark or game loop... only to SLAM into the PL and drop clocks by 1 or more bins. This is really annoying!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its the same for me core voltage at 0% or at 100% (or any thing inbetween) the card will run at 1.025v anyhow. Its the same for my second card but that card runs at 1.050v all the time. Same with Power limit at 0% or 120%.
> Evga FE 1080, both of them.
Click to expand...

Do you have the issue with different clocks when sli active? Just wanna know if it`s normal

Enviado do meu iPhone usando Tapatalk


----------



## kx11

AC syndicate 4k 60fps @ ultra is smooth sailing on 1080strix

here's a video


----------



## wsarahan

Quote:


> Originally Posted by *wsarahan*
> 
> Guys can you help me again with one thing?
> 
> At my 1080 SLI when i start a game both cards have same clock, for example 2079/2079
> 
> After some minutes one card goes to 2063 but the other one stays at 2079, this not happened with my 980TI SLi rig, is it normal? If no what should i so to maintain both cards equal?
> 
> At MSI Afterburner 4.3 beta 4 the cards anre synced
> 
> Thanks


It`s weird, if i alt tab at the game and appply the over again it comes to normal
Some msi afterburner or driver bug maybe?

Tks


----------



## Asus11

Quote:


> Originally Posted by *kx11*
> 
> AC syndicate 4k 60fps @ ultra is smooth sailing on 1080strix
> 
> here's a video


NICE VID


----------



## seabiscuit68

Hey - just noticed my EVGA GTX 1080 Superclock has an idle temp of 55+ degrees. I happened to notice this was also posted in a review for the card online. Why is the idle so high for this card? Should I be worried?


----------



## wsarahan

Quote:


> Originally Posted by *seabiscuit68*
> 
> Hey - just noticed my EVGA GTX 1080 Superclock has an idle temp of 55+ degrees. I happened to notice this was also posted in a review for the card online. Why is the idle so high for this card? Should I be worried?


Mine SC is 33 Idle, i`m using at SLi, the other one is 35 idle but i`m using 100% fan speed, so i think you are ok


----------



## wsarahan

Quote:


> Originally Posted by *wsarahan*
> 
> Mine SC is 33 Idle, i`m using at SLi, the other one is 35 idle[ but i`m running an agressive fan curve, 100% even at idle so i think you are ok/quote]


----------



## seabiscuit68

I actually figured it out - the fan profile has it for 0% fan until it hits like 60C. It is warmer because the fan is literally off...


----------



## wsarahan

Quote:


> Originally Posted by *wsarahan*
> 
> It`s weird, if i alt tab at the game and appply the over again it comes to normal
> Some msi afterburner or driver bug maybe?
> 
> Tks


I think i found the solution

If i bumped +50 the % core at Afterburner the problem is solved and both cards stay with the same core, is it normal? Any problem to bump +50? Tks


----------



## JedixJarf

Quote:


> Originally Posted by *wsarahan*
> 
> I think i found the solution
> 
> If i bumped +50 the % core at Afterburner the problem is solved and both cards stay with the same core, is it normal? Any problem to bump +50? Tks


No prob at all, just generates more heat. If that's stable, cool. I have it pumped to the max personally.


----------



## wsarahan

Quote:


> Originally Posted by *JedixJarf*
> 
> No prob at all, just generates more heat. If that's stable, cool. I have it pumped to the max personally.


This cards are so cold comapriig to my old 980TI SLI

EVGA made a good job with this coolers, it`s almost not reaching 60C at top card

But about the voltage, maybe the bottom card needed a little juice to stable the core and that`s why it fixed?


----------



## LBear

Will be joining the ranks got a MSI gaming X delivered today. Cant wait to get off work and do some benches. Hoping to see decent leap coming from a 970.


----------



## emett

Quick question. 3930k at stock volts with 4.7ghz and im getting 70c on one or two cores in BF4 with SLI 1080s. Other cores are much lower. Temps fine? Never really monitored CPU temps.


----------



## rv8000

Quote:


> Originally Posted by *seabiscuit68*
> 
> Hey - just noticed my EVGA GTX 1080 Superclock has an idle temp of 55+ degrees. I happened to notice this was also posted in a review for the card online. Why is the idle so high for this card? Should I be worried?


If you have a 144hz monitor or are running dual monitors the card isn't going to it's lower power state at idle. Check with AB, I suspect your card is sitting around a 1220mhz clock speed @ desktop idle. If that's the case there isn't much afaik that you can do besides manually turning the fan on or setting up a new curve.


----------



## emett

Also seeing idle at 1640mhz! Thats crazy.. With a single 1440p 144hz monitor.


----------



## Spiriva

Quote:


> Originally Posted by *wsarahan*
> 
> Do you have the issue with different clocks when sli active? Just wanna know if it`s normal
> 
> Enviado do meu iPhone usando Tapatalk


I have both my cards cooled with EK waterblock, they both stay on 2200mhz all of the time while gaming. In bench mark programs Ive seen that they sometimes can be different, one can be at 2190 and the other one 2203 for a little while.


----------



## JedixJarf

Quote:


> Originally Posted by *wsarahan*
> 
> This cards are so cold comapriig to my old 980TI SLI
> 
> EVGA made a good job with this coolers, it`s almost not reaching 60C at top card
> 
> But about the voltage, maybe the bottom card needed a little juice to stable the core and that`s why it fixed?


Yep, not all cards clock equally.


----------



## pez

Quote:


> Originally Posted by *wsarahan*
> 
> Guys can you help me again with one thing?
> 
> At my 1080 SLI when i start a game both cards have same clock, for example 2079/2079
> 
> After some minutes one card goes to 2063 but the other one stays at 2079, this not happened with my 980TI SLi rig, is it normal? If no what should i so to maintain both cards equal?
> 
> At MSI Afterburner 4.3 beta 4 the cards anre synced
> 
> Thanks


Quote:


> Originally Posted by *Robilar*
> 
> Could the hotter card be throttling? I've had a few SLI setups in the fast and typically the top card runs much hotter than the lower. Could be one is running hotter than the other.
> 
> Or the bios on the two cards might not match. Are they exactly the same cards?


I'm pretty sure they're supposed to clock match to each other. SLI usually downclocks to match the lowest card for consistency. That had always been the case before I believe.

Unless something changed with SLI (something with GPU boost I'd think), then it might be the readings from AB.


----------



## immortalkings

Quote:


> Originally Posted by *emett*
> 
> Also seeing idle at 1640mhz! Thats crazy.. With a single 1440p 144hz monitor.


try to check the running background apps.. that was my issue before as well, i tried closing my Corsair link and its back to 200+ clock speed and getting a better temp on idle


----------



## hemon

Hi,

I want to buy the gtx 1080 but I would like to know if I really need a second power connector if I OC:

Would my card throttle because of the power limit when OC? Would a modbios helps with the power throttling?

Cheers.


----------



## Luxer

you can never be too safe


----------



## famich

Quote:


> Originally Posted by *Outcasst*
> 
> In personal experience I have found GTA 5 to be the best at finding a stable overclock.


Me too, do test the OC with this game !!


----------



## emett

Hrmm. A reinstal
Quote:


> Originally Posted by *Luxer*
> 
> you can never be too safe


Yeah Boy! Welcome to the club.


----------



## fitzy-775

just picked up a gigabyte G1 gaming today and loving it so far. Such a huge difference over the gtx 780 i had


----------



## gerbil80

Quote:


> Originally Posted by *fitzy-775*
> 
> just picked up a gigabyte G1 gaming today and loving it so far. Such a huge difference over the gtx 780 i had


Very nice, that's the exact route I'm looking to go... did you manage to double your frame-rate?

Also, any coil whine on your 1080 G1 gaming?


----------



## fitzy-775

Yea I am getting more than double in some games, and no coil whine. I ran fallout 4 maxed setting at 1440p and was getting 90+ fps. I am going to have to reinstall GTA5 and run it at max to see what fps i can get.


----------



## pez

I've only noticed coil whine on my G1 with extreme frames (i.e. Crysis 3 when I apparently get 3k FPS on the beginning loading screens). At normal game loads and frames I'm not hearing any.


----------



## emett

Yeah i get coil whil on 3k fps two but i click continue and it goes


----------



## scaramonga

Quote:


> Originally Posted by *emett*
> 
> Also seeing idle at 1640mhz! Thats crazy.. With a single 1440p 144hz monitor.


Set refresh to 120hz, this will keep the card clocked down, until Nvidia fixes the driver problem.


----------



## emett

Nah doesnt work on 1 monitor for me. I have power pf in CP on dafault and all good after games now. Thanks though.


----------



## barsh90

Quote:


> Originally Posted by *scaramonga*
> 
> Set refresh to 120hz, this will keep the card clocked down, until Nvidia fixes the driver problem.


This problem has been going on since last year on the 980 gtx TI release. If Nvidia hasn't fixed it by now, I doubt they ever will. Shame...


----------



## scaramonga

Yup.

Single monitor here 1440p, on Adaptive or Optimal Power (makes no difference):

*144hz*:


*120hz*:


120hz is also much brighter than 144hz, something I ain't noticed before?


----------



## emett

Two cards idling at 1695mhz aint ideal. I'll get rid of them
If it comes back and there is no fix in sight.


----------



## emett

120hz brighter? Huh.


----------



## Jpmboy

Quote:


> Originally Posted by *emett*
> 
> Also seeing idle at 1640mhz! Thats crazy.. With a single 1440p 144hz monitor.


guys- this is not new. 144Hz monitors hold maxwell and pascal at P0 or P1 clock state. drop to 120Hz and it/they will idle at P8.


----------



## Setzer

Quote:


> Originally Posted by *fitzy-775*
> 
> Yea I am getting more than double in some games, and no coil whine. I ran fallout 4 maxed setting at 1440p and was getting 90+ fps. I am going to have to reinstall GTA5 and run it at max to see what fps i can get.


My 780 managed to do ~High settings on 1440p with about 50 fps.
1080 everything maxed out, 1440p and average 100 fps


----------



## mr2cam

Picked up an MSI 1080 Armor at a micro center, on air I was able to hit 2025 on the core, threw my G10 / H105 cooler on it and am now able to hit 2100mhz on the core. Havn't really pushed the memory yet, what are people hitting with the memory on these things? I think im at like 10,600 or so.


----------



## Fabse

Just got a mail confirming that my MSI Geforce GTX 1080 Seahawk X has been mailed. Should be picking it up tomorrow. Super excited!


----------



## Shadowdane

Can't wait have 2 - EVGA 1080 SC cards being delivered today!

Been running on a GTX 960 for the past 4 months, so this is going to be a HUGE upgrade! lol


----------



## Fabse

I'm upgrading from 2x 580's in SLI, so I should feel quite a difference too.


----------



## Crazy G

Yesterday I received a Gigabyte GTX 1080 G1 Gaming. After many hours of benchs and Black Death tracks in CloD and IL2 1946 modded, I just couldn´t get rid of stutters. It worked fine on benchmarks, 3DMark 11 and Fire Strike, Valley, Heaven, etc. no stutters. Different drivers, Nvidia CP, AB, OC Guru II, RT fps limiter, to no avail. This morning reinstalled my Titan X BIOS modded and everything got smooth again. The gains on benchmarks were minimal so I´ll return this card. I just couldn´t find out why these stutters. I´m on a 4790k @ 4.6GHz, 16Ghz RAM on a Sabertooth Z97 and Win 7 64. I´m puzzled.


----------



## nexxusty

Quote:


> Originally Posted by *Fabse*
> 
> I'm upgrading from 2x 580's in SLI, so I should feel quite a difference too.


ROFL. Yeah, I'd say. Hehe.


----------



## nexxusty

Quote:


> Originally Posted by *Crazy G*
> 
> Yesterday I received a Gigabyte GTX 1080 G1 Gaming. After many hours of benchs and Black Death tracks in CloD and IL2 1946 modded, I just couldn´t get rid of stutters. It worked fine on benchmarks, 3DMark 11 and Fire Strike, Valley, Heaven, etc. no stutters. Different drivers, Nvidia CP, AB, OC Guru II, RT fps limiter, to no avail. This morning reinstalled my Titan X BIOS modded and everything got smooth again. The gains on benchmarks were minimal so I´ll return this card. I just couldn´t find out why these stutters. I´m on a 4790k @ 4.6GHz, 16Ghz RAM on a Sabertooth Z97 and Win 7 64. I´m puzzled.


Windows 7.

Seriously. Why don't you have W10 on that system? You literally can't give an answer that is right so just put W10 on it.

Guarantee your frame times will be much better. W7 is a joke now. I have NO IDEA why people still use it on newer systems.


----------



## Crazy G

Really???? How come the benchs were OK? I love Win7


----------



## nexxusty

Quote:


> Originally Posted by *Crazy G*
> 
> Really???? How come the benchs are OK?


Benchmarks are always the same so there aren't many variables for the PC to deal with. Gaming is completely different.

Also those games may not play well with Windows 7 now. Could be just the games. Benchmarks usually work in older OS's, however usually with less performance.

Considering its free, do it. Now. You only have like 1 day before MS stops free upgrades.

At the very least upgrade so you can get your key and downgrade back to 7 if you wish. Then you will have a hardware ID linked to that PC and can put W10 on it anytime for free in the future.


----------



## fitzy-775

I just got my gigabyte g1 gaming 1080 today and i haven't had a problem with it and I am using windows 10


----------



## Crazy G

I see...but then I´ll have to reinstalled all my bucket load of stuff.. too lazy to do it now, will take days to have all tuned again.


----------



## nexxusty

Quote:


> Originally Posted by *fitzy-775*
> 
> I just got my gigabyte g1 gaming 1080 today and i haven't had a problem with it and I am using windows 10


These cards were literally MADE for W10. Compatibility with older os's is just that.......compatibility. Absolutely means you won't get the best performance out of it, it will work though.

Glad to hear your card is fine.
Quote:


> Originally Posted by *Crazy G*
> 
> I see...but then I´ll have to reinstalled all my bucket load of stuff.. too lazy to do it now, will take days to have all tuned again.


Oh I know.... Hehe. I just went through this with my 1080 and new mobo/ram. It ducks but man, it will be worth it!

Get some redbulls and get at er! Haha.


----------



## Crazy G

Guess so, but last night I barely slept, quite tired today. Anyways, thanks for the info/tip. I´ll wait when the 1080ti comes out and will update my whole rig and... WINDOWS 10!!!


----------



## dubldwn

EVGA added a non-oc FTW



EDIT: Apparently, these DTs are the FTWs that didn't quite pass testing. So, for people who want RGB lighting, dual BIOS, lower price, bigger fans, etc.

http://forums.evga.com/1080-FTW-DT-m2506026.aspx#2506032


----------



## mr2cam

Quote:


> Originally Posted by *Crazy G*
> 
> Yesterday I received a Gigabyte GTX 1080 G1 Gaming. After many hours of benchs and Black Death tracks in CloD and IL2 1946 modded, I just couldn´t get rid of stutters. It worked fine on benchmarks, 3DMark 11 and Fire Strike, Valley, Heaven, etc. no stutters. Different drivers, Nvidia CP, AB, OC Guru II, RT fps limiter, to no avail. This morning reinstalled my Titan X BIOS modded and everything got smooth again. The gains on benchmarks were minimal so I´ll return this card. I just couldn´t find out why these stutters. I´m on a 4790k @ 4.6GHz, 16Ghz RAM on a Sabertooth Z97 and Win 7 64. I´m puzzled.


I was getting some stutters in Crysis 3, was watching msi afterburner and the card was changing clock speeds (throttling) quite a bit. After I put it under water it stopped moving, just stays put at 2050, overwatch was butter smooth, will try Crysis 3 again tonight and report back.


----------



## scaramonga

One has to also remember, that the drivers for this card have not matured yet. We have only a couple of driver sets + a hotfix at present, so let's just see what future versions bring, if anything.


----------



## x7007

I posted it on nvidia customer service.

Does anyone here play with 3D TV and Tridef ? can you please tell me if COD Special Ops III works for you ? Windows 10 x64 + 368.51 + GTX1080 + Tridef 7.0 3DTV worked fine. Also GTA V crashes in instat as soon as I start new game and I go to the hostages, ALWAYS the SAME PLACE and SPOT. I can't figure it out !

Game issue - Call of Duty Special Ops III newest updated or fresh installed without the newest updates.

Someone else having the same issue with 970 correctly in the Tridef forums, he could only manage to work in 3D using 3Dvision in 720P and using the game inside renderer to upscale it to higher resolution.
Here is a link to the forum https://www.tridef.com/forum/viewtopic.php?f=2&t=6607

Using GTX1080 or 970 I had before, with Tridef 7.0 the newest version, and any profile that supposed to work before doesn't work with the latest driver.

The game works fine in 2D mode, but as soon as I launch it from Tridef menu the game goes black screen as soon after the logo when you need to press Enter to Continue.

What comes after is Display driver nvlddmkm stopped responding and has successfully recovered. and the clock stuck on 753 Mhz or so till doing restart or some driver reset which can be done using CRU 1.2.6 which restore the clock and driver condition like it's been system restart.

I get all this errors in the event viewer as soon as the Press Enter to Continue shown and the game is actually need to start rendering

\Device\Video3
NVRM: Graphics TEX Exception on (GPC 0, TPC 3): TEX NACK / Page Fault

\Device\Video3
Variable String too Large

\Device\Video3
Graphics Exception: ESR 0x505a24=0x80000000 0x505a28=0x0 0x505a2c=0x0 0x505a34=0x0

\Device\Video3
Graphics Exception: ESR 0x50da24=0x80000000 0x50da28=0x0 0x50da2c=0x0 0x50da34=0x0

\Device\Video3
NVRM: Graphics TEX Exception on (GPC 1, TPC 3): TEX NACK / Page Fault

\Device\Video3
Variable String too Large

\Device\Video3
NVRM: Graphics TEX Exception on (GPC 2, TPC 3): TEX NACK / Page Fault

\Device\Video3
Variable String too Large

\Device\Video3
Graphics Exception: ESR 0x515a24=0x80000000 0x515a28=0x0 0x515a2c=0x0 0x515a34=0x0

\Device\Video3
Graphics Exception: ESR 0x51da24=0x80000000 0x51da28=0x0 0x51da2c=0x0 0x51da34=0x0

\Device\Video3
NVRM: Graphics TEX Exception on (GPC 3, TPC 3): TEX NACK / Page Fault

\Device\Video3
Variable String too Large

Faulting application name: BlackOps3.exe, version: 0.0.0.0, time stamp: 0x5765991f
Faulting module name: unknown, version: 0.0.0.0, time stamp: 0x00000000
Exception code: 0xc0000005
Fault offset: 0x0000000000000000
Faulting process id: 0x2658
Faulting application start time: 0x01d1d10fcc74a5d3
Faulting application path: C:\Gamez\Call of Duty Black Ops III\BlackOps3.exe
Faulting module path: unknown
Report Id: 674d8a47-c3aa-4a3d-afb3-17bde1758926
Faulting package full name:
Faulting package-relative application ID:

Display driver nvlddmkm stopped responding and has successfully recovered.

It was working in older drivers, I'm not really sure which, because I waited for the GTX1080 to arrive and the 368.39 is the newest driver correctly which installed and I didn't run the game with the 970 to see before. I just don't want to check all kind of drivers, but I'm sure one of the older drivers work.


----------



## Warrimonk

Quote:


> Originally Posted by *dubldwn*
> 
> EVGA added a non-oc FTW
> 
> 
> 
> EDIT: Apparently, these DTs are the FTWs that didn't quite pass testing. So, for people who want RGB lighting, dual BIOS, lower price, bigger fans, etc.
> 
> http://forums.evga.com/1080-FTW-DT-m2506026.aspx#2506032


So this pretty much confirms that when you buy a higher end model that you are getting a higher binned card. The lower end models have the potential to each higher clock speeds but not to the stability required for customer release.


----------



## fayzaan

Yea I agree with you, I have a MSI Gaming GTX 1080, the non-x version (newegg exclusive), and I can't hit the core at 2000 stable. Has to be around 1960/1970, otherwise I get driver crash. You guys think that with custom bios I will be able to overclock more?


----------



## wsarahan

Guys how are you?

I`m having an issue here with my 1080 SLI

Sometimes, almost always when i launch a game one of the cards do not enter in 3d mode i think, the voltage stay in idle mode and the boost do not happen, i have to alt tab and change some value at Afeterburner to see the card voltage change enad the boost happen

This both images are while in gaming, you can see that one card boost the other one not, sli is active, voltage do not increase as well



















What can cause this? Anyone with the same issue with 1080 SLi?

Thanks


----------



## Pandora's Box

Those of you having issues with your card running at high clock speeds on the desktop when running high refresh rate:

Try using DDU and reinstalling the latest driver. I am on 368.51 on a 165Hz 1440P display and my Titan X downclocks to 135Mhz.

http://www.wagnardmobile.com/?q=display-driver-uninstaller-ddu-


----------



## Maintenance Bot

Quote:


> Originally Posted by *Pandora's Box*
> 
> Those of you having issues with your card running at high clock speeds on the desktop when running high refresh rate:
> 
> Try using DDU and reinstalling the latest driver. I am on 368.51 on a 165Hz 1440P display and my Titan X downclocks to 135Mhz.
> 
> http://www.wagnardmobile.com/?q=display-driver-uninstaller-ddu-


368.51 cured my desktop screen flickering. Im running a high refresh rate monitor also.


----------



## Setzer

Quote:


> Originally Posted by *Maintenance Bot*
> 
> 368.51 cured my desktop screen flickering. Im running a high refresh rate monitor also.


Screen flickering/artifacts when logging in and the next 5 seconds? I get that, but then it stops and is never an issue until reboot.


----------



## zGunBLADEz

somebody knows if the ek vga supremacy will fit on the founders without removing the mem/vrm plate from the front or do heavy modifications to it?

I already have the block but havent check yet....


----------



## Maintenance Bot

Quote:


> Originally Posted by *Setzer*
> 
> Screen flickering/artifacts when logging in and the next 5 seconds? I get that, but then it stops and is never an issue until reboot.


Yes, and everytime an app would be opened up or mouse movement. The issue is all gone since installing 368.51


----------



## Setzer

I have 368.39 installed through GeForce Experience, and when scanning, it just tells me it's already the latest version.
Where do I find .51?


----------



## Maintenance Bot

It was a hotfix driver.

http://nvidia.custhelp.com/app/answers/detail/a_id/4166/~/geforce-hotfix-driver-368.51


----------



## fishyswaz

Quote:


> Originally Posted by *wsarahan*
> 
> Guys how are you?
> 
> I`m having an issue here with my 1080 SLI
> 
> Sometimes, almost always when i launch a game one of the cards do not enter in 3d mode i think, the voltage stay in idle mode and the boost do not happen, i have to alt tab and change some value at Afeterburner to see the card voltage change enad the boost happen
> 
> This both images are while in gaming, you can see that one card boost the other one not, sli is active, voltage do not increase as well
> 
> What can cause this? Anyone with the same issue with 1080 SLi?
> 
> Thanks


I've got the exact same issue as well (2×SC's in sli under water). Also happens with benchmarking progs. Tried DDU, manual uninstall, Precision X oc. Happens with and without overclock - early days though I guess as also had innacurate monitoring with AB as well.


----------



## scaramonga

Hotfix driver here, on freshly installed OS, so no need for DDU, but issue still remains. No clocking down, unless 120hz.


----------



## Pandora's Box

Quote:


> Originally Posted by *scaramonga*
> 
> Hotfix driver here, on freshly installed OS, so no need for DDU, but issue still remains. No clocking down, unless 120hz.


Are you absolutely sure you are on the driver version you think you are? Under Windows 10 microsoft can and will automatically install an older version of GeForce drivers. Ive had this happen in the past and the card would not downclock after. DDU can disable Windows Update doing this.


----------



## Setzer

Quote:


> Originally Posted by *Maintenance Bot*
> 
> It was a hotfix driver.
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4166/~/geforce-hotfix-driver-368.51


Thanks! But I still get the artifact lines for the first 3-5 seconds after boot.

Also tried going from 165hz to 120, 100, and 60 - and my GPU doesn't downclock at all when on the desktop


----------



## mr2cam

Quote:


> Originally Posted by *Setzer*
> 
> Thanks! But I still get the artifact lines for the first 3-5 seconds after boot.
> 
> Also tried going from 165hz to 120, 100, and 60 - and my GPU doesn't downclock at all when on the desktop


How many monitors?


----------



## Setzer

Quote:


> Originally Posted by *mr2cam*
> 
> How many monitors?


Just this one


----------



## scaramonga

Quote:


> Originally Posted by *Pandora's Box*
> 
> Are you absolutely sure you are on the driver version you think you are? Under Windows 10 microsoft can and will automatically install an older version of GeForce drivers. Ive had this happen in the past and the card would not downclock after. DDU can disable Windows Update doing this.


368.51 set.

Microsoft will do nothing of the sort on my system, as the Windows Update service is first thing I disable on Win10, amongst other things. I manually update, when I see fit, not Microsoft.


----------



## MrPlankton

I'm curious as to if anyone has gotten their hands on an ASUS Turbo or MSI Aero versions of the GTX 1080 and how their temps compare the reference cards.


----------



## spyui

Will we ever see custom bios for gtx 1080 release to public ? I ask this question because i want to know what card should i buy for my WC SLI setup.


----------



## Shadowdane

Got my 1080 cards!! Wow these things run a lot cooler compared to my 980ti cards!

Haven't messed much with overclocking yet, but so far this seems stable:
~2050 to 2063Mhz Core (Seems to flip between these 2 clock speeds)
5508Mhz Memory



Need to test memory at a slightly lower speed, seen a few people mention GDDR5X seems to hit performance wall at some point then performance actually starts dropping before you start seeing artifacts.
Also need to see if the core can go higher, seemed the voltage slider in MSI Afterburner didn't do anything. One card hit 1.04v and other was sitting at 1.06v for those clockspeeds. 0% or 100% the voltage didn't change at all.

Only ran through regular 3DMark Firestrike so far (*27,880 Points*)
http://www.3dmark.com/fs/9049109


----------



## emett

Good so see more SLI guys joining the fold. What monitors you guys running?


----------



## LBear

Any tips for OCing? Went from +25 to +130mhz on the core and the card boost to 2100 then throttles down between 2075-2050mhz thoughout benching. Temps are between 65-69c. Is there a way to prevent throttle?


----------



## Shadowdane

Quote:


> Originally Posted by *emett*
> 
> Good so see more SLI guys joining the fold. What monitors you guys running?


Asus ROG Swift PG278Q here

Just did some GTA5 everything at Ultra/V.High @ 1440p with 4x TXAA.

First tried a 100fps limit in Afterburner and was a solid 100fps with no dips at all!! Tried 120fps cap but some places was getting dips below 120fps.

Turned off the fps cap for a bit and it bounced between 100fps up to 144fps in some spots with G-Sync turned on.


----------



## chronicfx

Where are the bridges.... I need the one slot EVGA or the three slot NVIDIA :


----------



## Shadowdane

Quote:


> Originally Posted by *LBear*
> 
> Any tips for OCing? Went from +25 to +130mhz on the core and the card boost to 2100 then throttles down between 2075-2050mhz thoughout benching. Temps are between 65-69c. Is there a way to prevent throttle?


Not until a Pascal BIOS Tweaker app is programmed... yah my card hits the 120% power limit and starts to downclock slightly. Granted dropping 30-50Mhz is what 0.2 - 0.3fps difference. I'm not even worried about it.


----------



## Shadowdane

Quote:


> Originally Posted by *chronicfx*
> 
> Where are the bridges.... I need the one slot EVGA or the three slot NVIDIA :


I used 2 ribbon bridges... works fine here.
But I'll probably grab a HB SLI bridge when I can find one.


----------



## blackforce

Quote:


> Originally Posted by *scaramonga*
> 
> 368.51 set.
> 
> Microsoft will do nothing of the sort on my system, as the Windows Update service is first thing I disable on Win10, amongst other things. I manually update, when I see fit, not Microsoft.


agree i did the same thing and microsoft has never installed anything, just the updates for windows.


----------



## Pendulum

I've just been poking around with mine since it's in a backup system currently.
Peak is 2062-2050 stepping down to and holding 2025. Fan profile is on auto - aggressive.
I'd have to say I'm pretty happy with it, breaking 2GHz without it getting hot is all I wanted.

+75 is where I start to lose stability, once I get it into my other system (parts on order) I'll have to tinker with it some more.


----------



## chronicfx

[
Quote:


> Originally Posted by *Shadowdane*
> 
> I used 2 ribbon bridges... works fine here.
> But I'll probably grab a HB SLI bridge when I can find one.


I am doing the same but there is some new data from a guy that tested on the reddit forum and through comments there seems to be some thought that the drivers actually don't give the need a better bridge warning but resort to using only one finger anyway. What are your thoughts on this? He does have some data to look at.

URL]

__
https://www.reddit.com/r/4qjhy3/gtx_1080_sli_5k_revisited_this_time_trying_2x_sli/
]


----------



## kantxcape

Quote:


> Originally Posted by *Spiriva*
> 
> I have both my cards cooled with EK waterblock, they both stay on 2200mhz all of the time while gaming. In bench mark programs Ive seen that they sometimes can be different, one can be at 2190 and the other one 2203 for a little while.


Which models do you have?


----------



## Spiriva

Quote:


> Originally Posted by *emett*
> 
> Good so see more SLI guys joining the fold. What monitors you guys running?


I use an Asus PG348Q with 1080 sli, its pretty near perfect match








Quote:


> Originally Posted by *kantxcape*
> 
> Which models do you have?


I got two Evga Founders Edition.


----------



## emett

So I am running two same length SLI ribbons and everytime I load a 3d application my refresh rate drops back to 59hz and I need to reboot and reset back to 144hz.
If I just run 1 ribbon I don't have this issue. I've tried with different ribbons and get the same results. HELP ME!!

Any ideas what may be causing this?


----------



## emett

Man this better not come down to running a freesync monitor. Surely not..
I can forsee that even with the HB bridge i'd have this issue.


----------



## KillerBee33

Ehh....


----------



## Bloodymight

Maybe OP should consider updating first post or let someone else do it!

There is not even a leader board, it has been almost a month since first post got updated.


----------



## Klocek001

Quote:


> Originally Posted by *LBear*
> 
> Any tips for OCing? Went from +25 to +130mhz on the core and the card boost to 2100 then throttles down between 2075-2050mhz thoughout benching. Temps are between 65-69c. Is there a way to prevent throttle?


That's not throttling, that's how gpu boost works. Lower temp = higher clocks. When you jump into a game your gpu is at 40 degrees, that's why the boost is so ridiculously high. Once it hits around 70 degrees it stabilizes. Throttling point is above 80 degrees.
Nvidia would have to sell all 1080s with a water block exclusively to prevent this from happening since a GPU like 1080 on air cooling will never be able to sustain temps around 50 degrees.


----------



## FreeElectron

Any information or speculations about the 1080 TI?


----------



## dentnu

Well I just got my MSI 1080 Gaming X looks like I got a good one as it can do 2139MHz on the Core and 5580MHz on the memory tested it for about 45 mins in Heaven. The card is still on air and plan on puting a kraken x41 and G10 bracket soon on it hope I push it further. Overall I am very happy with it. Was able to hit over 20000 in firestrike for the first time with one card. The Msi 1080 gaming x model sure does look nice did not think I would like the look of it so much. I can highly recommend it as from other posts on here seems most who have the card have been able to hit plus 2100 core clocks and plus 500 memory. Will keep tweaking it and will report back once I install the G10 on it.


----------



## emett

Anyone else here running an MG279Q with SLI 1080s?

Need help; see my posts on the last page please.


----------



## Alwrath

ZOTAC AMP EXTREMES ARE ON NEWEGG, 730$ a pop, I just picked one up got tired of waiting for the FTW I pre ordered on amazon. Plus its my birthday today so f it.









Hopefully it can OC. The cooler should keep this triple slot beast cool.


----------



## skline00

Alrath HAPPY Birthday:thumb:


----------



## Alwrath

Quote:


> Originally Posted by *skline00*
> 
> Alrath HAPPY Birthday:thumb:


hehe thanks man.


----------



## Maintenance Bot

Quote:


> Originally Posted by *Alwrath*
> 
> ZOTAC AMP EXTREMES ARE ON NEWEGG, 730$ a pop, I just picked one up got tired of waiting for the FTW I pre ordered on amazon. Plus its my birthday today so f it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully it can OC. The cooler should keep this triple slot beast cool.


Hey happy b-day.

Just got mine yesterday. It is a large gpu. Im getting temps in the low 50's celsius at 95% fan speed.


Spoiler: Warning: Spoiler!


----------



## gerbil80

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Hey happy b-day.
> 
> Just got mine yesterday. It is a large gpu. Im getting temps in the low 50's celsius at 95% fan speed.
> 
> 
> Spoiler: Warning: Spoiler!


Any noticeable difference in OC results between these?


----------



## fat4l

Sooo nice....
http://shop.aquacomputer.de/product_info.php?products_id=3406





+ active backplate cooling the back of the vrm with heatpipe!


----------



## Maintenance Bot

Quote:


> Originally Posted by *gerbil80*
> 
> Any noticeable difference in OC results between these?


So far 2114 for the zotac. The Fe I have won't go much over 2000mhz even on a good day.


----------



## boredgunner

Quote:


> Originally Posted by *Maintenance Bot*
> 
> So far 2114 for the zotac. The Fe I have won't go much over 2000mhz even on a good day.


Nice job with the Zotac. 2088 is my max for the MSI ARMOR OC, though temps usually drop it to 2063.


----------



## wsarahan

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LBear*
> 
> Any tips for OCing? Went from +25 to +130mhz on the core and the card boost to 2100 then throttles down between 2075-2050mhz thoughout benching. Temps are between 65-69c. Is there a way to prevent throttle?
> 
> 
> 
> That's not throttling, that's how gpu boost works. Lower temp = higher clocks. When you jump into a game your gpu is at 40 degrees, that's why the boost is so ridiculously high. Once it hits around 70 degrees it stabilizes. Throttling point is above 80 degrees.
> Nvidia would have to sell all 1080s with a water block exclusively to prevent this from happening since a GPU like 1080 on air cooling will never be able to sustain temps around 50 degrees.
Click to expand...

Yep the boost 3.0 works completely different from the 2.0 till 980's, the clock goes down way easy than before

I'm reaching here 2089/5400 on my sli rig, this is a good oc o think considering that oc in sli is way more difficult

Enviado do meu iPhone usando Tapatalk


----------



## Maintenance Bot

Quote:


> Originally Posted by *boredgunner*
> 
> Nice job with the Zotac. 2088 is my max for the MSI ARMOR OC, though temps usually drop it to 2063.


It will bench allday at 2114, but typical game clocks seem to be about 2063 like yours. It is nice not having any power limit clock shifting.


----------



## boredgunner

Quote:


> Originally Posted by *Maintenance Bot*
> 
> It will bench allday at 2114, but typical game clocks seem to be about 2063 like yours. It is nice not having any power limit clock shifting.


Just noticed we have almost the same PC. 6700k with AIO, probably the same exact RAM, same base GPU, same monitor almost, Sound Blaster sound card, Corsair Carbide case, 80 Plus Platinum certified PSUs. Benchmark comparions would be interesting.


----------



## fireyfire

I see a lot of people asking for recommendations for non reference cards. Go for the Zotac AMP! Or AMP! Extreme cards, my AMP! Overclocks quite well on water (NZXT Kraken G10) and I am able to get 2202 MHZ, but it drops down to 2190 after long loads. So if you want a great OC go for the Zotac non founders cards.

Edit: Included images of clock speed and curve.


----------



## Maintenance Bot

Quote:


> Originally Posted by *boredgunner*
> 
> Just noticed we have almost the same PC. 6700k with AIO, probably the same exact RAM, same base GPU, same monitor almost, Sound Blaster sound card, Corsair Carbide case, 80 Plus Platinum certified PSUs. Benchmark comparions would be interesting.


Hey nice catch








Quote:


> Originally Posted by *fireyfire*
> 
> I see a lot of people asking for recommendations for non reference cards. Go for the Zotac AMP! Or AMP! Extreme cards, my AMP! Overclocks quite well on water (NZXT Kraken G10) and I am able to get 2202 MHZ, but it drops down to 2190 after long loads. So if you want a great OC go for the Zotac non founders cards.
> 
> Edit: Included images of clock speed and curve.


Were you able to gain more speed by adjusting voltage/frequency curve? I have yet to try that.


----------



## fireyfire

Yes that's what I did, I was only able to manage about 2140 without adjusting the curve. I mainly adjusted the offsets at the high end of the curve, I didn't waste my time on the low end and left that stock.


----------



## AllGamer

very nice!

it's good to have additional choices on top of the EK and XPC and Koolance

Quote:


> Originally Posted by *fat4l*
> 
> Sooo nice....
> http://shop.aquacomputer.de/product_info.php?products_id=3406
> 
> 
> 
> 
> 
> + active backplate cooling the back of the vrm with heatpipe!


----------



## skline00

Nice to see all of these new water blocks. I'm using the standard EK GTX 1080 acrylic/copper block with my Zotac GTX1080 FE (only one available at the time!). Very pleased. Absolutely no thermal throttling.


----------



## glnn_23

I'vs started playing around with the power curve instead of manual input on my evga FE with an EK block. Definitely able to hit higher frequencies this way. Running fire strike extreme have hit 2227mhz at max temp 34C.


----------



## versions

Saw some posts about the card not clocking down in idle with 144Hz monitors. I have an MG279Q as well as a 2560x1440 60Hz monitor with one GTX 1080. Mine clocks down in idle.


Quote:


> Originally Posted by *glnn_23*
> 
> I'vs started playing around with the power curve instead of manual input on my evga FE with an EK block. Definitely able to hit higher frequencies this way. Running fire strike extreme have hit 2227mhz at max temp 34C.


Could you explain how you went about it? Have not played a lot with the curve yet but was not able to get any higher than with an offset, I might've done it wrong though.


----------



## glnn_23

A good place to start is here.

http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,3.html

After that it is a matter of fine tuning the frequency to the power point. Using driver 368.51 helps too.


----------



## KillerBee33

Quote:


> Originally Posted by *glnn_23*
> 
> A good place to start is here.
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,3.html
> 
> After that it is a matter of fine tuning the frequency to the power point. Using driver 368.51 helps too.


Power @ 110 Temp Limit @ 84 Core +220 Mem.+350 , No need to mess with Voltage just yet


----------



## versions

Quote:


> Originally Posted by *glnn_23*
> 
> A good place to start is here.
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,3.html
> 
> After that it is a matter of fine tuning the frequency to the power point. Using driver 368.51 helps too.


Thanks for the link. The new driver actually did seem to help a bit. Keep bumping into the power limit all the time so hardly ever get close to 1.1V. Hope there'll be custom BIOS eventually so this annoying limit can be bypassed.


----------



## Boweezie

I was finally able to order my EVGA GTX 1080 FTW from EVGA's website!!! Now the waiting begins...


----------



## dentnu

I have been running 3dmark with my 1080 and have been seeing some really strange results. Everytime I run 3dmarks at the same clocks my final score is always differnt by about 100 to 200 points. I have never seen this before all my other cards would always show the same scores with maybe a slight differnce of 20 to 50 points. Is anyone experiencing this issue is this normal. I looked at the clocks and they are not downclocking been beating my head against a wall trying to figure out what is wrong. Any help or adivse would be greatly appreciated thanks


----------



## Outcasst

I have been playing around with the curve in MSI Afterburner. Didn't have much success in getting a higher clockspeed BUT i was able to practically eliminate it downclocking.

On the curve graph, you just need to change one of the points. For example, I chose the point at 1025mV. I changed the value to 2063MHz.

Heaven has been looping for 10 minutes now, with no throttling at all and a max power usage of 113%. I'm thinking as long as you can find a stable overclock that doesn't reach 118-119% power usage then I don't think you're going to see any throttling at all.

The only downside of this is that your maximum clock isn't going to be as high, but on the other hand you aren't going to see any clock speeds bouncing around.


----------



## Jpmboy

Pascal is capable of some pretty robust error correction, so probably best to actually assess performance/productivity at high clocks. You may find that a stable 2100+ core is actually giving lower FPS than a lower clock would (ambient cooling. cryogenic cooling has effects on signal margins). Just my


----------



## Outcasst

Quote:


> Originally Posted by *Jpmboy*
> 
> Pascal is capable of some pretty robust error correction, so probably best to actually assess performance/productivity at high clocks. You may find that a stable 2100+ core is actually giving lower FPS than a lower clock would (ambient cooling. cryogenic cooling has effects on signal margins). Just my


Seems to be the case. Just ran some benchmarks using the curve compared to just putting +170 and even though the clocks are higher on the curve, the +170 is getting about 3fps more. Interesting.


----------



## Naked Snake

Quote:


> Originally Posted by *Outcasst*
> 
> I have been playing around with the curve in MSI Afterburner. Didn't have much success in getting a higher clockspeed BUT i was able to practically eliminate it downclocking.
> 
> On the curve graph, you just need to change one of the points. For example, I chose the point at 1025mV. I changed the value to 2063MHz.
> 
> Heaven has been looping for 10 minutes now, with no throttling at all and a max power usage of 113%. I'm thinking as long as you can find a stable overclock that doesn't reach 118-119% power usage then I don't think you're going to see any throttling at all.
> 
> The only downside of this is that your maximum clock isn't going to be as high, but on the other hand you aren't going to see any clock speeds bouncing around.


I'm always stable on heaven, I mean my clocks are stable at 2114mhz all the time without changing as long as I can mantain low temps with 100% fan but in games everything changes, I've actually tried now what you just said in your post but sadly is not working for me I still achieve for example 2063mhz like you do but in games I bounce from 2050 to 2063mhz now. It's the same with my old OC, bouncing from 2114mhz to 2088mhz. Kind of hating myself for not waiting for an Evga Hybrid 1080 lol
Quote:


> Originally Posted by *Jpmboy*
> 
> Pascal is capable of some pretty robust error correction, so probably best to actually assess performance/productivity at high clocks. You may find that a stable 2100+ core is actually giving lower FPS than a lower clock would (ambient cooling. cryogenic cooling has effects on signal margins). Just my


So we should test the FPS gains with the Core OC like we do with Mem OC? for example if I gain like 5 fps with mem at 463mhz but the games feel choppy, at 400mhz they stutter less and I only gain 4 fps but at stock memory I don't see any stuttering but I lose all my fps advantage.


----------



## emett

Quote:


> Originally Posted by *versions*
> 
> Saw some posts about the card not clocking down in idle with 144Hz monitors. I have an MG279Q as well as a 2560x1440 60Hz monitor with one GTX 1080. Mine down clock fine.


Hi, that was me having that issue and I have a MG278Q. My issue was fixed with a clean install.
I have another drama that wont go away. When running 2 sli ribbons my PC will boot @144hz but after opening any 3d app will revert back to 59hz, then I need to reboot to get back to 144hz
Any one have any insight to this?


----------



## RJacobs28

Just got 2 Asus Strix 1080s and I can't hit Run in EVGA Precision X's OC Scanner page? Any idea what I'm doing wrong?


----------



## Asus11

Quote:


> Originally Posted by *RJacobs28*
> 
> Just got 2 Asus Strix 1080s and I can't hit Run in EVGA Precision X's OC Scanner page? Any idea what I'm doing wrong?


same here, I think its only for EVGA cards & a serial number may have to be given with your email address to enable it, im not 100% sure though if someone knows more about it please do chime in


----------



## Naked Snake

Quote:


> Originally Posted by *RJacobs28*
> 
> Just got 2 Asus Strix 1080s and I can't hit Run in EVGA Precision X's OC Scanner page? Any idea what I'm doing wrong?


As far as I know from some guys on Reddit, you cannot use it with non Evga Cards, you can manually do it but cannot run it in automatic mode. I have no problem with it with my Evga FE

pd: and yes it did asked me for my card serial and I've registered it in the Evga webpage and everything is working fine with the OC scanner in version 6.0.2


----------



## Crazy G

Quote:


> Originally Posted by *dentnu*
> 
> I have been running 3dmark with my 1080 and have been seeing some really strange results. Everytime I run 3dmarks at the same clocks my final score is always differnt by about 100 to 200 points. I have never seen this before all my other cards would always show the same scores with maybe a slight differnce of 20 to 50 points. Is anyone experiencing this issue is this normal. I looked at the clocks and they are not downclocking been beating my head against a wall trying to figure out what is wrong. Any help or adivse would be greatly appreciated thanks


SSD´s don´t like much benchmarks over and over.


----------



## RJacobs28

Thanks guys!


----------



## dentnu

Quote:


> Originally Posted by *Crazy G*
> 
> SSD´s don´t like much benchmarks over and over.


Wow so your saying that my SSD is responsible for score difference ? Interesting


----------



## Crazy G

When SSD´s became more popular (cheaper) I read an interview regarding the excess of the same benchmark dropped results.

BTW, I have an EVGA Titan X SC as well with modded BIOS. Did you notice a big difference with the 1080?


----------



## RJacobs28

All sorts of weird behaviour at the moment but I realise it's very early days.
Seeing horizontal flickering and with GSync disabled I can't have my panel above 120hz or the image is filled with faint horizontal lines.
Never had any issues with 2 980s but these 1080s are playing up









Heres to early adoption eh?!

Edit: Probably doesn't help that the HB Bridge is non existent in Australia.


----------



## Asus11

Quote:


> Originally Posted by *RJacobs28*
> 
> All sorts of weird behaviour at the moment but I realise it's very early days.
> Seeing horizontal flickering and with GSync disabled I can't have my panel above 120hz or the image is filled with faint horizontal lines.
> Never had any issues with 2 980s but these 1080s are playing up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heres to early adoption eh?!
> 
> Edit: Probably doesn't help that the HB Bridge is non existent in Australia.


http://nvidia.custhelp.com/app/answers/detail/a_id/4166/~/geforce-hotfix-driver-368.51


----------



## Alwrath

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Hey happy b-day.
> 
> Just got mine yesterday. It is a large gpu. Im getting temps in the low 50's celsius at 95% fan speed.
> 
> 
> Spoiler: Warning: Spoiler!


Is that in game load temp? Man I cant wait till this card gets here on Tuesday next week. I have the day off and will definetly post the max in game oc I can get out of this bad boy and temps







I may end up putting some arctic silver on it to see if temps go down at all. This card should be the best you can get air cooled wise.


----------



## RJacobs28

Quote:


> Originally Posted by *Asus11*
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4166/~/geforce-hotfix-driver-368.51


You sir, are a god amongst men.


----------



## fat4l

Installation of the WB





I rly like the shapes...even more than EK blocks


----------



## versions

Quote:


> Originally Posted by *fat4l*
> 
> Installation of the WB
> 
> 
> 
> 
> 
> I rly like the shapes...even more than EK blocks


I have an Aqua Computer block for my R9 290X, looks the same as this one. It is a very heavy water block, much heavier than the EK blocks.


----------



## fat4l

Quote:


> Originally Posted by *versions*
> 
> I have an Aqua Computer block for my R9 290X, looks the same as this one. It is a very heavy water block, much heavier than the EK blocks.


Exactly. And the active cooling backplate is also nice!
I had it on my 295X2 and the card was soooo cool.

The only problem is, they dont seem to be doign blocks for aib cards. I'm planning on getting evga 1080 FTW!


----------



## Maintenance Bot

Quote:


> Originally Posted by *Alwrath*
> 
> Is that in game load temp? Man I cant wait till this card gets here on Tuesday next week. I have the day off and will definetly post the max in game oc I can get out of this bad boy and temps
> 
> 
> 
> 
> 
> 
> 
> I may end up putting some arctic silver on it to see if temps go down at all. This card should be the best you can get air cooled wise.


Yeah about 53-54c. It would be a wise decision to redo the stock tim.


----------



## manurap46

gtx 1080 Ichill x3 2177/2800mhz


----------



## emett

So it seems can't run the HB with my CPU over clocked to 4.7.
This overclock has been stable for 3 years. I don't get a crash but the refresh rate drops to 59hz on my monitor.


----------



## axiumone

Quote:


> Originally Posted by *emett*
> 
> So it seems can't run the HB with my CPU over clocked to 4.7.
> This overclock has been stable for 3 years. I don't get a crash but the refresh rate drops to 59hz on my monitor.


Ran it fine with a 6700k @ 4.8. 165hz across 5 displays and no crashes.


----------



## emett

Think its somewhere between the 3930k being an older gen CPU, running 2133 ram and forcing PCI-E 3.
No mater the volts i through at it the issue is still there.
The 4.7 CPU overclock is stable at -0.010 offset.

So just ruuning one sli ribbon atm, would like to sort it out though..


----------



## S4ch4Z

This is where playing with the voltage curve on AfterBurner got my EVGA FE with stock bios on Heaven Extreme.
But can't hold theses clocks constantly, thanks to this ridiculous power limit restriction : I'm getting a better score at 2100 flat...
Really deserves a custom bios!


----------



## Shadowdane

These 1080 cards are a beast! Really need a custom BIOS though, clearly power limited one of my cards is bouncing off the 120% power limit. It's really looking like a single 8-pin isn't enough for these cards. Granted I'd hope with a custom BIOS I can squeeze some additional juice from 8-pin maybe close to 170-175W? All I want is clocks that stay stable over 2Ghz, currently they bounce up and down quite a bit.

Both cards sitting around 1.02v to 1.06v fluctuates a bit, clocks sit between 2000Mhz to 2063Mhz. Tried to push higher than +100Mhz offset and get a driver crash. Memory had no problems pushing +500Mhz for 11Ghz! I tried messing with the Clock/Voltage curve editor but didn't seem to do anything they just ran at stock clocks when I tried the curve option.


http://www.3dmark.com/fs/9087337

[edit]
Looks like I need to do more testing I think my memory overclock is too high.. got corrupted textures after gaming for about ~20-25 minutes.


----------



## boredgunner

Some impressive results just from overclocking my GTX 1080.

Stock:
http://www.3dmark.com/3dm11/11367101

Overclocked (typically 2063 MHz core, memory is 11016 MHz):
http://www.3dmark.com/3dm11/11378256

GTX 980 Ti at 1488 MHz core:
http://www.3dmark.com/3dm11/11342689


----------



## ChevChelios

Quote:


> Originally Posted by *boredgunner*
> 
> Some impressive results just from overclocking my GTX 1080.
> 
> Stock:
> http://www.3dmark.com/3dm11/11367101
> 
> Overclocked (typically 2063 MHz core, memory is 11016 MHz):
> http://www.3dmark.com/3dm11/11378256
> 
> GTX 980 Ti at 1488 MHz core:
> http://www.3dmark.com/3dm11/11342689


so 25% faster than a 1488mhz 980Ti


----------



## Jpmboy

Quote:


> Originally Posted by *ChevChelios*
> 
> so 25% faster than a 1488mhz 980Ti


less if you compare graphics scores only... it's good tho, 3DMK11 extreme is very tough on a gpu.


----------



## ChevChelios

i did compare graphics only

10761 / 8600 = 1.25


----------



## boredgunner

Quote:


> Originally Posted by *ChevChelios*
> 
> i did compare graphics only
> 
> 10761 / 8600 = 1.25


Makes sense considering I used the same exact CPU and CPU overclock for both tests. The GTX 1080 is also bringing pleasant benefits in games, perhaps Fallout 4 most of all (no more sub 60 FPS dips even with mods, so far at least). For reference, The Vanishing of Ethan Carter Redux now treads in ULMB territory; constant 120 FPS with V-Sync which is a requirement for ULMB in my opinion.


----------



## kx11

my asus Stix 1080s crash 3dmark when i try to OC higher than 10560memory clock


----------



## fitzy-775

so i was just looking at my gigbtye card last night and just notice that none of my fan where moving until i got into a game. is this the normal now?


----------



## fitzy-775

I have been overclocking my card in msi after burner. But i can't change my core voltage even tho i have it enabled in the settings.


----------



## kx11

turns out Asus GPU tweakII is kinda useless in OC , MSI AB got me far enough to do 10868mhz mem OC , this was done with 96% manual fan speed and top GPU temp was 75c at maximum


----------



## Pendulum

Probably going to end up sending my EVGA SC back here shortly, I lost the 10 series coil whine lottery. I really don't want to go through the RMA process but this is intolerable.
I'll give it a few days to see if it will correct itself before sending it in. (probably won't do anything at all)

Sounds identical to this guy's under load regardless of fan / gpu usage.


----------



## boredgunner

Quote:


> Originally Posted by *kx11*
> 
> my asus Stix 1080s crash 3dmark when i try to OC higher than 10560memory clock


That's a shame. It seems most people are capable of getting 11000 MHz. I doubt you're missing a whole lot though.
Quote:


> Originally Posted by *Pendulum*
> 
> Probably going to end up sending my EVGA SC back here shortly, I lost the 10 series coil whine lottery. I really don't want to go through the RMA process but this is intolerable.
> I'll give it a few days to see if it will correct itself before sending it in. (probably won't do anything at all)


This is a major reason as to why I suggest going for a model with an aftermarket PCB. I haven't seen any coil whine complaints for MSI ARMOR/GAMING/GAMING X, EVGA FTW, the new Zotac AMP Extreme, and very few for the Gigabyte XTREME Gaming.


----------



## dentnu

Quote:


> Originally Posted by *fitzy-775*
> 
> I have been overclocking my card in msi after burner. But i can't change my core voltage even tho i have it enabled in the settings.


I have an MSI 1080 gaming X and also used afterbunner unlocked the voltage and have raised it all the way up and did not see any change in my voltage. Anyone knows what is going on with this ? Thanks


----------



## wsarahan

Guys one question

My EVGA SC1080 SLI is something about 20C colder than my 980TI Galax HOF GOC SLi setup

Is this normal or my old galax sli 980ti had some issues in the temps? now it reaches 55C max in the hottest card, the 980ti setuo had 75/80c

The fans is at same speed

tks


----------



## MrTOOSHORT

Quote:


> Originally Posted by *wsarahan*
> 
> Guys one question
> 
> My EVGA SC1080 SLI is something about 20C colder than my 980TI Galax HOF GOC SLi setup
> 
> Is this normal or my old galax sli 980ti had some issues in the temps? now it reaches 55C max in the hottest card, the 980ti setuo had 75/80c
> 
> The fans is at same speed
> 
> tks


normal, the 980 TIs are running around ~300w, while the 1080s are running around ~200w. Big die on old 28nm process vs small die on 16nm process. 1080 doesn't take much to keep cool. Since you live in Brazil, you made a good decision to switch out for 1080s.


----------



## boredgunner

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> normal, the 980 TIs are running around ~300w, while the 1080s are running around ~200w. Big die on old 28nm process vs small die on 16nm process. 1080 doesn't take much to keep cool. Since you live in Brazil, you made a good decision to switch out for 1080s.


On that note, it's funny that the temperatures on my previous MSI GTX 980 Ti Lightning (maxed voltage) were slightly lower than my current MSI GTX 1080 Armor 8G, and with a slightly less aggressive fan profile too. And this GTX 1080 never breaks 72c. I think that's a testament to how good the Lightning cooler is, as it's monstrous compared to the itty bitty Armor cooler.


----------



## MrTOOSHORT

@boredgunner

I hear that cooler was the best on the 980TI generation.


----------



## fayzaan

which version of Afterburner? I updated to 4.3 beta 4, if I increase the voltage it does go up, but I believe it also depends on temperature and other stuff as well. But even with increased voltage, I can't seem to go over 1999 clock speed







.

Also wanted to mention...for those that have MSI Gaming card, I added a fan to the backplate, and on idle my temps dropped from 38c (at 100% fan speed) to 32c. Might help get higher clocks on load.


----------



## wsarahan

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> normal, the 980 TIs are running around ~300w, while the 1080s are running around ~200w. Big die on old 28nm process vs small die on 16nm process. 1080 doesn't take much to keep cool. Since you live in Brazil, you made a good decision to switch out for 1080s.


Thanks for the explanation


----------



## uggy

Is there any news about custom bios to the 1080?

and how about 3xSLI, nvidia has been very back and forward with this ..My knowledge is that the enthusiast key is not possible to get at the moment?
or is there a way?


----------



## Pendulum

Quote:


> Originally Posted by *boredgunner*
> 
> This is a major reason as to why I suggest going for a model with an aftermarket PCB. I haven't seen any coil whine complaints for MSI ARMOR/GAMING/GAMING X, EVGA FTW, the new Zotac AMP Extreme, and very few for the Gigabyte XTREME Gaming.


I definitely agree with you on that. I was wanting a FTW, however due to stock I got impatient and ordered a EVGA SC. I sold my 980s a while back and have been running a since GTX 460 so it's going to painful to go back to it.









I'll probably just return it after I finish moving houses next week. I'll likely look for a Gigabyte or Asus card after this.


----------



## ChevChelios

I wonder if the *single* 8-pin on G1 is ever going to be a power issue if a custom BIOS ever comes out for 1080 ..

I decided to get G1 over Palit Gamerock Premium or Gainward Phoenix GLH even though those had 8+6 and dual BIOSes .. just that much faith and trust in Gigabyte


----------



## dentnu

Quote:


> Originally Posted by *fayzaan*
> 
> which version of Afterburner? I updated to 4.3 beta 4, if I increase the voltage it does go up, but I believe it also depends on temperature and other stuff as well. But even with increased voltage, I can't seem to go over 1999 clock speed
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also wanted to mention...for those that have MSI Gaming card, I added a fan to the backplate, and on idle my temps dropped from 38c (at 100% fan speed) to 32c. Might help get higher clocks on load.


I am using the same version of afterburner as you. If it is rasing it then I can't tell a differnce as it always maxs out 1.062v. Maybe it had to do with temps but my card hits a max temp of 40c with my kraken x41 and g10 on it so I would think those temps are low enough to make the card hit its max voltage. Guess my card is already set to max voltage with the factory bios. Who knows ?? This new gpu boost 3.0 is a step backwards compare 2.0. Nvidia just added and more restrictions can't wait for modded bios that removes all this BS.


----------



## fayzaan

hmm, did you set power limit and temp limit to full? and also, are you manually overclocking or using the curve? if you use the curve, you can specify a higher speed at higher voltage and if temps are good, it should increase core clock and voltage based on the curve you set. I am on air and was able to get my card to use 1093, but doesn't seem stable for me.


----------



## wsarahan

Quote:


> Originally Posted by *Pendulum*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boredgunner*
> 
> This is a major reason as to why I suggest going for a model with an aftermarket PCB. I haven't seen any coil whine complaints for MSI ARMOR/GAMING/GAMING X, EVGA FTW, the new Zotac AMP Extreme, and very few for the Gigabyte XTREME Gaming.
> 
> 
> 
> I definitely agree with you on that. I was wanting a FTW, however due to stock I got impatient and ordered a EVGA SC. I sold my 980s a while back and have been running a since GTX 460 so it's going to painful to go back to it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll probably just return it after I finish moving houses next week. I'll likely look for a Gigabyte or Asus card after this.
Click to expand...

I think I got lucky with my 1080 SC sli, got 2 good cards reaching 2089 stable sli oc, with one card only I think I can get higher, and here in Brazil we don't have all the cards options that we have at Europe and USA, Evga and MSI dominates here and we just find the basic models

Enviado do meu iPhone usando Tapatalk


----------



## boredgunner

Quote:


> Originally Posted by *Pendulum*
> 
> I definitely agree with you on that. I was wanting a FTW, however due to stock I got impatient and ordered a EVGA SC. I sold my 980s a while back and have been running a since GTX 460 so it's going to painful to go back to it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll probably just return it after I finish moving houses next week. I'll likely look for a Gigabyte or Asus card after this.


I've seen numerous coil whine complaints for the ASUS ROG Strix, but most say it's minor. Also it uses a direct-touch heatpipe design without great contact. See this image from Techpowerup:



As for the Gigabyte XTREME Gaming, you will probably have to pull out the shroud in the middle so that it doesn't hit the middle fan. Seems to be a great card otherwise though, perhaps top 2 along with the Zotac AMP Extreme.


----------



## dentnu

Yea power limit and temp limit is maxed out. I am manually overclocking it and can hit 2126MHZ on core and my memory can do 5557MHz.. My memory clocks can go all the way to 5760MHz stable tested it for 1 hour in Heaven and 3dmark. The wired thing is any memory clocks higher than 5557MHz cause my heaven and 3dmark scores to drop which is crazy cause I can run memory clocks @ 5760MHz for over 1hour with no artifact or driver crashes. Been trying to figure out what it is but so far tried using DDU to uninstall my driver multiple times and tested different drivers and same thing. Don't think the cards bad cause its working correctly. I think its the bios causing these issue but who knows. I looked at that curve overclock graph but after taking one look at it hated it. Overclocking a GPU should not need to be co complex... Thanks


----------



## supermi

Quote:


> Originally Posted by *dentnu*
> 
> Yea power limit and temp limit is maxed out. I am manually overclocking it and can hit 2126MHZ on core and my memory can do 5557MHz.. My memory clocks can go all the way to 5760MHz stable tested it for 1 hour in Heaven and 3dmark. The wired thing is any memory clocks higher than 5557MHz cause my heaven and 3dmark scores to drop which is crazy cause I can run memory clocks @ 5760MHz for over 1hour with no artifact or driver crashes. Been trying to figure out what it is but so far tried using DDU to uninstall my driver multiple times and tested different drivers and same thing. Don't think the cards bad cause its working correctly. I think its the bios causing these issue but who knows. I looked at that curve overclock graph but after taking one look at it hated it. Overclocking a GPU should not need to be co complex... Thanks


You are seeing error correction in your vram, lower the vram OC to the point you are getting fps increases.


----------



## dentnu

Quote:


> Originally Posted by *supermi*
> 
> You are seeing error correction in your vram, lower the vram OC to the point you are getting fps increases.


Oh ok so that is that is cool thanks


----------



## chronicfx

Go tired of waiting so I picked up two EVGA PRO SLI V2 LED bridges and took to tops off... Anyway to tell if they are running correctly at the higher rate and equivalent to the HB bridge besides benchmarks?


----------



## emett

Couple inches under where you turn on SLI there will be a notifacation saying you can get better performance with a HB bridge. If it detects it the notifacation isn't there.


----------



## RJacobs28

Quote:


> Originally Posted by *chronicfx*
> 
> Go tired of waiting so I picked up two EVGA PRO SLI V2 LED bridges and took to tops off... Anyway to tell if they are running correctly at the higher rate and equivalent to the HB bridge besides benchmarks?


It would be interesting to see whether there is any difference between the HB Bridge and 2 LED Bridges.
If there isn't a difference - I'll do the same.
The lack of availability on these new bridges is pretty disappointing.


----------



## Jquala

Currently. I have a FTW model that can hit a power target of 130%...I swear it was 120% yesterday. It has 0 effect on my overclock ability though. I'm surprised how unimpressive FTW overclock a compared to FE cards. I had 2/3 FE cards hit 2126-2100 and one 2050. My FTW can loop valley/heaven for hours at +99 on core (2050-2088mhz) but will artifact and then crash at +100(2088-2113mhz).cant make it through a loop. My question is you guys think when a block comes out for it it will get over that bump? Or should I sell it and go back to my FE with blocks


----------



## chronicfx

Quote:


> Originally Posted by *Jquala*
> 
> Currently. I have a FTW model that can hit a power target of 130%...I swear it was 120% yesterday. It has 0 effect on my overclock ability though. I'm surprised how unimpressive FTW overclock a compared to FE cards. I had 2/3 FE cards hit 2126-2100 and one 2050. My FTW can loop valley/heaven for hours at +99 on core (2050-2088mhz) but will artifact and then crash at +100(2088-2113mhz).cant make it through a loop. My question is you guys think when a block comes out for it it will get over that bump? Or should I sell it and go back to my FE with blocks


The FE's are obv. better. They have a higher overclock ability along with the higher temps, if you were gonna squeeze anything extra out of those it would be the FE on water.


----------



## KillerBee33

Any word on a PascalBiosTweaker?
AB or Presision can't fix Pwr. & VRel. capp.


----------



## uberwootage

Just went from an i5 4690k and a GTX 970 strix to a i7 6700k @ 4.9ghz Nvidia GTX 1080 FE Gigabyte Gaming 5 and some Crucial ballistix elite

And it was a nice jump. Expensive but nice. Then i went on Amazon ordered the GTX 980ti hybrid water cooler slapped it on and now im sitting solid at 2139 on the core and my load temps havce never went over 48c.


----------



## boredgunner

Quote:


> Originally Posted by *uberwootage*
> 
> i7 6700k @ 4.9ghz


Wow.


----------



## bp7178

Quote:


> Originally Posted by *boredgunner*
> 
> Wow.


My i6700k will do 4.9Ghz, but the voltage/temp is a bit too high for long term use IMO. With a fixed voltage of 1.44 it will run IBT and Prime 95, but core temp will hit 95 at times.

At 4.8Ghz and an adaptive voltage of .185, temps never go over 85 even under very heavy IBT/Prime 95 loads.

Back on the 1080s though, both of mine can run at 2100Mhz (+200) with the memory at 5555Mhz (+550). A few posts ago someone wrote about the VRAM being too high causing a loss of FPS because it was error checking. Is there a good benchmark/quick test to check for this?


----------



## versions

Quote:


> Originally Posted by *boredgunner*
> 
> Wow.


Yep, pretty lucky there. The most mine will do is 4.8, but I'm happy with that.
Quote:


> Originally Posted by *bp7178*
> 
> My i6700k will do 4.9Ghz, but the voltage/temp is a bit too high for long term use IMO. With a fixed voltage of 1.44 it will run IBT and Prime 95, but core temp will hit 95 at times.
> 
> At 4.8Ghz and an adaptive voltage of .185, temps never go over 85 even under very heavy IBT/Prime 95 loads.
> 
> Back on the 1080s though, both of mine can run at 2100Mhz (+200) with the memory at 5555Mhz (+550). A few posts ago someone wrote about the VRAM being too high causing a loss of FPS because it was error checking. Is there a good benchmark/quick test to check for this?


Delid it. I saw a 20C temperature drop when I did. Of course assuming you have good cooling and the reason it's getting hot isn't because you have a 212 EVO.


----------



## uberwootage

Quote:


> Originally Posted by *versions*
> 
> Yep, pretty lucky there. The most mine will do is 4.8, but I'm happy with that.
> Delid it. I saw a 20C temperature drop when I did. Of course assuming you have good cooling and the reason it's getting hot isn't because you have a 212 EVO.


I might dlid. I got a 3d printer so i can make the part to do it. I load under water at 69c. A little hot but meh. Typical temps 38-55c range. Right now im printing cover for the 1080 stock shroud to cover the pump from the gtx 980ti hybrid cooler.


----------



## bp7178

Quote:


> Originally Posted by *versions*
> 
> Yep, pretty lucky there. The most mine will do is 4.8, but I'm happy with that.
> Delid it. I saw a 20C temperature drop when I did. Of course assuming you have good cooling and the reason it's getting hot isn't because you have a 212 EVO.




I actually own a 212 EVO. Of course, its not being used at the moment.

I did the delid thing, first w Coollabartory Ultra, then with just Gelid Extreme. Same-ish results. I just hit a point where I needed a LOT more voltage to be stable, and with that came the heat. There's these odd point with overclocking where you add maybe .02 volts, and gain +10C.

I can get through Firestrike at 4.9 no problem, and it doesn't REALLY make that much of a difference.

The first 6700k I owned would do 4.6Ghz at the same voltage this one will do 4.9Ghz. Its all the lottery.

Tonight I played The Division for a few hours, CPU averaged about 60 C and the 1080s at 2100Mhz were in the very low 40 C range. Furmark does way more to tax the power limit then actually playing a game. I don't think I ever saw anything more than 80% power in game when the GPU were even at 90%+.


----------



## Jquala

Quote:


> Originally Posted by *bp7178*
> 
> My i6700k will do 4.9Ghz, but the voltage/temp is a bit too high for long term use IMO. With a fixed voltage of 1.44 it will run IBT and Prime 95, but core temp will hit 95 at times.
> 
> At 4.8Ghz and an adaptive voltage of .185, temps never go over 85 even under very heavy IBT/Prime 95 loads.
> 
> Back on the 1080s though, both of mine can run at 2100Mhz (+200) with the memory at 5555Mhz (+550). A few posts ago someone wrote about the VRAM being too high causing a loss of FPS because it was error checking. Is there a good benchmark/quick test to check for this?


Check this out. My i7 6700k can run 4.8ghz @1.42 [email protected] stable 5hr occt(January) [email protected] stable 5hr occt(April) now 4.9ghz 1.504v stable 5hr occt. Why is my cou degrading so quickly? I intentionally left out temps because I delid and I never exceed 69C on any core even at 1.504v


----------



## Jquala

Quote:


> Originally Posted by *chronicfx*
> 
> The FE's are obv. better. They have a higher overclock ability along with the higher temps, if you were gonna squeeze anything extra out of those it would be the FE on water.


Right now I have a 1080 gtx sc with newegg you think I should return it and get a FE? Or do you think the sc is the exact same reference board as the FE with the ACX and factory tuning. I'm afraid that nvidia may have done some preliminary binning for FE and then gave the rest to the aib. Just my conspiracy theory


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *manurap46*
> 
> 
> 
> 
> 
> 
> 
> gtx 1080 Ichill x3 2177/2800mhz


Wowser!


----------



## RJacobs28

Is there a guide on Overclocking 1080's anywhere? Can't get my pair over 1860MHz.


----------



## Jquala

Quote:


> Originally Posted by *chronicfx*
> 
> The FE's are obv. better. They have a higher overclock ability along with the higher temps, if you were gonna squeeze anything extra out of those it would be the FE on water.


Right now I have a 1080 gtx sc with newegg you think I should return it and get a FE? Or do you think the sc is the exact same reference board as the FE with the ACX and factory tuning. I'm afraid that nvidia may have done some preliminary binning for FE and then gave the res


----------



## Jpmboy

Quote:


> Originally Posted by *bp7178*
> 
> 
> 
> I actually own a 212 EVO. Of course, its not being used at the moment.
> 
> I did the delid thing, first w Coollabartory Ultra, then with just Gelid Extreme. Same-ish results. I just hit a point where I needed a LOT more voltage to be stable, and with that came the heat. There's these odd point with overclocking where you add maybe .02 volts, and gain +10C.
> 
> I can get through Firestrike at 4.9 no problem, and it doesn't REALLY make that much of a difference.
> 
> The first 6700k I owned would do 4.6Ghz at the same voltage this one will do 4.9Ghz. Its all the lottery.
> 
> Tonight I played The Division for a few hours, CPU averaged about 60 C and the 1080s at 2100Mhz were in the very low 40 C range. Furmark does way more to tax the power limit then actually playing a game. I don't think I ever saw anything more than 80% power in game when the GPU were even at 90%+.


that's because furmark IS a power virus.


----------



## chronicfx

Quote:


> Originally Posted by *bp7178*
> 
> 
> 
> What you have here is what I would call a first world problem. You are raising the CPU in a rich exotic mansion and it just doesn't feel like it has to any hard work
> 
> 
> 
> 
> 
> 
> 
> Can't you just hire a pentium to do the other 100MHz?


----------



## BrainSplatter

For anyone interested here is a link to the Inno 3X 1080 BIOS
Quote:


> Originally Posted by *bp7178*
> 
> Back on the 1080s though, both of mine can run at 2100Mhz (+200) with the memory at 5555Mhz (+550). A few posts ago someone wrote about the VRAM being too high causing a loss of FPS because it was error checking. Is there a good benchmark/quick test to check for this?


One way of checking this is to run Valley with 8xMSAA in a window and pausing it. Then u can adjust the memory speed in software (e.g. MSI Afterburner) while u observe the FPS.


----------



## Setzer

Quote:


> Originally Posted by *Setzer*
> 
> Screen flickering/artifacts when logging in and the next 5 seconds? I get that, but then it stops and is never an issue until reboot.


I now get this flickering constantly...


----------



## ChevChelios

__
https://www.reddit.com/r/4liseq/now_that_the_1080_and_1070_are_out_how_is_the/

so does 1080 support 10-bit color via DP ?


----------



## bp7178

Quote:


> Originally Posted by *Jquala*
> 
> Check this out. My i7 6700k can run 4.8ghz @1.42 [email protected] stable 5hr occt(January) [email protected] stable 5hr occt(April) now 4.9ghz 1.504v stable 5hr occt. Why is my cou degrading so quickly? I intentionally left out temps because I delid and I never exceed 69C on any core even at 1.504v


What are you using to read your temps? I wouldn't take temps for granted. Heat is going to be what kills it and you can't underestimate how even slight increases in voltage will cause exponentially more heat.

When I first got my 6700K I could boot into windows and run the older version of P95 at 5Ghz. Shortly thereafter, I noticed I needed a lot more voltage to keep stable at those higher clocks. I took that as a clue and decided to just leave it at 4.8Ghz with an adaptive offset so most of the time is just humming along with a very low core voltage at CPU package temps at 26-27C.
Quote:


> Originally Posted by *BrainSplatter*
> 
> For anyone interested here is a link to the Inno 3X 1080 BIOS
> One way of checking this is to run Valley with 8xMSAA in a window and pausing it. Then u can adjust the memory speed in software (e.g. MSI Afterburner) while u observe the FPS.


Thanks. I'm going to give that a try.


----------



## x7007

Does any one has an issue with the Fans ? every time the Fan Curve goes up 1% the game stuttering. using MsiAfterburner to set a Direct curve which means it will go straight to 75% from 40% and not 41%,42% depends on the temperature fix the issue.

No one has an issue like this ? because I wonder if nvidia had an issue with the fan already in the Founder Edition , why there wouldn't an issue on other fans as well. it does happens to other people , this is a straight fix that fix the issue after just ALT + TAB, nothing else changes.

Kinda like this issue

http://www.tomsguide.com/answers/id-2325218/nvidia-card-stuttering-games-videos-transparency.html

micro-freezing

INNO3D GTX1080 Ichill x3


----------



## KillerBee33

Quote:


> Originally Posted by *x7007*
> 
> Does any one has an issue with the Fans ? every time the Fan Curve goes up 1% the game stuttering. using MsiAfterburner to set a Direct curve which means it will go straight to 75% from 40% and not 41%,42% depends on the temperature fix the issue.
> 
> No one has an issue like this ? because I wonder if nvidia had an issue with the fan already in the Founder Edition , why there wouldn't an issue on other fans as well. it does happens to other people , this is a straight fix that fix the issue after just ALT + TAB, nothing else changes.
> 
> Kinda like this issue
> 
> http://www.tomsguide.com/answers/id-2325218/nvidia-card-stuttering-games-videos-transparency.html
> 
> micro-freezing
> 
> INNO3D GTX1080 Ichill x3


Try this.


27%@42-70%@65-100%@75


----------



## qLiixz

Got my 1080 to 2101/5515 MHz stable with the Inno 3d X3 Bios. This is a Gainward Phoenix GLH with an Accelero Xtreme IV.


----------



## BrightCandle

I seem to have nothing but issues with MSI Afterburner. It seems to cause quite a bit of stuttering in games and causes a dramatic performance problem with OBS Studio. It might be a very popular overclocking tool but its also **** right? Any way to lock my overclock in so I don't need the tool running constantly in the background causing problems?


----------



## KillerBee33

Quote:


> Originally Posted by *BrightCandle*
> 
> I seem to have nothing but issues with MSI Afterburner. It seems to cause quite a bit of stuttering in games and causes a dramatic performance problem with OBS Studio. It might be a very popular overclocking tool but its also **** right? Any way to lock my overclock in so I don't need the tool running constantly in the background causing problems?


----------



## x7007

Quote:


> Originally Posted by *KillerBee33*
> 
> Try this.
> 
> 
> 27%@42-70%@65-100%@75


Ye I tried that I tried some curve just for test and it did fix the issue, I'm trying not to use curve now just to see if it could be any other issue.

I have the weird issue the clock stuck on 1648 mhz or something same like 970 GTX was stuck on 748 mhz or 648 mhz .. There is one or 2 bios settings that should fix it, I fixed it once but I changed the bios settings again. It could be VRM Spread Spectrum or CPU Power Phase or something close to it. I didn't touch much so I'll know what could cause the issue.


----------



## Martin778

Quote:


> Originally Posted by *bp7178*
> 
> What are you using to read your temps? I wouldn't take temps for granted. Heat is going to be what kills it and you can't underestimate how even slight increases in voltage will cause exponentially more heat.
> 
> When I first got my 6700K I could boot into windows and run the older version of P95 at 5Ghz. Shortly thereafter, I noticed I needed a lot more voltage to keep stable at those higher clocks. I took that as a clue and decided to just leave it at 4.8Ghz with an adaptive offset so most of the time is just humming along with a very low core voltage at CPU package temps at 26-27C.
> Thanks. I'm going to give that a try.


Jesus, 1.45-1.5v on a Skylake. No wonder it's degrading









p.s
Anybody have any info on when EVGA is going to ship cards from the step-up program? I'm already waiting more than a month.


----------



## KillerBee33

Quote:


> Originally Posted by *x7007*
> 
> Ye I tried that I tried some curve just for test and it did fix the issue, I'm trying not to use curve now just to see if it could be any other issue.
> 
> I have the weird issue the clock stuck on 1648 mhz or something same like 970 GTX was stuck on 748 mhz or 648 mhz .. There is one or 2 bios settings that should fix it, I fixed it once but I changed the bios settings again. It could be VRM Spread Spectrum or CPU Power Phase or something close to it. I didn't touch much so I'll know what could cause the issue.


Can't wait for PascalBiosTweaker
Had same FAn settings for few years now on reference cards , works great


----------



## Spiriva

Quote:


> Originally Posted by *KillerBee33*
> 
> Can't wait for PascalBiosTweaker
> Had same FAn settings for few years now on reference cards , works great


1080 been out over a month now, you gotta wonder if there will ever be a "PascalBiosTweaker". Maybe its not coming at all.


----------



## KillerBee33

Quote:


> Originally Posted by *Spiriva*
> 
> 1080 been out over a month now, you gotta wonder if there will ever be a "PascalBiosTweaker". Maybe its not coming at all.


1080 BIOS is floating around , someone is tweaking it








Just thinking out loud here


----------



## Spiriva

Quote:


> Originally Posted by *KillerBee33*
> 
> 1080 BIOS is floating around , someone is tweaking it
> 
> 
> 
> 
> 
> 
> 
> 
> Just thinking out loud here


Ofc i hope it will be released too, but 1080 been out a decent amount of time now and so far there is no sight of a bios tweaker.
So who knows , maybe we will sit here in another month or two and continue hoping for a bios tweaker that wount show


----------



## KillerBee33

Quote:


> Originally Posted by *Spiriva*
> 
> Ofc i hope it will be released too, but 1080 been out a decent amount of time now and so far there is no sight of a bios tweaker.
> So who knows , maybe we will sit here in another month or two and continue hoping for a bios tweaker that wount show


Using AB or any other OCS makes no sense to me . Just testing . Also waiting for any good Hybrid AIO for the 10 Series.
By then Tweaker will be out or Custom BIOS for each Manufacturer will be available








Not a statement , just my hopes.


----------



## Outcasst

Quote:


> Originally Posted by *Spiriva*
> 
> 1080 been out over a month now, you gotta wonder if there will ever be a "PascalBiosTweaker". Maybe its not coming at all.


That's not the problem. You can quite easily edit the BIOS already with Hex values if you know how. But there's no way to flash them until a modded Nvflash is working and available.


----------



## Ascendor81

Gigabyte released a new BIOS for their G1 recently, it has a NVFLASH in the bios zip file, that I have used on my Asus STRIX non OC to flash OC BIOS. It works. But, I think these BIOS have vaild certificates. We need NVFLASH modded to flash non-cert BIOS. Amiright?


----------



## ChevChelios

Quote:


> Gigabyte released a new BIOS for their G1 recently


what difference to OG G1 bios ?


----------



## grimboso

Looking at for example this review http://hexus.net/tech/reviews/graphics/93494-evga-geforce-gtx-1080-ftw/?page=11 total power consumption is 250. Add 100 for cpu OC, fans and two pumps. Will an ax760 run sli ftw cards? I know it Will probably push towards limits, but it should be doable since the Cards dont OC that much.


----------



## Jquala

Quote:


> Originally Posted by *Martin778*
> 
> Jesus, 1.45-1.5v on a Skylake. No wonder it's degrading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> p.s
> Anybody have any info on when EVGA is going to ship cards from the step-up program? I'm already waiting more than a month.


[email protected] around 65c doesn't sound too outrageous right? Even at 5ghz hottest core I've seen is 69c


----------



## Jquala

No ones answered my questions but both my FTW and FE cards can go up in power limit to 130% 1.093 is still the highest it goes though


----------



## axiumone

Quote:


> Originally Posted by *grimboso*
> 
> Looking at for example this review http://hexus.net/tech/reviews/graphics/93494-evga-geforce-gtx-1080-ftw/?page=11 total power consumption is 250. Add 100 for cpu OC, fans and two pumps. Will an ax760 run sli ftw cards? I know it Will probably push towards limits, but it should be doable since the Cards dont OC that much.


My sli 1080 acx @2050 and [email protected] pull about 570 watts from the wall max. So, you will probably be fine.


----------



## chronicfx

accidently posted this in the wrong thread... but

Speaking of dremels and shaving... I ordered two EVGA Pro SLI V2 long bridges and are running them side by side (both with the top fancy heatspreaders and LED removed) while I wait and if there is any reason for an HB bridge. They overlap slightly, so I had to use a flat lego piece to space one of them out a bit so it could still be level across but not pushed all the way in as to not have a crooked line-up, I will try to get a pic. Is it possible to shave a little bit from each SLI bridge to have them flush together? As in maybe a half mm from the left side of one and a half mm from the right side of another? What tool would I use?

Pics:


----------



## shremi

Guys need some help here.... So i just got a hold of one 1080 ACX 3.0 since i heard sli was broken for VR i decided to try out the 1080 ..... It is a great card but however i can seem to overclock a damn bit with this card it seems that overclocked has changed with this gen of cards ???

This is what i have been doing as always with other cards before they go underwater ..... Set fan speed to 85% up the power limit and the temp limit to the maximum and load Heaven for a first run so i can compare my results stock vs overclocking ..... However the card decides to do its own thing it boost around the 2020 at start slowly going down in steps until i hit like 1945 or something .....Temps are in check and i haven't even added a single thing to the clock or memory ...

What am i missing here ??? How are you guys getting constant clock speeds ??? I have tried to search around and it seems that most throttling becomes from heat but my card starts downclocking at around the 50C

TIA

Shremi


----------



## supermi

Quote:


> Originally Posted by *shremi*
> 
> Guys need some help here.... So i just got a hold of one 1080 ACX 3.0 since i heard sli was broken for VR i decided to try out the 1080 ..... It is a great card but however i can seem to overclock a damn bit with this card it seems that overclocked has changed with this gen of cards ???
> 
> This is what i have been doing as always with other cards before they go underwater ..... Set fan speed to 85% up the power limit and the temp limit to the maximum and load Heaven for a first run so i can compare my results stock vs overclocking ..... However the card decides to do its own thing it boost around the 2020 at start slowly going down in steps until i hit like 1945 or something .....Temps are in check and i haven't even added a single thing to the clock or memory ...
> 
> What am i missing here ??? How are you guys getting constant clock speeds ??? I have tried to search around and it seems that most throttling becomes from heat but my card starts downclocking at around the 50C
> 
> TIA
> 
> Shremi


You just described boost 3.0 , 3 times the *** are my clocks bouncing around. They start the first voltage/core clock throttle step somewhere in the upper 40's so you can see it is working how intended from how you are throttling.


----------



## dentnu

Quote:


> Originally Posted by *shremi*
> 
> Guys need some help here.... So i just got a hold of one 1080 ACX 3.0 since i heard sli was broken for VR i decided to try out the 1080 ..... It is a great card but however i can seem to overclock a damn bit with this card it seems that overclocked has changed with this gen of cards ???
> 
> This is what i have been doing as always with other cards before they go underwater ..... Set fan speed to 85% up the power limit and the temp limit to the maximum and load Heaven for a first run so i can compare my results stock vs overclocking ..... However the card decides to do its own thing it boost around the 2020 at start slowly going down in steps until i hit like 1945 or something .....Temps are in check and i haven't even added a single thing to the clock or memory ...
> 
> What am i missing here ??? How are you guys getting constant clock speeds ??? I have tried to search around and it seems that most throttling becomes from heat but my card starts downclocking at around the 50C
> 
> TIA
> 
> Shremi


What you are experiencing is a feature on all 1080 its called GPU boost 3.0. Once your temps hit around 50c the card start to lower its clock speed. From what I read it is a side affect of going to the 16nm finfet. This is just the way it is now till we can start modding the bios and remove it. GPU boost 3.0 is just security measure so you don't fry your card its not a feature. Your card is working correctly if you want it to stop downclocking then you have to make sure your temps don't hit 50c or above.


----------



## Drewminus

Hey I was just wondering if someone with a gigabyte 1080 gaming g1 could measure its actual length for me, the specs on their website are different to the specs on most retailers do I'm not sure exactly how long it is.


----------



## shremi

Quote:


> Originally Posted by *supermi*
> 
> You just described boost 3.0 , 3 times the *** are my clocks bouncing around. They start the first voltage/core clock throttle step somewhere in the upper 40's so you can see it is working how intended from how you are throttling.


Quote:


> Originally Posted by *dentnu*
> 
> What you are experiencing is a feature on all 1080 its called GPU boost 3.0. Once your temps hit around 50c the card start to lower its clock speed. From what I read it is a side affect of going to the 16nm finfet. This is just the way it is now till we can start modding the bios and remove it. GPU boost 3.0 is just security measure so your fry your card its not a feature. Your card is working correctly if you want it to stop downclocking then you have to make sure your temps don't hit 50c or above. Good luck


Thanks guys for the quick reply ! Well this F*****g sucks .... Then i guess you guys keep adding core and deal with the throttling ??? or whats the best overclocking method for this cards ???

What i am thinking of doing is to start heaven and let it go for a couple of loops and then start adding core and mem ???


----------



## dentnu

Quote:


> Originally Posted by *shremi*
> 
> Thanks guys for the quick reply ! Well this F*****g sucks .... Then i guess you guys keep adding core and deal with the throttling ??? or whats the best overclocking method for this cards ???
> 
> What i am thinking of doing is to start heaven and let it go for a couple of loops and then start adding core and mem ???


Yep it does suck. My suggestion is keep pushing the core till hit you hit a wall then leave heaven running and see how far it downclocks. The higher the temps the lower the clocks. I myself put a g10 and a kraken x41 on it and don't see any downclocks since my max temps are 40c. If watercooling is an option for you then that would be the route to take to completely eliminates it.


----------



## orlfman

Quote:


> Originally Posted by *Setzer*
> 
> I now get this flickering constantly...


its a known problem

https://forums.geforce.com/default/topic/939358/geforce-1000-series/gtx-1080-flickering-issue/

https://forums.geforce.com/default/topic/943822/geforce-drivers/announcing-geforce-hotfix-driver-368-51-released-6-17-16-/

hotfix driver doesn't appear to fix it for everyone. its very random. most complaints are coming from gsync high refresh rate users while a few coming from non gsync and non gysnc and low refresh rate.

no eta from nvidia. their latest reply sounds like they now just started to look into it....


----------



## Setzer

I actually fixed it by using DDU to uninstall everything (default setting).
Re-installed drivers and GeForce Experience, and it's gone!


----------



## barsh90

Can I just use 2 regular ribbon bridges instead of that fancy HB Nvidia bridge for 2 1080s?


----------



## supermi

Quote:


> Originally Posted by *barsh90*
> 
> Can I just use 2 regular ribbon bridges instead of that fancy HB Nvidia bridge for 2 1080s?


If same length etc , seems to work at least till 4k 60hz range.


----------



## bp7178

With two OC'd 1080s feeding a Dell 34" UHD, I saw zero difference between one ribbon, two ribbon, an old style PCB 3-way bridge, and the new HB bridge.


----------



## grimboso

Quote:


> Originally Posted by *axiumone*
> 
> My sli 1080 acx @2050 and [email protected] pull about 570 watts from the wall max. So, you will probably be fine.


That is reassuring, thanks for your reply!

My 6700k sits at 4.7 right now, because I didnt find the added temp/volt going to 4.8 to be worth it (1.42). It's not high to be honest, but I see 1.45 spikes when going from idle to game (llc-5 on a z170 sabertooth).

What other various components are you using? I am going to run two D5 pumps, 2 fan controllers and a total of 12 fans. Not that these draw a lot of power, things add up in the end


----------



## x7007

Does anyone have HIGH DPC issue with 1080 ? using Latency Mon I have jumps every couple seconds .

It was working fine with 970


----------



## schoolofmonkey

Hey guys

I'm wondering do the Gigabyte GTX1080 FE cards have the stupid warranty sticker on the heatsink screws?
I want to grab a EKWB waterblock for the Predator 360 and here the retailers are a little funny about the stickers being broken.

I know eVGA doesn't, but I can't find one in store.


----------



## ChevChelios

http://www.gigabyte.com/products/product-page.aspx?pid=5915#bios

is this the new G1 BIOS ? What does it do ? Is it to enable Gaming/OC mode by default or some other changes ?

Can you flash this on any G1 1080 unit safely ?


----------



## ValSidalv21

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Hey guys
> 
> I'm wondering do the Gigabyte GTX1080 FE cards have the stupid warranty sticker on the heatsink screws?
> I want to grab a EKWB waterblock for the Predator 360 and here the retailers are a little funny about the stickers being broken.
> 
> I know eVGA doesn't, but I can't find one in store.


None on mine besides that ugly GIGABYTE sticker on the side which I removed anyway, not sure if it voids warranty









I was also looking at the same combination from EK as you, but it turns out the Air 540 does not support the 360 radiator without a mod. Now I don't know if the Predator 240 will be enough to cool the CPU + GPU. I think I'll stick with the FE heatsink.
Quote:


> Originally Posted by *ChevChelios*
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5915#bios
> 
> is this the new G1 BIOS ? What does it do ? Is it to enable Gaming/OC mode by default or some other changes ?
> 
> Can you flash this on any G1 1080 unit safely ?


According to the provided info it modifies the fan duty, and yes you can safely flash it on any 1080 G1


----------



## Hilpi234

Anyone owns an *"Inno 3d i Chill x3/4 V2"*, it has 25 Mhz less OC ?


----------



## ayaya119

Anybody can share the bios of HOF1080 LIMITED?

thx!


----------



## Setzer

Well I spoke too early. After a reboot, the flickering is now constantly back, same as before...
Nvidia, please fix this mess!


----------



## ChevChelios

seems like setting to 120Hz is the only solution atm to 1080 + 144hz flickering issue, if the hotfix doesnt help

hope they sort this out ASAP


----------



## Setzer

Going from 165hz to 120hz, and then back to 165hz seemed to "fix" it.


----------



## ChevChelios

Im pretty sure thats still only until the next reboot (you can also go 144-120-144 to "fix" it)

only 120Hz will remain flickerless after a restart

someone also said for him going to 160Hz helped, dunno if he tried after a restart

someone with a 1070 said they dont get this, it might be 1080 only

Im also not sure if windows version matters


----------



## gerbil80

Quote:


> Originally Posted by *ChevChelios*
> 
> seems like setting to 120Hz is the only solution atm to 1080 + 144hz flickering issue, if the hotfix doesnt help
> 
> hope they sort this out ASAP


Set the desktop refresh rate to 120hz (Screen) and set the global 3D settings to "use highest available refresh rate" and it will stop flickering still use 144hz > in games (3D).


----------



## ChevChelios

Quote:


> Originally Posted by *gerbil80*
> 
> Set the desktop refresh rate to 120hz (Screen) and set the global 3D settings to "use highest available refresh rate" and it will stop flickering still use 144hz > in games (3D).


hmm, thats pretty neat, thx

will this method also make it use 165Hz (instead of 144) in games if your monitor is OCed to 165 ?


----------



## KillerBee33

Any news on NVFlash & BiosTweaker?


----------



## ChevChelios

wonder if custom BIOS will allow us to get that 2300-2400 Ghz or will that remain a pipe dream


----------



## KillerBee33

Quote:


> Originally Posted by *ChevChelios*
> 
> wonder if custom BIOS will allow us to get that 2300-2400 Ghz or will that remain a pipe dream


980 ran from 1127 to 1557 @ 1.25 V. hoping this will be the same deal


----------



## gerbil80

Quote:


> Originally Posted by *ChevChelios*
> 
> hmm, thats pretty neat, thx
> 
> will this method also make it use 165Hz (instead of 144) in games if your monitor is OCed to 165 ?


I don't i'm afraid, but I guess if t the monitor is 144hz native, and the OC is manual, then the driver would force 144hz and the monitor would use 165 as configured.... interesting one that, try it and let us know


----------



## axiumone

Quote:


> Originally Posted by *grimboso*
> 
> That is reassuring, thanks for your reply!
> 
> My 6700k sits at 4.7 right now, because I didnt find the added temp/volt going to 4.8 to be worth it (1.42). It's not high to be honest, but I see 1.45 spikes when going from idle to game (llc-5 on a z170 sabertooth).
> 
> What other various components are you using? I am going to run two D5 pumps, 2 fan controllers and a total of 12 fans. Not that these draw a lot of power, things add up in the end


4 sata ssd's, 1 sata hdd, 1 pcie ssd, 8 x 120mm 2000 rpm fans, corsair h100i. I think that's about it.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ChevChelios*
> 
> wonder if custom BIOS will allow us to get that 2300-2400 Ghz or will that remain a pipe dream


I dont think so, unless you can keep your core way cold


----------



## zGunBLADEz

Quote:


> Originally Posted by *x7007*
> 
> Does anyone have HIGH DPC issue with 1080 ? using Latency Mon I have jumps every couple seconds .
> 
> It was working fine with 970


yup
same here, its somehow fixable putting your pc on high performance mode instead of balanced on the power settings it takes longer to hit the fan XD


----------



## escalibur

Asus Strix, EVGA SuperClocked, Inno iChill X3 and AMP! Extreme review in russian:


ASUS ROG Strix GeForce GTX 1080 (STRIX-GTX1080-O8G-GAMING)
EVGA GeForce GTX 1080 SC GAMING ACX 3.0 (08G-P4-6183-KR)
Gigabyte GeForce GTX 1080 G1 Gaming (GV-N1080G1 GAMING-8GD)
Inno3D iChill GeForce GTX 1080 X3 (C108V3-2SDN-P6DNX)
Zotac GeForce GTX 1080 AMP! Extreme Edition (ZT-P10800B-10P)

http://www.hardwareluxx.ru/index.php/artikel/hardware/grafikkarten/38836-roundup-5x-geforce-gtx-1080-custom-design-test.html


----------



## x7007

This is after 5 hrs IDLE

Please open a ticket so nvidia will fix it ASAP

https://nvidia-submit.custhelp.com/app/ask


----------



## ChevChelios

Quote:


> Originally Posted by *x7007*
> 
> This is after 5 hrs IDLE
> 
> Please open a ticket so nvidia will fix it ASAP
> 
> https://nvidia-submit.custhelp.com/app/ask


what am I looking at here ?


----------



## x7007

This is after 5 hrs





Please open a ticket so nvidia will fix it ASAP
Quote:


> Originally Posted by *ChevChelios*
> 
> what am I looking at here ?


That there is an issue with the DRIVER or how the driver is managed with the Card, it causes interrupts which on the user side higher number or jumps means stuttering, audio skipping, lags, slow computer, eventually unable to play properly or even talk in voice chat. and the cause is Nvidia Drivers system file since that's the only things we changed. Download the program and see for yourself .

http://www.resplendence.com/download/LatencyMon.exe


----------



## Waitng4realGPU

Quote:


> Originally Posted by *ChevChelios*
> 
> wonder if custom BIOS will allow us to get that 2300-2400 Ghz or will that remain a pipe dream


Just a hype train ready to derail hard.......................


----------



## zGunBLADEz

Somebody knows whats the size in mm in the square area?


----------



## AlanAlberino

No custom bios yet right? Didn't want to read 215 pages to check...


----------



## bp7178

Quote:


> Originally Posted by *AlanAlberino*
> 
> No custom bios yet right? Didn't want to read 215 pages to check...


Nope.


----------



## orlfman

Quote:


> Originally Posted by *ChevChelios*
> 
> Im pretty sure thats still only until the next reboot (you can also go 144-120-144 to "fix" it)
> 
> only 120Hz will remain flickerless after a restart
> 
> someone also said for him going to 160Hz helped, dunno if he tried after a restart
> 
> someone with a 1070 said they dont get this, it might be 1080 only
> 
> Im also not sure if windows version matters


its 1080 owners as well. even a few 9xx series owners have crept out too.

https://forums.geforce.com/default/topic/939358/geforce-1000-series/gtx-1080-flickering-issue/

https://forums.geforce.com/default/topic/943822/geforce-drivers/announcing-geforce-hotfix-driver-368-51-released-6-17-16-/

it started off with 1080 owners. then when the 1070 was released, 1070 users started to pop up too.

people are not even sure if its a high refresh rate issue as there are now quite a few users complaining about it on 60hz max displays. gsync displays to non gsync. displayport and dvi. single and multi monitor. its completely random. people's "fixes" have been all over the place. some fix it by setting the refresh rate lower. others by setting adaptive or maximum performance in the nvidia control panel. others nothing works.

this has been going on for a little over a month now. last response from nvidia in those threads was july 2nd and they NOW asked people to start posting their displays, graphic cards, and refresh rates. pretty ridiculous on nvidias part.


----------



## KillerBee33

Quote:


> Originally Posted by *orlfman*
> 
> its 1080 owners as well. even a few 9xx series owners have crept out too.
> 
> https://forums.geforce.com/default/topic/939358/geforce-1000-series/gtx-1080-flickering-issue/
> 
> https://forums.geforce.com/default/topic/943822/geforce-drivers/announcing-geforce-hotfix-driver-368-51-released-6-17-16-/
> 
> it started off with 1080 owners. then when the 1070 was released, 1070 users started to pop up too.
> 
> people are not even sure if its a high refresh rate issue as there are now quite a few users complaining about it on 60hz max displays. gsync displays to non gsync. displayport and dvi. single and multi monitor. its completely random. people's "fixes" have been all over the place. some fix it by setting the refresh rate lower. others by setting adaptive or maximum performance in the nvidia control panel. others nothing works.
> 
> this has been going on for a little over a month now. last response from nvidia in those threads was july 2nd and they NOW asked people to start posting their displays, graphic cards, and refresh rates. pretty ridiculous on nvidias part.


I have no flickering issues but NVIDIA has a HotFix driver addressing this issue
http://nvidia.custhelp.com/app/answers/detail/a_id/4166/~/geforce-hotfix-driver-368.51
Tried installing it on a 1080 , FAILED.


----------



## ChevChelios

Im not sure I believe those complains about 60Hz non Gsync monitors flickering on older cards

or at least its another issue .. not sure where it would even come from suddenly and why would a regular 60hz monitor start flickering

however the 1080 + 144hz (Gsync) desktop flicker bug seems to be real

you can set desktop to 120hz and keep 144 for games while waiting for driver fix


----------



## orlfman

Quote:


> Originally Posted by *KillerBee33*
> 
> I have no flickering issues but NVIDIA has a HotFix driver addressing this issue
> http://nvidia.custhelp.com/app/answers/detail/a_id/4166/~/geforce-hotfix-driver-368.51
> Tried installing it on a 1080 , FAILED.


i linked the hotfix (its my second link) and if you read through the hotfix it doesn't fix desktop flickering (only fixes flickering in fullscreen games) and nor does it fix game flickering for a good number of the users who installed it.

i too have a 1080 and haven't noticed any flickering but a good amount have been. nearly a month and 21 pages of complaints in one thread and another 8 in the hotfix thread.

i feel sorry for those who rma their cards or sent back their monitors thinking it was a hardware issue. a few in the main thread have already stated they sent back their cards due to nvidia not making it clear if its software or hardware. with pascal being in short supply they won't be getting replacement cards for awhile....

edit:
also you're not the only one having issues with failing driver installs. a lot of others have been having similar issues along with tdr's.


----------



## KillerBee33

Quote:


> Originally Posted by *orlfman*
> 
> i linked the hotfix (its my second link) and if you read through the hotfix it doesn't fix desktop flickering (only fixes flickering in fullscreen games) and nor does it fix game flickering for a good number of the users who installed it.
> 
> i too have a 1080 and haven't noticed any flickering but a good amount have been. nearly a month and 21 pages of complaints in one thread and another 8 in the hotfix thread.
> 
> i feel sorry for those who rma their cards or sent back their monitors thinking it was a hardware issue. a few in the main thread have already stated they sent back their cards due to nvidia not making it clear if its software or hardware. with pascal being in short supply they won't be getting replacement cards for awhile....
> 
> edit:
> also you're not the only one having issues with failing driver installs. a lot of others have been having similar issues along with tdr's.










Not sure what to say here, sending a card back bcz. of a bad driver sounds rather rational .
I'm locked to 60Hz run all games in full screen VSYNC on. Single game always had this issue JustCause 3 , 980 and now 1080 flickers .


----------



## zGunBLADEz

anyone with a MSI GTX 1080 SEA HAWK X the one with the AiO from corsair?

That can provide a bios...


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> anyone with a MSI GTX 1080 SEA HAWK X the one with the AiO from corsair?
> 
> That can provide a bios...


Any idea what model 1080 Sea hawk uses , H55 , H75 or H90?


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Any idea what model 1080 Sea hawk uses , H55 , H75 or H90?


heres the review
https://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-sea-hawk-x-review,2.html

I dont see no mention on what they are using..

But the overclocking is one of the highests i have seen in nvidia reference
https://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-sea-hawk-x-review,2.html


----------



## mr2cam

I am not getting this flicker issue other people are getting with my 1080 / XB270HU, running 368.39. Although I am getting the green / black background bug in overwatch, very annoying..


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> heres the review
> https://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-sea-hawk-x-review,2.html
> 
> I dont see no mention on what they are using..
> 
> But the overclocking is one of the highests i have seen in nvidia reference
> https://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-sea-hawk-x-review,2.html


Got it







Good to know btw. The MSI GTX 1080 SEA HAWK X makes good use of the Corsair (H55) closed loop liquid-cooling solution.
Now wondering if purchase H55 or H90 will it have a bracket for GPU


----------



## uberwootage

The gtx 980ti and titan x brackets work. You missed out i bought the gtx 980ti evga hybrtid cooler on amazon for $60. Mounted right up. Loads hit a max of 48c at 2165mhz I think its back upto $100 now but fgor $60 that was one hell of a deal and it works great with the stock FE cooler.


Quote:


> Originally Posted by *KillerBee33*
> 
> Got it
> 
> 
> 
> 
> 
> 
> 
> Good to know btw. The MSI GTX 1080 SEA HAWK X makes good use of the Corsair (H55) closed loop liquid-cooling solution.
> Now wondering if purchase H55 or H90 will it have a bracket for GPU


----------



## KillerBee33

Quote:


> Originally Posted by *uberwootage*
> 
> The gtx 980ti and titan x brackets work. You missed out i bought the gtx 980ti evga hybrtid cooler on amazon for $60. Mounted right up. Loads hit a max of 48c at 2165mhz I think its back upto $100 now but fgor $60 that was one hell of a deal and it works great with the stock FE cooler.


Hehe , i had it in my Amazon basket @ 59$ wile waiting for the 1080 , next time i looked it was 99$ . Have one installed on my 980 , does a good job , highest i've seen was 60 on a hot day AC off. Was thinking of trying something else this time


----------



## VSG

Yeah as mentioned above, the existing hybrid AIO cooling solutions for reference Maxwell boards work for reference Pascal also.


----------



## KillerBee33

Pascals are a bit thiner than Maxwells what i did to 980 probably wont work







But i wouldnt mind tryin


----------



## Ascendor81

They fixed their fan curve.


----------



## uberwootage

Quote:


> Originally Posted by *KillerBee33*
> 
> Hehe , i had it in my Amazon basket @ 59$ wile waiting for the 1080 , next time i looked it was 99$ . Have one installed on my 980 , does a good job , highest i've seen was 60 on a hot day AC off. Was thinking of trying something else this time


Soon as i seen it i texted my buddy at 11:30 911 and told him to order it and amazon prime that lol. It was picked up at 12:30 lol. Nothing like last min. Pascal seems to respond really well to water. $60 you can beat it. Heck i would of bought it just because it was cheap lol.

I haven't water cooled a card since the 6600gt days lol. So im really happy at how flat the temps. Cant wait for the tdp to be unlocked and this really start to show its teeth.


----------



## KillerBee33

Quote:


> Originally Posted by *uberwootage*
> 
> Soon as i seen it i texted my buddy at 11:30 911 and told him to order it and amazon prime that lol. It was picked up at 12:30 lol. Nothing like last min. Pascal seems to respond really well to water. $60 you can beat it. Heck i would of bought it just because it was cheap lol.
> 
> I haven't water cooled a card since the 6600gt days lol. So im really happy at how flat the temps. Cant wait for the tdp to be unlocked and this really start to show its teeth.


YeAHHH , i think every single one of us waiting for Pascal NVFlash and Tweaker








I'm just looking into some options but main idea still wait for EVGA Hybrid AIO for 10 Series








By the way H55 is the Cheapest of the 3 available from Corsair on that SEA HAWK , don't know why just felt to mention .I think it might benefit from H90 and 140MM radiator for 1080.


----------



## Pendulum

As mentioned earlier my card is whining under load / high FPS. However, the card is whining ~70% less now and barely audible when my system fans are running at a higher speed in a silent room.
Completely inaudible with my headphones on with nothing playing on them.

What I found worked best for me to eliminate some of the sound was just breaking in the card, I ran it at 120% power OC'd as high as it would go and just kept it under load for hours at a time. Immediately after exiting a game I would run Heaven benchmark to keep it loaded.
Locking frame rate to 60 fps (60Hz monitor) helped a lot since it whines under high fps.
Running a slightly more aggressive fan setup to drown the sound out more.

I'm probably going to keep the card, it doesn't dip under 2012MHz at 65C after endless hours of stress, it doesn't dip down to 1999MHz until reaching 73C+. Overall I'm happy with the performance of the card so far.


----------



## barsh90

Quote:


> Originally Posted by *bp7178*
> 
> With two OC'd 1080s feeding a Dell 34" UHD, I saw zero difference between one ribbon, two ribbon, an old style PCB 3-way bridge, and the new HB bridge.


Does any one know what's the point if the new hb bridge then?


----------



## BrainSplatter

Quote:


> Originally Posted by *barsh90*
> 
> Does any one know what's the point if the new hb bridge then?


Depends very much on game. New Doom and Witcher 3 in 4K seem to show clear difference.


----------



## ChevChelios

finally got and installed my G1 1080

358.39 WHQL drivers

absolutely no issues so far, fans dont spin in idle/desktop and my temps there are still ~48-49C

in an hour+ of Overwatch, I had max 65C and a max of ~44-45% fan .. 1923 Mhz rock solid stable core for that entire OW session - I didnt do any manual OC yet, didnt even enable G1 OC mode via the Gigabyte utility, didnt flash new BIOS (it seems to be for fans and mine work fine already), this is straight out of the box (which is by default in Gaming Mode)..

I heard there is some 1080 driver issue with OW (?), but I have the default Optimal Power (not Max Performance) set in NVCP, and everything was perfect in OW - GPU utilization, 1923Mhz stable core etc. (however I do have high performance set in windows power settings) .. no crashes/drops or anything .. I also dont have GF experience installed

havent gotten my XB271HU yet, so I cant check 144Hz flicker bug yet


----------



## boredgunner

Is anyone else getting terrible ArmA 3 performance on their GTX 1080? My friend and I are; we got far better performance on our previous Titan X and GTX 980 Ti respectively. Must be a driver issue.


----------



## flexus

Quote:


> Originally Posted by *Jquala*
> 
> [email protected] around 65c doesn't sound too outrageous right? Even at 5ghz hottest core I've seen is 69c


Have you tried at lower voltage? I don`t have skylake, but I had same experience as you when I dialed in my i7-4790K for a couple of years ago, more volt actually hurt the stability.
To get a stable clock at 4.8 ghz I had to back down to 1.3v.


----------



## Romir

I just wanted to report the Koolance GPU-220 universal gpu block works perfectly with the FE. The mounting bolts are thin enough to fit through both the stock backplate and the cooling plate holes.

After 15m in the Witcher 3 I observed 31c gpu temps at stock and then 34c when overclocked. Even at those temps my card isn't stable beyond 2139mhz in games. Underwhelming, but fully expected.

I actually had to adjust my radiator fan profiles because the air water deltaT wasn't rising above 5c with one 480mm radiator cooling it. That kept the fans on the second 480 from spinning up.


----------



## ssgwright

anyone know where I can pick one of these up? everywhere seems to be sold out


----------



## boredgunner

Quote:


> Originally Posted by *ssgwright*
> 
> anyone know where I can pick one of these up? everywhere seems to be sold out


It's a race. Spam F5 until one becomes available. Although I would not buy just any GTX 1080, only one of select few and none of them Founder's Edition (unless you plan on water cooling).


----------



## ssgwright

I am watercooling but what are these "select few" you speak of?


----------



## AllGamer

Quote:


> Originally Posted by *Romir*
> 
> I just wanted to report the Koolance GPU-220 universal gpu block works perfectly with the FE. The mounting bolts are thin enough to fit through both the stock backplate and the cooling plate holes.
> 
> After 15m in the Witcher 3 I observed 31c gpu temps at stock and then 34c when overclocked. Even at those temps my card isn't stable beyond 2139mhz in games. Underwhelming, but fully expected.
> 
> I actually had to adjust my radiator fan profiles because the air water deltaT wasn't rising above 5c with one 480mm radiator cooling it. That kept the fans on the second 480 from spinning up.


ohh! thanks for the info

this is good news for those with FE










now I can mod my current FE for Hybrid water/air using the Koolance universal GPU-220 block.

I wanted to this for a while already but the EVGA stuff mentioned on the Hybrid thread, was not easy to find.

incidentally EVGA mentioned they will release generic AIO GPU blocks compatible with all FE.

well now that you found out the Koolance works, this is much better, since it's not AIO, and it can be added to any existing loop if the system is already on water.


----------



## chronicfx

I have an XB270HU monitor running at 144Hz without any desktop flickering. I wonder if it is only "overclocked" monitors that are having the issue?


----------



## Shadowdane

Quote:


> Originally Posted by *barsh90*
> 
> Does any one know what's the point if the new hb bridge then?


HB SLI Bridge is designed to improve frame-pacing and frame latency at high resolutions.
All it will do is reduce stuttering and micro-stuttering might see a slight improvement in minimum framerates.

I don't know why people thought this was going to bring better fps performance


----------



## boredgunner

Quote:


> Originally Posted by *ssgwright*
> 
> I am watercooling but what are these "select few" you speak of?


MSI Armor and GAMING X (virtually no coil whine reports, good price on the former and very good cooling on the later, good overclocking), Gigabyte XTREME Gaming (very good cooler, minimal coil whine reports, lots of people seem to be getting over 2.1 GHz core, just need to pull out the shroud a bit so that the middle fan doesn't hit it), Zotac AMP Extreme (perhaps the best cooler and overclocker), and maybe the EVGA FTW (haven't seen coil whine reports, very good cooler).


----------



## chronicfx

Quote:


> Originally Posted by *Shadowdane*
> 
> HB SLI Bridge is designed to improve frame-pacing and frame latency at high resolutions.
> All it will do is reduce stuttering and micro-stuttering might see a slight improvement in minimum framerates.
> 
> I don't know why people thought this was going to bring better fps performance


Several games are showing better performance. The example screenshot I am giving is 5k but you cannot deny that it is an "improved" bridge.



Okay screenshot is no good but here is the link, just drag through it until you see all the games they tested.


----------



## Shadowdane

Quote:


> Originally Posted by *chronicfx*
> 
> Several games are showing better performance. The example screenshot I am giving is 5k but you cannot deny that it is an "improved" bridge.
> 
> 
> 
> Okay screenshot is no good but here is the link, just drag through it until you see all the games they tested.


Yah seems it does depend on the game also also if your running very high resolutions, basically 4K or higher.


----------



## L4TINO

I own a MSI GTX 1080 Gaming X i seem to have hit a rather standard silicon lottery, only being able to run 2012mhz core clock stable.
been trying different ways to get it higher but no luck so far, been using the MSI Afterburner 4.3.0 beta 4
i have been hearing about the memory clock holding the fps back if you go to high, havent tested it yet but i have been able to reach it to 5400 MHz stable.
currently upping the voltage to max 1.0810 even reaching 1.0940 rarely, and lowering it to stock at around 1.0500 while under load does not change the max overclock for me.
even tried changing the bios for gaming to oc and back with the new bios that msi released.


----------



## fayzaan

I have the same experience, I can get the card to run at 2012 but then due to temps it drops to 1999 and then usually settles at around 1974. I was not able to get it stable any higher. Are you getting driver crash? that's what happens when I try to go over 2000 core speed.

I have the MSI GTX 1080 Gaming (non-x)


----------



## x7007

About the DPC issue

Do you guys have HPET Enabled ? it seems HPET Enabled fix the issue . The first jump to 1530+ always happens when I start Latencymon, must be something with the power saving of the 1080 . but it doesn't jump to extreme 10000 anymore.




I'm still checking , but I don't see the high jumps anymore !

So whoever is with 1080 must enable HPET , I don't know why, must be they changed something with the pascal to have better timings

So with the bios settings you see in my sig , I don't have the DPC issue and I don't have the Idle Clock issue with the 1080. The throttling issue must happen because Cpu Power Duty Control - Extreme and not T.Probe or Cpu Power Phase Control - Extreme and not optimized. that's what I think from all the tests. I didn't have issue for 7 days when I use same kinds of settings which I don't remember exactly , but I didn't use the Extreme on Cpu Power Phase Control , when I did I think the problem occured. so it's not 100% nvidia problem, it might be just heat on the motherboard..


----------



## Willp1108

hi guys,

I see lots of people owning the MSI gtx 1080 card.
Just want to ask if anyone owns the Asus GTX 1080 Strix? If yes, how is it performing? is the fans loud under load?


----------



## L4TINO

Quote:


> Originally Posted by *fayzaan*
> 
> I have the same experience, I can get the card to run at 2012 but then due to temps it drops to 1999 and then usually settles at around 1974. I was not able to get it stable any higher. Are you getting driver crash? that's what happens when I try to go over 2000 core speed.
> 
> I have the MSI GTX 1080 Gaming (non-x)


my card starts at 2050mhz but goes down to 2032 then 2012 and thats within a few seconds. you get driver crashes when the overclock is not stable.
my clock is stable at 2012MHz


----------



## max883

The MSI GTX 1080 Gaming X is A VERY Good card! Think i gott a god one







I testes the most demanding games at 4K. Temps never got above 73.c Cant hear the card!

2100.Mhz core

11.000 mem

25.200P Gpu score in firestrike


----------



## L4TINO

Quote:


> Originally Posted by *max883*
> 
> The MSI GTX 1080 Gaming X is A VERY Good card! Think i gott a god one
> 
> 
> 
> 
> 
> 
> 
> I testes the most demanding games at 4K. Temps never got above 73.c Cant hear the card!
> 
> 2100.Mhz core
> 
> 11.000 mem
> 
> 25.200P Gpu score in firestrike


its very odd i had a Msi 980Ti Gaming and i had a Gpu score of 20198
and now with the 1080 Gaming X i get 21374
ive always gotten lower scores than everyone, it would be great to know the reason, but i have never looked into it.


----------



## Eorzean

I just listed my 1070 for sale after only owning it for a few days now.

For 1440p gaming, I want my card to last a little longer.

Now I just need to find an MSI Gaming 1080 in stock somewhere. Does anyone know if the X is binned higher now that they've been out for awhile? Or is it a complete money grab? My 'X' 1070 was only able to OC to around 2050Mhz stable... Anyway, I'll probably just grab the non-X because it's 80 bucks cheaper.


----------



## webmi




----------



## Setzer

Quote:


> Originally Posted by *ChevChelios*
> 
> hmm, thats pretty neat, thx
> 
> will this method also make it use 165Hz (instead of 144) in games if your monitor is OCed to 165 ?


If, like me, you're using an ASUS PG279Q overclocked to 165Hz through the built-in screen menu/software, then there is *NO* 144Hz option left








165 has taken over the slot of the 144 option.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Eorzean*
> 
> I just listed my 1070 for sale after only owning it for a few days now.
> 
> For 1440p gaming, I want my card to last a little longer.
> 
> Now I just need to find an MSI Gaming 1080 in stock somewhere. Does anyone know if the X is binned higher now that they've been out for awhile? Or is it a complete money grab? My 'X' 1070 was only able to OC to around 2050Mhz stable... Anyway, I'll probably just grab the non-X because it's 80 bucks cheaper.


There is no binning of gpus this round so far. Last time there was some sort of binning was with the EVGA KPE 980 TI for Asic percentage.

All luck of the draw what kind clocks you get when getting a 1080.


----------



## ChevChelios

Quote:


> Originally Posted by *Setzer*
> 
> If, like me, you're using an ASUS PG279Q overclocked to 165Hz through the built-in screen menu/software, then there is *NO* 144Hz option left
> 
> 
> 
> 
> 
> 
> 
> 
> 165 has taken over the slot of the 144 option.


i see, cool

so far Ive heard the following:

- "Just updated the Bios for my GTX 1080 G1 Gaming and the flickering is gone."
- someone said simply setting to 165/160Hz for everything removed the flicker for him
- set 120Hz for Win dekstop and then leave 144/165Hz for 3D gaming
- for some setting in NVCP power option to Maximum Performance helps
- some dont get the flicker at all, and for some the hotfix driver helps

I also heard 165hz removes the split vertical line issue on high refresh Gsync monitors for some, will try that too


----------



## keem21

To anyone using Zotac 1080 FE cards, changing thermal paste resulted to a 8c temperature drop. Zotac must have used low quality paste.


----------



## BrainSplatter

Quote:


> Originally Posted by *keem21*
> 
> To anyone using Zotac 1080 FE cards, changing thermal paste resulted to a 8c temperature drop. Zotac must have used low quality paste.


I thought all FE cards come from the same factory and are manufactured according to specs from NVIDIA.


----------



## pez

Quote:


> Originally Posted by *ChevChelios*
> 
> finally got and installed my G1 1080
> 
> 358.39 WHQL drivers
> 
> absolutely no issues so far, fans dont spin in idle/desktop and my temps there are still ~48-49C
> 
> in an hour+ of Overwatch, I had max 65C and a max of ~44-45% fan .. 1923 Mhz rock solid stable core for that entire OW session - I didnt do any manual OC yet, didnt even enable G1 OC mode via the Gigabyte utility, didnt flash new BIOS (it seems to be for fans and mine work fine already), this is straight out of the box (which is by default in Gaming Mode)..
> 
> I heard there is some 1080 driver issue with OW (?), but I have the default Optimal Power (not Max Performance) set in NVCP, and everything was perfect in OW - GPU utilization, 1923Mhz stable core etc. (however I do have high performance set in windows power settings) .. no crashes/drops or anything .. I also dont have GF experience installed
> 
> havent gotten my XB271HU yet, so I cant check 144Hz flicker bug yet


I need to remove Geforce Experience and mess with the power settings in NVIDIA CP (unless it's linked to the Windows power settings). I'm getting some strange graphical glitching in OW. Nothing crazy, but I it's a tad strange.


----------



## KillerBee33

Quote:


> Originally Posted by *pez*
> 
> I need to remove Geforce Experience and mess with the power settings in NVIDIA CP (unless it's linked to the Windows power settings). I'm getting some strange graphical glitching in OW. Nothing crazy, but I it's a tad strange.


There are 3 options Optimal,Adaptive and Performance.....Set it to Adaptive , works best on MSI FE , setting it to Performance throws @ highest clock at all time, Setting it to Optimal runs @ 1100 MHZ in some games, Adaptive is something to try







Also , delete all game saves if you used GFE to optimize and try clean In Game Settings. Optimized with GFE The Witcher3 cant even do 1440 without dipping into 40's , After new save runs @ 1620 flawless.


----------



## ChevChelios

I tried setting Maximum Performance and for Windows dekstop it didnt change anything - my idle clock was still 240Mhz and it even still downclocked me to PCI-E x16 1.1 (under load it goes back to PCI-E 3.0)

I did notice though that with _Optimal Power_ my GPU never went above 16XX Mhz during a match of Heroes of the Storm







(Ultra settings as well) .. not sure if its a bug or the game is just that "light" that it cant even tax a 1080 enough to push it to 19XX Mhz .. OW (Epic settings) did push me to 19XX Mhz

I would assume with Maximum Performance even HotS would set my clock to 19XX Mhz since its a 3D app

I'll try Adaptive and see if my HotS clocks go higher

will probably leave it at Adaptive in the end

Im never touching GF experience


----------



## pez

Quote:


> Originally Posted by *KillerBee33*
> 
> There are 3 options Optimal,Adaptive and Performance.....Set it to Adaptive , works best on MSI FE , setting it to Performance throws @ highest clock at all time, Setting it to Optimal runs @ 1100 MHZ in some games, Adaptive is something to try
> 
> 
> 
> 
> 
> 
> 
> Also , delete all game saves if you used GFE to optimize and try clean In Game Settings. Optimized with GFE The Witcher3 cant even do 1440 without dipping into 40's , After new save runs @ 1620 flawless.


Interesting. The one thing that bothers me is I get stuck at ~900MHz if I lock my screen. I haven't looked too far into it, but I remember this being mentioned before. Though I'm not sure a fix was really mentioned.


----------



## KillerBee33

Quote:


> Originally Posted by *ChevChelios*
> 
> I tried setting Maximum Performance and for Windows dekstop it didnt change anything - my idle clock was still 240Mhz and it even still downclocked me to PCI-E x16 1.1 (under load it goes back to PCI-E 3.0)
> 
> I did notice though that with _Optimal Power_ my GPU never went above 16XX Mhz during a match of Heroes of the Storm
> 
> 
> 
> 
> 
> 
> 
> (Ultra settings as well) .. not sure if its a bug or the game is just that "light" that it cant even tax a 1080 enough to push it to 19XX Mhz .. OW (Epic settings) did push me to 19XX Mhz
> 
> I would assume with Maximum Performance even HotS would set my clock to 19XX Mhz since its a 3D app
> 
> I'll try Adaptive and see if my HotS clocks go higher
> 
> will probably leave it at Adaptive in the end
> 
> Im never touching GF experience


Might be just me but every game i use GFE to Optimize, games become unplayable, always delete These game save Folders and start fresh , You can keep the Game Save itslef and put it back but Game Settings must be deleted. GFE is one of the worst piece of software to have







Installing nVidia Drivers should be this>


----------



## ChevChelios

I always set games settings and graphics personally in-game, always going into advanced tab as well, its a sacred process and Im never letting some filthy app do that instead


----------



## KillerBee33

Quote:


> Originally Posted by *pez*
> 
> Interesting. The one thing that bothers me is I get stuck at ~900MHz if I lock my screen. I haven't looked too far into it, but I remember this being mentioned before. Though I'm not sure a fix was really mentioned.


Same here i noticed in Far Cry 4 @ 1620p , 1080 showed 1130MHz clock and highest 1580 i think , thats why i decided to look into Power Settings


----------



## mr2cam

Does anyone else notice that the cooler your card runs, the less voltage it requires? Is that the way GPU boost 3.0 works?


----------



## KillerBee33

Quote:


> Originally Posted by *mr2cam*
> 
> Does anyone else notice that the cooler your card runs, the less voltage it requires? Is that the way GPU boost 3.0 works?


It might be the Power Issue, have you checked the Clock it runs when low Voltage?


----------



## mr2cam

Quote:


> Originally Posted by *KillerBee33*
> 
> It might be the Power Issue, have you checked the Clock it runs when low Voltage?


Ya during a benchmark the clock will hold steady 2100mhz, if the card is at like 40c it will run more voltage then lets say at like 28c, which doesn't make sense to me


----------



## KillerBee33

Quote:


> Originally Posted by *mr2cam*
> 
> Ya during a benchmark the clock will hold steady 2100mhz, if the card is at like 40c it will run more voltage then lets say at like 28c, which doesn't make sense to me


28C ...? What are you cooling it with?


----------



## c0nsistent

Have any SLI 980 owners here recently upgraded to a 1080 and was it worth it?

I'm looking to SLI my 980 because the HSF is damaged and I can't RMA it, but it works fine. Basically I wont get much by selling it used, so I'm looking towards the SLI upgrade path instead of the 1080.
I'd like to know if microstutter and issues of that nature are still a big hindrance.


----------



## mr2cam

Quote:


> Originally Posted by *KillerBee33*
> 
> 28C ...? What are you cooling it with?


H105, highest temp I have seen was 40c


----------



## KillerBee33

Quote:


> Originally Posted by *mr2cam*
> 
> H105, highest temp I have seen was 40c


Nice , NZXT bracket?


----------



## zGunBLADEz

Quote:


> Originally Posted by *Romir*
> 
> I just wanted to report the Koolance GPU-220 universal gpu block works perfectly with the FE. The mounting bolts are thin enough to fit through both the stock backplate and the cooling plate holes.
> 
> After 15m in the Witcher 3 I observed 31c gpu temps at stock and then 34c when overclocked. Even at those temps my card isn't stable beyond 2139mhz in games. Underwhelming, but fully expected.
> 
> I actually had to adjust my radiator fan profiles because the air water deltaT wasn't rising above 5c with one 480mm radiator cooling it. That kept the fans on the second 480 from spinning up.


Im wondering if you can measure the space as i have a ek supremacy vga (49.5mm all around) for i,t but i dont know if i have to shave some metal to make it fit


----------



## zGunBLADEz

Quote:


> Originally Posted by *x7007*
> 
> About the DPC issue
> 
> Do you guys have HPET Enabled ? it seems HPET Enabled fix the issue . The first jump to 1530+ always happens when I start Latencymon, must be something with the power saving of the 1080 . but it doesn't jump to extreme 10000 anymore.
> 
> 
> 
> 
> I'm still checking , but I don't see the high jumps anymore !
> 
> So whoever is with 1080 must enable HPET , I don't know why, must be they changed something with the pascal to have better timings
> 
> So with the bios settings you see in my sig , I don't have the DPC issue and I don't have the Idle Clock issue with the 1080. The throttling issue must happen because Cpu Power Duty Control - Extreme and not T.Probe or Cpu Power Phase Control - Extreme and not optimized. that's what I think from all the tests. I didn't have issue for 7 days when I use same kinds of settings which I don't remember exactly , but I didn't use the Extreme on Cpu Power Phase Control , when I did I think the problem occured. so it's not 100% nvidia problem, it might be just heat on the motherboard..


still have the issue with hpet on here


----------



## mr2cam

Quote:


> Originally Posted by *KillerBee33*
> 
> Nice , NZXT bracket?


Yes sir, I keep it relatively cool in my computer room also, I like it at about 70F. The 1080 seems to run quite a bit cooler then my 980ti, im guessing because of low amount of power, but I figured with the smaller dye that it would even out, guess not


----------



## KillerBee33

Quote:


> Originally Posted by *mr2cam*
> 
> Yes sir, I keep it relatively cool in my computer room also, I like it at about 70F. The 1080 seems to run quite a bit cooler then my 980ti, im guessing because of low amount of power, but I figured with the smaller dye that it would even out, guess not


Heh 70c thats Winter temps.. Well i keep fishing but it looks like i might wait for EVGA AIO , i want to try and keep the whole reference design with some Dremeling


----------



## mr2cam

Quote:


> Originally Posted by *KillerBee33*
> 
> Heh 70c thats Winter temps.. Well i keep fishing but it looks like i might wait for EVGA AIO , i want to try and keep the whole reference design with some Dremeling


Ya I took an MSI Armor card and put the G10 bracket on it, was extremely simple.


----------



## KillerBee33

Quote:


> Originally Posted by *mr2cam*
> 
> Ya I took an MSI Armor card and put the G10 bracket on it, was extremely simple.


1080 is a lot thinner than 980 so i'm hoping for EVGA releasing a smaller pump








If not than H90 and NZXT Kraken , really wanted to keep Reference look.


----------



## pphx459

anyone having issues with their screen not automatically turning off after the set time with the 1080's?


----------



## looniam

Quote:


> Originally Posted by *mr2cam*
> 
> Does anyone else notice that the cooler your card runs, the less voltage it requires? Is that the way GPU boost 3.0 works?


well heat adds resistance which could require more voltage; that then adds more heat . . wouldn't be surprised if boost 3.0 takes advantage of the reverse.


----------



## Romir

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Im wondering if you can measure the space as i have a ek supremacy vga (49.5mm all around) for it but i dont know if i have to shave some metal to make it fit


47.65mm as measured with my calipers just now. The Koolance GPU-220 is 45mm.

BTW, my MCW82 base measures at 49mm wide and I'm not sure if a MCW60's raised core area would clear the plate.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Romir*
> 
> 47.65mm as measured with my calipers just now. The Koolance GPU-220 is 45mm.
> 
> BTW, my MCW82 base measures at 49mm wide and I'm not sure if a MCW60's raised core area would clear the plate.


So i need to shave some metal then









Both sides are the same size?? top to bottom and left to right?

PD: repped for the info thanks


----------



## moustang

Quote:


> Originally Posted by *c0nsistent*
> 
> Have any SLI 980 owners here recently upgraded to a 1080 and was it worth it?
> 
> I'm looking to SLI my 980 because the HSF is damaged and I can't RMA it, but it works fine. Basically I wont get much by selling it used, so I'm looking towards the SLI upgrade path instead of the 1080.
> I'd like to know if microstutter and issues of that nature are still a big hindrance.


I ran SLI 770s for years before upgrading to the 1080. Microstutter is not the issue it used to be, but once in a while it still rears it's ugly head when a new game comes out and there isn't a proper SLI profile for it yet. Usually that's fixed within a few days to a week after it's released.

As for performance, SLI 980s will outperform a single 1080.


----------



## AllGamer

Quote:


> Originally Posted by *pphx459*
> 
> anyone having issues with their screen not automatically turning off after the set time with the 1080's?


I have the Opposite problem

after the screen goes into sleep mode, when i come back and wake up the PC

only 2 of the screen will come back on, the 3rd screen will reaming in sleep mode.


----------



## pphx459

Quote:


> Originally Posted by *AllGamer*
> 
> I have the Opposite problem
> 
> after the screen goes into sleep mode, when i come back and wake up the PC
> 
> only 2 of the screen will come back on, the 3rd screen will reaming in sleep mode.


hmm i think my issue is related to overwatch. After several hours of playing it won't go to sleep after i exit. If i play for only a hour or so it's fine. Strange..


----------



## kx11

Galax HOF 1080 Benchmarks are out

https://linustechtips.com/main/topic/621072-1080-hof-benchmarks-and-more-pictures/

one badass GPU


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> Galax HOF 1080 Benchmarks are out
> 
> https://linustechtips.com/main/topic/621072-1080-hof-benchmarks-and-more-pictures/
> 
> one badass GPU


Had 970 and 980 Reference cards from GALAX , both with NVIDIA Vendor chips , both good cards.


----------



## scaramonga

Quote:


> Originally Posted by *x7007*
> 
> About the DPC issue
> 
> Do you guys have HPET Enabled ? it seems HPET Enabled fix the issue . The first jump to 1530+ always happens when I start Latencymon, must be something with the power saving of the 1080 . but it doesn't jump to extreme 10000 anymore.
> 
> I'm still checking , but I don't see the high jumps anymore !
> 
> So whoever is with 1080 must enable HPET , I don't know why, must be they changed something with the pascal to have better timings
> 
> So with the bios settings you see in my sig , I don't have the DPC issue and I don't have the Idle Clock issue with the 1080. The throttling issue must happen because Cpu Power Duty Control - Extreme and not T.Probe or Cpu Power Phase Control - Extreme and not optimized. that's what I think from all the tests. I didn't have issue for 7 days when I use same kinds of settings which I don't remember exactly , but I didn't use the Extreme on Cpu Power Phase Control , when I did I think the problem occured. so it's not 100% nvidia problem, it might be just heat on the motherboard..


Jury is still out on all this DPC stuff, I really don't know what to make of it all?



HPET on in BIOS and off in Windows 10.

*Edit*

2nd kick at the ball:


----------



## ssgwright

finally getting my 1080, got to know without reading over 200 pages... can the bios be edited? are there any available?


----------



## x7007

Quote:


> Originally Posted by *scaramonga*
> 
> Jury is still out on all this DPC stuff, I really don't know what to make of it all?
> 
> 
> 
> HPET on in BIOS and off in Windows 10.
> 
> *Edit*
> 
> 2nd kick at the ball:


So you don't have any jumps with the driver ? Did you just install the nvidia driver or changed anything in the NVCP ? what else did you do or change

Which IRST intel drivers do you use ? 14.8 or 15.0 ? are you in Raid 0 or AHCI

Can you show me the DRIVERS TAB ? we need to see if the Total Exe is high there


----------



## Kielon

Quote:


> Originally Posted by *max883*
> 
> The MSI GTX 1080 Gaming X is A VERY Good card! Think i gott a god one
> 
> 
> 
> 
> 
> 
> 
> I testes the most demanding games at 4K. Temps never got above 73.c Cant hear the card!
> 
> 2100.Mhz core
> 
> 11.000 mem
> 
> 25.200P Gpu score in firestrike


Could u link ur score plse? I get 24.906 GPU score @2164 Mhz http://www.3dmark.com/fs/9149743


----------



## x7007

No one else has DPC issue ?

I'm not sure if it's 1080 anymore, even without drivers or device disabled I have jumps.. why do I have those jumps, I never had those.

When you open Chrome you don't have jumps ?


----------



## versions

Quote:


> Originally Posted by *x7007*
> 
> No one else has DPC issue ?
> 
> I'm not sure if it's 1080 anymore, even without drivers or device disabled I have jumps.. why do I have those jumps, I never had those.
> 
> When you open Chrome you don't have jumps ?


While I'm not familiar with what a DPC issue is, I haven't noticed anything out of the ordinary with my 1080.


----------



## x7007

Quote:


> Originally Posted by *versions*
> 
> While I'm not familiar with what a DPC issue is, I haven't noticed anything out of the ordinary with my 1080.


Did you check it ? there are no random jumps to 15000 or 10000 ?

can you list the drivers you are using and check the Drivers tab in latency mon ?


----------



## versions

Quote:


> Originally Posted by *x7007*
> 
> Did you check it ? there are no random jumps to 15000 or 10000 ?
> 
> can you list the drivers you are using and check the Drivers tab in latency mon ?


Again, don't know what this is all about. Just wanted to say that my computer seems to be working perfectly normally since I installed the 1080, haven't noticed out of the ordinary. I'm also not at home at the moment so don't have access to it. I am using 368.51.


----------



## x7007

Quote:


> Originally Posted by *versions*
> 
> Again, don't know what this is all about. Just wanted to say that my computer seems to be working perfectly normally since I installed the 1080, haven't noticed out of the ordinary. I'm also not at home at the moment so don't have access to it. I am using 368.51.


We are not asking for working perfectly normally. we are asking to see LatencyMon if it has issue. other people has the same issues. I didn't have issue with 970 .
I disabled all device I had. USB3 , Creative ZXR , GTX1080 device, SolarFlare network device . I had 7-30 in LatencyMon not a single jump. as soon as I enabled the GTX1080 device the jumps were back.

I don't know if it's the Power Management on Optimal Power, but I did try to change it, it didn't help, 970 I think it worked on Adaptive, but still I tried that too. something in the drivers doesn't work properly.

Look link

https://forums.geforce.com/default/topic/941579/gtx-1080-high-dpc-latency-and-stuttering/

*If I move my Mouse G502 1000hz usb polling very fast non stop I have non stop high DPC - between 600-800 what is your usb polling set to ?*


----------



## ChevChelios

I dont see anything bad happening in games (or videos), runs smooth, but maybe Im not seeing it or looking right

I definitely dont have any audio crackle and never did in my life

how does stutter present itself ?


----------



## x7007

Quote:


> Originally Posted by *ChevChelios*
> 
> I dont see anything bad happening in games (or videos), runs smooth, but maybe Im not seeing it or looking right
> 
> I definitely dont have any audio crackle and never did in my life
> 
> how does stutter present itself ?


It's hard to notice it, why not just run Latencymon and check ? that's what we need to know


----------



## ChevChelios

alright i will later today

is it free software ?

do I need the "DPC Latency checker" program or some other one (Latency Monitor ?)


----------



## x7007

Quote:


> Originally Posted by *ChevChelios*
> 
> alright i will later today
> 
> is it free software ?
> 
> do I need the "DPC Latency checker" program or some other one (Latency Monitor ?)


Latencymon

http://www.resplendence.com/download/LatencyMon.exe


----------



## ChevChelios

alright Ill check it

also just in case - I have Win10 x64, in my BIOS I have ACPI HPET Enabled (and I see it in Device Manager), but nothing else - I have not forced HPET via a cmd line .. as far as I understand on Win10 its best to leave HPET enabled in BIOS (if its there at all), but not force anything in Windows .. Win10 will choose by itself what to use best


----------



## Eorzean

Quote:


> Originally Posted by *x7007*
> 
> We are not asking for working perfectly normally. we are asking to see LatencyMon if it has issue. other people has the same issues. I didn't have issue with 970 .
> I disabled all device I had. USB3 , Creative ZXR , GTX1080 device, SolarFlare network device . I had 7-30 in LatencyMon not a single jump. as soon as I enabled the GTX1080 device the jumps were back.
> 
> I don't know if it's the Power Management on Optimal Power, but I did try to change it, it didn't help, 970 I think it worked on Adaptive, but still I tried that too. something in the drivers doesn't work properly.
> 
> Look link
> 
> https://forums.geforce.com/default/topic/941579/gtx-1080-high-dpc-latency-and-stuttering/
> 
> *If I move my Mouse G502 1000hz usb polling very fast non stop I have non stop high DPC - between 600-800 what is your usb polling set to ?*


Do you have Intel speedstep on? Disabling all the power saving features on my CPU helped bring down the latency stutters to acceptable levels on the 1070 for me. I still get spikes to 450-500, but it stays in the green at least. Before that it spiked all over the place, even at idle (the spikes are related to your CPU speed ramping down and back up again). It's a very weird issue and seems to only be affecting some people.


----------



## ChevChelios

Ive always had all CPU power savings on

if I see some high latency today Ill try disabling them


----------



## x7007

Quote:


> Originally Posted by *Eorzean*
> 
> Do you have Intel speedstep on? Disabling all the power saving features on my CPU helped bring down the latency stutters to acceptable levels on the 1070 for me. I still get spikes to 450-500, but it stays in the green at least. Before that it spiked all over the place, even at idle (the spikes are related to your CPU speed ramping down and back up again). It's a very weird issue and seems to only be affecting some people.


I always enabled EIST only since I didn't see any effect on SSD 4k speed, games, programs, windows or youtube 4k. EIST shouldn't cause issue with 1080GTX if it didn't with 970 , am I right ? I don't know why some people don't have problem. they don't give much information. bios settings or software or windows settings.


----------



## Eorzean

Quote:


> Originally Posted by *x7007*
> 
> I always enabled EIST only since I didn't see any effect on SSD 4k speed, games, programs, windows or youtube 4k. EIST shouldn't cause issue with 1080GTX if it didn't with 970 , am I right ? I don't know why some people don't have problem. they don't give much information. bios settings or software or windows settings.


EIST caused issues with my 1070 but not my 960. Same setup. Same everything... All I did was pull my 1070 and swap it with the 960 to see if any other hardware was the culprit. It turned out the 960 was perfectly fine, and never went into the red under any circumstances. The 1070, however, spiked to the heavens. Turned EIST off, and it stopped spiking as high (note: under most conditions, FS Ultra benchmarks stressed it too much, I guess). If you do have latency issues, it's worth a shot. Hopefully some more people with absolutely no issues can start providing more information and we can diagnose what the problem is. Nvidia is pretty much AWOL on the matter and haven't responded, even in their own forums about it yet.


----------



## ChevChelios

Quote:


> In my case it seems like i only get high latency readings when idling with my 1080 , *seems like it's a combination of Intel speedstep stuff and pascals new low power modes.*
> 
> When i set high power mode in windows 10 x64 CP latency much better , but even with that when the system detects inactivity low power modes still kick in and higher latency readings return again.
> 
> I get between 1000us and 3000us when idling , but when the card is being utilised the readings are perfectly normal , lower than my 780 , gaming is buttery smooth , no audio problems.


could this be it ?

also I realized that I havent installed any specifically downloaded Intel chipset/LAN etc. drivers for my ASRock Z77 Pro3 mobo - Win10 apparently handled all that automatically .. should I track down some drivers for it and install them ? and if so - do I look at Intel site or on ASRocks site ?


----------



## Ascendor81

My scores with my Asus Strix non OC (Graphics Score 11 229) 8+2 phase .

I bios flashed the Strix OC bios, and did a -15Mhz in Afterburner to maintain stability in games (GTAV, ultra max details, for over 2 hours). http://www.3dmark.com/fs/9140664.

This is in SLI with 2 of the same card, with the same BIOS flashed at -10Mhz in Afterburner: http://www.3dmark.com/fs/9156456

Photo: it is blood red in real life









http://imgur.com/ryBUHqN

 (looks like smoke in there, is just the red led light reflecting off the side panel window).


----------



## x7007

Quote:


> Originally Posted by *Eorzean*
> 
> EIST caused issues with my 1070 but not my 960. Same setup. Same everything... All I did was pull my 1070 and swap it with the 960 to see if any other hardware was the culprit. It turned out the 960 was perfectly fine, and never went into the red under any circumstances. The 1070, however, spiked to the heavens. Turned EIST off, and it stopped spiking as high (note: under most conditions, FS Ultra benchmarks stressed it too much, I guess). If you do have latency issues, it's worth a shot. Hopefully some more people with absolutely no issues can start providing more information and we can diagnose what the problem is. Nvidia is pretty much AWOL on the matter and haven't responded, even in their own forums about it yet.


Disabled EIST , and it still jumps to 500-800-12000

IDLE is fine without EIST, but as soon as I open youtube or chrome and something is working the DPC goes up. did nvidia changed something with pascal , how it works with windows timings or something ?

Whoever test something, open Youtube with chrome or Internet Edge , and while something is Playing behind check the DPC. the whole points is why there is high DPC while playing something. IDLE is easy to fix like he said with Disabled EIST there are no more jumps. so we are on route here. I just need you to confirm when you open youtube or chrome you have big jumps every seconds while watching youtube video and such.

*Watching a MKV video 1080p causing the 1080GTX to use max Clock speed , while watching the video I don't have any jumps . means it's something with the power saving that causing it to jump.*

it must be with the Turbo Boost which working wrongly while causing DPC every time because it's not timing is correctly, other people have low DPC because they test wrongly or just don't have something or have something that does something which "Hidden"


----------



## ChevChelios

it could be Pascal power savings/Pascal GPU boost 3.0 or combo thereof

Ill test it on my rig, but I havent seen any tangible lag/stutter yet anywhere, so it isnt affecting my real world performance


----------



## x7007

Quote:


> Originally Posted by *ChevChelios*
> 
> it could be Pascal power savings/Pascal GPU boost 3.0 or combo thereof
> 
> Ill test it on my rig, but I havent seen any tangible lag/stutter yet anywhere, so it isnt affecting my real world performance


It's weird ,, it must be something with the driver or the card memory that has some timing issue with the system , something going nuts after it go up on the clock speed and then go back to saving mode. it just doesn't go to saving mode correctly.


----------



## Eorzean

Quote:


> Originally Posted by *x7007*
> 
> Disabled EIST , and it still jumps to 500-800-12000
> 
> IDLE is fine without EIST, but as soon as I open youtube or chrome and something is working the DPC goes up. did nvidia changed something with pascal , how it works with windows timings or something ?
> 
> WHoever test something, open Youtube with chrome or Internet Edge , and while something is Playing behind check the DPC. the whole points is why there is high DPC while playing something. IDLE is easy to fix like he said with Disabled EIST there are no more jumps. so we are on route here. I just need you to confirm when you open youtube or chrome you have big jumps every seconds while watching youtube video and such.


Does it have the message written in green, or the one in red telling you it's not suitable for real-time audio? Mine would spike up to 500 every now and then, but it stayed in the green telling me it was ok. You should open up CPUZ to see if your CPU is still fluctuating speeds. Turn off every power saving / turbo feature your mobo has.
Quote:


> Originally Posted by *ChevChelios*
> 
> it could be Pascal power savings/Pascal GPU boost 3.0 or combo thereof
> 
> Ill test it on my rig, but I havent seen any tangible lag/stutter yet anywhere, so it isnt affecting my real world performance


Just run LatencyMon to even see if you're affected. Mine was jumping high while idle with EIST enabled.... and was even worse while running FS.


----------



## x7007

Quote:


> Originally Posted by *Eorzean*
> 
> Does it have the message written in green, or the one in red telling you it's not suitable for real-time audio? Mine would spike up to 500 every now and then, but it stayed in the green telling me it was ok. You should open up CPUZ to see if your CPU is still fluctuating speeds. Turn off every power saving / turbo feature your mobo has.


It shows me the error, but not in red. just black with the system might have buffer underrun and sound skipping and such. but it's not red at the moment or not anymore. the speed are not fluctuating anymore since I disable EIST or set minimum cpu speed in windows power options to 100% instead 5% for use with EIST.

Watching MKV movies working as before.
Games doesn't seem to have any issue or stuttering or fps drops. even the heavy ones.

But desktop with power saving has big issues with YouTube Chrome and HTML5 and Flash, it just has big spikes, even when using Skype there is issue.

Did any of you tested YouTube ? Skype ? Chrome with many Tabs ? something that doesn't max the GPU clock speed . I can't believe any of you will have it working correctly, as long as you test it correctly you will have the same issue .. it's driver issue, can't be something else.

Did you check your GPu clock speed with nvidia inspector or MSI After Burner ? does it reach to power saving 291 Mhz Memory 405 Mhz ? what happens then when you try to watch youtube or doing something not heavy, what happens after you did something heavy and the GPU Clock goes back to power saving ?


----------



## Eorzean

Quote:


> Originally Posted by *x7007*
> 
> It shows me the error, but not in red. just black with the system might have buffer underrun and sound skipping and such. but it's not red at the moment or not anymore. the speed are not fluctuating anymore since I disable EIST or set minimum cpu speed in windows power options to 100% instead 5% for use with EIST.


These are the options I ticked that greatly reduced the latency:



I did pretty extensive tests with my 1070 and a 960, and I'll repost them here:

This is before, with EIST and Intel Turbo Boost enabled:
Quote:


> Did some tests between my GTX 1070 and 960, same hardware and software, only the video card was swapped.
> 
> Idle 1070 (high latency): http://i.imgur.com/eLGnbm5.jpg
> Idle 960 (low latency): http://i.imgur.com/nHNCnXH.jpg
> 
> Under load 1070 FS (high latency): http://i.imgur.com/WBOV2fS.jpg
> Under load 1070 FS Ultra (high latency): http://i.imgur.com/mNaheTS.jpg
> 
> Under load 960 FS (low latency): http://i.imgur.com/3wOmPiq.jpg
> Under load 960 FS Ultra (low latency): http://i.imgur.com/C3WUwCV.jpg


And this is after, with my CPU Ratio Mode set to 'Fixed', which automatically disables EIST and Intel Turbo Boost (I went back and set it back to Dynamic, and then manually turned off EIST to see if there was a difference and there wasn't). Much better:

 *Updated img*

Running that, playing YouTube, and browsing Chrome WaterFox (I just switched heh), my card stayed with the green / suitable for real-time audio message.

But running FS Ultra gave me the red message telling me my DPC was too high, which is why I'm not completely convinced the 'workaround' fixed it:

http://i.imgur.com/mNaheTS.jpg

Also to note, even though it stayed in the green, there were still pretty high spikes of 450-500, that I DIDN'T get with my 960 at all. In the thread over at Nvidia, most people who reported 'fixing' it, still had pretty high spikes IIRC.


----------



## x7007

Quote:


> Originally Posted by *Eorzean*
> 
> These are the options I ticked that greatly reduced the latency:
> 
> 
> 
> I did pretty extensive tests with my 1070 and a 960, and I'll repost them here:
> 
> This is before, with EIST and Intel Turbo Boost enabled:
> And this is after, with my CPU Ratio Mode set to 'Fixed', which automatically disables EIST and Intel Turbo Boost (I went back and set it back to Dynamic, and then manually turned off EIST to see if there was a difference and there wasn't). Much better:
> 
> 
> 
> http://imgur.com/WBOV2fS
> 
> 
> Running that, playing YouTube, and browsing Chrome, my card stayed with the green / suitable for real-time audio message.
> 
> But running FS Ultra gave me the red message telling me my DPC was too high, which is why I'm not completely convinced the 'workaround' fixed it:
> 
> http://i.imgur.com/mNaheTS.jpg
> 
> Also to note, even though it stayed in the green, there were still pretty high spikes of 450-500, that I DIDN'T get with my 960 at all. In the thread over at Nvidia, most people who reported 'fixing' it, still had pretty high spikes IIRC.


So we know the issue , something with the power management which works not intended in some senerios. Driver should fix the issue cause the 970 had issues too when it was first released.

Eist shouldn't cause issues like this 970 at my laptop max is 250 or so. Windows 8.1 x64 368.51


----------



## ChevChelios

I use Firefox 32-bit and obviously I use HTML5 and sometimes flash all the time - videos streams etc. - all fine

with _Optimal Power_ in NVCP my GPU clock is 240 even while using youtube/twitch .. videos dont lag

I dont use Chrome of Skype


----------



## Avant Garde

Is this : https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/1/

the problem with all GTX1080s?


----------



## KillerBee33

Quote:


> Originally Posted by *Eorzean*
> 
> These are the options I ticked that greatly reduced the latency:
> 
> 
> 
> I did pretty extensive tests with my 1070 and a 960, and I'll repost them here:
> 
> This is before, with EIST and Intel Turbo Boost enabled:
> And this is after, with my CPU Ratio Mode set to 'Fixed', which automatically disables EIST and Intel Turbo Boost (I went back and set it back to Dynamic, and then manually turned off EIST to see if there was a difference and there wasn't). Much better:
> 
> 
> 
> http://imgur.com/WBOV2fS
> 
> 
> Running that, playing YouTube, and browsing Chrome, my card stayed with the green / suitable for real-time audio message.
> 
> But running FS Ultra gave me the red message telling me my DPC was too high, which is why I'm not completely convinced the 'workaround' fixed it:
> 
> http://i.imgur.com/mNaheTS.jpg *Wait, this is the wrong img, let me re-run the test to prove that it's now low again (I went and unboxed my card just to re-test it, to prove it is now much better.*
> 
> Also to note, even though it stayed in the green, there were still pretty high spikes of 450-500, that I DIDN'T get with my 960 at all. In the thread over at Nvidia, most people who reported 'fixing' it, still had pretty high spikes IIRC.


I moved up to 4.6 manually with same settings without a single issue so far


FIXED MODE to DYNAMIC so it don't run @ 4.6 100% of the time.


----------



## Eorzean

Quote:


> Originally Posted by *x7007*
> 
> So we know the issue , something with the power management which works not intended in some senerios. Driver should fix the issue cause the 970 had issues too when it was first released.
> 
> Eist shouldn't cause issues like this 970 at my laptop max is 250 or so. Windows 8.1 x64 368.51


That's good to hear, didn't know the 970 had similar issues.


----------



## x7007

Quote:


> Originally Posted by *Avant Garde*
> 
> Is this : https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/1/
> 
> the problem with all GTX1080s?


yes , that's the thread, it's not just us, I'm trying to figure or make nvidia release a drive fix for this. it is a problem, it shouldn't be like this, I have top gear and hardware not to have issue like those, the hardware suppose to be the best in all terms of performance and power saving, the power saving shouldn't affect the performance in any way or cause issue visible or not visible to the naked eye.

Quote:


> Originally Posted by *ChevChelios*
> 
> I use Firefox 32-bit and obviously I use HTML5 and sometimes flash all the time - videos streams etc. - all fine
> 
> with _Optimal Power_ in NVCP my GPU clock is 240 even while using youtube/twitch .. videos dont lag
> 
> I dont use Chrome of Skype


Please try to use Waterfox which is x64 bit browser, because we all use x64 bit chrome for obvious reasons. and check LatencyMon if you can show us what you get when watching youtube. saying you don't have issues without checking is not really watching blind folded for issue we see that exists.


----------



## flexus

Have you tried at lover voltage? I don`t have skylake, but I had same experience as you when I dialed in my i7-4790K for a couple of years ago, more volt actually hurt the stability.
To get a stable clock at 4.8 ghz I had to back down to 1.3v.
Quote:


> Originally Posted by *Jquala*
> 
> [email protected] around 65c doesn't sound too outrageous right? Even at 5ghz hottest core I've seen is 69c


Quote:


> Originally Posted by *x7007*
> 
> No one else has DPC issue ?
> 
> I'm not sure if it's 1080 anymore, even without drivers or device disabled I have jumps.. why do I have those jumps, I never had those.
> 
> When you open Chrome you don't have jumps ?


Have you tried to disable hardware acceleration in Chrome?


----------



## L4TINO

Yup now that you guys have mentioned this latency problem, i have noticed it aswell, i did a test with latencymon and it is also in the red.


----------



## bfedorov11

New drivers out. Same issue.


----------



## x7007

Quote:


> Originally Posted by *flexus*
> 
> Have you tried at lover voltage? I don`t have skylake, but I had same experience as you when I dialed in my i7-4790K for a couple of years ago, more volt actually hurt the stability.
> To get a stable clock at 4.8 ghz I had to back down to 1.3v.
> 
> Have you tried to disable hardware acceleration in Chrome?


disabling hardware acceleration doesn't help, and why would I do that ? I never did, we suppose to use Hardware with a piece of gear that cost more than 500$ . I don't know who thought about this from the first place, for more 10 years I always hardware, never disabled it.

Quote:


> Originally Posted by *bfedorov11*
> 
> New drivers out. Same issue.


The build didn't change much, we need new version build to 369 or 370. 368 is bad


----------



## JedixJarf

Quote:


> Originally Posted by *ChevChelios*
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5915#bios
> 
> is this the new G1 BIOS ? What does it do ? Is it to enable Gaming/OC mode by default or some other changes ?
> 
> Can you flash this on any G1 1080 unit safely ?


My G1 came with this bios installed.


----------



## flexus

Quote:


> Originally Posted by *x7007*
> 
> disabling hardware acceleration doesn't help, and why would I do that ? I never did, we suppose to use Hardware with a piece of gear that cost more than 500$ . I don't know who thought about this from the first place, for more 10 years I always hardware, never disabled it.
> The build didn't change much, we need new version build to 369 or 370. 368 is bad


I remember it was a driver issue that gave black artifacts in chrome and also freezing and high GPU usage, so it was a temporary solution then.
It is common to find workarounds for problems when waiting for a is fix from a vendor


----------



## boredgunner

Two strange things I've noticed with my GTX 1080. Anyone else suffering from either of these?


Poor performance in ArmA 3. My friend gets the same and he has two different GTX 1080s in two different PCs, each with the same issue I believe. Even a GTX 980 ran it better.
Driver settings (via NVIDIA Inspector or Control Panel) simply don't work in any Source title I've tried (HL2, HL2 E1, HL2 E2, and Source mods that use HL2 E2's exe).
I'm on Windows 10 Pro 64-bit.


----------



## H3avyM3tal

Any point in waiting for a 3rd party version to be in stock if I can get a founders now? What say you?


----------



## boredgunner

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Any point in waiting for a 3rd party version to be in stock if I can get a founders now? What say you?


Founder's Edition tends to get really hot and throttle in many cases, it's a coil whine lottery and the cooler is bound to be audible under load for most people. I waited for 3rd party and do not regret it.


----------



## flexus

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Any point in waiting for a 3rd party version to be in stock if I can get a founders now? What say you?


I`ve just ordered the MSI GeForce GTX 1080 Sea Hawk EK X as it would be the same price to slap on a WB and backplate myself. If you don`t have a custom loop, then maybe the MSI GeForce GTX 1080 Sea Hawk X with Corsair Hydro H55 would interest you.


----------



## ChevChelios

I tried it and its definitely power management related

(1) with _Optimal Power_ in NVCP (Windows power is still High Performance), 2 browsers and some twitch/YT tabs open and running/playing: my GPU clock was still 240 Mhz and LatMon readings really bad







.. 3 of them - 800, 1500, 2800 and the current (changing one) was going from 800-900 to 1200-1300

then I changed NVCP power to _Max Performance_ and opened LatMon again - this time 3 values of ~200/300/400 and current latency was ~100+ .. this is still with twitch/YT tabs playing and GPU clock was ~1700 Mhz

(2) _Optimal Power_ and idle: readings ~300, 800, 900-1000 us, current latency changing between 50 & 500

_Max Performance_ and idle: the same as above, which makes sense since they were both @ 240 Mhz GPU

(2) _Optimal Power_, Overwatch match: 1923 Mhz current latency ~100, ISR - 200, the others spiked kinda high

_Max Performance_, Overwatch match: 1923 Mhz the same, since the same max clocks

this was all with CPU power savings still enabled in BIOS, I imagine based on how it responded to GPU power saving changes and to reports from others, that disabling CPU power savings would improve it further, but I dont want to run CPU at max clocks 24/7 and I still dont see anything negative in actual performance

from what I can tell higher & constant clocks = less latency .. which kind of makes sense, no ?


----------



## nexxusty

I noticed while running GPU-Z D3D animation that my latency was extremely low....

Only ran this for 2 minutes, however it seemed to have an effect on latency.


----------



## ChevChelios

also the last latency reading - the hard pagefault - was always high in all my testing, but that may be because I had only 2048-2048 MB paging filesize set (on the SSD where the OS is .. not sure if it should be on the system SSD or on the media HDD) since I have 16GB RAM and I want it to use RAM, not the page file .. I changed it to 4096-4096 now, dunno if it even matters


----------



## ChevChelios

Quote:


> Originally Posted by *nexxusty*
> 
> I noticed while running GPU-Z D3D animation that my latency was extremely low....


I just tried this too ...

started GPU-Z render test and after it was running (nothing else was running) and clock was @ 1936 Mhz - started LatMon .. the latencies were 60-150; 165; 190; ~250 (the last pagefault one was high again), didnt go any higher than that .. ran a few minutes, as soon as I stopped render test - core clock went to 240 Mhz again and latencies went up

^ this was again with CPU power savings all still enabled


----------



## bfedorov11

Quote:


> We have only received one feedback from the original poster. It would help if we had more feedbacks so I have complete system information to pass along to QA:


http://surveys.nvidia.com/index.jsp?pi=6e7ea6bb4a02641fa8f07694a40f8ac6

https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/10/


----------



## x7007

Quote:


> Originally Posted by *bfedorov11*
> 
> http://surveys.nvidia.com/index.jsp?pi=6e7ea6bb4a02641fa8f07694a40f8ac6
> 
> https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/10/


Everyone please open a ticket support and driver feedback for fast fix.

https://nvidia-submit.custhelp.com

Tell them it's a problem idling with nvidia power saving management and when playing or using higher clock speed the issue doesn't happen direct link to the GPU clock speed.

Copy and paste on the nvidia ticket or support if you need to it will be easier for you.

Code:



Code:


The issue is with power management causing it to have High DPC when the GPU clock is low 291 Mhz and 405 Mhz Memory P8. When playing Game/Bluray or MKV movie the GPU Clock max out and the DPC issue doesn't happen, it stays low like it should. Disabling GTX1080 device driver from device mananger instantly fix the issue. Not installing nvidia driver any driver since 368.25 to 368.69 the issue doesn't happen.

Issue doesn't happen when >

Running GPU Clock > Memory Clock > Max P5 
Disabling NVidia device without restarting 
Not installing NVidia drivers -


----------



## snafua

Ran a DPC check against my 980 and 1080.



http://imgur.com/LoI7Z


One thing I noticed is the drivers aren't so unified anymore. I have to do a driver re-install if I swap the 1080 with anything else and vice versa.
I don't recall having to do that with previous Nvidia cards and drivers.

Survey Sent.


----------



## aberrero

Found the EVGA Superclock available for retail and jumped on it.

Still not sure if it was the right move or if I should wait for it to

I've got a couple days to cancel--I'm tempted to just stick with my 290x until a 1080Ti is announced.


----------



## scaramonga

Card is now clocking down properly with latest driver set (368.69) on my 1080 @ 144hz, so I guess that's something


----------



## Setzer

^2nd this


----------



## jprovido

finally got my strix 1080. it's a BIG upgrade from my 970. perfect for my 1440p 144hz monitor.




2050mhz at stock voltage. performs great runs cool too


----------



## amvnz

I came from AMD HD 7970 trifire to Nvidia GTX 1080. The card is as powerful as all 3 combined, without any VRAM limitations or crossfire compatibility reliance. Really happy. The DPC latency is horrible though, gone from a consistent 20us to a ****storm, values going up to 4500us and an average of 1000us when idle.


----------



## x7007

so we have update, good that's a start



http://imgur.com/h9hChJZ


----------



## amvnz

Quote:


> Originally Posted by *x7007*
> 
> so we have update, good that's a start
> 
> 
> 
> http://imgur.com/h9hChJZ


Good news.


----------



## Eorzean

Quote:


> Originally Posted by *x7007*
> 
> so we have update, good that's a start
> 
> 
> 
> http://imgur.com/h9hChJZ


----------



## bp7178

Quote:


> Originally Posted by *boredgunner*
> 
> Founder's Edition tends to get really hot and throttle in many cases, it's a coil whine lottery and the cooler is bound to be audible under load for most people. I waited for 3rd party and do not regret it.


I completely disagree with this. My two EVGA FE cards were great before I put them under water, even better after. The stock FE coolers were SO much more quiet than the 980Ti FTW cards they replaced, all with less heat and less power consumption with more performance. Coil whine on either card is MUCH less than the 980Ti cards as well, and is only audible when running Furmark or the like. In game, if there is any coil whine, its not detectable. Both cards are humming along at 2100Mhz 5400Mhz memory w/o issue. When I added the EK blocks, I gained more cooling. Playing The Division on a 34" monitor for a few hours and the cards never get above the low 40s.

The biggest reason I upgraded from the 980Ti is how much heat and noise they put out, plus how power hungry they were.


----------



## x7007

Does anyone know if bcdedit/set disabledynamictick yes does anything useful for windows 10 ? I can't figure if it's good or not.


----------



## moustang

Quote:


> Originally Posted by *bp7178*
> 
> I completely disagree with this. My two EVGA FE cards were great before I put them under water, even better after. The stock FE coolers were SO much more quiet than the 980Ti FTW cards they replaced, all with less heat and less power consumption with more performance. Coil whine on either card is MUCH less than the 980Ti cards as well, and is only audible when running Furmark or the like. In game, if there is any coil whine, its not detectable. Both cards are humming along at 2100Mhz 5400Mhz memory w/o issue. When I added the EK blocks, I gained more cooling. Playing The Division on a 34" monitor for a few hours and the cards never get above the low 40s.
> 
> The biggest reason I upgraded from the 980Ti is how much heat and noise they put out, plus how power hungry they were.


You may disagree but he is mostly correct.

For example, the MSI Gaming and Gaming X cards are whisper quiet even under full load and run significantly cooler than the FEs did. There is very minimal, if any throttling, and no reason to change fan profiles. Coil whine is non-existent in the MSI cards. I've been using mine for 2 weeks and have yet to hear a hint of coil whine in any benchmark or game.

And yes, my Gaming X is humming along quite nicely at 2113mhz and 5554mhz memory, without an EK block.


----------



## xTesla1856

To all Asus Strix owners: How happy are you with the cards? What are temps and noise like? I'm still torn between all the partner cards, I had an MSI Gaming 980Ti that had horrible coil whine and temps and I'm not a fan of the new EVGA designs. Thanks for helping out


----------



## ChevChelios

Quote:


> Originally Posted by *x7007*
> 
> so we have update, good that's a start
> 
> 
> 
> http://imgur.com/h9hChJZ


so we wait for new drivers ?

Im still keeping CPU & GPU power savings on


----------



## bfedorov11

Quote:


> Originally Posted by *x7007*
> 
> Does anyone know if bcdedit/set disabledynamictick yes does anything useful for windows 10 ? I can't figure if it's good or not.


No change here on 8.1. Only thing that helps is setting power option to high performance.


----------



## aberrero

Quote:


> Originally Posted by *xTesla1856*
> 
> To all Asus Strix owners: How happy are you with the cards? What are temps and noise like? I'm still torn between all the partner cards, I had an MSI Gaming 980Ti that had horrible coil whine and temps and I'm not a fan of the new EVGA designs. Thanks for helping out


I've been reading reviews all day and ended up going with EVGA. Seems to clock the highest and the fan is literally off when the card is idle. It is the quietest and coolest of the cards I looked at. I don't like the design either, but it has a great backplate and you don't see the underside much once it's installed anyway.


----------



## Unkzilla

Owned a FuryX for under 30 days and today was the day I encountered one issue too many.. easily the worst card i've owned since the x1950 . Anyone claiming AMD is on top of their drivers is full of BS.

Anyway, sold the Fury on ebay, 1080 acquired







Went down to my local retailer (MSY) and they had this in stock ready to go.



Will be interesting to see its speed out of the box and what I can OC to, exciting times


----------



## xTesla1856

Quote:


> Originally Posted by *Unkzilla*
> 
> Owned a FuryX for under 30 days and today was the day I encountered one issue too many.. easily the worst card i've owned since the x1950 . Anyone claiming AMD is on top of their drivers is full of BS.
> 
> Anyway, sold the Fury on ebay, 1080 acquired
> 
> 
> 
> 
> 
> 
> 
> Went down to my local retailer (MSY) and they had this in stock ready to go.
> 
> 
> 
> Will be interesting to see its speed out of the box and what I can OC to, exciting times


I'm much in the same boat, looking to ditch these Furys for a 1080


----------



## CannedBullets

Gainward? Never really heard of them. I'm still on my 770. I'm just waiting for the EVGA GTX 1080 FTW to be back in stock.


----------



## xTesla1856

Quote:


> Originally Posted by *CannedBullets*
> 
> Gainward? Never really heard of them. I'm still on my 770. I'm just waiting for the EVGA GTX 1080 FTW to be back in stock.


Gainward is big in Europe and Asia, they make good cards, and are usually the cheapest of all AIB's.


----------



## ChevChelios

Gainward = better Palit

cheap too


----------



## BrainSplatter

Palit/Gainward actually have one of the most effective 1080/1070 cooler designs atm (2.5/3 slots). It's only that they seem to have coil whine more often than more established brands and their warranty is not that great (2 years and u have to contact the seller instead of RMA through them directly).


----------



## escalibur

Has anyone spotted any info/pics/whatever of Gigabyte 1080 Xtreme Water apart from promo graphics?


----------



## Unkzilla

Quote:


> Originally Posted by *CannedBullets*
> 
> Gainward? Never really heard of them. I'm still on my 770. I'm just waiting for the EVGA GTX 1080 FTW to be back in stock.


Apparently the Phantom is one of the better custom 1080's.. custom pcb and has a 6pin+8pin . could only really find one review but if you are interested - http://www.hardwareunboxed.com/gainward-gtx-1080-phoenix-glh-review-best-board-partner-1080/

Card looks seriously impressive out of the box and the quality seems very good. Just need to figure out how to change the color of the LED on it now


----------



## Unkzilla

Quote:


> Originally Posted by *Unkzilla*
> 
> Apparently the Phantom is one of the better custom 1080's.. custom pcb and has a 6pin+8pin . could only really find one review but if you are interested - http://www.hardwareunboxed.com/gainward-gtx-1080-phoenix-glh-review-best-board-partner-1080/
> 
> Card looks seriously impressive out of the box and the quality seems very good. Just need to figure out how to change the color of the LED on it now


Last post in case this is of use for anyone

Out of the box untweaked this card runs at 1911mhz after 15min of Valley

Setting power to 120% and adding +210mhz core clock , I'm getting a stable 2064mhz - no voltage added. Card starts at 2114 and drops to 2064 after 15min. 69deg and is silent... possibly the quietest overclocked card I have ever owned

Recommended !


----------



## xTesla1856

Quote:


> Originally Posted by *Unkzilla*
> 
> Last post in case this is of use for anyone
> 
> Out of the box untweaked this card runs at 1911mhz after 15min of Valley
> 
> Setting power to 120% and adding +210mhz core clock , I'm getting a stable 2064mhz - no voltage added. Card starts at 2114 and drops to 2064 after 15min. 69deg and is silent... possibly the quietest overclocked card I have ever owned
> 
> Recommended !


Thanks for your experience, I'm now leaning even more towards that card


----------



## Unkzilla

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm much in the same boat, looking to ditch these Furys for a 1080


You won't regret it... I can't believe how much quieter this card is, even when overclocked compared to the Fury @ stock.

Also overclocking just seems to work with very little effort , not sure how you went with your Fury but mine was very erratic and after 60-90min straight of gaming nearly any OC would make my games CTD.

I've only got one game on my system with a built in benchmark which might not be the best indicator of performance difference but my Fury @ 1150 was getting 72fps in the division and now I am getting 105 fps with the 1080 @ 2063mhz.. roughly +45%. crazy


----------



## KillerBee33

Any news on Pascal NVFlash & BiosTweaker?


----------



## Alwrath

Quote:


> Originally Posted by *moustang*
> 
> You may disagree but he is mostly correct.
> 
> For example, the MSI Gaming and Gaming X cards are whisper quiet even under full load and run significantly cooler than the FEs did. There is very minimal, if any throttling, and no reason to change fan profiles. Coil whine is non-existent in the MSI cards. I've been using mine for 2 weeks and have yet to hear a hint of coil whine in any benchmark or game.
> 
> And yes, my Gaming X is humming along quite nicely at 2113mhz and 5554mhz memory, without an EK block.


Sounds like he had a positive experience with the FE cooler. Maybe they arent as bad as people claim. Maybe people just like to complain?


----------



## KillerBee33

Quote:


> Originally Posted by *Alwrath*
> 
> Sounds like he had a positive experience with the FE cooler. Maybe they arent as bad as people claim. Maybe people just like to complain?


Its not as loud as the 980 but a bit hotter, Liquid is a must
My fan profile is very agressive 100% @ 75 degrees but it still gets to 80's


----------



## boredgunner

Quote:


> Originally Posted by *moustang*
> 
> You may disagree but he is mostly correct.
> 
> For example, the MSI Gaming and Gaming X cards are whisper quiet even under full load and run significantly cooler than the FEs did. There is very minimal, if any throttling, and no reason to change fan profiles. Coil whine is non-existent in the MSI cards. I've been using mine for 2 weeks and have yet to hear a hint of coil whine in any benchmark or game.
> 
> And yes, my Gaming X is humming along quite nicely at 2113mhz and 5554mhz memory, without an EK block.


Also he used the EVGA GTX 980 Ti FTW as an example, but EVGA's 900 series ACX coolers are known to be quite loud unlike the 1080's ACX 3.0. Every review of aftermarket GTX 1080s compares them to FE, so you can just look at their results and see for yourself. The FE is hotter and louder than every aftermarket GTX 1080 according to every review I've seen.

MSI really cares about noise. Again, the loudest components in my PC are two Silverstone AP182s running at about 1000 RPM. My MSI GTX 1080 Armor OC cannot be heard over them even at 100%. I'm surprised this card with such a small heatsink doesn't ever get loud. Top fan speed is about 2300 RPM for reference, according to MSI Afterburner. The GAMING and GAMING X are bound to be more quiet and much cooler.


----------



## Robstar

Does anybody have issues with the fans on the Gigabyte GTX 1080 G1 ?

The fan closest to the ports started to make a rattling noise (2 weeks after purchase) from 45% up to 80% RPM (testet with MSI Afterburner Beta 4).
I updated the BIOS and Nvidia Drivers just in case but it is definitely a problem on the hardware side, most likely the bearings.


----------



## ChevChelios

Quote:


> Originally Posted by *Robstar*
> 
> Does anybody have issues with the fans on the Gigabyte GTX 1080 G1 ?
> 
> The fan closest to the ports started to make a rattling noise (2 weeks after purchase) from 45% up to 80% RPM (testet with MSI Afterburner Beta 4).
> I updated the BIOS and Nvidia Drivers just in case but it is definitely a problem on the hardware side, most likely the bearings.


wait do you mean the fan randomly spins up from 45% to 80% for no reason ?

or that at 80% it rattles ?

my G1 is ok but I had it for less than 2 weeks atm and fans never went over 45-50% so far


----------



## moustang

Quote:


> Originally Posted by *Alwrath*
> 
> Sounds like he had a positive experience with the FE cooler. Maybe they arent as bad as people claim. Maybe people just like to complain?


Sounds like he was comparing his FE to his 980 Ti rather than comparing it to custom AIB 1080s which the person he was responding to was talking about.

The person he was quoting was comparing the FE to 3rd party cooling on the 1080, and there simply is no comparison there. Take the MSI 1080 Gaming for example. You can get it for the same price as the FE but it's stock cooling is FAR superior, keeping the card significantly cooler and is much more quiet at the same time. That means you get far less throttling and far less noise which makes for better performance from a card that is easier on the ears.


----------



## THEROTHERHAMKID

Anyone having problems installing new drivers for the 1080? Mine keeps failing and will only install updates through windows update won't install through experience or from the download from nvidia website?


----------



## snafua

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Anyone having problems installing new drivers for the 1080? Mine keeps failing and will only install updates through windows update won't install through experience or from the download from nvidia website?


Let me guess, Windows 10? Have to disable it's automatic driver install.

http://winsupersite.com/windows-10/stop-automatic-driver-updates-windows-10


----------



## x7007

Fix is soon !!

ManuelGuzman said:
OK we found a system we were able to reproduce this on. Thank you all for your assistance. Once I have further information I will update everyone.

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Anyone having problems installing new drivers for the 1080? Mine keeps failing and will only install updates through windows update won't install through experience or from the download from nvidia website?


Try custom Install , Delete all the folder not needed in the Nvidia folder like GFE Experience , Update and such.
Don't clean Profiles, I had an issue one it didn't install when Clean Profiles was selected.

And don't install through windows update. just download the drivers from nvidia after you cleaned it with DDU 1.6.0.4


----------



## THEROTHERHAMKID

Yes windows 10 thanks for the reply but didn't work still won't install ? Won't even install normal experience only beta ?


----------



## KillerBee33

Quote:


> Originally Posted by *x7007*
> 
> Fix is soon !!
> 
> ManuelGuzman said:
> OK we found a system we were able to reproduce this on. Thank you all for your assistance. Once I have further information I will update everyone.
> Try custom Install , Delete all the folder not needed in the Nvidia folder like GFE Experience , Update and such.
> Don't clean Profiles, I had an issue one it didn't install when Clean Profiles was selected.
> 
> And don't install through windows update. just download the drivers from nvidia after you cleaned it with DDU 1.6.0.4


1. http://www.wagnardmobile.com/forums/viewtopic.php?f=5&t=276
2. 
Extract Downloaded Driver , delete everything except these Files on image , run setup


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *x7007*
> 
> Fix is soon !!
> 
> ManuelGuzman said:
> OK we found a system we were able to reproduce this on. Thank you all for your assistance. Once I have further information I will update everyone.
> Try custom Install , Delete all the folder not needed in the Nvidia folder like GFE Experience , Update and such.
> Don't clean Profiles, I had an issue one it didn't install when Clean Profiles was selected.
> 
> And don't install through windows update. just download the drivers from nvidia after you cleaned it with DDU 1.6.0.4


Thanks for the reply but didn't work still won't install other than through windows update?


----------



## x7007

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Thanks for the reply but didn't work still won't install other than through windows update?


Check the event viewer log and post it.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *x7007*
> 
> Check the event viewer log and post it.


Sorry but how do I post event viewer log ?


----------



## THEROTHERHAMKID

Sorted now cheers for replies ? No idea why it's just decided to install
Repd


----------



## ChevChelios

Quote:


> Originally Posted by *x7007*
> 
> Fix is soon !!
> 
> ManuelGuzman said:
> OK we found a system we were able to reproduce this on. Thank you all for your assistance. Once I have further information I will update everyone.


do you have a link where he said it ?


----------



## ChevChelios

https://forums.geforce.com/default/topic/947950/geforce-drivers/gtx-1080-gsync-forcing-vsync-on-/

this is a bug too apparently

kind of a bummer if it forces Vsync ON when your fps goes over 144


----------



## PasK1234Xw

Quote:


> Originally Posted by *ChevChelios*
> 
> https://forums.geforce.com/default/topic/947950/geforce-drivers/gtx-1080-gsync-forcing-vsync-on-/
> 
> this is a bug too apparently
> 
> kind of a bummer if it forces Vsync ON when your fps goes over 144


This isn't a bug its been like that for while now. Now that we have fast sync im sure it wont be this way for much longer.


----------



## boredgunner

Quote:


> Originally Posted by *PasK1234Xw*
> 
> This isn't a bug its been like that for while now. Now that we have fast sync im sure it wont be this way for much longer.


Fast Sync is said to be a stutter fest by most people who use it.


----------



## ChevChelios

Quote:


> This isn't a bug its been like that for while now


they changed that long ago so that you can choose whether to have Vsync ON or OFF with Gsync

http://www.geforce.com/whats-new/articles/g-sync-gets-even-better

but this bug apaprently forces Vsync back ON even if you have set it to OFF

dunno if it will also force it to ON if you set it to FastSync

if this persists Ill probably just set my monitor to 165Hz so that I can have up to 165 fps until Gsync stops working

that and/or cap frames globally to 140/160


----------



## x7007

Quote:


> Originally Posted by *ChevChelios*
> 
> do you have a link where he said it ?


https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/post/4923509/#4923509


----------



## aberrero

Quote:


> Originally Posted by *boredgunner*
> 
> Fast Sync is said to be a stutter fest by most people who use it.


Are you aware of any frametime benchmarks for it?


----------



## boredgunner

Quote:


> Originally Posted by *aberrero*
> 
> Are you aware of any frametime benchmarks for it?


Not that I've seen, I've only seen comments so far. Haven't tried it myself though.


----------



## t1337dude

Any reviews on the Gigabyte 1080 Xtreme yet? In theory it looks like the best card but I'm afraid that the execution might not be so hot. I was reading reports of the fans clipping on the GPU for a few people.

And is coil whine not so much an issue this time around like it was with some 980 Ti models?


----------



## chronicfx

G-Sync... Guys... G-Sync







It is worth it. I know there is a helluva premium but watch your stutter fest nightmares vanish!


----------



## dante`afk

I got lucky today and sniped 2x evga 1080 ftw+.

gonna see if one is enough or not. what is the best way to overclock, evga precision or flash some miracle bios that has been come up? what is this curve people are talking about?


----------



## KillerBee33

Has anyone tried the NEW Fast Sync option in NVCP?


----------



## TK421

does 10 bit color work with DP?

*with actual 10 bit monitor


----------



## ssgwright

I got my Zotac 1080 FE today and threw a EK waterblock on it. So far I'm running +190 on the core and +300 (heard this was the best to ensure optimal performance due to throttle?) Here's my firestrike:



how's it look? what kind of overclocks are you guys getting?


----------



## chronicfx

Quote:


> Originally Posted by *ssgwright*
> 
> I got my Zotac 1080 FE today and threw a EK waterblock on it. So far I'm running +190 on the core and +300 (heard this was the best to ensure optimal performance due to throttle?) Here's my firestrike:
> 
> 
> 
> how's it look? what kind of overclocks are you guys getting?


I have SLI, this score looks good. I get 10500 Graphics and 9850 total with two cards. I am +200 core and +400 mem, +90 voltage and +120% power this was a first push and "insta overclock" for me. I have been running it for a week now and have had no driver related crashes or artifacts. To boost is 2078 and eventually settles to either 2038 or 2050 game dependant. If you are more than half that you are good in my book









As for CPU comparison I run my 6700K at 4.9 and my physics can be just above or below 15000 for any given run.


----------



## ssgwright

Quote:


> Originally Posted by *chronicfx*
> 
> I have SLI, this score looks good. I get 10500 Graphics and 9850 total with two cards. I am +200 core and +400 mem, +90 voltage and +120% power this was a first push and "insta overclock" for me. I have been running it for a week now and have had no driver related crashes or artifacts. To boost is 2078 and eventually settles to either 2038 or 2050 game dependant. If you are more than half that you are good in my book
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for CPU comparison I run my 6700K at 4.9 and my physics can be just above or below 15000 for any given run.


thanks!


----------



## aberrero

Quote:


> Originally Posted by *chronicfx*
> 
> G-Sync... Guys... G-Sync
> 
> 
> 
> 
> 
> 
> 
> It is worth it. I know there is a helluva premium but watch your stutter fest nightmares vanish!


OLED. I'm not dropping a grand on a monitor that isn't OLED and at least 3k.


----------



## chronicfx

Quote:


> Originally Posted by *aberrero*
> 
> OLED. I'm not dropping a grand on a monitor that isn't OLED and at least 3k.


I love OLED and have had stock in it since infancy but the burn in is gonna make it a poor choice for computers unless they figure out how to mitigate it. Worse than plasma from what I hear.


----------



## aberrero

Quote:


> Originally Posted by *chronicfx*
> 
> I love OLED and have had stock in it since infancy but the burn in is gonna make it a poor choice for computers unless they figure out how to mitigate it. Worse than plasma from what I hear.


It has already gotten better. Dell's 30" OLED display is $5,000 though, and it doesn't support G-Sync.

Let's be honest: G-Sync is an awful technology that needs to die. Freesync is much better (or at least is more consumer/manufacturer friendly), and if nVidia stopped pushing G-Sync so hard every new mid-high end monitor would have aSync, and this wouldn't be an issue.


----------



## immortalkings

Quote:


> Let's be honest: G-Sync is an awful technology that needs to die. Freesync is much better (or at least is more consumer/manufacturer friendly), and if nVidia stopped pushing G-Sync so hard every new mid-high end monitor would have aSync, and this wouldn't be an issue.


sorry but i'l have to disagree with this, Gsync is better than Freesync. or almost the same. but the premium price is too much but it help a lot of gamer to play more smooth with less stutter and tearing


----------



## boredgunner

Quote:


> Originally Posted by *aberrero*
> 
> Let's be honest: G-Sync is an awful technology that needs to die. Freesync is much better (or at least is more consumer/manufacturer friendly), and if nVidia stopped pushing G-Sync so hard every new mid-high end monitor would have aSync, and this wouldn't be an issue.


This is so not true. First of all, G-SYNC is the best variable refresh rate solution, showing the benefits of being a hardware implementation. Monitors with a G-SYNC module have less input lag and often times a better overdrive implementation, particularly on the IPS monitors (see the Acer XF270HU and Eizo FS2735 and their overdrive implementation). NVIDIA doesn't force people to use G-SYNC, and they can't prevent manufacturers from using FreeSync instead (which is why there are so many more FreeSync monitors which negates your last point).


----------



## chronicfx

Have you ever used it? Never have I just clicked "play and forget it" since I got a G-Sync monitor... Until the GPU companies have solved all of their stuttering problems I feel it is necessary for SLI... I have played at least 8 games from start to end since I bought this monitor and not a single hitch or stutter has broken me from the immersion ... 980 Ti's and now 1080's super smooth with an XB270HU.. Before that 290x tri fire with triple monitor 1440p, 7970 trifire single 1440p, GTX 680 SLI 1080p, GTX 280 Tri SLI 1080p all had stutter in many games. Yes I keep pretty up-to-date hardware CPU and GPU so when I spend 2 grand a year I don't want stuttering... G-Sync has been the only thing to solve those problems.. So before you knock it because you can't afford it.. Just realize it pretty much is a necessity for SLI gamers. Single GPU there will be minimal stutter but if you want that scaling you need the second GPU and a G-Sync monitor... If you don't buy the monitor don't buy the second GPu that is my 2 cents.







Please don't trash it without showing some data.


----------



## aberrero

Quote:


> Originally Posted by *boredgunner*
> 
> This is so not true. First of all, G-SYNC is the best variable refresh rate solution, showing the benefits of being a hardware implementation. Monitors with a G-SYNC module have less input lag and often times a better overdrive implementation, particularly on the IPS monitors (see the Acer XF270HU and Eizo FS2735 and their overdrive implementation). NVIDIA doesn't force people to use G-SYNC, and they can't prevent manufacturers from using FreeSync instead (which is why there are so many more FreeSync monitors which negates your last point).


My problem is that I don't want to be stuck with a monitor that only works with one brand of card. I don't believe any monitors support both. It's also frustrating that GSync monitors tend to be "gaming" oriented with overdrive and other features that make them bad for productivity or photography. I get that G Sync is better but it doesn't do me any good if I can't actually feel comfortable about buying into it.


----------



## 337drew

I purchased the Asus GTX 1080 Strix this past weekend and I wanted to share some feedback. I picked up non OC version at a local shop. I chose the Strix because I felt it looked the best and read that it was very good on low sound levels. Boy was I wrong... After installing it, I loaded up Heaven Benchmark and immediately heard the dreaded coil whine. It's terribly bad. There's no way I'll live with it. I plan on returning it this weekend. I didn't spend $600+ to hear the equivalent of nails on chalk board while I try to enjoy myself. You can absolutely hear it over both fan noise and with headphones on.

Here's what I've tried thus far to mitigate it. None of the below worked.

1) Ran GPU stress tests for over 72hrs at every profile, and power target you can think of. My hopes were that it would somehow subside after baking in. I cannot detect any change after plenty of usage.
2) Flashed BIOS from non OC to OC version found on techpowerup and earlier in this thread. No change whatsoever.
3) Swapped Seasonic Platinum Fanless 520 PSU for Corsair HX750. No change

For me, the ONLY thing that both changed the tone and minimized the coil wine was to lower the power target < 60%. At 60% power you can't go much higher than 1600mhz. I obviously don't plan on keeping to card to only run it at those speeds. Sadly it's going back... I'm very closely looking at the EVGA 1080 FTW, because I know they have good customer service.

On the bright side, the OC BIOS worked like a champ. My card evens out at about 2015-2025 on the OC profile under benchmarks.


----------



## chronicfx

If you can hear it with headphones return it. Your only other option is trying a different PSU.


----------



## 337drew

Thx for reminding me.. I did try my other PSU. I have a Seasonic Platinum Fanless 520, and a Corsair HX750. No noticeable change in coil whine at all.


----------



## Spiriva

How many have already tried the strix1080xoc.rom on your card(s) ? Im gonna give it ago this weekend when I got some time over to play around with the computer. See if this bios with 1,24V will push the cards higher then 2200mhz









http://forum.hwbot.org/showthread.php?p=452427#post452427


----------



## ChevChelios

Quote:


> Originally Posted by *TK421*
> 
> does 10 bit color work with DP?
> 
> *with actual 10 bit monitor


curious too


----------



## bfedorov11

Quote:


> Originally Posted by *Spiriva*
> 
> How many have already tried the strix1080xoc.rom on your card(s) ? Im gonna give it ago this weekend when I got some time over to play around with the computer. See if this bios with 1,24V will push the cards higher then 2200mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forum.hwbot.org/showthread.php?p=452427#post452427


Will it work on FE cards?


----------



## Ragnarook

Quote:


> Originally Posted by *Spiriva*
> 
> How many have already tried the strix1080xoc.rom on your card(s) ? Im gonna give it ago this weekend when I got some time over to play around with the computer. See if this bios with 1,24V will push the cards higher then 2200mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forum.hwbot.org/showthread.php?p=452427#post452427


I flashed my first card just now with that bios



2316mhz with 1.193v so far


----------



## TK421

Quote:


> Originally Posted by *ChevChelios*
> 
> does 10 bit color work with DP?
> *with actual 10 bit monitor
> curious too


found some info that nvidia block 10 bit functionality on gtx graphics card, forcing people to buy quadro?

however, this feature has been reenabled some months ago on a driver revision, can anyone confirm? especially for the 1080?


----------



## Ragnarook

Quote:


> Originally Posted by *bfedorov11*
> 
> Will it work on FE cards?


Yes i just flashed my evga fe 1080 with this (flashed one of two cards to try first). So far 2316mhz on 1.193v.
It wount let me post pictures here because im new, but i posted it on a Swedish page.

http://www.sweclockers.com/forum/post/16255447


----------



## xTesla1856

Last night, half drunkenly I was thinking the following and I need your advice: I know this is the 1080 owner's club, but if I were to find 2 brand new 980Ti (Classified or Matrix) for the price of roughly one 1080, should I spring for the Ti's instead? I know all the implications SLI brings and compatibility issues. Or would my best move be to get a 1080 now and maybe add a 2nd sometime down the line?


----------



## fripon

Quote:


> Yes i just flashed my evga fe 1080 with this (flashed one of two cards to try first). So far 2316mhz on 1.193v.
> It wount let me post pictures here because im new, but i posted it on a Swedish page.


How did you flash it? I have use the NVflash from the source + dirty Joes one...always get the error the bios file can´t be read....


----------



## bfedorov11

Quote:


> Originally Posted by *Ragnarook*
> 
> Yes i just flashed my evga fe 1080 with this (flashed one of two cards to try first). So far 2316mhz on 1.193v.
> It wount let me post pictures here because im new, but i posted it on a Swedish page.
> 
> http://www.sweclockers.com/forum/post/16255447


It worked, but I am unable to increase voltage on the card. It still doesn't go past 1.050.


----------



## Ragnarook

Quote:


> Originally Posted by *fripon*
> 
> How did you flash it? I have use the NVflash from the source + dirty Joes one...always get the error the bios file can´t be read....


I used the file from hwbot, and then it did:

nvflash --index=0 --save 1080org.rom
nvflash --index=0 --protectoff
nvflash --index=0 -6 strix1080xoc.rom

--index=0 is because i have two cards, next card would be --index=1 and do the same thing again.

Also remember to up the volt you need to use msi after burner and use the graph overclocker (ctrl f to bring it up)


----------



## CerN

Awesome stuff guys. Curious to see if the the extra power pins on the AIB cards give any extra headroom with this BIOS.


----------



## bfedorov11

oh wow.. I never knew you could hold ctrl to adjust the curve.. I was just using shift.









Running 2250 with 1.2v steady







Didn't run it for long though. Will play some more in the morning.


----------



## ChevChelios

Quote:


> I never knew you could hold ctrl to adjust the curve


is that in afterburner ?


----------



## grimboso

What kind of FPS do you get with the 2300mhz overclocks?

I am still debating if I should go SLI or not. I just ordered a Acer Predator x34a, and according to the reviews I've seen you get close to a 100 in most games on a single 1080. Drop a few settings and you're fine att 100.

Besides, not all games I play support SLI either.


----------



## Ragnarook

I now flashed both my cards (evga 1080 FE)

The first card i had did 2200mhz on ~1.030v now does 2316mhz on 1.200v
The second card isnt as good, it use to do 2150-2164mhz on ~1.080v, now it does 2200mhz on 1.200v.

The different in temps are not very big. on 1.030v it maxes out at around 37c and on 1.200v it maxes out around 41c.


----------



## ChevChelios

Quote:


> Originally Posted by *Ragnarook*
> 
> I now flashed both my cards (evga 1080 FE)
> 
> The first card i had did 2200mhz on ~1.030v now does 2316mhz on 1.200v
> The second card isnt as good, it use to do 2150-2164mhz on ~1.080v, now it does 2200mhz on 1.200v.
> 
> The different in temps are not very big. on 1.030v it maxes out at around 37c and on 1.200v it maxes out around 41c.


wow, 2200-2300Mhz ? grats

is it worth it to flash that on an AIB air cooler ? or with temps of 70C+ you wont see stable clocks of 2200+ anyway ?


----------



## Clockster

Quote:


> Originally Posted by *Ragnarook*
> 
> Yes i just flashed my evga fe 1080 with this (flashed one of two cards to try first). So far 2316mhz on 1.193v.
> It wount let me post pictures here because im new, but i posted it on a Swedish page.
> 
> http://www.sweclockers.com/forum/post/16255447


Can you run 3D Mark Extreme or Ultra or if you don't own it, run the normal 3D Mark.


----------



## BrainSplatter

Quote:


> Originally Posted by *xTesla1856*
> 
> if I were to find 2 brand new 980Ti (Classified or Matrix) for the price of roughly one 1080, should I spring for the Ti's instead? I know all the implications SLI brings and compatibility issues. Or would my best move be to get a 1080 now and maybe add a 2nd sometime down the line?


I am currently testing this because I am still waiting for my 1080. I am evaluating 2 new Zotac 980TIs Omegas which have a large 2.5 slot cooler (HW seems identical to extreme version) which cost essentiall the same as 1 'well known brand' 1080 (750-800 Euros) One of them actually has an ASIC in the top 92% and can do Firestrike Ultra with about 1550 Mhz on air (5134 graphics score):
http://www.3dmark.com/fs/9188715.

My mobo + case is also large enough to have them in SLI. Performance wise it's great and just in the right area (4K @ 30-60fps @ max settings). Now the one big downside is the heat + noise. While the coolers are very capable and also very quite in single GPU mode, using them in SLI puts out a tremendous amount of heat which needs some serious fan power to be removed from the case. I always utilize a framerate limiter for smoother gameplay and for avoiding uneccessary GPU utilization but the heat which comes up from my small desk is quite astonishing, lol. In winter this could easily heat our living room









So, performance is great but heat + noise are to be considered. I am still considering to keep them until the 1080TI/next Titan with hopefully 50% more performance arrives because that 50% extra would bring me pretty much into the desired performance range. A single 1080 doesn't do that.

But if u have the money, get a 1080 instead or maybe 2x1070 until the TI/Titan comes out.


----------



## ChevChelios

what is the sweet spot for memory OC on 1080 ?

do you have to go for 11000 or is 10400-10600 or so the best ?

I heard lesser memory OC gives a marginally better core OC and that @ 11000 memory the fps may drop a bit ?

this is for 2560x1440p res, just in case


----------



## KillerBee33

This is as far as it goes on AIR without bios http://www.3dmark.com/3dm/13036005


----------



## fripon

Quote:


> I used the file from hwbot, and then it did:
> 
> nvflash --index=0 --save 1080org.rom
> nvflash --index=0 --protectoff
> nvflash --index=0 -6 strix1080xoc.rom
> 
> --index=0 is because i have two cards, next card would be --index=1 and do the same thing again.
> 
> Also remember to up the volt you need to use msi after burner and use the graph overclocker (ctrl f to bring it up)


Thx for help









But i am stuck @1.14V even with the Graph from MSI Afterburner.


----------



## fat4l

Never buy asus graphics cards.
Their customer support is very bad. No bios support. No card support. Very very poor and often theres coil whine.
Go with evga or even msi or gigabyte


----------



## Setzer

Quote:


> Originally Posted by *fat4l*
> 
> Never buy asus graphics cards.
> *Their customer support is very bad. No bios support. No card support.* Very very poor and often theres coil whine.


This!
I had an ASUS motherboard of which the BIOS got gimped after updating it with THEIR OWN update tool. ASUS told me I had to buy a new chip - refusing to send me one free of charge, despite it being their own tool gimping my BIOS chip.
A friend of mine has an ASUS Strix 970, and it keeps locking up. Drivers don't solve anything, with Nvidia telling him to contact ASUS, and ASUS telling him to buy a new card, or contact the retailer.

If it works, ASUS is pretty good, but if you need support, then you're on your own.
I hope my monitor never fails.


----------



## uberwootage

Quote:


> Originally Posted by *Ragnarook*
> 
> I flashed my first card just now with that bios
> 
> What does your curve look like for yout clocks
> 
> 
> 
> 2316mhz with 1.193v so far


Quote:


> Originally Posted by *KillerBee33*
> 
> YeAHHH , i think every single one of us waiting for Pascal NVFlash and Tweaker
> 
> 
> 
> 
> 
> 
> 
> 
> I'm just looking into some options but main idea still wait for EVGA Hybrid AIO for 10 Series
> 
> 
> 
> 
> 
> 
> 
> 
> By the way H55 is the Cheapest of the 3 available from Corsair on that SEA HAWK , don't know why just felt to mention .I think it might benefit from H90 and 140MM radiator for 1080.


Honestly at 2165 the temps i seen are amazing.


----------



## achilles73

Sorry for the noob question, but it's (already) possible to flash that "strix modified bios" in a MSI 1080 Gaming X ?
Or we need a hard mod on the 1080 too ?
Thanks.


----------



## dante`afk

I don't have my cards yet, but some questions:

is there any miracle bios everyone is using? what's the standard way of OF here, msi ab/evga precision or bios shenanigans?


----------



## Ragnarook

Quote:


> Originally Posted by *achilles73*
> 
> Sorry for the noob question, but it's (already) possible to flash that "strix modified bios" in a MSI 1080 Gaming X ?
> Or we need a hard mod on the 1080 too ?
> Thanks.


It should be fine for all 1080 cards.


----------



## vmanuelgm

Quote:


> Originally Posted by *Ragnarook*
> 
> It should be fine for all 1080 cards.


Hey Ragnarook, could you post a FS Ultra with 2300 in your 1080???

Thanks in advance...


----------



## Bdonedge

Ordered a GTX 1080 Gigabyte G1 just now.

I own the G1 1070 - gonna sell this thing once mine comes in so there is no gaming downtime. Why did I do this? I can't tell you.. wanted the best lol

Whatsup exclusive club members


----------



## seabiscuit68

Can someone with Rise of the Tomb Raider download the new patch and do a before / after the implementation of Asych? I'm curious to see if there is any real difference for Pascal.


----------



## ChevChelios

Quote:


> Originally Posted by *seabiscuit68*
> 
> Can someone with Rise of the Tomb Raider download the new patch and do a before / after the implementation of Asych? I'm curious to see if there is any real difference for Pascal.


not sure about Pascal, but on R9 Nano there is some improvement, but its mostly from the patch itself (better CPU usage ?)

only few % are from async itself



I suspect patch will improve DX12 on Pascal as well, but async will give ~0%-1%


----------



## axiumone

Quote:


> Originally Posted by *seabiscuit68*
> 
> Can someone with Rise of the Tomb Raider download the new patch and do a before / after the implementation of Asych? I'm curious to see if there is any real difference for Pascal.


At least in sli and surround the latest patch is way behind in dx12. Still about a 20% performance deficiency over dx11. It's a start I guess.


----------



## TK421

Quote:


> Originally Posted by *TK421*
> 
> found some info that nvidia block 10 bit functionality on gtx graphics card, forcing people to buy quadro?
> 
> however, this feature has been reenabled some months ago on a driver revision, can anyone confirm? especially for the 1080?


anyone can confirm?


----------



## seabiscuit68

Quote:


> Originally Posted by *ChevChelios*
> 
> not sure about Pascal, but on R9 Nano there is some improvement, but its mostly from the patch itself (better CPU usage ?)
> 
> only few % are from async itself
> 
> I suspect patch will improve DX12 on Pascal as well, but async will give ~0%-1%


I guess I was asking purely out of curiosity of whether NVIDIAs claims that Pascal is better at implementing Asych Compute are true.


----------



## ChevChelios

Quote:


> Originally Posted by *seabiscuit68*
> 
> I guess I was asking purely out of curiosity of whether NVIDIAs claims that Pascal is better at implementing Asych Compute are true.


well its better than Maxwell









the new RotR patch doesnt even support async on Maxwell







on Pascal it does


----------



## achilles73

Quote:


> Originally Posted by *Ragnarook*
> 
> It should be fine for all 1080 cards.


OK, thank you!
Will take a look the next days to see what my MSI 1080 Gaming X can do.


----------



## BrainSplatter

Quote:


> Originally Posted by *ChevChelios*
> 
> the new RotR patch doesnt even support async on Maxwell
> 
> 
> 
> 
> 
> 
> 
> on Pascal it does


That's an obvious optimization. The only thing Async on Maxwell does is to activate the Async software simulation in the driver. It only took them a while to realize because NVIDIA tried to obfuscate the lack of HW Asynch support in Maxwell as long as possible


----------



## ChevChelios

Im mostly glad for CPU optimizations on DX12

thats the thing thats needed the most (in general) IMO

to make use of those cores and threads


----------



## pez

Did some OC'ing last night and only got to benchmark a couple games (Hitman Absolution and GTA V). Running +110Mhz on GPU and +536Mhz on Mem. Power target is maxed at 108% via Xtreme Gaming software and I haven't touched the voltage...don't really plan to yet, either.boost through both benchmarks goes up to (and maintains) at 2063Mhz. Seems 4K likes the memory clocks a tad better, but still nothing big enough to make a drastic difference. That being said, I just back-ordered for my second G1 on Newegg. Now I need to find a nice bridge...


----------



## bfedorov11

Quote:


> Originally Posted by *ChevChelios*
> 
> is that in afterburner ?


yeah

http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,2.html
Quote:


> Put an anchor on the edge of screen and use linear curve scaling via dragging any point with mouse and holding button ("linear" mode).
> Apply the same fixed offset to all curve points via dragging any point with mouse and holding button (let's call that "basic" mode).


I never understood the curve.. it's almost like the axis labels are reversed.


----------



## ChevChelios

does curve OC give higher clocks than just setting PowerTarget (and possibly core voltage to +100) to max via AfterBurner ?


----------



## bfedorov11

Quote:


> Originally Posted by *ChevChelios*
> 
> does curve OC give higher clocks than just setting PowerTarget (and possibly core voltage to +100) to max via AfterBurner ?


yeah, because the voltage slider doesn't do anything.

I'm actually getting lower scores.. my best was ~2170mhz, 5934 overall FS ultra... running at 2200mhz and 5500 mem I scored below 5800. Maybe it's because it's a strix bios on a FE card. The card will bench 2260mhz but I can't get valid FS results.

I also updated drivers the other day but I doubt I would be dropping 100 points.


----------



## Alwrath

Just got my geforce 1080 amp extreme in the mail yesterday.







Huge upgrade from radeon 290. Didnt have much time with it but tested crysis 3 and Doom so far, highest stable oc is 2114 with no voltage increase, custom fan profile set and card doesnt go over 50c during gaming, power target stays mostly around 80%. Yeah im in heaven right now ?


----------



## Ragnarook

Quote:


> Originally Posted by *vmanuelgm*
> 
> Hey Ragnarook, could you post a FS Ultra with 2300 in your 1080???
> 
> Thanks in advance...


I already flashed the evga 1080 original bios back to both cards. I did that because my "second" card didnt respond very well to more volt, it went from 2164 to 2200 but wouldnt go even any higher at all.
Even at 1.200v it still refused to go higher. My "first" card however went up to 2316mhz, wish i had two that could do that


----------



## ssgwright

Quote:


> Originally Posted by *Ragnarook*
> 
> Yes i just flashed my evga fe 1080 with this (flashed one of two cards to try first). So far 2316mhz on 1.193v.
> It wount let me post pictures here because im new, but i posted it on a Swedish page.
> 
> http://www.sweclockers.com/forum/post/16255447


I have a Zotac FE and it won't let me flash, id mismatch?


----------



## boredgunner

Quote:


> Originally Posted by *Ragnarook*
> 
> I already flashed the evga 1080 original bios back to both cards. I did that because my "second" card didnt respond very well to more volt, it went from 2164 to 2200 but wouldnt go even any higher at all.
> Even at 1.200v it still refused to go higher. My "first" card however went up to 2316mhz, wish i had two that could do that


Woah hold on, which cards do you have?


----------



## axiumone

Quote:


> Originally Posted by *ssgwright*
> 
> I have a Zotac FE and it won't let me flash, id mismatch?


Use nvflash 5.292. That let me plow through id mismatch and flash.

http://www.majorgeeks.com/files/details/nvflash.html


----------



## uberwootage

How you guys getting to 1.2v. I flashed the bios and I'm only hitting 1.1ish


----------



## TK421

can anyone confirm that the gtx 1080 support 10 bit color output over DP to a real 10 bit monitor?


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> How you guys getting to 1.2v. I flashed the bios and I'm only hitting 1.1ish


same here?


----------



## DOOOLY

Should I upgrade to the 1080 from 780 in sli ?


----------



## nexxusty

Quote:


> Originally Posted by *chronicfx*
> 
> Have you ever used it? Never have I just clicked "play and forget it" since I got a G-Sync monitor... Until the GPU companies have solved all of their stuttering problems I feel it is necessary for SLI... I have played at least 8 games from start to end since I bought this monitor and not a single hitch or stutter has broken me from the immersion ... 980 Ti's and now 1080's super smooth with an XB270HU.. Before that 290x tri fire with triple monitor 1440p, 7970 trifire single 1440p, GTX 680 SLI 1080p, GTX 280 Tri SLI 1080p all had stutter in many games. Yes I keep pretty up-to-date hardware CPU and GPU so when I spend 2 grand a year I don't want stuttering... G-Sync has been the only thing to solve those problems.. So before you knock it because you can't afford it.. Just realize it pretty much is a necessity for SLI gamers. Single GPU there will be minimal stutter but if you want that scaling you need the second GPU and a G-Sync monitor... If you don't buy the monitor don't buy the second GPu that is my 2 cents.
> 
> 
> 
> 
> 
> 
> 
> Please don't trash it without showing some data.


Having more GSYNC experience (and monitors themselves) than the average person, I can say GSYNC does NOT fix a crap game engine.

UE3 games for example. GSYNC does nothing to help stutters. Try playing BL2 with PhysX and tell me you don't get stutters.

I also find it hard to believe that you played 8 games start to finish without a stutter. Lol. This is 2016, there's no way out of 8 games one of them doesn't have a crappy engine.


----------



## bfedorov11

Quote:


> Originally Posted by *uberwootage*
> 
> How you guys getting to 1.2v. I flashed the bios and I'm only hitting 1.1ish


afterburner, ctrl f, hold shift to bring down points, then hold ctrl to bring up the speed. This was the only way I could get more voltage to the card.


----------



## boredgunner

Quote:


> Originally Posted by *aberrero*
> 
> My problem is that I don't want to be stuck with a monitor that only works with one brand of card. I don't believe any monitors support both. It's also frustrating that GSync monitors tend to be "gaming" oriented with overdrive and other features that make them bad for productivity or photography. I get that G Sync is better but it doesn't do me any good if I can't actually feel comfortable about buying into it.


For sure. In an ideal world NVIDIA and AMD and Intel and others would support an open VESA AdaptiveSync standard like FreeSync.

Also overdrive doesn't make a monitor bad for productivity or photography. It's just that monitors excellent for the latter aren't good for gaming and vice versa.
Quote:


> Originally Posted by *nexxusty*
> 
> Having more GSYNC experience (and monitors themselves) than the average person, I can say GSYNC does NOT fix a crap game engine.
> 
> UE3 games for example. GSYNC does nothing to help stutters. Try playing BL2 with PhysX and tell me you don't get stutters.


Yup, G-SYNC isn't magical. It's excellent but not magic. UE3 in particular is a great example, several games on that engine are stutterfests with or without G-SYNC.


----------



## boredgunner

Quote:


> Originally Posted by *DOOOLY*
> 
> Should I upgrade to the 1080 from 780 in sli ?


----------



## vmanuelgm

Quote:


> Originally Posted by *Ragnarook*
> 
> I already flashed the evga 1080 original bios back to both cards. I did that because my "second" card didnt respond very well to more volt, it went from 2164 to 2200 but wouldnt go even any higher at all.
> Even at 1.200v it still refused to go higher. My "first" card however went up to 2316mhz, wish i had two that could do that


But was that diamond card stable at 2300???

I would like to see some benchs at that clock in air/water...


----------



## nexxusty

Quote:


> Originally Posted by *DOOOLY*
> 
> Should I upgrade to the 1080 from 780 in sli ?


Oh without a doubt. 780sli is weak comparatively.

You're going to get about a 65% increase in perfs.


----------



## bfedorov11

Looks like I have some degradation after using that bios. I reflashed the original bios 3 times, reinstalled drivers with ddu 3 times, rolled back to .39 driver.. I get crashes at 2150 when my card could do almost 2180. It seems it also affects memory too so may be is power related. Only other change I can't undo is 3dmark update. Max gpu temp was 43.

I think there is a reason why nobody is posting higher bench numbers with the bios.


----------



## KillerBee33

Patiently waiting for Pascal BIOS TOOL.


----------



## TK421

Quote:


> Originally Posted by *bfedorov11*
> 
> Looks like I have some degradation after using that bios. I reflashed the original bios 3 times, reinstalled drivers with ddu 3 times, rolled back to .39 driver.. I get crashes at 2150 when my card could do almost 2180. It seems it also affects memory too so may be is power related. Only other change I can't undo is 3dmark update. Max gpu temp was 43.
> 
> I think there is a reason why nobody is posting higher bench numbers with the bios.


did the asus xoc 1.24v bios bork the fe card?

even with curve oc afterburner I cannot get 1.24v

does anyone know if the gtx 1080 support 10 bit output over dp to a true 10 bit monitor?


----------



## nexxusty

You boys should stop flashing BIOS's... it's not a bright idea.

No idea what the difference is until you flash, boot and check voltages and clocks?

I'm sorry to be so blunt.


----------



## Accuracy158

Quote:


> Originally Posted by *axiumone*
> 
> At least in sli and surround the latest patch is way behind in dx12. Still about a 20% performance deficiency over dx11. It's a start I guess.


Really I tried out the in game benchmark fast with my SLI 970s today and was getting 54 FPS with out SLI and 85 fps after enabling it. (I think i have AA and motion blur turn down everything else max 2560x1440) ...1080 is still in the mail.


----------



## nexxusty

Quote:


> Originally Posted by *Accuracy158*
> 
> Really I tried out the in game benchmark fast with my SLI 970s today and was getting 54 FPS with out SLI and 85 fps after enabling it. (I think i have AA and motion blur turn down everything else max 2560x1440) ...1080 is still in the mail.


Did you enable DX11 and re-test SLi? You should be at 102fps according to what he is saying.....


----------



## TK421

Quote:


> Originally Posted by *nexxusty*
> 
> You boys should stop flashing BIOS's... it's not a bright idea.
> 
> No idea what the difference is until you flash, boot and check voltages and clocks?
> 
> I'm sorry to be so blunt.


just impatient I guess

I still have a week left before i can do an exchange with microcenter


----------



## nexxusty

Quote:


> Originally Posted by *TK421*
> 
> just impatient I guess
> 
> I still have a week left before i can do an exchange with microcenter


I hear you man. Trust me when I say it's taking everything in me not to try what you guys are doing. Lol.

I WIIIISH we had a Microcenter in Canada. Even just one. Was saying this the other day to someone playing The Division. He was talking about going down to Microcenter and I called him a bastard for it, hehe.


----------



## Ragnarook

Quote:


> Originally Posted by *boredgunner*
> 
> Woah hold on, which cards do you have?


I got two Evga 1080 FE
Quote:


> Originally Posted by *KillerBee33*
> 
> Patiently waiting for Pascal BIOS TOOL.


The strix ocx bios will let you run at 1.200v, what els would you like to do with a pascal bios tool then to raise the voltage ?
Quote:


> Originally Posted by *ssgwright*
> 
> I have a Zotac FE and it won't let me flash, id mismatch?


nvflash --protectoff (add index=0 if you have a sli setup and then do the same again but write index=1)
nvflash -6 strix1080xoc.rom (add index=0 if you have a sli setup and then do the same again but write index=1)


----------



## boredgunner

Quote:


> Originally Posted by *Ragnarook*
> 
> I got two Evga 1080 FE


Water cooled I assume?


----------



## Ragnarook

Quote:


> Originally Posted by *boredgunner*
> 
> Water cooled I assume?


Yep, thats correct =)


----------



## nexxusty

Quote:


> Originally Posted by *Ragnarook*
> 
> I got two Evga 1080 FE
> The strix ocx bios will let you run at 1.200v, what els would you like to do with a pascal bios tool then to raise the voltage ?


I have an EVGA FE too... stop. You're killing me here. Lol.

1.200v would be so sweet. I shall wait a little longer though.
Quote:


> Originally Posted by *Ragnarook*
> 
> Yep, thats correct =)


I believe your GPU's were in a pic a few pages back? With universal GPU blocks on them?

I didn't notice if there was any VRM or RAM cooling on them.... necessary or?


----------



## marc0053

I successfully installed the ASUS strix XOC bios posted by Dancop on a Gigabyte G1 and gained about 25 MHz over stock G1 bios under water in both cases.
Idle temps between both bios are the same for me at around 25C and loaded temps are 1C higher with the XOC bios at a maximum of 37C.

XOC bios
http://hwbot.org/submission/3258437_marc0053_gpupi___1b_geforce_gtx_1080_16sec_178ms

G1 bios
http://hwbot.org/submission/3242251_


----------



## boredgunner

I'm too much of a coward to try flashing the BIOS. My card gets too got at 1.075v on air anyway. I plan to water cool my next GPU though so who knows...


----------



## KillerBee33

@ Ragnarook
Everything you see here and then some . After bricking my old 970 with some one elses BIOS , i like to do things myself


----------



## Ragnarook

Quote:


> Originally Posted by *nexxusty*
> 
> I have an EVGA FE too... stop. You're killing me here. Lol.
> 
> 1.200v would be so sweet. I shall wait a little longer though.


For one of my cards it did a nice differens (from 2200mhz to 2316mhz) on the other card it didnt do much at all tho, got it up too 2200 from ~2150-2164mhz. So both card could run at 2200mhz, but my better card (card 1) hits 2200 w/o raising the volt at all, running at 1.030v. And the second one (card 2) will only hit 2200mhz with 1.200v but can do 2150-2164mhz (depening on game/benchmark program) at its original volt 1.080v

So in my case i would have used this bios if i only had one card (card 1) but since i run sli and my second card (card 2) isnt as good as card 1 there isnt much of a point for me to run it 36-50mhz faster at 1.200v


----------



## nexxusty

Quote:


> Originally Posted by *Ragnarook*
> 
> For one of my cards it did a nice differens (from 2200mhz to 2316mhz) on the other card it didnt do much at all tho, got it up too 2200 from ~2150-2164mhz. So both card could run at 2200mhz, but my better card (card 1) hits 2200 w/o raising the volt at all, running at 1.030v. And the second one (card 2) will only hit 2200mhz with 1.200v but can do 2150-2164mhz (depening on game/benchmark program) at its original volt 1.080v
> 
> So in my case i would have used this bios if i only had one card (card 1) but since i run sli and my second card (card 2) isnt as good as card 1 there isnt much of a point for me to run it 36-50mhz faster at 1.200v


Hmm. Well that report makes me less inclined to expect lots of gains from my FE. She does 2075 mhz right now at 80% fan.

Meaningless I suppose now that I think about it. You have yours under water, stock BIOS or not.

I realized the pic I thought you posted wasn't yours. It was JPMBoy's, do you have full waterblocks on yours?


----------



## Ragnarook

Quote:


> Originally Posted by *nexxusty*
> 
> Hmm. Well that report makes me less inclined to expect lots of gains from my FE. She does 2075 mhz right now at 80% fan.
> 
> Meaningless I suppose now that I think about it. You have yours under water, stock BIOS or not.
> 
> I realized the pic I thought you posted wasn't yours. It was JPMBoy's, do you have full waterblocks on yours?


Yep, EK-FC1080 GTX Nickel waterblocks on both cards, with backplates.


----------



## nexxusty

Quote:


> Originally Posted by *Ragnarook*
> 
> Yep, EK-FC1080 GTX Nickel waterblocks on both cards, with backplates.


Yeah.... lol that's a world of difference from my Stock FE. I suppose 2075mhz is decent.

Well that settles it.. have to get water on this card immediately. There seems to be at least 125mhz left in my core.


----------



## fireyfire

I just saw the ASUS STRIX XOC modded BIOS, I am currently able to run my zotac amp! edition card at 2202 MHz at stock settings (On water) It maxes at about 1.083 MV and this voltage boost should give me a large boost in OC, maybe even up to 2400 MHZ? Maybe im not that lucky.


----------



## xer0h0ur

Gents I have a question for you, I just got my Gigabyte FE GTX 1080 and EK-FC1080 Nickel block. Are we still able to use the FE's backplate along with this block? Or do I need to buy EK's to use a backplate with this block?


----------



## Ragnarook

Quote:


> Originally Posted by *fireyfire*
> 
> I just saw the ASUS STRIX XOC modded BIOS, I am currently able to run my zotac amp! edition card at 2202 MHz at stock settings (On water) It maxes at about 1.083 MV and this voltage boost should give me a large boost in OC, maybe even up to 2400 MHZ? Maybe im not that lucky.


Only one way to find out








Quote:


> Originally Posted by *xer0h0ur*
> 
> Gents I have a question for you, I just got my Gigabyte FE GTX 1080 and EK-FC1080 Nickel block. Are we still able to use the FE's backplate along with this block? Or do I need to buy EK's to use a backplate with this block?


You will need a EK backplate to use with that waterblock.


----------



## nexxusty

Quote:


> Originally Posted by *fireyfire*
> 
> I just saw the ASUS STRIX XOC modded BIOS, I am currently able to run my zotac amp! edition card at 2202 MHz at stock settings (On water) It maxes at about 1.083 MV and this voltage boost should give me a large boost in OC, maybe even up to 2400 MHZ? Maybe im not that lucky.


Expect around 2300mhz like Ragnarook up there.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Gents I have a question for you, I just got my Gigabyte FE GTX 1080 and EK-FC1080 Nickel block. Are we still able to use the FE's backplate along with this block? Or do I need to buy EK's to use a backplate with this block?


Even without actually "Knowing" I can say with confidence the answer is a big NO.

It's plastic anyway. Ditch it.


----------



## xer0h0ur

Ah, I had not pulled it out of the anti-static bag yet to notice that its plastic. LOL, what genius made the decision to use plastic for a backplate? Thanks for the answers boys.


----------



## mr2cam

So the Strix bios seems to be the best out of all the 1080's at this point? I have an MSI card and the highest my voltage will go is 1043 while under water, no matter where I put the voltage slider. Guessing I would see a little benefit with the strix-x bios?


----------



## Ragnarook

Quote:


> Originally Posted by *mr2cam*
> 
> So the Strix bios seems to be the best out of all the 1080's at this point? I have an MSI card and the highest my voltage will go is 1043 while under water, no matter where I put the voltage slider. Guessing I would see a little benefit with the strix-x bios?


Only one way to find out







The voltage slider wount do anything with the strix bios either tho, you need to use the graph (ctrl+f in msi afterburner) to overclock and then it will up the volt too.


----------



## nexxusty

Whoa, secretive...


----------



## Accuracy158

Quote:


> Originally Posted by *nexxusty*
> 
> Did you enable DX11 and re-test SLi? You should be at 102fps according to what he is saying.....


Actualy I just re-ran it a few times and was getting a higher framerate in Dx12... Dx11 was only 79 FPS. Of course this is the built in benchmark and not actual game play. Like I said before it's also GTX 970s for now I don't have a second 1080 yet or high bandwidth SLI bridge.


----------



## nexxusty

Quote:


> Originally Posted by *Accuracy158*
> 
> Actualy I just re-ran it a few times and was getting a higher framerate in Dx12... Dx11 was only 79 FPS. Of course this is the built in benchmark and not actual game play. Like I said before it's also GTX 970s for now I don't have a second 1080 yet or high bandwidth SLI bridge.


Hmmmm. Maxwell doesnt get a perf boost, Pascal does?

Albeit 20% less than DX11... I guess we'll see.


----------



## mr2cam

Quote:


> Originally Posted by *Ragnarook*
> 
> Only one way to find out
> 
> 
> 
> 
> 
> 
> 
> The voltage slider wount do anything with the strix bios either tho, you need to use the graph (ctrl+f in msi afterburner) to overclock and then it will up the volt too.


I tried to use the graph with my bios and it didn't change anything


----------



## ssgwright

the new bios let me add 67mhz to my core at 1.2 but honoslty it was only a slight gain in benchmarks? So I flashed back. I'm on water running 2100/5500


----------



## versions

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ah, I had not pulled it out of the anti-static bag yet to notice that its plastic. LOL, what genius made the decision to use plastic for a backplate? Thanks for the answers boys.


Yeah, the FE backplate is garbage. It's a shame because the front of the card looks and feels very nice, and then you have that crap on the back ruining it.


----------



## pez

Still looks better than no backplate to me.


----------



## kx11

Quote:


> Originally Posted by *chronicfx*
> 
> I have SLI, this score looks good. I get 10500 Graphics and 9850 total with two cards. I am +200 core and +400 mem, +90 voltage and +120% power this was a first push and "insta overclock" for me. I have been running it for a week now and have had no driver related crashes or artifacts. To boost is 2078 and eventually settles to either 2038 or 2050 game dependant. If you are more than half that you are good in my book
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for CPU comparison I run my 6700K at 4.9 and my physics can be just above or below 15000 for any given run.


with my 1080 Strix cards i can't do that , i can reach 75+ core clock and 420+ mem 120% power target and 100 voltage


----------



## Jared Pace

nexxusty, if you have G-sync, find the FPS that BL2 can handle for avg/max, then edit willowengine.ini and change Physxlevel=2 and maxparticleresize=256

Then put a framerate cap near the avg FPS & turn G-sync on. For me its like 38-45 fps range. Looks very smooth and stays at 45 fps even with hugh physx counts on Overpower 8 UVHM 4 player


----------



## alawadhi3000

I'm ready.


----------



## nexxusty

Quote:


> Originally Posted by *Jared Pace*
> 
> nexxusty, if you have G-sync, find the FPS that BL2 can handle for avg/max, then edit willowengine.ini and change Physxlevel=2 and maxparticleresize=256
> 
> Then put a framerate cap near the avg FPS & turn G-sync on. For me its like 38-45 fps range. Looks very smooth and stays at 45 fps even with hugh physx counts on Overpower 8 UVHM 4 player


If this helps even a little I'll kiss you.

I'll install BL2 when I get home, have a new Dell 27" 1440p G-Sync to play with.
Thanks for the tip bro!


----------



## nexxusty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ah, I had not pulled it out of the anti-static bag yet to notice that its plastic. LOL, what genius made the decision to use plastic for a backplate? Thanks for the answers boys.


It's not a HORRIBLE decision.... it adds rigidity. The card doesn't get hot enough for the plastic to have an insulating effect. Or so it seems.

Still... agreed. Jesus am I an NV fan boy without even knowing it? Why did I defend them there? Lol.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> If this helps even a little I'll kiss you.
> 
> I'll install BL2 when I get home, have a new Dell 27" 1440p G-Sync to play with.
> Thanks for the tip bro!


BL2 runs @ 2160p, locked @ 60fps not a single drop on a 1080. So as BLTPS.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> BL2 runs @ 2160p, locked @ 60fps not a single drop on a 1080. So as BLTPS.


Not with PhysX enabled.

Please don't respond saying it runs without a framedrop with PhysX on... lol. You seem like someone who values your credibility.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Not with PhysX enabled.
> 
> Please don't respond saying it runs without a framedrop with PhysX on... lol. You seem like someone who values your credibility.


everything MAXED OUT


----------



## kx11

Quote:


> Originally Posted by *nexxusty*
> 
> Not with PhysX enabled.
> 
> Please don't respond saying it runs without a framedrop with PhysX on... lol. You seem like someone who values your credibility.


he might be right , Nvidia's new power managment option (optimal power ) seems to hold 60fps in games amazingly


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> everything MAXED OUT


Well... there goes your credibility... J/K.

BL2 can't run flawlessly with PhysX on. That's a fact. Frame times are RIDICULOUS with PhysX on. You seriously don't notice that?

Or are you one of those people who think they have the only system in the world that runs BL2 with PhysX on? Heh.









Regardless I'm not getting into this again. Pretty sure I've already had it out with someone on this very thread and on this very subject. Not again, lol.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Not with PhysX enabled.
> 
> Please don't respond saying it runs without a framedrop with PhysX on... lol. You seem like someone who values your credibility.


Just for you Sir. i will make a video...


----------



## nexxusty

Here we go... lol.

Alright I'll play ball. Let's see this.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Well... there goes your credibility... J/K.
> 
> BL2 can't run flawlessly with PhysX on. That's a fact. Frame times are RIDICULOUS with PhysX on. You seriously don't notice that?
> 
> Or are you one of those people who think they have the only system in the world that runs BL2 with PhysX on? Heh.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regardless I'm not getting into this again. Pretty sure I've already had it out with someone on this very thread and on this very subject. Not again, lol.


By the way my 980 ran BL2 @ 1620 with drops only in Caustic Caverns and Thousand Cuts.....


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> By the way my 980 ran BL2 @ 1620 with drops only in Caustic Caverns and Thousand Cuts.....


I haven't tried BL2 in over two years. You're getting me excited that something has changed... lol.

Caustic and Thousand Cuts are the worst for sure. Same with the first level leading up to the ship. Always experienced fps drops there. Gave up as I thought the game just had way too many draw calls.

Each successive upgrade with little to no performance seemed to confirm this.


----------



## KillerBee33

This was few years back without BIOS MOD , yes this is my Video. Had a dispute about changing PhysX to CPU , here is the outcome, and i'll make a new one just for you Sir.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> This was few years back without BIOS MOD , yes this is my Video. Had a dispute about changing PhysX to CPU , here is the outcome, and i'll make a new one just for you Sir.


I've tried PhysX on CPU as well as dedicated PhysX... didn't work like that when I did.

This was on an..... i7 920 @ 4.4ghz and a 470 at 840MHz core. 5930k @ 4.4ghz and a 1080 is a big difference from then.

Interesting.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> I've tried PhysX on CPU as well as dedicated PhysX... didn't work like that when I did.
> 
> This was on an..... i7 920 @ 4.4ghz and a 470 at 840MHz core. 5930k @ 4.4ghz and a 1080 is a big difference from then.
> 
> Interesting.


This video was made on an Alienware X51 with i7 4770 plus 970 and all that on 330W psu jammed in a 13X12X5 inch BOX








You cant be serious that 1080 cant run this


----------



## Jared Pace

Quote:


> Originally Posted by *KillerBee33*
> 
> This was few years back without BIOS MOD , yes this is my Video. Had a dispute about changing PhysX to CPU , here is the outcome, and i'll make a new one just for you Sir.


edit: nvm i see you have 60 fps with physx set to use 970

just set my 980 to physx in options on new driver and can keep higher than 45 fps now, guess the fixed something that used to be a problem in bl2


----------



## dante`afk

Quesion about flashing:

is NVFlash 5.292.0 the most recent one for flashing 1080?
what is the command? just nvflash yourfilenamehere.bios ?
is it stave to do it in windows?

the asus strix OC is the only one able to get 1.2v ? What about the Inno3d?

is here anyone with EVGA 1080 FTW, what are your results?


----------



## mouacyk

Quote:


> Originally Posted by *versions*
> 
> Yeah, the FE backplate is garbage. It's a shame because the front of the card looks and feels very nice, and then you have that crap on the back ruining it.


You obviously didn't know about NVidia's premium materials.

@Ragnarook - Why you no fs benchark at 2300? Would like to see that OC backed up.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> This video was made on an Alienware X51 with i7 4770 plus 970 and all that on 330W psu jammed in a 13X12X5 inch BOX
> 
> 
> 
> 
> 
> 
> 
> 
> You cant be serious that 1080 cant run this


I'm serious about the engine being UE3 and a garbage implementation to boot.

In no way am I saying a 1080 SHOULDN'T run this. The fact remains.... until proven wrong of course.


----------



## axiumone

Quote:


> Originally Posted by *dante`afk*
> 
> Quesion about flashing:
> 
> is NVFlash 5.292.0 the most recent one for flashing 1080?
> what is the command? just nvflash yourfilenamehere.bios ?
> is it stave to do it in windows?
> 
> the asus strix OC is the only one able to get 1.2v ? What about the Inno3d?
> 
> is here anyone with EVGA 1080 FTW, what are your results?


nvflash -i0 -6 romname.rom

-i0 this command tells it which card to flash, use --list command to figure out what your card number is.


----------



## bfedorov11

Borderlands 2? I get 100+ fps in 4k, maxed, high physx.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> I'm serious about the engine being UE3 and a garbage implementation to boot.
> 
> In no way am I saying a 1080 SHOULDN'T run this. The fact remains.... until proven wrong of course.


Yeah, i've spent many Months in BL , those drops were always anoying , but it's mainly few Maps the rest of the game runs great even in heavy battles








Video on its way with Sanctuary-Thoudand Cuts and Highlands , 2 out of 3 maps run steady and ofcourse Thousand Cuts will never be fixed to compare









Keep in mind recording in the background @ 4K 2GB video will take few Min to Upload and Prosess by YT


----------



## looniam

Quote:


> Originally Posted by *KillerBee33*
> 
> This was few years back without BIOS MOD , yes this is my Video. Had a dispute about changing PhysX to CPU , here is the outcome, and i'll make a new one just for you Sir.


that would have saved me sooo many arguments a few years ago.

_always use "auto" for physX._

+REP from a physX lover.


----------



## KillerBee33

The 1080p will be available as soon as YouTube ready for it few hours as usual but here it is. V Sync ON


----------



## ArakniD

Ok.. Gigabyte 1080 FE owner here. I've loaded on the StrixX OC bios successfully and now I can run all the way up to 1.2v without an issue.

Now this bios has lost the power usage output.. that is somewhat important I feel..??

Is there a bios out there that still has a power cap setting.. but it has been raised by say.. 30%?

I'm unsure of the power usage at the higher clock rates!

If I let 2000 MHz @ 1.05v = 108% power;

speed x volts x factor = power;

factor = 0.00051428571428571429

2300 MHz x 1.2v x factor = 142%

The power phase circuits use the part NTMFD4C85N to regulate the Vcore.. these can handle at 85 degC (maximum air cooled temp) at the lets say; 108% power draw.\

If we water cool those parts are kept 45ºC lower.. estimated.. so lets assume these PowerPhase chips can handle 30% higher load when kept at a reasonable temperature.. (major assumptions here)..

I """think""" 2.3GHz @ 1.2v (if stable) is the maximum power draw that these cards could sustain..

It would pull an additional 42% over the rated cards TPD .

So what is safe? 2.2GHz?

Please. any more tech wizards out there? please elaborate


----------



## mr2cam

Flashed to the strix bios and im not able to adjust my powerlimit or temp limit on msi afterburner, also ctrl+f does nothing now, any ideas why?


----------



## scaramonga

*Waits patiently for the rest of lab rats to test before doing anything*


----------



## mr2cam

I'm almost wondering if it was a fail safe that MSI put into their cards, went ahead and flashed back to the stock bios, can now control power limit and temp limit. I am going to wait until someone figures out how to mod the GP104 bios.


----------



## ssgwright

i ran a higher clock at 1.2 with the stix bios and it actually performed worse than my stock FE bios


----------



## fitzy-775

So since Iv'e had this 1080 my monitor keeps flickering pretty badly, is this happening to anyone else?


----------



## boredgunner

Quote:


> Originally Posted by *fitzy-775*
> 
> So since Iv'e had this 1080 my monitor keeps flickering pretty badly, is this happening to anyone else?


144 Hz flickering is a known issue. One I don't suffer from though.


----------



## Alwrath

Ran Firestrike demo with my Core i5 760 @4.2 and 2114 mhz core 10800 mhz memory Geforce 1080 amp extreme ( STOCK VOLTAGE ) :

Total score 15004,Graphics 23806,fps 115,fps 93

Looks like my cpu is still kicking ass and chewing bubblegum, even after 5 1/2 years


----------



## scaramonga

Quote:


> Originally Posted by *fitzy-775*
> 
> So since Iv'e had this 1080 my monitor keeps flickering pretty badly, is this happening to anyone else?


Use latest drivers. It still happens though, more so on 144 monitors. Also. You MUST remove older drivers via DDU, along with profiles. Yes. DO NOT import any via Nvidia Inspector!!

When done, and installing new driver set, choose 'clean install' via driver options.

Now, what I'd recommend you do, is what I do, and have been doing for years.

1. Download latest driver set and extract that set to any folder you wish.
2. Gaze at amazement at all the added bloat within said folder.
3. Say goodbye to most of it, and Delete until you have as below:










4. Go to Programs & Features and....

Uninstall Vulkan ****.
Uninstall PhysX.
Uninstall any other Nvidia ****.
Uninstall Nvidia drivers.
Do NOT reboot.
Pull plug from ethernet connection to internet.

5. Run the very latest DDU, and choose to run in 'safe mode'.
6. Uninstall and clean Nvidia stuff.
7. After boot, run setup from driver folder with removed bloat, and tick 'clean install' option.
8. Reboot.
9. Connect ethernet back.


----------



## boredgunner

Quote:


> Originally Posted by *Alwrath*
> 
> Ran Firestrike demo with my Core i5 760 @4.2 and 2114 mhz core 10800 mhz memory Geforce 1080 amp extreme ( STOCK VOLTAGE ) :
> 
> Total score 15004,Graphics 23806,fps 115,fps 93
> 
> Looks like my cpu is still kicking ass and chewing bubblegum, even after 5 1/2 years


I was so proud of my 4.2 GHz overclock on that CPU. Funny thing is, that was the normal overclock for it. Everyone was getting it, despite its stock speed being 2.8 GHz. That's a 50% overclock. I wish current day Intel CPUs overclocked that well... a 50% overclock on the i7 6700k would be 6 GHz. My i5 760 was hot as hell at 4.2 GHz though.

With that being said, you would notice improvements with a newer CPU, i7 especially. I went from the i5 760 @ 4.2 GHz to an i7 2600 non-K (it was free) and saw some very nice improvements. The i7 6700k didn't make much of a difference in the games I tested, but supposedly it does in a few like Fallout 4. I'd upgrade my CPU/mobo/RAM if I were you.


----------



## Alwrath

Quote:


> Originally Posted by *boredgunner*
> 
> I'd upgrade my CPU/mobo/RAM if I were you.


Oh I plan to, soon as Zen and Skylake-E benches hit the web


----------



## bfedorov11

Quote:


> Originally Posted by *mr2cam*
> 
> Flashed to the strix bios and im not able to adjust my powerlimit or temp limit on msi afterburner, also ctrl+f does nothing now, any ideas why?


That's because the PT was removed.

I have no idea if this really does anything, but in AB options, voltage control, I had it set to standard msi. First few times I was trying it, I couldn't get it to increase voltage too. Not sure what fixed it.

The guy that posted the bios does not recommend using it on other cards.

http://forum.hwbot.org/showpost.php?p=452701&postcount=11


----------



## ArakniD

Quote:


> Originally Posted by *bfedorov11*
> 
> The guy that posted the bios does not recommend using it on other cards.
> 
> http://forum.hwbot.org/showpost.php?p=452701&postcount=11


Seems like a fairly standard recommendation... works on Evga, gigabyte and MSI cards tho...

But what is the safe limit on a FE?


----------



## uberwootage

I did some data sheet checking and took my system to work and put her up on ye old agilent and ran some 1.2v benches at 2.265ghz and everything was fine. All the measurements were withent spec and I don't think you will hit the risk zone until about 1.25 - 1.3v if they ever unlock that.

Now am I going to say I checked it and its good to go green light that ****. No I just did some measurements I did not dig down to the nitty gritty and come pair circuits between the cards. Also keep in mind the tolerance on the components is not the same for every card. So what would be same on my card could make your card cry. But I will be running that bios as a daily driver until something better comes out.

Side note the corsair cx600 was retired today. Hooked it up to see how bad I was beating on it and it was pretty bad lol. Replaced it with a evga super nova 1kw and much much much more stable. The cx600 was fine but I was hitting it hard with water pumps "2 loops" fans USB stuff and over clocks. And its old so years of being smacked like that time to replace it before it gets revenge and decide to kill my mobo or GPU. I had an old psu take its wrath out on a mobo. Lost the onboard nic and vaporized a trace off the 24 pin. But I got it working again. Good off athlon 64 days lol


----------



## ArakniD

Quote:


> Originally Posted by *uberwootage*
> 
> I did some data sheet checking and took my system to work and put her up on ye old agilent and ran some 1.2v benches at 2.265ghz and everything was fine. All the measurements were withent spec and I don't think you will hit the risk zone until about 1.25 - 1.3v if they ever unlock that.


What was the current draw ? I've got an Agilent DMM at work with ethernet.. that will log the 8pin OK if I makeup a intercept cable..

But it will ignore the pcie source.. does vcore only source from the 8pin header? And all other rails source from pcie??

I'm sure some log dumps here would be nice?


----------



## uberwootage

Quote:


> Originally Posted by *ArakniD*
> 
> What was the current draw ? I've got an Agilent DMM at work with ethernet.. that will log the 8pin OK if I makeup a intercept cable..
> 
> But it will ignore the pcie source.. does vcore only source from the 8pin header? And all other rails source from pcie??
> 
> I'm sure some log dumps here would be nice?


Monday I'll toss some up. I was just writing them down and left the paper at work. It was just a personal thing dident know anyone would be interested in it lol. I can log it I test fiberoptic receivers so I have access to a few million dollars in test equipment. Only reason. I'm an RF engineer I get all the fun toys.

Monday I'll get some numbers and if I have time get some logs and upload them or at the very least put up a video on YouTube.


----------



## metal409

Not sure if anyone else tried this or cares, but the studs used to mount the FE backplate are the same thread that EK uses for their waterblocks. I just had to slightly shorten them to work.


----------



## ArakniD

Quote:


> Originally Posted by *uberwootage*
> 
> Monday I'll toss some up. I was just writing them down and left the paper at work. It was just a personal thing dident know anyone would be interested in it lol. I can log it I test fiberoptic receivers so I have access to a few million dollars in test equipment. Only reason. I'm an RF engineer I get all the fun toys.
> .


Ah top bloke.

Your multichannel equipment will be far superior to my lesser single chan equip.

A few MHz steps around 1900 2000 2100 2200 with 1.05 to 1.2v at each step where stable should do







few 20s runs? Voltage, current and clock?

Should be able to predict power draw at MHz x voltage then.. you get the idea









Should then be able to say what's """safe""" at what temp..


----------



## ArakniD

Quote:


> Originally Posted by *metal409*
> 
> Not sure if anyone else tried this or cares, but the studs used to mount the FE backplate are the same thread that EK uses for their waterblocks. I just had to slightly shorten them to work.


I did the same. Be careful that my removing the block in future! They didnt apply thread lock to the inserts! So it's easy to unscrew the insert with the screw and apply pressure on the PCB. That got me last night! Broke one stud putting it back together. Luckily didn't damage the card....

EK.. plz apply thread lock


----------



## uberwootage

Quote:


> Originally Posted by *ArakniD*
> 
> Ah top bloke.
> 
> Your multichannel equipment will be far superior to my lesser single chan equip.
> 
> A few MHz steps around 1900 2000 2100 2200 with 1.05 to 1.2v at each step where stable should do
> 
> 
> 
> 
> 
> 
> 
> few 20s runs? Voltage, current and clock?
> 
> Should be able to predict power draw at MHz x voltage then.. you get the idea
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should then be able to say what's """safe""" at what temp..


Yeah and they are more robust then people think. There are specs they list and then higher specs they are fine at but they tell you. Same thing what we do. Our internal spec is way higher then what the customers specs are and what we spec them for after. We have amps that are a 5v bias but we found you can run them fine at 19v and get way more performance out of them. You just gotta get an idea of what there internal spec is and see how much they spec it to cover for slight failures.

The equipment and machine shop access is nice. But still not worth dealing with all the peasants there.


----------



## Riadon

Would there be anything wrong with me using the XOC BIOS with a 1070, or does it only work for 1080s?


----------



## KillerBee33

Looks like the "X" like to be bosted








http://www.3dmark.com/3dm/13067828


----------



## dante`afk

if i'm lucky both of my cards could arrive today. scheduled delivery is Monday but they are already in the city as per the tracking.


----------



## axiumone

The evga hybrid kit is available. Comes with the pump/rad plus a blower. It'll be usable on any reference board.

http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


----------



## ArakniD

StrixX OC bios rocks.. I could not get my 1080 FE card above 1900 MHz with the default bios. Now I've pushed 2100 MHz rock solid at 1.15v.. I'll keep pushing up until it stops..

(edit) 2150 is too high.. 2100 is just right @ 1.15v with 1400 MHz memory. I guess I didnt strike the OC lottery with a 2300 capable unit


----------



## kx11

since i have original Strix cards running SLi , MSI AB is reporting that the cards doesn't pass the 2063 core clock no matter what while Memory Clocks can reach 11190mhz


----------



## Jpmboy

Quote:


> Originally Posted by *boredgunner*
> 
> I was so proud of my 4.2 GHz overclock on that CPU. Funny thing is, that was the normal overclock for it. Everyone was getting it, despite its stock speed being 2.8 GHz. That's a 50% overclock. I wish current day Intel CPUs overclocked that well... a 50% overclock on the i7 6700k would be 6 GHz. My i5 760 was hot as hell at 4.2 GHz though.
> 
> With that being said, you would notice improvements with a newer CPU, i7 especially. I went from the i5 760 @ 4.2 GHz to an i7 2600 non-K (it was free) and saw some very nice improvements. The i7 6700k didn't make much of a difference in the games I tested, but supposedly it does in a few like Fallout 4. I'd upgrade my CPU/mobo/RAM if I were you.


an only average 5960X does a a 50% OC (4.5GHz) on all 8 cores. a 6950X is close (4.3-4.4 being average on 10 cores) I'm running 4.5 ATM. So there are and have been CPUs since the 920 that run 50% over stock.
But, IMO, a good 4.7-4.8GHz 6700k is the best gaming value ATM. Don;lt expect to see 50% over stock OCs as a regular thing at 14nm.


----------



## Rhadamanthis

Quote:


> Originally Posted by *axiumone*
> 
> The evga hybrid kit is available. Comes with the pump/rad plus a blower. It'll be usable on any reference board.
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


damn, not for europe damnnnnnnnnnnnnnnnnnn


----------



## boredgunner

Quote:


> Originally Posted by *Jpmboy*
> 
> an only average 5960X does a a 50% OC (4.5GHz) on all 8 cores. a 6950X is close (4.3-4.4 being average on 10 cores) I'm running 4.5 ATM. So there are and have been CPUs since the 920 that run 50% over stock.
> But, IMO, a good 4.7-4.8GHz 6700k is the best gaming value ATM. Don;lt expect to see 50% over stock OCs as a regular thing at 14nm.


Yeah the i7 6700k is the best overall gaming CPU (I wish mine could hit 4.8 or even 4.7 though), I just miss those crazy overclocks.


----------



## KillerBee33

Quote:


> Originally Posted by *axiumone*
> 
> The evga hybrid kit is available. Comes with the pump/rad plus a blower. It'll be usable on any reference board.
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


[email protected] If i can get that bracket only i'd go with H90 , EVGA hasn't changed anything at all








Ugly @ss shoud got even worse just with more metal


----------



## uberwootage

Quote:


> Originally Posted by *Riadon*
> 
> Would there be anything wrong with me using the XOC BIOS with a 1070, or does it only work for 1080s?


You will brick. Just the dif ram alone. diff. timings and speeds. Do not flash it to 1070.


----------



## SweWiking

Quote:


> Originally Posted by *ArakniD*
> 
> StrixX OC bios rocks.. I could not get my 1080 FE card above 1900 MHz with the default bios. Now I've pushed 2100 MHz rock solid at 1.15v.. I'll keep pushing up until it stops..
> 
> (edit) 2150 is too high.. 2100 is just right @ 1.15v with 1400 MHz memory. I guess I didnt strike the OC lottery with a 2300 capable unit


I agree 1.15v seems to be a sweet spot. My 1080 evga fe did just 2038mhz. But with this bios i could get an extra 100 mhz out of it. Adding more volt after 1.15v just made the card unstable even at lower clock speed.


----------



## Noshuru

Quote:


> Originally Posted by *SweWiking*
> 
> I agree 1.15v seems to be a sweet spot. My 1080 evga fe did just 2038mhz. But with this bios i could get an extra 100 mhz out of it. Adding more volt after 1.15v just made the card unstable even at lower clock speed.


How do I limit it to 1.15v?


----------



## arrow0309

Quote:


> Originally Posted by *Alwrath*
> 
> Just got my geforce 1080 amp extreme in the mail yesterday.
> 
> 
> 
> 
> 
> 
> 
> Huge upgrade from radeon 290. Didnt have much time with it but tested crysis 3 and Doom so far, highest stable oc is 2114 with no voltage increase, custom fan profile set and card doesnt go over 50c during gaming, power target stays mostly around 80%. Yeah im in heaven right now ?


+Rep

I've ordered one yesterday as well, that's quite a nice rocking newest monster oc "out of the box" video card, congrats mate















Hope I'm not gonna get any less than yours









And also,
Hi everybody, I'm also gonna join soon with the Zotac 1080 Amp Extreme


----------



## SweWiking

Quote:


> Originally Posted by *Noshuru*
> 
> How do I limit it to 1.15v?


Use msi a:b and that graph overclocking way. And make sure that the highest point is at 1150v and just match that to whatever mhz you need.










Like here for exemple you can see I went got 2062mhz at 1.150v
You can lower every "Point" by shift + click any Point and pull them down abit, then just pull the one you need, in this exemple 1150 up to the mhz you need. In this exemple 2062mhz.

It says -14 here because the strix xoc bios is default 2088mhz. Once you go above 2088mhz it will start saying + instead.

*Sorry my message looks like hell, Im typing on my cell


----------



## ssgwright

i think I found my zotac FE sweet spot, I'm able to get 1.093v on the stock bios and max for me is 2126/5400


----------



## Naked Snake

Guys are you using the Strix bios on FE on air? or water?


----------



## nexxusty

Quote:


> Originally Posted by *axiumone*
> 
> The evga hybrid kit is available. Comes with the pump/rad plus a blower. It'll be usable on any reference board.
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


Well... you rock. Lol.

Waiting for this. Thanks!


----------



## SweWiking

Quote:


> Originally Posted by *Naked Snake*
> 
> Guys are you using the Strix bios on FE on air? or water?


I use an EK block.


----------



## uberwootage

Quote:


> Originally Posted by *SweWiking*
> 
> Use msi a:b and that graph overclocking way. And make sure that the highest point is at 1150v and just match that to whatever mhz you need.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Like here for exemple you can see I went got 2062mhz at 1.150v
> You can lower every "Point" by shift + click any Point and pull them down abit, then just pull the one you need, in this exemple 1150 up to the mhz you need. In this exemple 1062mhz.
> 
> It says -14 here because the strix xoc bios is default 2088mhz. Once you go above 2088mhz it will start saying + instead.


Yeah i seen that to. Something is going on with the asus bios making it unstable at 1.2v I think if you cant hit 1.2 on that bios you might clock higher with the stock bios after we can up the tdp.
Quote:


> Originally Posted by *Naked Snake*
> 
> Guys are you using the Strix bios on FE on air? or water?


Water but for some reason the strrix bios can clock higher but needs more voltage to be stable at lower clocks.


----------



## Jordel

Got my hands on two cards; one is a plain EVGA GeForce GTX 1080 ACX 3.0, and the other one is an MSI GeForce GTX 1080 ARMOR 8G OC

I'm running both at 2000MHz core, 11008MHz memory, 1.043v on the EVGA and 0.981v on the MSI

Cooled by air, not sure if I'll take the plunge on water cooling these. I'm considering putting them in my girlfriend's computer once "1080 Ti"/"Titan P" hits, and slap some waterblocks on those instead.


----------



## pez

Anyone have experience with back ordering through Newegg? I did so with my second G1 and it says it's in packaging already. I got a preorder/backorder confirmation yesterday and now an order confirmation this morning with the packaging status. I'm not sure if this is the norm, but I'd be surprised if it was actually going to ship out Monday/Tuesday for as long as the backorder was up.


----------



## Visceral

1080/1070 Hybrid kits are out.

http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1

With tax and shipping it's about $135


----------



## arrow0309

Quote:


> Originally Posted by *Jordel*
> 
> Got my hands on two cards; one is a plain EVGA GeForce GTX 1080 ACX 3.0, and the other one is an MSI GeForce GTX 1080 ARMOR 8G OC
> 
> I'm running both at 2000MHz core, 11008MHz memory, 1.043v on the EVGA and 0.981v on the MSI
> 
> Cooled by air, not sure if I'll take the plunge on water cooling these. I'm considering putting them in my girlfriend's computer once "1080 Ti"/"Titan P" hits, and slap some waterblocks on those instead.


Welcome to the ocn you hard core gamers (assuming your girlfriend is a gamer as well)


----------



## Alwrath

Quote:


> Originally Posted by *arrow0309*
> 
> +Rep
> 
> I've ordered one yesterday as well, that's quite a nice rocking newest monster oc "out of the box" video card, congrats mate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope I'm not gonna get any less than yours
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And also,
> Hi everybody, I'm also gonna join soon with the Zotac 1080 Amp Extreme


Welcome to the club. This Zotac is the best cooling you can get for the gtx 1080 next to water cooling pretty much. Excellent card so far. Will probably put some voltage into it eventually and get 2200/2300 mhz if im lucky ?


----------



## arrow0309

Quote:


> Originally Posted by *Alwrath*
> 
> Welcome to the club. This Zotac is the best cooling you can get for the gtx 1080 next to water cooling pretty much. Excellent card so far. Will probably put some voltage into it eventually and get 2200/2300 mhz if im lucky ?


Thanks
And yeah, let me know if you can squeeze out even more from the beast, 2200 would be real nice for me, anything above is a mith








Can't wait to have it, I hope I'll get it before the next weekend.
Nice cooling as it seems, I'll probably leave it like that, on air for a while and then, if the Bitspower will make a wb for this series as well as the 980 (ti)'s I might buy one.
I have to say that your temps are simply great, did you somehow replace the thermal grease of the gpu?
Cause I have an Italian friend and he's getting 70 C with the stock fan curve and 65 with a modified (1:1) one.

Also looking on your sig, are you also thinking of a cpu & system upgrade?
Cause' you may benefit some from this in minimum framerate and frametimes


----------



## Alwrath

Quote:


> Originally Posted by *arrow0309*
> 
> Thanks
> And yeah, let me know if you can squeeze out even more from the beast, 2200 would be real nice for me, anything above is a mith
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait to have it, I hope I'll get it before the next weekend.
> Nice cooling as it seems, I'll probably leave it like that, on air for a while and then, if the Bitspower will make a wb for this series as well as the 980 (ti)'s I might buy one.
> I have to say that your temps are simply great, did you somehow replace the thermal grease of the gpu?
> Cause I have an Italian friend and he's getting 70 C with the stock fan curve and 65 with a modified (1:1) one.
> 
> Also looking on your sig, are you also thinking of a cpu & system upgrade?
> Cause' you may benefit some from this in minimum framerate and frametimes


Stock thermal paste. The Phantom 820 is a great case. Big fans and all the hot air gets out of the case fast, excellent airflow. Water cooling the cpu also helps. I will replace the thermal paste with arctic silver ceramic when I go for the next overclock.

Soon as Zen and Skylake-e benches and reviews come out ill update my cpu for sure, but it would be funny to just keep on trucking with this first gen i5 till 2020. Everyone would be like " this guy lol " ????


----------



## Jordel

Quote:


> Originally Posted by *arrow0309*
> 
> Welcome to the ocn you hard core gamers (assuming your girlfriend is a gamer as well)


Thank you! We sure are! She's rocking a 980 Ti right now, but I'm looking to bump her up to these two 1080s, and put the 980 Ti to some other use!


----------



## ArakniD

Quote:


> Originally Posted by *SweWiking*
> 
> I agree 1.15v seems to be a sweet spot. My 1080 evga fe did just 2038mhz. But with this bios i could get an extra 100 mhz out of it. Adding more volt after 1.15v just made the card unstable even at lower clock speed.


Ah okay... I kept adding voltage AND MHz... I'll retry by adding MHz only. I had to drop memory clocks by 50mhz after long runs as the temps started slowing the max memory speed down.. keep an eye on that too.

I can get 8000 in a Fur-mark 1080p run on my 2500k.. I feel like it's holding me back now in game. Time for 6700k or Zen!


----------



## SiriusLeo

*Okay, here's the best of what I've got so far. I've gone through three nVidia GTX 1080 Founder Editions cards and this is the best of the lot thus far. This is STOCK bios for the Founder Edition card. These numbers are with an EK Waterblock installed. Overclocking this card with GPU Boost 3.0 makes me really miss my 980ti's with Custom Bios.







*

*Before installing the EK Water Block I was unable to go over 2050 core, even with max fan speed. It would hit 60c and the voltage would start jumping all over the place adding instability. After installing the water block on this card I haven't seen temps over 39c and it will contently run along at 2100 core, (no fluctuations) for hours. The core will go up to 2126mhz and is 24/7 stable there but these settings in the graph seem to be the sweet spot. The Memory on this card will OC all the way to +1000Mhz, (max slider) and is benchmark stable. It's 24/7 Stable at +800Mhz but I see little to ZERO difference in performance past where I have it set now, (+464Mhz) which seems to be the sweet spot for the memory for now.*


*Adding more Volts via the slider on the left tends to only add instability. This is what the Core and Memory are set to in the GPU-Z graph above.
*

*Best 3DMark Extreme I've been able to get, 11,724. Trying to push 12,000 Graphics Score without flashing the BIOS.*


*Quick Snapshot.*


----------



## dentnu

Well I have spent most of the day testing the XOC Bios on my MSI 1080 Gaming X. I was able to push my card to +2300MHz @ 1.2v. Which is great but I am getting lower benchmark scores and FPS in Heaven and Firestrike. I tested my card using the XOC bios at different voltages and core freq. but I could never reach or beat my firestrike score of 20135 on my stock bios with my max clocks of 2139MHz core and 5580MHz on the memory.

I even tested the XOC bios using the same clocks and volts I used with my stock bios and could not reach or even get close to my top score of 20135 in firestrike. I kept getting a score of 19610 and I noticed my FPS was allot lower Haven. I tried reinstalling drivers and XOC bios and same thing. I do not know what is going on but the XOC bios is no good on my card.

While I can push my core super high and its stable with no artifacting nor driver crashing it is not any good since I am losing FPS and getting lowers scores in 3dmark. I went ahead and flashed my stock bios back and tested and sure enough got my +20000 score in firestrike. I highly recommend you guys test the XOC bios and compare with your stock bios and make sure you are getting higher FPS and high scores in benchmarks.

Link to my top score on firestrike stock bios

Fire Strike 1.1 Score = 20135
http://www.3dmark.com/fs/9053405

Fire Strike Extreme 1.1 Score = 11141
http://www.3dmark.com/fs/9073628


----------



## uberwootage

Quote:


> Originally Posted by *dentnu*
> 
> Well I have spent most of the day testing the XOC Bios on my MSI 1080 Gaming X. I was able to push my card to +2300MHz @ 1.2v. Which is great but I am getting lower benchmark scores and FPS in Heaven and Firestrike. I tested my card using the XOC bios at different voltages and core freq. but I could never reach or beat my firestrike score of 20135 on my stock bios with my max clocks of 2139MHz core and 5580MHz on the memory.
> 
> I even tested the XOC bios using the same clocks and volts I used with my stock bios and could not reach or even get close to my top score of 20135 in firestrike. I kept getting a score of 19610 and I noticed my FPS was allot lower Haven. I tried reinstalling drivers and XOC bios and same thing. I do not know what is going on but the XOC bios is no good on my card.
> 
> While I can push my core super high and its stable with no artifacting nor driver crashing it is not any good since I am losing FPS and getting lowers scores in 3dmark. I went ahead and flashed my stock bios back and tested and sure enough got my +20000 score in firestrike. I highly recommend you guys test the XOC bios and compare with your stock bios and make sure you are getting higher FPS and high scores in benchmarks.
> 
> Link to my top score on firestrike
> 
> http://www.3dmark.com/fs/9053405


Yeah that bios is typical asus. Toss up some numbers to distract you from the drop in stuff that counts. Asus lovers im sorry but after the Sabertooth i had everything asus i have got has been junk. Like my 970 strix that could not overclock because asus put a bios on it that crippled the card so it could not be overclocked at all compaired to a g1 or other brands.

From what i seen the only good thing is the tdp is unlocked in that bios. Clocks with that bios take more voltage then others to be stable and you get a good fps hit when using it.

For all you with i5's i trested a gtx 1080 fe with a crap clocking 4690k and it ran nice. 25,000 gpu score.http://www.3dmark.com/3dm/12757481

We really need a better bios. Just bump the tdp to 150 or 200%


----------



## ssgwright

Quote:


> Originally Posted by *dentnu*
> 
> Well I have spent most of the day testing the XOC Bios on my MSI 1080 Gaming X. I was able to push my card to +2300MHz @ 1.2v. Which is great but I am getting lower benchmark scores and FPS in Heaven and Firestrike. I tested my card using the XOC bios at different voltages and core freq. but I could never reach or beat my firestrike score of 20135 on my stock bios with my max clocks of 2139MHz core and 5580MHz on the memory.
> 
> I even tested the XOC bios using the same clocks and volts I used with my stock bios and could not reach or even get close to my top score of 20135 in firestrike. I kept getting a score of 19610 and I noticed my FPS was allot lower Haven. I tried reinstalling drivers and XOC bios and same thing. I do not know what is going on but the XOC bios is no good on my card.
> 
> While I can push my core super high and its stable with no artifacting nor driver crashing it is not any good since I am losing FPS and getting lowers scores in 3dmark. I went ahead and flashed my stock bios back and tested and sure enough got my +20000 score in firestrike. I highly recommend you guys test the XOC bios and compare with your stock bios and make sure you are getting higher FPS and high scores in benchmarks.
> 
> Link to my top score on firestrike stock bios
> 
> Fire Strike 1.1 Score = 20135
> http://www.3dmark.com/fs/9053405
> 
> Fire Strike Extreme 1.1 Score = 11141
> http://www.3dmark.com/fs/9073628


same here, better overclocking with the xoc bios worse performance than stock bios (using an FE card)


----------



## dentnu

Quote:


> Originally Posted by *ssgwright*
> 
> same here, better overclocking with the xoc bios worse performance than stock bios (using an FE card)


Thanks for letting us know thought I might be the only one that experienced it. If more people are having the same issue then I see no point in using this bios. Yea you can hit higher overclocks but you don't get any benifits from it in fact you get worse proformance than your stock bios.


----------



## TK421

so how long until we get proper bios edit lol?


----------



## sok0

Just finished watercooling my G1 1080 gaming. Temps max out at 43c . Stock bios using MSI afterburner.

http://www.3dmark.com/3dm/13084696


----------



## ssgwright

not bad, here's mine.... it's all about tweaking the curve!

http://www.3dmark.com/3dm/13085185


----------



## KillerBee33

Quote:


> Originally Posted by *sok0*
> 
> Just finished watercooling my G1 1080 gaming. Temps max out at 43c . Stock bios using MSI afterburner.
> 
> http://www.3dmark.com/3dm/13084696


I'll see your watercooled G1 and raise you MSI FE on air








http://www.3dmark.com/3dm/13067828


----------



## Benjiw

Quote:


> Originally Posted by *KillerBee33*
> 
> I'll see your watercooled G1 and raise you MSI FE on air
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/13067828


Looks like your ram speed pulled you ahead there. Nice scores though guys.


----------



## KillerBee33

Quote:


> Originally Posted by *Benjiw*
> 
> Looks like your ram speed pulled you ahead there. Nice scores though guys.


Goes easy to 600 but Kills the performance after +550. Can't wait to see what this thing will do with proper Power management @ 1.25V under water


----------



## scaramonga

Quote:


> Originally Posted by *SiriusLeo*
> 
> *Okay, here's the best of what I've got so far. I've gone through three nVidia GTX 1080 Founder Editions cards and this is the best of the lot thus far. This is STOCK bios for the Founder Edition card. These numbers are with an EK Waterblock installed. Overclocking this card with GPU Boost 3.0 makes me really miss my 980ti's with Custom Bios.
> 
> 
> 
> 
> 
> 
> 
> *


Interesting.

I take it that you must have the EVGA FE card, as Precision XOC will not work properly with the likes of other non-branded FE cards?, like my PNY variant. Regardless. I'm kinda regretting getting rid of my 980ti also, as there is not much margin for error on the 1080, with, or without 8+6 input, it really don't seem to matter. EK block fitted here also, but now I'm wondering if there was any need to do so?


----------



## sherlock

Quote:


> Originally Posted by *scaramonga*
> 
> Interesting.
> 
> I take it that you must have the EVGA FE card, as Precision XOC will not work properly with the likes of other non-branded FE cards?, like my PNY variant. Regardless. I'm kinda regretting getting rid of my 980ti also, as there is not much margin for error on the 1080, with, or without 8+6 input, it really don't seem to matter. EK block fitted here also, but now I'm wondering if there was any need to do so?


The only thing differentiate a EVGA FE from other FE is the Box, it have nothing to do with whether Percision XOC will work with it or not. You were on an odd card that don't respond or a bad version of XOC, that's about it.


----------



## scaramonga

Quote:


> Originally Posted by *sherlock*
> 
> The only thing differentiate a EVGA FE from other FE is the Box, it have nothing to do with whether Percision XOC will work with it or not. You were on an odd card that don't respond or a bad version of XOC, that's about it.


So ALL features of the Precision XOC work with non EVGA cards?, and I mean ALL?

I must have an 'odd' card then


----------



## ssgwright

Quote:


> Originally Posted by *scaramonga*
> 
> So ALL features of the Precision XOC work with non EVGA cards?, and I mean ALL?
> 
> I must have an 'odd' card then


yup


----------



## scaramonga

Not that I'd use it 24/7 anyway, as it's a buggy mess ATM, what with OC starting whenever it feels like it, and version numbers not tying up with what has been installed, so I'll be the judge of what's wrong with that scenario.









If it turns out to be my card, then so be it, but I'm thinking otherwise, for the moment.

I prefer not to use software clocking anyway, as I'd much rather do it at BIOS level, so flash and forget, but it seems this is looking unlikely for the 1080, which is a shame.


----------



## Accuracy158

Quote:


> Originally Posted by *boredgunner*
> 
> I was so proud of my 4.2 GHz overclock on that CPU. Funny thing is, that was the normal overclock for it. Everyone was getting it, despite its stock speed being 2.8 GHz. That's a 50% overclock. I wish current day Intel CPUs overclocked that well... a 50% overclock on the i7 6700k would be 6 GHz. My i5 760 was hot as hell at 4.2 GHz though.
> 
> With that being said, you would notice improvements with a newer CPU, i7 especially. I went from the i5 760 @ 4.2 GHz to an i7 2600 non-K (it was free) and saw some very nice improvements. The i7 6700k didn't make much of a difference in the games I tested, but supposedly it does in a few like Fallout 4. I'd upgrade my CPU/mobo/RAM if I were you.


I would say the typical overclock on the i5 760 was more like 4GHz. I could hit 4.2GHz but just barely and wasn't worth the extra voltage required. Some chips wouldn't do 4GHz at all but with the base clock al the way back at 2.8 overclocking still gave a nice jump.









I know this isn't 1080 related but I still have a secondary PC with with this fairly cheap old CPU that I continue to pair with modern GPUs. Wouldn't be holding up as well as it does at all if it didn't OC so well.


----------



## bfedorov11

Quote:


> Originally Posted by *SiriusLeo*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Wouldn't water bypass your gpu block with all 4 ports hooked up?

Don't think I posted my FS with the xoc bios. Same thing.. lower scores.

2177/5557 xoc
http://www.3dmark.com/fs/9209254

same speed.. best with the FE bios
http://www.3dmark.com/fs/8897015


----------



## Menthol

Quote:


> Originally Posted by *ssgwright*
> 
> yup


After I install Precision XOC it asks for a name, email, and serial number of the card before it opens, are you saying if you don't enter that info it still works, or it only asks if it see's an EVGA bios?


----------



## nexxusty

Quote:


> Originally Posted by *bfedorov11*
> 
> [/SPOILER]
> 
> Wouldn't water bypass your gpu block with all 4 ports hooked up?
> 
> Don't think I posted my FS with the xoc bios. Same thing.. lower scores.
> 
> 2177/5557 xoc
> http://www.3dmark.com/fs/9209254
> 
> same speed.. best with the FE bios
> http://www.3dmark.com/fs/8897015


This looks super clean... surely his GPU temps reflect a working loop.


----------



## jprovido

my asus strix can only go upto 2088mhz with +100mv. dissapointed


----------



## SiriusLeo

Quote:


> Originally Posted by *scaramonga*
> 
> Interesting.
> 
> I take it that you must have the EVGA FE card, as Precision XOC will not work properly with the likes of other non-branded FE cards?, like my PNY variant. Regardless. I'm kinda regretting getting rid of my 980ti also, as there is not much margin for error on the 1080, with, or without 8+6 input, it really don't seem to matter. EK block fitted here also, but now I'm wondering if there was any need to do so?


Nah, I've been picking up the nVidia retail boxes from BestBuy everytime they come in stock. It's not EVGA. When Precision asked for name, email and SN I just put in the info / SN for one of my old EVGA branded 980's. So far, I've had better success / stability with Precision over Afterburner.


----------



## SiriusLeo

Quote:


> Originally Posted by *bfedorov11*
> 
> [/SPOILER]
> 
> Wouldn't water bypass your gpu block with all 4 ports hooked up?
> 
> Don't think I posted my FS with the xoc bios. Same thing.. lower scores.
> 
> 2177/5557 xoc
> http://www.3dmark.com/fs/9209254
> 
> same speed.. best with the FE bios
> http://www.3dmark.com/fs/8897015


Parallel configurations not only look clean but also perform as good as serial configurations. EVEN if it did perform marginally worse, (and it doesn't here) I'd still set it up this way because it looks way more aesthetically pleasing!


----------



## L4TINO

i will mention again that i have a MSI 1080 GAMING X

ive taken a look at both MSI's 1080 gaming x oc bios which is standard with my card, and the strix1080xoc bios.
now with my standard OC bios i have noticed even when altering the frequency curve it does not change the voltage that is has which is inbetween 1.04 - 1.0930 Max under load, so for the conclusion of the oc standard msi bios is that the voltage is capped which only led me to get to 2025mhz stable, under quick tests.

now with the strix1080xoc bios i was able to modify the frequency curve to reach the voltage of 1.200 max which led me to a 2080mhz clock on a quick test, the temperatures also changed with this bios showing lower temperatures even with higher voltage

Idle voltage on MSI Standard bios shows 51c
idle voltage on strix1080xoc bios shows 38c

now im wondering why the temperatures are so different.
and why the MSI bios has a capped voltage.


----------



## Spiriva

Quote:


> Originally Posted by *L4TINO*
> 
> i will mention again that i have a MSI 1080 GAMING X
> 
> ive taken a look at both MSI's 1080 gaming x oc bios which is standard with my card, and the strix1080xoc bios.
> now with my standard OC bios i have noticed even when altering the frequency curve it does not change the voltage that is has which is inbetween 1.04 - 1.0930 Max under load, so for the conclusion of the oc standard msi bios is that the voltage is capped which only led me to get to 2025mhz stable, under quick tests.
> 
> now with the strix1080xoc bios i was able to modify the frequency curve to reach the voltage of 1.200 max which led me to a 2080mhz clock on a quick test, the temperatures also changed with this bios showing lower temperatures even with higher voltage
> 
> Idle voltage on MSI Standard bios shows 51c
> idle voltage on strix1080xoc bios shows 38c
> 
> now im wondering why the temperatures are so different.
> and why the MSI bios has a capped voltage.


Does the fans om msi gaming x not spin untill your card hits a certain degree ? while the strix ocx bios make the fans spin all the time no matter the temp of the card ?


----------



## achilles73

Quote:


> Originally Posted by *Spiriva*
> 
> Does the fans om msi gaming x not spin untill your card hits a certain degree ? while the strix ocx bios make the fans spin all the time no matter the temp of the card ?


Yes, thats true, the fans on msi gaming x, only start to spin after 60º, so thats the reason of the different temps, compared to the strix card, that spin allways.


----------



## Setzer

Yeah, unless you've changed touched the fan profile on the MSI Gaming X, the fans don't spin until you hit 60C


----------



## Avant Garde

I would prefer to have them spinning always, they make almost no noise anyway but will give a lot better idle temps so....


----------



## achilles73

Quote:


> Originally Posted by *L4TINO*
> 
> i will mention again that i have a MSI 1080 GAMING X
> 
> ive taken a look at both MSI's 1080 gaming x oc bios which is standard with my card, and the strix1080xoc bios.
> now with my standard OC bios i have noticed even when altering the frequency curve it does not change the voltage that is has which is inbetween 1.04 - 1.0930 Max under load, so for the conclusion of the oc standard msi bios is that the voltage is capped which only led me to get to 2025mhz stable, under quick tests.
> 
> now with the strix1080xoc bios i was able to modify the frequency curve to reach the voltage of 1.200 max which led me to a 2080mhz clock on a quick test, the temperatures also changed with this bios showing lower temperatures even with higher voltage
> 
> Idle voltage on MSI Standard bios shows 51c
> idle voltage on strix1080xoc bios shows 38c
> 
> now im wondering why the temperatures are so different.
> and why the MSI bios has a capped voltage.


Hi, can you give some feedback how the strix 1080xoc bios works on your msi gaming x (i have one too)
Did you gain or loose performance with the strix bios, how much core clocks stable can you obtain ?
Thanks.


----------



## KillerBee33

Unrelated....








Any one have a broken or unused EVGA Hyrbid kit laying around? I just want the Bracket around dial


----------



## L4TINO

Quote:


> Originally Posted by *achilles73*
> 
> Yes, thats true, the fans on msi gaming x, only start to spin after 60º, so thats the reason of the different temps, compared to the strix card, that spin allways.


i was aware of the Gaming X only starting to spin at 60c but yea that did go over my head with the strix bios i had not checked the fan speeds, was too concentrated on getting higher clocks.

Goes to show how quiet the fans are









i tested the MSI OC bios by turning the fans on 25% till it cooled down to around 35c, i then turned the fans off and it has stayed at 35c at idle, didnt go back to the 50c.... probably not relevant to anything but i did test it lol

i will do more testing soon will be back with further results thanks for the reminder of the fan speeds.


----------



## achilles73

Quote:


> Originally Posted by *L4TINO*
> 
> i was aware of the Gaming X only starting to spin at 60c but yea that did go over my head with the strix bios i had not checked the fan speeds, was too concentrated on getting higher clocks.
> 
> Goes to show how quiet the fans are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i tested the MSI OC bios by turning the fans on 25% till it cooled down to around 35c, i then turned the fans off and it has stayed at 35c at idle, didnt go back to the 50c.... probably not relevant to anything but i did test it lol
> 
> i will do more testing soon will be back with further results thanks for the reminder of the fan speeds.


Just flashed the strix modded bios on my msi gaming x, and been playing with the msi ab "curve" overclock.
Just did some quick benchs and gaming, and i can confirm that the higher core clocks i can have with 1.15v or 1.20v, give worst performance than lowers core clocks... it's strange... something wrong with this strix modded bios... could it be related to power ?
Other thing, is that even at 1.2vcore i can't reach 2200mhz.

For example, my msi gaming x:

original bios @ 2076 core is faster than strix bios @ 2152 core
(3dmark ultra, valley, rainbowsix siege, BF1)


----------



## L4TINO

ok i just flashed back to strix1080xoc and i checked the fans straight away, they are off


----------



## uberwootage

To me it feels like that strix bios is just giving you numbers. I dont think the clocks on the card are really what they are being set to. Some bios trickery. I mean i wouldent put it past Asus to do something like this. Like maybe there is a +100mhz or something offset they "Calibrated" into it so 2.1ghz is really 2ghz. This could be why the bios its self is total crap for stability vs stock bio's. I think its one of those "Hey look were cool look our card clocks to 2.3ghz but loses to founder editions at 2ghz so just dont bench it just look at the clocks." I bet they put 5.0 badges on there v6 mustangs to.

Could i be wrong? yeah chances are yea but there is something going on with that bios and i do not think what you are getting reported in msi are really the real clocks. Anyone have a killawatt and wanna to a test. stock bios at a set clock then the xoc at the same one and see if they are pulli9ng the same amount of power. If we get X watts then we get a few less with the xoc.


----------



## KillerBee33

Has anyone checked the InGame FPS differences from any OC other than benchmark scores?


----------



## uberwootage

Quote:


> Originally Posted by *KillerBee33*
> 
> Has anyone checked the InGame FPS differences from any OC other than benchmark scores?


Did some test in the only game i play. World of warcraft useing /timetest.

4k everything maxed out msaa 8X there is a drop. I need around 2.175 on the core and 550 on the memory to get the same fps average i get with a fe bios at 2.065 with 550 on the memory. Same clock with the XOC i lose fps.

Based off 10 run average. i'll do some more test later. The FE bios i used was 86.04.11.00.0C tdp set to 120 and the card is warecooled. But from what i been seeing its looking like an 8-10% drop but im only going off one game and we really need more data. But thats what im seeing .

Going to get some food. I'll do some more testing when i get home


----------



## BigBeard86

This is strange. Could it be that the increased voltage is somehow affecting performance? Has anyone tried to use the stock voltage with the same clocks prior to flash, to see if there is any difference?

Could it be something with memory timings?


----------



## Spiriva

Has any one who flashed the strix bios and say they get worse preformance but higher clock speed, check what happens if you dont put it to 1.20v but set it to a little lower ?
I read some users said that 1.150v was a goos spot, and then just overclock it abit more then you were able with your stock bios ?

Exemple of what i mean: your stock bios 2050mhz and then instead of going 1.200v and 2200mhz put it to ~ 1.150v and just add 30-50mhz more and do some benchmarks. Is it still worse preformance then with the stock ?

What Im thinking is that if you push the memory to high you will get worse preforemance, maybe the gpu is somehow thew same.


----------



## Jquala

I have an EVGA FE 1080 waterblock and I have a msi X ek seahawk 1080 coming tomorrow. Will they be able to use a terminal to connect the two it seems the msi card looks bigger or does the waterblock all line up


----------



## uberwootage

Ok so i did some test. Strix bios sucks.

Strix bios vs Nvidia's FE the 01 version ill test the 0C version tonight.

World of Warcraft running at 4k all settings maxed out 8X MSAA 5 run average using the in game benchmark. Same driver and both cards were running at 1.05v. GTX 1080 is water cooled temps never went over 45c. FE was set to 120% tdp in afterburner

OCX 2.177ghz

137.468

OCX 2ghz

137.468

FE 2ghz

145.719

GPUZ was running logs. Despite the overclock on the OCX i saw no improvement. in FPS. 5 run average is spot on. FE at 2ghz 8.215 fps increase clock for clock. Thats a 6% increase.

Final thoughts. looks like its just tossing up numbers theirs no real increase in that bios when i overclocked it or left it at 2ghz. The FE is another story. Clock increases show gains in games. The 177mhz overclock i should of seen an increase but nothing. In fact the numbers are the same and i did not expect that.

FE bios is faster. 86.04.17.00.01 is the version i tested I will test 86.04.11.00.0C later tonight since thats the other nvidia fe bios.

The OCX gives you some numbers to look at in afterburner but offer you nothing in terms of performance.Clock for clock 6% is a big gap. The gap gets bigger if you clock the FE higher. I seen gains of over 10% in early test that i did.

So here are some real world numbers. Backs up what everyone has been thinking. The bios offers less performance.


----------



## dentnu

Quote:


> Originally Posted by *BigBeard86*
> 
> This is strange. Could it be that the increased voltage is somehow affecting performance? Has anyone tried to use the stock voltage with the same clocks prior to flash, to see if there is any difference?
> 
> Could it be something with memory timings?


Yea I did and got worst performance on the XOC bios.

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/2490#post_25333479


----------



## skline00

SiriusLeo, I never saw a loop piped like that. Are you getting proper circulation from the cpu block and gpu block?


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Unrelated....
> 
> 
> 
> 
> 
> 
> 
> 
> Any one have a broken or unused EVGA Hyrbid kit laying around? I just want the Bracket around dial


Can we order these from Evga? Let's see...


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Can we order these from Evga? Let's see...


Just the Bracket? If you can find it for sale i will love you


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Just the Bracket? If you can find it for sale i will love you


Hehe, I've been wanting just the bracket myself. Just emailed them. We shall see.

I'll update as soon as I know.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Hehe, I've been wanting just the bracket myself. Just emailed them. We shall see.
> 
> I'll update as soon as I know.


After seeing EVGA havent changed AIO unit at all i want to try Corsair H90
Not planning on using EVGA Shroud...


----------



## SiriusLeo

Quote:


> Originally Posted by *skline00*
> 
> SiriusLeo, I never saw a loop piped like that. Are you getting proper circulation from the cpu block and gpu block?


Yeah, it's a parallel configuration, (passing through both blocks equally/simultaneously). Cooling is overall the same as in serial, (fully passing through one before the other).

Average GPU temp is *37c* after 1 hour of full load, (overclocked to 2100mhz core with 1.081 volts) and stock Bios.
Average CPU temp under AIDA64 load is *56c*, (overclocked to 4.5ghz with 1.280 volts).
Average room temps *24c*.

I'm running a D5 Pump at 70% through an single EK CoolStream CE 560mm Radiator with 4 140mm EK Vardar fans set to 40%, (about 1000rpm).

Here is some info about it.


----------



## scaramonga

Quote:


> Originally Posted by *KillerBee33*
> 
> Just the Bracket? If you can find it for sale i will love you


Just order a new kit, remove bracket, then send kit back saying you ordered by mistake, profit!


----------



## KillerBee33

Quote:


> Originally Posted by *scaramonga*
> 
> Just order a new kit, remove bracket, then send kit back saying you ordered by mistake, profit!


Ehh, i can't do that ...i do have one installed on a 980 but it's for sale on eBay


----------



## nizmoz

So what is the best 1080 card out there to get? I have my eyes on the EVGA 1080 FTW but I want to make sure my money is spent on the best.







Let me know!


----------



## uberwootage

Quote:


> Originally Posted by *nizmoz*
> 
> So what is the best 1080 card out there to get? I have my eyes on the EVGA 1080 FTW but I want to make sure my money is spent on the best.
> 
> 
> 
> 
> 
> 
> 
> Let me know!


Hard to say. Founders are clocking the highest but for that your going to need better cooling. The stock FE heatsink is ok. Turn the fan to around 65% and stock and with a light oc the temps will be fine. Stock the fan is not to loud better then the 980ti's at 75% you start to hear it at 100% it is still not as loud as some of the older cards.

Single 8 pin is not effecting the clocks. So no need for the 8 and 6 or dual 8's. Here are my picks.

Founder edition "if you plan on getting better cooling. Everything that i seen these cards are clocking higher but until we get an unlocked tdp bios mod for them we wont really know the limit of them until then. Reason i said this first is Nvidia said they use the highest quality components in these. They wont say binning but that means using the higher end GPU's. They might have some legal issues if they say they bin them from contracts with MSI,EVGA and them for chips they use in there highend cards. But for a fact they are binning because if it cant pass as a 1080 it gets cut down to a 1070 so they know what they have

Galaxy HOF. Dont need to say anything about that card. I would of got one if i could get one here. They are beastly but then again its a Galaxy

Zotac Amp extreme.

MSI gaming Z

Gigabyte Extreme Gaming


----------



## boredgunner

Quote:


> Originally Posted by *nizmoz*
> 
> So what is the best 1080 card out there to get? I have my eyes on the EVGA 1080 FTW but I want to make sure my money is spent on the best.
> 
> 
> 
> 
> 
> 
> 
> Let me know!


Are you in North America, Europe, or elsewhere? This is a big factor as many cards aren't available in certain places.

If you're in the US and planning on using stock cooling, then it seems the best would be the Zotac AMP Extreme followed by the Gigabyte XTREME Gaming. Monstrous cooling and better typical overclocking than the others.

If you plan on water cooling, Founder's Edition is a good choice although maybe it's better to get the EVGA SC or Gigabyte G1 Gaming instead for the price? They are cheaper and use the same reference PCB, and as a bonus have better cooling in case you want to use the card for a little while without a water block.

From what I've seen EVGA FTW models overclock worse than all the others on average, although it looks the best.


----------



## nizmoz

Thanks or the detailed response. Looks like on your list you didn't add the FTW, so you think the 2nd best to the Galaxy is the Zotac? I see they have a 5 year warranty on it. Never had experience with their video cards though. The last MSI card I had, had terrible support.


----------



## nizmoz

Quote:


> Originally Posted by *boredgunner*
> 
> Are you in North America, Europe, or elsewhere? This is a big factor as many cards aren't available in certain places.
> 
> If you're in the US and planning on using stock cooling, then it seems the best would be the Zotac AMP Extreme followed by the Gigabyte XTREME Gaming. Monstrous cooling and better typical overclocking than the others.
> 
> If you plan on water cooling, Founder's Edition is a good choice although maybe it's better to get the EVGA SC or Gigabyte G1 Gaming instead for the price? They are cheaper and use the same reference PCB, and as a bonus have better cooling.
> 
> From what I've seen EVGA FTW models overclock worse than all the others on average, although it looks the best.


I am in the USA. Texas to be exact. It does seem Zotac maybe the way to go even though I have no experience with them.


----------



## fireyfire

Quote:


> Originally Posted by *uberwootage*
> 
> Hard to say. Founders are clocking the highest but for that your going to need better cooling. The stock FE heatsink is ok. Turn the fan to around 65% and stock and with a light oc the temps will be fine. Stock the fan is not to loud better then the 980ti's at 75% you start to hear it at 100% it is still not as loud as some of the older cards.
> 
> Single 8 pin is not effecting the clocks. So no need for the 8 and 6 or dual 8's. Here are my picks.
> 
> Founder edition "if you plan on getting better cooling. Everything that i seen these cards are clocking higher but until we get an unlocked tdp bios mod for them we wont really know the limit of them until then. Reason i said this first is Nvidia said they use the highest quality components in these. They wont say binning but that means using the higher end GPU's. They might have some legal issues if they say they bin them from contracts with MSI,EVGA and them for chips they use in there highend cards. But for a fact they are binning because if it cant pass as a 1080 it gets cut down to a 1070 so they know what they have
> 
> Galaxy HOF. Dont need to say anything about that card. I would of got one if i could get one here. They are beastly but then again its a Galaxy
> 
> Zotac Amp extreme.
> 
> MSI gaming
> Gigabyte Extreme Gaming


I have a Zotac card with dual 8 pins and am seeing clocks over 2200 MHZ, I left the power limit to default and the temperature limit to default, as raising them cause no difference. I only see 60-80% power usage anyways (The card has a 230W TDP, so that is 138 to 184W) I can agree that the 2 8 pins, or anything more than 1 8 pin probably does not make a difference. Most people with Zotac AMP/AMP Extreme cards seem to have higher overclocks.


----------



## snafua

I've been reading a lot of tests with flashing the Strix bios and high clocks not pushing out the performance. Are there any tests on the Strix itself doing the same?

I assume it would be but would be nice to know.
Sorry if I missed the comparison somewhere here.


----------



## nexxusty

Quote:


> Originally Posted by *scaramonga*
> 
> Just order a new kit, remove bracket, then send kit back saying you ordered by mistake, profit!


LOL. A little social engineering never hurt anyone...


----------



## Jquala

I'm kind of in a weird predicament and I really need the forums help on this. I ordered an msi Seahawk ek. I absolutely love the design of the water block. Currently I have a founders edition 1080 GTX with an ekwb waterblock and it clocks to 2176. I assumed that when I plug those cards in the terminal would align however I did not notice how much bigger the on the MS I card. I think the great wisdom of this forum on possible solutions or if anyone has tried a combination of two different waterblocks within the same generation and company.


----------



## pez

Quote:


> Originally Posted by *boredgunner*
> 
> Are you in North America, Europe, or elsewhere? This is a big factor as many cards aren't available in certain places.
> 
> If you're in the US and planning on using stock cooling, then it seems the best would be the Zotac AMP Extreme followed by the Gigabyte XTREME Gaming. Monstrous cooling and better typical overclocking than the others.
> 
> If you plan on water cooling, Founder's Edition is a good choice although maybe it's better to get the EVGA SC or Gigabyte G1 Gaming instead for the price? They are cheaper and use the same reference PCB, and as a bonus have better cooling in case you want to use the card for a little while without a water block.
> 
> From what I've seen EVGA FTW models overclock worse than all the others on average, although it looks the best.


Double checking, but I don't think the G1 is reference. Just happens to have 1 X 8 pin.

Edit: http://www.techspot.com/review/1190-gigabyte-geforce-gtx-1080-g1-gaming/


----------



## Avant Garde

Just go with either of these non-reference GPU's. Seriously, they're all pretty bad overclockers but they're all powerful out of the box, 2050MHz-2100MHz or 2200MHz, the difference is almost non existent in raw performance. Personally, I will get EVGA FTW, it looks the best so far (although that illuminated logo could be smaller) and I don't have to use THE STICK to support my GPU from braking out of GPU slot and falling...


----------



## yenclas

Hi,

I'm Spanish, sorry by my bad English.

Last week received my Palit 1080 Gamerock.

Good card, coll and quiet.

I'm thinking to replace thermal paste or install G10 bracked with Corsair AIO, but the card have a Warranty stick that i don't know if I remove it void warranty. Anyone know this ?

The card is stable at 2100Mhz with +100 vcore and +450Mhz on mem (120% tdp limit) but setting the fan at 80% (much noise).

If i leave fan at auto, card throttles too much

How can I change the light color ?

Thank you very much


----------



## ChevChelios

how far can the 1080 memory go before it starts giving worse performance ? is 11000 the limit ? lower than 11000 ?


----------



## moustang

Quote:


> Originally Posted by *boredgunner*
> 
> Are you in North America, Europe, or elsewhere? This is a big factor as many cards aren't available in certain places.
> 
> If you're in the US and planning on using stock cooling, then it seems the best would be the Zotac AMP Extreme followed by the Gigabyte XTREME Gaming. Monstrous cooling and better typical overclocking than the others.


Define "better typical overclocking".

And where exactly are you getting these numbers?

I'm asking because from user comments here and elsewhere hitting 2100+ core and 11,000+ memory is the norm with the MSI Gaming X. I am personally running 2113mhz core and 11,108 memory on my Gaming X, and that's not pushed to the limits. I've run the core as high as 2156mhz without any problems, but keep it clocked lower for my 24/7 use. And it stays cool enough to never throttle at those speeds either.

How much higher are the Zotac and Gigabyte cards getting?


----------



## BrainSplatter

Quote:


> Originally Posted by *yenclas*
> 
> but the card have a Warranty stick that i don't know if I remove it void warranty. Anyone know this ?


Yes, it will void warranty. Only few companies allow hardware modifications within the warranty. One of them is EVGA.


----------



## Jorginto

Guys, I need a piece o advice. Is Palit Jetstream and Gainward Phoenix practicly the same card? (Same goes with Gamerock vs Phoenix GLH?)


----------



## BrainSplatter

Yes, I think so. Gainward belongs to Palit. Maybe the fans are different (fan blades look different I think) but the cooler seems to be the same otherwise.


----------



## KickAssCop

Which is the best 1080 GTX in terms of overclockability, fan noise and temperatures?


----------



## hemon

Quote:


> Originally Posted by *KickAssCop*
> 
> Which is the best 1080 GTX in terms of overclockability, fan noise and temperatures?


According to me: MSI Gaming X, definitely!


----------



## Avant Garde

They are all a few % apart from each other. Just go for the most price competitive CUSTOM PCB one.


----------



## KillerBee33

Quote:


> Originally Posted by *hemon*
> 
> According to me: MSI Gaming X, definitely!


MSI are known to lock Voltage on their GPU's so i wouldnt say MSi are best for OC








If you can get your hands on a NVIDIA chip i'd say you lucky







i had 3 of them , great Overclockers.


----------



## Spiriva

Quote:


> Originally Posted by *KickAssCop*
> 
> Which is the best 1080 GTX in terms of overclockability, fan noise and temperatures?


I think you just have to play the lottery, both my Evga FE 1080 overclock to 2200mhz with the stock bios (ek waterblock) and then Ive read about Evga FE 1080´s who wount get over 2025mhz.


----------



## boredgunner

Quote:


> Originally Posted by *moustang*
> 
> Define "better typical overclocking".
> 
> And where exactly are you getting these numbers?
> 
> I'm asking because from user comments here and elsewhere hitting 2100+ core and 11,000+ memory is the norm with the MSI Gaming X. I am personally running 2113mhz core and 11,108 memory on my Gaming X, and that's not pushed to the limits. I've run the core as high as 2156mhz without any problems, but keep it clocked lower for my 24/7 use. And it stays cool enough to never throttle at those speeds either.
> 
> How much higher are the Zotac and Gigabyte cards getting?


I'm basing it on verified user reviews on newegg and amazon, and what users are saying here on OCN and also on Reddit. I've seen users claim Zotac AMP Extreme models and to a lesser extent Gigabyte XTREME Gaming reach 2200 MHz or close to it quite often. MSI is good too though on average, with > 2100 MHz being common.
Quote:


> Originally Posted by *KickAssCop*
> 
> Which is the best 1080 GTX in terms of overclockability, fan noise and temperatures?


Overclockability is somewhat of a lottery but if you're in the US, you might want to prioritize the Zotac AMP Extreme, followed by Gigabyte XTREME Gaming and MSI GAMING X.

As for noise, the three I just mentioned along with the EVGA FTW, MSI ARMOR, and ASUS ROG Strix all seem to be very quiet, perhaps on roughly the same level.

Now for temperatures, I would bet on the Zotac AMP Extreme being the best, followed closely by the Gigabyte XTREME Gaming and MSI GAMING X. EVGA FTW runs very cool too. Temps should never be an issue for these.


----------



## nexxusty

Quote:


> Originally Posted by *Spiriva*
> 
> I think you just have to play the lottery, both my Evga FE 1080 overclock to 2200mhz with the stock bios (ek waterblock) and then Ive read about Evga FE 1080´s who wount get over 2025mhz.


Not one of those 1080's had a waterblock on them then..... Hehe.

Seems to me the caveat is cooling. The LEAST a 1080 should do on a waterblock is 2100mhz.


----------



## boredgunner

Quote:


> Originally Posted by *nexxusty*
> 
> Not one of those 1080's had a waterblock on them then..... Hehe.
> 
> Seems to me the caveat is cooling. The LEAST a 1080 should do on a waterblock is 2100mhz.


For sure. Temperatures are holding me back with the MSI ARMOR, it gets really hot when the voltage nears 1.1v and throttles. Tiny little heatsink.


----------



## skline00

SiriusLeo: Thank you for the explanation on parallel water cooling. I had forgotten about that.


----------



## nexxusty

Quote:


> Originally Posted by *boredgunner*
> 
> For sure. Temperatures are holding me back with the MSI ARMOR, it gets really hot when the voltage nears 1.1v and throttles. Tiny little heatsink.


Not surprised.

What I'm seeing is people either forgetting that when a VRM is cooler it's energy conversion ratio is higher... or they just plain didn't know.

This combined with lower core temps equals some nice gains.


----------



## Phinix

Hi all, I have been following and reading this thread for a while and now I need advice.

Originally I was waiting to get the Gigabyte 1080 Xtreme but then they announced the WaterForce variant.

Since it seems that cooling is the main difference between all stock bios 1080s, I'm not sure if I should get the Gigabyte 1080 Xtreme WaterForce or just buy a reference EVGA 1080 and get a G10 bracket and Corsair H110.

The EVGA route ends up being cheaper by around 150 USD - when converted - in my country.

Thoughts?


----------



## nexxusty

Quote:


> Originally Posted by *Phinix*
> 
> Hi all, I have been following and reading this thread for a while and now I need advice.
> 
> Originally I was waiting to get the Gigabyte 1080 Xtreme but then they announced the WaterForce variant.
> 
> Since it seems that cooling is the main difference between all stock bios 1080s, I'm not sure if I should get the Gigabyte 1080 Xtreme WaterForce or just buy a reference EVGA 1080 and get a G10 bracket and Corsair H110.
> 
> The EVGA route ends up being cheaper by around 150 USD - when converted - in my country.
> 
> Thoughts?


G10 doesn't cool the card well enough without heatsinks on VRM. Pfft, RAM too.


----------



## grimboso

Anyone heard anything about FTW waterblocks or a release date for the classy? Tried google but couldn't find any good info.


----------



## Crono180

I validated my card with gpu-z but when I fill out the member form it says incorrect validation. Anyone can help me?


----------



## Jpmboy

Quote:


> Originally Posted by *snafua*
> 
> I've been reading a lot of tests with flashing the Strix bios and high clocks not pushing out the performance. Are there any tests on the Strix itself doing the same?
> 
> I assume it would be but would be nice to know.
> Sorry if I missed the comparison somewhere here.


the strix PCB has different components (power section chokes etc) than MSI or FE models... and different memory timings too. Several users posted about lower performance with higher clocks on the strix bios when flashed to a different PCB some weeks ago. Luckily no one cooked a component (different I2C commands control these differnt VRMs/chokes). I suspect there is a lot of error correction at the higher clocks in these cross-flashed examples. Sometimes a cross flash can really work well (980 strix bios from Shammy to the 980 Kingpin was one - nicknamed the KingStrix at the time)... probably more blind luck that anything.


----------



## TK421

Quote:


> Originally Posted by *Jpmboy*
> 
> the strix PCB has different components (power section chokes etc) than MSI or FE models... and different memory timings too. Several users posted about lower performance with higher clocks on the strix bios when flashed to a different PCB some weeks ago. Luckily no one cooked a component (different I2C commands control these differnt VRMs/chokes). I suspect there is a lot of error correction at the higher clocks in these cross-flashed examples. Sometimes a cross flash can really work well (980 strix bios from Shammy to the 890 Kingpin was one - nicknamed the KingStrix at the time)... probably more blind luck that anything.


have you figured out a way to unlock the pwr limit beside shorting the resistors?

maybe a pencil mod for memory voltage would be nice too


----------



## nyk20z3

Who ever is in need of a 1080 they have one on clearence for around $600 at Micro Center in Yonkers ny lol.

I seen it today when i went in, its open box of course.


----------



## nexxusty

Quote:


> Originally Posted by *nyk20z3*
> 
> Who ever is in need of a 1080 they have one on clearence for around $600 at Micro Center in Yonkers ny lol.
> 
> I seen it today when i went in, its open box of course.


Micro Center.... hate you.


----------



## WolfenWind

Not sure if this is the place to post this but if anyone here plays Overwatch what are your settings?

I can't seem to get more than 70fps at 2560x1440 at Ultra with a 1080 strix at the moment.


----------



## Crazy9000

Quote:


> Originally Posted by *WolfenWind*
> 
> Not sure if this is the place to post this but if anyone here plays Overwatch what are your settings?
> 
> I can't seem to get more than 70fps at 2560x1440 at Ultra with a 1080 strix at the moment.


Sounds like you have the frame rate limiter on. Turn it to disabled in video settings, as well as vsync.


----------



## ChevChelios

Quote:


> Originally Posted by *WolfenWind*
> 
> Not sure if this is the place to post this but if anyone here plays Overwatch what are your settings?
> 
> I can't seem to get more than 70fps at 2560x1440 at Ultra with a 1080 strix at the moment.


I get ~200-250 (well, or 200+) fps @ 1080p full Epic


----------



## alawadhi3000

Quote:


> Originally Posted by *grimboso*
> 
> Anyone heard anything about FTW waterblocks or a release date for the classy? Tried google but couldn't find any good info.


As per EKWB the FTW block will be released late July, so its around two weeks away.


----------



## auraofjason

Quote:


> Originally Posted by *WolfenWind*
> 
> Not sure if this is the place to post this but if anyone here plays Overwatch what are your settings?
> 
> I can't seem to get more than 70fps at 2560x1440 at Ultra with a 1080 strix at the moment.


Is your render scale at 100%?


----------



## WolfenWind

Yeah it is.

I don't remember setting a fps limiter option so I'll definitely have to check that out when i get back. It sorta makes sense because I have been messing with my settings and it seems really stable/stuck at 69/70 fps.


----------



## Crazy9000

Quote:


> Originally Posted by *WolfenWind*
> 
> Yeah it is.
> 
> I don't remember setting a fps limiter option so I'll definitely have to check that out when i get back. It sorta makes sense because I have been messing with my settings and it seems really stable/stuck at 69/70 fps.


I'm fairly sure it's on my default, I had the exact same issue when I was trying to get it up for my new 144hz monitor.


----------



## dante`afk

finally in da club. daaam 100% asic?









http://abload.de/image.php?img=captureisrpo.png http://abload.de/image.php?img=20160711_1804266lrav.jpg


----------



## Vellinious

Quote:


> Originally Posted by *dante`afk*
> 
> finally in da club. daaam 100% asic?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://abload.de/image.php?img=captureisrpo.png http://abload.de/image.php?img=20160711_1804266lrav.jpg


ASIC quality isn't supported correctly yet in GPUz for Pascal. The readings are incorrect.


----------



## jase78

Quote:


> Originally Posted by *WolfenWind*
> 
> Yeah it is.
> 
> I don't remember setting a fps limiter option so I'll definitely have to check that out when i get back. It sorta makes sense because I have been messing with my settings and it seems really stable/stuck at 69/70 fps.


It 100 pecent has a limit on it until you change it. Garunteed you can hit 144. I have 1080 strix and have everything all tje way up on ultra and it stays pegged on 140 144


----------



## dante`afk

can I use dvi to mini displayport since the FTW has only 1xdvi and both of my screens are only dvi ?

so would it work to connect 1x dvi and 1x displayport on the same card?

I'd think female DVI to male displayport?


----------



## nexxusty

Quote:


> Originally Posted by *dante`afk*
> 
> can I use dvi to mini displayport since the FTW has only 1xdvi and both of my screens are only dvi ?
> 
> so would it work to connect 1x dvi and 1x displayport on the same card?
> 
> I'd think female DVI to male displayport?


Yup. Don't use the display port though. Use an HDMI to DVI adapter.

Cheaper, the same thing for less.


----------



## Atomicbomb22

I have a 1080 gtx brand new that my friend is giving me tomorrow for a balance he owes me. Not sure if it'll be EVGA FE or MSI Aero.

Pm me if you want it. It'll be brand new factory sealed. $650. I'll do paypal if u want that buyer protection since I'm new here.

And it seems crippled bios is what's bringing the overclockability to the gutter. I think Nvdia has too much of a lead against AMD and they have a target maximum generational gain they accidentally or cost-effectively surpassed...

The target is probably 90%. You never want to overdo it if you're in the business of providing regular gains for shareholders.

I'm thinking if someone unlocks the power of these in the bios, we'll have so many tick-tock barriers of cooling and power-draw to break through.

This has never happened before where everyone is so puzzled over these cards full potential. They're gimped. All the evidence is there. I'm firmly against anyone who says it's the smaller nanometer process that's making us very close to max potential out of the box. Not way that's true, there's other things at work.

Source: Myself/MBA and BS in Finance

Sent from my SM-G935T using Tapatalk


----------



## nexxusty

Quote:


> Originally Posted by *Atomicbomb22*
> 
> I have a 1080 gtx brand new that my friend is giving me tomorrow for a balance he owes me. Not sure if it'll be EVGA FE or MSI Aero.
> 
> Pm me if you want it. It'll be brand new factory sealed. $650. I'll do paypal if u want that buyer protection since I'm new here.
> 
> And it seems crippled bios is what's bringing the overclockability to the gutter. I think Nvdia has too much of a lead against AMD and they have a target maximum generational gain they accidentally or cost-effectively surpassed...
> 
> The target is probably 90%. You never want to overdo it if you're in the business of providing regular gains for shareholders.
> 
> I'm thinking if someone unlocks the power of these in the bios, we'll have so many tick-tock barriers of cooling and power-draw to break through.
> 
> This has never happened before where everyone is so puzzled over these cards full potential. They're gimped. All the evidence is there. I'm firmly against anyone who says it's the smaller nanometer process that's making us very close to max potential out of the box. Not way that's true, there's other things at work.
> 
> Source: Myself/MBA and BS in Finance
> 
> Sent from my SM-G935T using Tapatalk


Only n00bs are puzzled.... TiN overclocked the crap out of an EVGA FE. Remove the power limit via a hardware mod, cool it well and they'll clock to 2200-2300 easy.


----------



## Atomicbomb22

^Just confirmed. It's an EVGA 1080 GTX FE still shrink wrapped. $650 plus 15 bucks for ship anywhere in the continental US. International ship is possible. Pm me. I'll have pics tmrw.

Sent from my SM-G935T using Tapatalk


----------



## xer0h0ur

So whomever told me I can't use the FE backplate with the EKWB was wrong. Its brutally easy to do so by reusing most of the hex screws that were holding the PCB to the air cooler. I installed the card into my loop and everything is working great. Going from waterblocked 295X2 + 290X to this may have left behind some multi-gpu power but it also left the Crossfire profile nightmare in the past. I am going to be sticking to single powerful GPU setups from now on. Regardless of if its from AMD or Nvidia.


----------



## nexxusty

Quote:


> Originally Posted by *xer0h0ur*
> 
> So whomever told me I can't use the FE backplate with the EKWB was wrong. Its brutally easy to do so by reusing most of the hex screws that were holding the PCB to the air cooler. I installed the card into my loop and everything is working great. Going from waterblocked 295X2 + 290X to this may have left behind some multi-gpu power but it also left the Crossfire profile nightmare in the past. I am going to be sticking to single powerful GPU setups from now on. Regardless of if its from AMD or Nvidia.


LOL yeah someone posted after our responses that the stock backplate fits with EK blocks.

First time....


----------



## stoker

Quote:


> Originally Posted by *xer0h0ur*
> 
> So whomever told me I can't use the FE backplate with the EKWB was wrong. Its brutally easy to do so by reusing most of the hex screws that were holding the PCB to the air cooler. I installed the card into my loop and everything is working great. Going from waterblocked 295X2 + 290X to this may have left behind some multi-gpu power but it also left the Crossfire profile nightmare in the past. I am going to be sticking to single powerful GPU setups from now on. Regardless of if its from AMD or Nvidia.


Nice 

Care to share some pics?


----------



## xer0h0ur

Sure, lemme upload em to my webspace


----------



## xer0h0ur




----------



## jtom320

That's kind of cool. I'm glad it works.

EK I'm pretty sure specifically states that the reference backplate doesn't work which I think is where that came from. Good to know it does.


----------



## xer0h0ur

Yeah its not like I wanted it that way. When I ordered the EKWB the retailer didn't have backplates in stock. This is a compromise till I get one.


----------



## WolfenWind

Quote:


> Originally Posted by *Crazy9000*
> 
> I'm fairly sure it's on my default, I had the exact same issue when I was trying to get it up for my new 144hz monitor.


I think the issue is on overwatch's limit fps setting I had it set to display based rather than off. Weird I assumed that setting it to display based would allow the G-Sync functionality to work. Now it's higher, but I'm confused ha!


----------



## KickAssCop

When is the Classified out?


----------



## nexxusty

Quote:


> Originally Posted by *WolfenWind*
> 
> I think the issue is on overwatch's limit fps setting I had it set to display based rather than off. Weird I assumed that setting it to display based would allow the G-Sync functionality to work. Now it's higher, but I'm confused ha!


The game doesn't know you are running G-Sync. Frame limiters should always be off when running G-sync.


----------



## x7007

Quote:


> Originally Posted by *xer0h0ur*


What is that big thing behind with the fans that clawed to the case ? is it custom made ?


----------



## versions

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah its not like I wanted it that way. When I ordered the EKWB the retailer didn't have backplates in stock. This is a compromise till I get one.


Not sure how relevant this is for you as I'm Swedish, but when I ordered my waterblock and backplate from EKWB, neither was in stock, though it said it would not cause major shipping delays. I ordered a Thursday with Express Shipping and had them home by Monday morning. Perhaps it'll be similar for the US EK Store, if they have one of those.


----------



## pez

Quote:


> Originally Posted by *WolfenWind*
> 
> Not sure if this is the place to post this but if anyone here plays Overwatch what are your settings?
> 
> I can't seem to get more than 70fps at 2560x1440 at Ultra with a 1080 strix at the moment.


Quote:


> Originally Posted by *auraofjason*
> 
> Is your render scale at 100%?


Quote:


> Originally Posted by *WolfenWind*
> 
> Yeah it is.
> 
> I don't remember setting a fps limiter option so I'll definitely have to check that out when i get back. It sorta makes sense because I have been messing with my settings and it seems really stable/stuck at 69/70 fps.


Quote:


> Originally Posted by *WolfenWind*
> 
> I think the issue is on overwatch's limit fps setting I had it set to display based rather than off. Weird I assumed that setting it to display based would allow the G-Sync functionality to work. Now it's higher, but I'm confused ha!


Yep, glad you got that worked out. I have mine set this way and it actually runs really smoothly since I don't have g-sync. No input lag between it on and off for me in the game and it's super smooth, so I'm very happy.


----------



## ChevChelios

http://www.overclock.net/t/1605618/nvidia-gtx-1080-and-1070-suffering-from-extraordinarily-high-dpc-latency-issues-and-stuttering

so do any of you guys get this ?

I dont mean latency readings, but problems in actual usage - i.e. lag or stutter in games etc. ?

I do get higher latency in idle (240 MHz clock), but it doesnt effect anything or any game so far


----------



## ChevChelios

and yes for Overwatch you should enable in-game fps limiter (while keeping Vsync off) if you dont have Gsync and disable it (and use Gsync) if you have Gsync

if you dont have Gsync, then another option is to try FastSync via NVCP .. some ppl said FastSync gave them stutter in BF4, but in OW I tried it and didnt see any problems


----------



## pez

I've noticed various differences in each game I've tried. FastSync did something quite terrible with Crysis (original) but I think it heavily relies on 60+ FPS at all times. It was a quick test and some settings need to be tweaked, but I'll have to give some more testing a go soon. I'm definitely curious to see what kinda results I'll get with x8/x8 PCI-e 3.0 for SLI.


----------



## ChevChelios

FastSync absolutely needs/prefers that you are above your max refresh rate at all times, so your min fps is above 60/100/144/165

thats when its supposed to work best and provide stutter, tear free experience when your fps is overkill

although frankly at 144+ fps on 144 Hz tearing is almost non-visible even with Vsync Off, so IMO its ok to just leave Vsync Off then


----------



## WolfenWind

Quote:


> Originally Posted by *ChevChelios*
> 
> http://www.overclock.net/t/1605618/nvidia-gtx-1080-and-1070-suffering-from-extraordinarily-high-dpc-latency-issues-and-stuttering
> 
> so do any of you guys get this ?
> 
> I dont mean latency readings, but problems in actual usage - i.e. lag or stutter in games etc. ?
> 
> I do get higher latency in idle (240 MHz clock), but it doesnt effect anything or any game so far


No lag or stutter for me.


----------



## xer0h0ur

Quote:


> Originally Posted by *x7007*
> 
> What is that big thing behind with the fans that clawed to the case ? is it custom made ?


LOL, that is an Alphacool NexXxos Monsta 360mm radiator. This mATX case doesn't have the space inside for radiators other than 120's so I bought a Koolance external case mount and a PCI pass thru along with that radiator to cool my loop. Its overkill now but back when it was a 295X2 and 290X dumping tons of heat into the loop, I needed it.


----------



## Jpmboy

Quote:


> Originally Posted by *TK421*
> 
> have you figured out a way to unlock the pwr limit beside shorting the resistors?
> 
> maybe a pencil mod for memory voltage would be nice too


no, sorry. The only way to increase the PL is via altering the resistance of the resistors you already know about (clu or a solder pen). I don't have any reason to think that a small increase in vdimm (Fvdd) is gonna help the already tight timings on these stacked gddr5 configs... 1.7+V. maybe.
Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, that is an Alphacool NexXxos Monsta 360mm radiator. This mATX case doesn't have the space inside for radiators other than 120's so I bought a Koolance external case mount and a PCI pass thru along with that radiator to cool my loop. Its overkill now but back when it was a 295X2 and 290X dumping tons of heat into the loop, I needed it.


lol - big difference between the 1080 and a fire breathing 295x2 (tho I'm still running mine - amazing how long the 295x2 has remained a relevant single slot solution!).


----------



## xer0h0ur

The reason I ditched the 295X2 was because of how much heat the card blasted into my loop coupled with the fact the GTX 1080 already equaled or surpassed its performance without needing any Crossfire profiles. So now even in games that I couldn't Crossfire in I am getting Crossfired 295X2 performance or better. That works for me. I basically traded 3dmark tri-fire e-peen for real world performance and zero multi-gpu headaches. I had a GTX 690 then a 295X2, neither solution was in the end what it was cracked up to be.

I was playing last night @ 2088MHz GPU and the temp was barely cracking 41C.

So far I have only pushed it as far as:

Core Voltage : +100%
Power Limit : 120 %
Core Clock : +225
Memory Clock : +575

Benched Firestrike 5 times, played Dying Light for an hour and CSGO for an hour without issue. I am sure there is still some more performance to squeeze out of it.


----------



## Rivered082

I was playing around with the voltage curve in Precision XOC for my FTW, and I found something interesting. I can set the curve to manual, and only select one voltage to offset, and I can hit over 2.2ghz stable. I chose just 1.05v, and was able to push it +200mhz stable. This gets me to 2202mhz. I also tested all of the others, and was able to hit 2226 stable at 1.093v with the voltage offset at 100%.

However, whenever I start adding offsets to the other voltage points, it becomes unstable. Why would a single offset for a single voltage be much more stable at a much higher offset, than when the other voltage offsets are added in? I have to keep everything below 1.05v at +75, 1.05v and 1.062v at +100, and 1.075v at +125 for stability.


----------



## xer0h0ur

I haven't tried using Precision yet. Only have been using Afterburner Beta 4.3.0.


----------



## nexxusty

Quote:


> Originally Posted by *ChevChelios*
> 
> FastSync absolutely needs/prefers that you are above your max refresh rate at all times, so your min fps is above 60/100/144/165
> 
> thats when its supposed to work best and provide stutter, tear free experience when your fps is overkill
> 
> although frankly at 144+ fps on 144 Hz tearing is almost non-visible even with Vsync Off, so IMO its ok to just leave Vsync Off then


Not true when G-Sync and Fast Sync are enabled.

G-sync handles 0-144fps and fast sync takes over past your monitors refresh. Fast sync is sick.

I suspect people having issues with it have a crap monitor.


----------



## x7007

Quote:


> Originally Posted by *WolfenWind*
> 
> No lag or stutter for me.


Did you even check Latencymon ? it's not stutter or lag, the program shows that something is wrong. Run for 30 min then post back screenshots.

Quote:


> Originally Posted by *nexxusty*
> 
> Not true when G-Sync and Fast Sync are enabled.
> 
> G-sync handles 0-144fps and fast sync takes over past your monitors refresh. Fast sync is sick.
> 
> I suspect people having issues with it have a crap monitor.


Gsync already enabling Vsync ON automatically so using Gsync is capping your fps.
Using Gsync + Fast Sync let the fps go above your monitor XX Hz

The big question is can we use Fast Sync with 60 Hz Monitor or it should be used only for 100 Hz Monitor and above ?

I have Philips 7007 TV , so I am not sure yet if I can use it or not from technical view


----------



## nexxusty

Quote:


> Originally Posted by *x7007*
> 
> Did you even check Latencymon ? it's not stutter or lag, the program shows that something is wrong. Run for 30 min then post back screenshots.
> Gsync already enabling Vsync ON automatically so using Gsync is capping your fps.
> Using Gsync + Fast Sync let the fps go above your monitor XX Hz


G-sync absolutely does NOT automatically enable Vsync. What in the world are you talking about?

As for your second statement. That is exactly what I said beforehand....


----------



## boredgunner

Quote:


> Originally Posted by *nexxusty*
> 
> G-sync absolutely does NOT automatically enable Vsync. What in the world are you talking about?
> 
> As for your second statement. That is exactly what I said beforehand....


Enabling G-SYNC does set V-Sync to Force On in NVIDIA Control Panel, although V-Sync will only kick in when your frame rate reaches your refresh rate.


----------



## nexxusty

Quote:


> Originally Posted by *boredgunner*
> 
> Enabling G-SYNC does set V-Sync to Force On in NVIDIA Control Panel, although V-Sync will only kick in when your frame rate reaches your refresh rate.


It's not enabled. G-sync takes over.

Ugh.... this has been covered before people.

I'm also totally the type of person that both doesn't know what he's talking about and talks crap... lol.

Just trust me on this....


----------



## x7007

Quote:


> Originally Posted by *nexxusty*
> 
> G-sync absolutely does NOT automatically enable Vsync. What in the world are you talking about?
> 
> As for your second statement. That is exactly what I said beforehand....


It does, only my laptop Gsync enabled Asus G751JT , if I enables GSYNC it enabled Vsync ON .

Vsync will be turned ON when you enable Nvidia 3DTV too.

http://nvidia.custhelp.com/app/answers/detail/a_id/2342

http://forums.guru3d.com/showthread.php?t=399791

Manual on Geforce forums

https://forums.geforce.com/default/topic/489962/3d-vision/3d-and-vsync/post/3511551/#3511551

"Hi

Our driver forces overrides Vsync in-game settings and always forced Vsync on. It does not matter what the game's menu says."

Quote:


> Originally Posted by *boredgunner*
> 
> Enabling G-SYNC does set V-Sync to Force On in NVIDIA Control Panel, although V-Sync will only kick in when your frame rate reaches your refresh rate.


That's what I meant, but it is forcing it on, with drivers 353.xx they said to disable Vsync OFF, but they change it in the new drivers. Like what boredgunner said, Vsync wouldn't be enabled as long you don't pass the monitor FPS/Hz


----------



## nexxusty

Quote:


> Originally Posted by *x7007*
> 
> It does, only my laptop Gsync enabled Asus G751JT , if I enables GSYNC it enabled Vsync ON .
> 
> Vsync will be turned ON when you enable Nvidia 3DTV too.
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/2342
> 
> http://forums.guru3d.com/showthread.php?t=399791
> 
> Manual on Geforce forums
> 
> https://forums.geforce.com/default/topic/489962/3d-vision/3d-and-vsync/post/3511551/#3511551
> 
> "*Hi
> 
> Our driver forces overrides Vsync in-game settings and always forced Vsync on. It does not matter what the game's menu says.*"


That's when you select vsync and DON'T have G-sync enabled.

Again... G-sync takes over all sync functions...


----------



## x7007

Quote:


> Originally Posted by *nexxusty*
> 
> That's when you select vsync and DON'T have G-sync enabled.
> 
> Again... G-sync takes over all sync functions...


again, we are talking about something nvidia did and change, we are talking about the same thing.

I just said Vsync is Globally enabled to ON when you enable Gsync or 3DTV .. that's all . it does that, working or not on the technical of it as you already know how it works.


----------



## nexxusty

Quote:


> Originally Posted by *x7007*
> 
> again, we are talking about something nvidia did and change, we are talking about the same thing.
> 
> I just said Vsync is Globally enabled to ON when you enable Gsync or 3DTV .. that's all . it does that, working or not on the technical of it as you already know how it works.


Alright, I wasn't sure if there was some confusion.

Yes it does get set to "ON" when enabling G-sync. Just not really. Hehe.


----------



## Rivered082

Quote:


> Originally Posted by *xer0h0ur*
> 
> I haven't tried using Precision yet. Only have been using Afterburner Beta 4.3.0.


I used all of them so far. Afterburner and XOC are the most stable for me. I like XOC because of the scan tool for EVGA cards. The thing that I don't get is why offsetting a single voltage point allows for a very high over clock, but when I add offsets to the other points it becomes unstable. 1.05v can hit 2202mhz stable for me if it is the only voltage that has an offset. 1.062, 1.075, and 1.081 are not as stable that high, but 1.093 is.

In order for me to hit 1.093v I have to push the voltage to 100%. If I create a curve from just 1.05v to 1.093v, then I have to lower the offset of all five voltage steps because the steps between them are not as stable when clocked that high. I did notice that I couldn't just offset 1.05 and 1.093, and have it skip from one to the other without going through the other three points. It will stop at 1.05v no matter how high the voltage is turned up if they are offset lower than 1.05 and 1.093.


----------



## goinskiing

Quote:


> Originally Posted by *Jquala*
> 
> Currently. I have a FTW model that can hit a power target of 130%...I swear it was 120% yesterday. It has 0 effect on my overclock ability though. I'm surprised how unimpressive FTW overclock a compared to FE cards. I had 2/3 FE cards hit 2126-2100 and one 2050. My FTW can loop valley/heaven for hours at +99 on core (2050-2088mhz) but will artifact and then crash at +100(2088-2113mhz).cant make it through a loop. My question is you guys think when a block comes out for it it will get over that bump? Or should I sell it and go back to my FE with blocks


I have the same card and have had the exact same experience. Maybe because it's because I don't understand the whole curve thing.


----------



## Rivered082

Quote:


> Originally Posted by *goinskiing*
> 
> I have the same card and have had the exact same experience. Maybe because it's because I don't understand the whole curve thing.


When testing them individually, I was able to hit these offsets stable, but only individually. Whenever I added them together on a curve I instantly got a crash:

1.093v +200
1.081v +200
1.075v +175
1.062v +175
1.050v +200
1.046v +150
1.031v +150
1.025v +150
1.012v +150
1.000v +175
0.993v +175
0.981v +200
0.975v +200
0.962v +200


----------



## uberwootage

He pretty much sums up what everyone's been seeing.


----------



## goinskiing

Quote:


> Originally Posted by *Rivered082*
> 
> When testing them individually, I was able to hit these offsets stable, but only individually. Whenever I added them together on a curve I instantly got a crash:
> 
> 1.093v +200
> 1.081v +200
> 1.075v +175
> 1.062v +175
> 1.050v +200
> 1.046v +150
> 1.031v +150
> 1.025v +150
> 1.012v +150
> 1.000v +175
> 0.993v +175
> 0.981v +200
> 0.975v +200
> 0.962v +200


So, you keep the same curve and just manually (in Precision X) move just one of these points?


----------



## goinskiing

Quote:


> Originally Posted by *uberwootage*
> 
> 
> 
> 
> 
> 
> He pretty much sums up what everyone's been seeing.


Yup, most of the custom PCBs just seems to change it's out of the box clocks and do next to nothing for additional headroom.


----------



## Rivered082

Quote:


> Originally Posted by *goinskiing*
> 
> So, you keep the same curve and just manually (in Precision X) move just one of these points?


I set the curve to manual with a +0 offset on all points, and test just one single voltage with an offset at a time. I reset each point that I test back to zero before testing the next. I get really impressive results from each point individually, but they don't seem to work well together when I put them each at their max stable voltage that I tested. I'm so confused as to why they work well as a single offset, but become very unstable as part of an offset curve. Look at my pictures in my post a page back to see what I am talking about. You may have to zoom in to see where I selected the 1.05v offset by itself.


----------



## xzamples

Can somebody more knowledgeable tell me what the major differences are between:

GIGABYTE 1080 G1 GAMING - $650

and

GIGABYTE 1080 XTREME GAMING - $700

xtreme gaming only gets 3-6 more FPS than g1 gaming as far as performance goes, i'm pretty sure temps are identical.... so what justifies the extra $50 for the xtreme gaming edition?


----------



## boredgunner

Quote:


> Originally Posted by *xzamples*
> 
> Can somebody more knowledgeable tell me what the major differences are between:
> 
> GIGABYTE 1080 G1 GAMING - $650
> 
> and
> 
> GIGABYTE 1080 XTREME GAMING - $700
> 
> xtreme gaming only gets 3-6 more FPS than g1 gaming as far as performance goes, i'm pretty sure temps are identical.... so what justifies the extra $50 for the xtreme gaming edition?


Different PCB with better components on the XTREME Gaming. Better cooler on the XTREME Gaming, typically a better overclocker than the G1 Gaming I'd wager (I typically see > 2100 MHz from users with the XTREME Gaming but not with the G1 Gaming).

The G1 Gaming also has loads of coil whine complaints.


----------



## xzamples

Quote:


> Originally Posted by *boredgunner*
> 
> Different PCB with better components on the XTREME Gaming. Better cooler on the XTREME Gaming, typically a better overclocker than the G1 Gaming I'd wager (I typically see > 2100 MHz from users with the XTREME Gaming but not with the G1 Gaming).
> 
> The G1 Gaming also has loads of coil whine complaints.


both cards have 3 windforce fans, how is it much of better cooling?
both have LED's
both have back plates

if xtreme gaming gets > 2100 MHz then what does G1 gaming get? > 2060 ???

i mean look how close these #'s are

g1 gaming

Core Clock 1721 MHz in OC Mode 1695 MHz in Gaming Mode
Boost Clock 1860 MHz in OC Mode 1835 MHz in Gaming Mode

xtreme gaming:

Boost: 1936 MHz / Base: 1784 MHz in OC mode
Boost: 1898 MHz / Base: 1759 MHz in Gaming mode

i mean the only justification for that $50 on the xtreme gaming edition is for the VR link, SLI bridge, mouse pad, power cable, and other accessories that come with it


----------



## ChevChelios

Quote:


> The G1 Gaming also has loads of coil whine complaints.


havent heard many and I dont have any coil whine at all on my G1 1080


----------



## xer0h0ur

Coil whine. I haz it :/

Going to do the same thing I did with my 295X2 and leave it CSGO's menu for hours at about 900 FPS. That made most of my coil whine go away. One can only hope it will do the same. Then again it may just be my power supply doesn't play nice. I have a brand new EVGA SuperNOVA 1000 G2 sitting here next to me. I should probably try it to see if I have less coil whine but screw all that re-cabling. Too lazy for that right now.


----------



## boredgunner

Quote:


> Originally Posted by *xzamples*
> 
> both cards have 3 windforce fans, how is it much of better cooling?
> both have LED's
> both have back plates
> 
> if xtreme gaming gets > 2100 MHz then what does G1 gaming get? > 2060 ???
> 
> i mean look how close these #'s are
> 
> g1 gaming
> 
> Core Clock 1721 MHz in OC Mode 1695 MHz in Gaming Mode
> Boost Clock 1860 MHz in OC Mode 1835 MHz in Gaming Mode
> 
> xtreme gaming:
> 
> Boost: 1936 MHz / Base: 1784 MHz in OC mode
> Boost: 1898 MHz / Base: 1759 MHz in Gaming mode
> 
> i mean the only justification for that $50 on the xtreme gaming edition is for the VR link, SLI bridge, mouse pad, power cable, and other accessories that come with it


G1 Gaming seems to have good cooling too, but the XTREME Gaming has a much larger heatsink. I haven't seen the actual heatsink design of both in detail though, not sure on heatpipe count and arrangements. But fans aren't everything; take my MSI Armor vs MSI Twin Frozr for example. Twin Frozr (GAMING X) runs MUCH cooler.

Stock speed doesn't say much about potential overclocking either. Overclocking is always a lottery though, and yes they shouldn't be a world apart. I've just seen considerably more > 2100 MHz reports for the XTREME Gaming than G1 Gaming (and some 2200+ MHz reports too).

I wish I waited for the XTREME Gaming to come in stock, but then again I might still be waiting to this day.


----------



## goinskiing

Quote:


> Originally Posted by *Rivered082*
> 
> I set the curve to manual with a +0 offset on all points, and test just one single voltage with an offset at a time. I reset each point that I test back to zero before testing the next. I get really impressive results from each point individually, but they don't seem to work well together when I put them each at their max stable voltage that I tested. I'm so confused as to why they work well as a single offset, but become very unstable as part of an offset curve. Look at my pictures in my post a page back to see what I am talking about. You may have to zoom in to see where I selected the 1.05v offset by itself.


What about your other setting? Overvoltage, power %, mem clock, fan curve?


----------



## Jorginto

Guys, a piece of advice. I can choose between GTX 1080 Asus Strix or FE + EK waterblock. What would you prefer? Is that 8 pin enough to power a watercooled card?


----------



## kx11

a tip from an anonymous person indicates that Nvidia might remove the " Disable SLi " option and replace it with " turn off 2nd GPU" meaning SLi will always be on until you shut off the 2nd GPU

kinda weird IMO


----------



## Vellinious

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, a piece of advice. I can choose between GTX 1080 Asus Strix or FE + EK waterblock. What would you prefer? Is that 8 pin enough to power a watercooled card?


Yes, with Pascal and a modded bios to increase the power limit, a single 8 pin will be plenty to power it.


----------



## Rivered082

Quote:


> Originally Posted by *goinskiing*
> 
> What about your other setting? Overvoltage, power %, mem clock, fan curve?


This is with everything default. Power % makes no difference as the FTW never gets close to 100% TDP, at least mine doesn't. I did a fan curve, though to keep temps in the mid 50c range at load. The high clocks from the single voltage offsets when testing came with the memory at +0. I can push it to +600 but it adversely affects performance after +450. The only way for me to test and check 1.075, 1.081, and 1.093 were to push voltage 25%, 50%, and 100% respectively.

What confuses me is why I cannot create a stable curve based on the individual results for each voltage step. I can either hit over 2.2ghz with a single voltage having an offset, or a little over 2.1ghz with a standard curve with an offset of +100 across the board.


----------



## goinskiing

Quote:


> Originally Posted by *Rivered082*
> 
> This is with everything default. Power % makes no difference as the FTW never gets close to 100% TDP, at least mine doesn't. I did a fan curve, though to keep temps in the mid 50c range at load. The high clocks from the single voltage offsets when testing came with the memory at +0. I can push it to +600 but it adversely affects performance after +450. The only way for me to test and check 1.075, 1.081, and 1.093 were to push voltage 25%, 50%, and 100% respectively.
> 
> What confuses me is why I cannot create a stable curve based on the individual results for each voltage step. I can either hit over 2.2ghz with a single voltage having an offset, or a little over 2.1ghz with a standard curve with an offset of +100 across the board.


This is excellent info, thanks! Sounds like I have some playing around to do. I want to break down the 2100MHz wall. :-D


----------



## chronicfx

Quote:


> Originally Posted by *xzamples*
> 
> both cards have 3 windforce fans, how is it much of better cooling?
> both have LED's
> both have back plates
> 
> if xtreme gaming gets > 2100 MHz then what does G1 gaming get? > 2060 ???
> 
> i mean look how close these #'s are
> 
> g1 gaming
> 
> Core Clock 1721 MHz in OC Mode 1695 MHz in Gaming Mode
> Boost Clock 1860 MHz in OC Mode 1835 MHz in Gaming Mode
> 
> xtreme gaming:
> 
> Boost: 1936 MHz / Base: 1784 MHz in OC mode
> Boost: 1898 MHz / Base: 1759 MHz in Gaming mode
> 
> i mean the only justification for that $50 on the xtreme gaming edition is for the VR link, SLI bridge, mouse pad, power cable, and other accessories that come with it


I believe the cheaper one has three medium sized fans and the more expensive one has two large fans and a medium fan that sits overlapping the other two and spins in an opposite direction, it explains it on the site. Plus slightly better components on the more expensive.

http://www.gigabyte.us/products/product-page.aspx?pid=5921#kf


----------



## chronicfx

Quote:


> Originally Posted by *chronicfx*
> 
> I believe the cheaper one has three medium sized fans and the more expensive one has two large fans and a stacked fan fan that sits overlapping the other two and spins in an opposite direction, it explains it on the site. Plus slightly better components on the more expensive. Does the G1 have the spill resistant coating on the PCB?
> 
> http://www.gigabyte.us/products/product-page.aspx?pid=5921#kf


----------



## Andreadeluxe

i successfully flashed my msi gtx 1080 armor oc with msi gaming x oc bios.


----------



## Derko1

Just received an MSI 1080 EK Sea Hawk X. I hooked it up and got these results from 3dmarks vs my 2x7970s... http://www.3dmark.com/compare/fs/9287645/fs/8697663

I would like to OC it right off the bat, see how far up I can get... I put the card with power limit and temp limit all the way to the top and did +50 on the core clock and everything is running perfectly fine. Temps are hitting barely 39-40C on load. How much more would I be able to go up before I need to start adding voltage? I haven't touch mem yet... but will do once I get the core dialed in.


----------



## jedimasterben

Quote:


> Originally Posted by *Derko1*
> 
> Just received an MSI 1080 EK Sea Hawk X. I hooked it up and got these results from 3dmarks vs my 2x7970s... http://www.3dmark.com/compare/fs/9287645/fs/8697663
> 
> I would like to OC it right off the bat, see how far up I can get... I put the card with power limit and temp limit all the way to the top and did +50 on the core clock and everything is running perfectly fine. Temps are hitting barely 39-40C on load. How much more would I be able to go up before I need to start adding voltage? I haven't touch mem yet... but will do once I get the core dialed in.


Can you pretty please give us a BIOS dump? The Sea Hawk X uses a FE PCB, but I'm wondering if it has a higher power limit than standard FE due to it 'only' giving +5% power limit and in the sole review so far it has it barely hitting that PL at 2152MHz with 1.062v.







http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_sea_hawk_x_review,39.html

Also, I'm jelly at that load temp. I am using an EVGA 980ti hybrid cooler on mine with dual Scythe GT AP-14 in push/pull and I'm still hitting 53-54C full load after gaming for a while.


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> Can you pretty please give us a BIOS dump? The Sea Hawk X uses a FE PCB, but I'm wondering if it has a higher power limit than standard FE due to it 'only' giving +5% power limit and in the sole review so far it has it barely hitting that PL at 2152MHz with 1.062v.
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_sea_hawk_x_review,39.html


Not the same Sea Hawk X... it's the EK waterblock edition.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127952

Would anything bad happened if I just tried those settings in one shot and see if they worked?


----------



## jedimasterben

Quote:


> Originally Posted by *Derko1*
> 
> Not the same Sea Hawk X... it's the EK waterblock edition.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127952
> 
> Would anything bad happened if I just tried those settings in one shot and see if they worked?


Oh, you got the better one then









You can definitely give those settings a go and see how it ends up! edit: nvm, the board is custom lol


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> Oh, you got the better one then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can definitely give those settings a go and see how it ends up! edit: nvm, the board is custom lol


They didn't work. Got artifacts and what not.









Link me to somewhere on how to do it and I will keep playing with it and see how high I get. So running at +100 core voltage is ok?


----------



## jedimasterben

Quote:


> Originally Posted by *Derko1*
> 
> They didn't work. Got artifacts and what not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link me to somewhere on how to do it and I will keep playing with it and see how high I get. So running at +100 core voltage is ok?


Changing the voltage probably won't do you any favors IMHO. I didn't realize in the review they were running their VRAM at +700, most are only capable of +450-500, with a few hitting +530-550, so for starters, I would leave VRAM at default and play with the core only and see where you get.


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> Changing the voltage probably won't do you any favors IMHO. I didn't realize in the review they were running their VRAM at +700, most are only capable of +450-500, with a few hitting +530-550, so for starters, I would leave VRAM at default and play with the core only and see where you get.


So don't do the voltage at +100? I am down to +145 and voltage at +100 and am still getting artifacts.


----------



## jedimasterben

Quote:


> Originally Posted by *Derko1*
> 
> So don't do the voltage at +100? I am down to +145 and voltage at +100 and am still getting artifacts.


Usually more voltage just causes you to hit the power limit wall faster. What is Afterburner reporting your voltage, clocks, etc at once they level out?


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> Usually more voltage just causes you to hit the power limit wall faster. What is Afterburner reporting your voltage, clocks, etc at once they level out?


Using heaven, power is leveling out at around 80 to 85, clocks at 2126, mem at 5006... how do I see voltage? I don't see it listed anywhere. I see VID usage going from like 59 to 65 or so. This is doing +145 core and keeping the voltage at +100.


----------



## jedimasterben

Quote:


> Originally Posted by *Derko1*
> 
> Using heaven, power is leveling out at around 80 to 85, clocks at 2126, mem at 5006... how do I see voltage? I don't see it listed anywhere. I see VID usage going from like 59 to 65 or so. This is doing +145 core and keeping the voltage at +100.


You likely haven't unlocked voltage control and monitoring in the Afterburner settings. Should be on the initial page, then for monitoring you can actually disable anything you don't need to monitor (I only have it log power, core clock, voltage, temperature, GPU usage, memory usage, total CPU usage, and frame rate).


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> You likely haven't unlocked voltage control and monitoring in the Afterburner settings. Should be on the initial page, then for monitoring you can actually disable anything you don't need to monitor (I only have it log power, core clock, voltage, temperature, GPU usage, memory usage, total CPU usage, and frame rate).


It is unlocked. Voltage is at 1.062 constant. That is whether I had my core voltage to +0 or +100... I'm guessing it's supposed to go up?


----------



## jedimasterben

Quote:


> Originally Posted by *Derko1*
> 
> It is unlocked. Voltage is at 1.062 constant. That is whether I had my core voltage to +0 or +100... I'm guessing it's supposed to go up?


It is different with Boost 3.0, core clock and voltage are kind of willy nilly now. You can manually adjust the clock/voltage curve by hitting Ctrl+F and manually changing each state, but I haven't personally had much luck with that at this point, possibly due to my card hitting its power limit pretty quickly.


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> It is different with Boost 3.0, core clock and voltage are kind of willy nilly now. You can manually adjust the clock/voltage curve by hitting Ctrl+F and manually changing each state, but I haven't personally had much luck with that at this point, possibly due to my card hitting its power limit pretty quickly.


How do I know I'm hitting my power limit? It seems like my core does not change despite my adding to the core voltage.

Edit: Using that ctrl+F method, I was able to get my voltage up to 1.093. Trying different clocks now and see what works.


----------



## xzamples

Quote:


> Originally Posted by *chronicfx*
> 
> I believe the cheaper one has three medium sized fans and the more expensive one has two large fans and a medium fan that sits overlapping the other two and spins in an opposite direction, it explains it on the site. Plus slightly better components on the more expensive.
> 
> http://www.gigabyte.us/products/product-page.aspx?pid=5921#kf


if it truly has better components it would cost more than the extra $50, especially considering the accessories it comes with... those all have to be included with the cost to determine ROI


----------



## Raisingx

Got a MSI GTX 1080 8g Gaming X card but it emits a buzz sound at over 40% gpu load or even fps locked to 60, weirdly enough it makes no noise at thousands of fps.

My psu is a EVGA supernova g2 1300w, is this normal ? I can hear the noise while playing because I don't use headphones.. does this go away or I should RMA ?

Had a MSI 980ti before and it made no such noise :\


----------



## Derpinheimer

People in the past have claimed leaving the card alone for awhile as it buzzes can fix the noise; i.e. leave it running the program that its buzzing on for a few hours)

I never saw improvement with myself IIRC, but its worth a shot. Otherwise its permanent without hardware mod.


----------



## Raisingx

Quote:


> Originally Posted by *Derpinheimer*
> 
> People in the past have claimed leaving the card alone for awhile as it buzzes can fix the noise; i.e. leave it running the program that its buzzing on for a few hours)
> 
> I never saw improvement with myself IIRC, but its worth a shot. Otherwise its permanent without hardware mod.


Tried that overnight twice and didn't work, so tired of playing russian roulette with expensive hardware, first was with a xb270hu monitor and now this


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *ChevChelios*
> 
> havent heard many and I dont have any coil whine at all on my G1 1080


My g1 has coil wine


----------



## xTesla1856

I've had terrible coil whine in the past with MSI, ASUS and Gigabyte. Only cards that didn't have it were from EVGA. Might be anecdotal though


----------



## Avant Garde

Anybody here with EVGA FTW ? I've opted for that one...


----------



## Jquala

https://redd.it/4sls5a

Please go there and stomach through my post. I've been gathering information regarding different ASIC%, FE VS AIB and what it means for overclockability. If I can get more results I can add more to my findings and hopefully publish a concrete consensus of what the best overclocking option will be for future consumers


----------



## goinskiing

Quote:


> Originally Posted by *Avant Garde*
> 
> Anybody here with EVGA FTW ? I've opted for that one...


I do.


----------



## bobytt

Quote:


> Originally Posted by *Avant Garde*
> 
> Anybody here with EVGA FTW ? I've opted for that one...


Mine is coming Friday


----------



## ChevChelios

anyone tried running the bench in DX12 after the latest async RotR patch ?

I compared DX11 vs DX12, with the latest patch and I got an overall score increase of ~10-15 in DX12 and min fps increased significantly in 2 of the tests


----------



## pez

Quote:


> Originally Posted by *xzamples*
> 
> Can somebody more knowledgeable tell me what the major differences are between:
> 
> GIGABYTE 1080 G1 GAMING - $650
> 
> and
> 
> GIGABYTE 1080 XTREME GAMING - $700
> 
> xtreme gaming only gets 3-6 more FPS than g1 gaming as far as performance goes, i'm pretty sure temps are identical.... so what justifies the extra $50 for the xtreme gaming edition?


Via GB and some reading I've done, better components, better PCB, etc. I've got some coil whine on my card at higher FPS, but it's more of a low buzz (I'm pretty sure this indicates memory rather than GPU like coil whine normally is). I don't notice this during gaming, and especially not so since I'm using headphones 99% of the time I'm gaming.

Also, the 'bundle' you get with the Xtreme Gaming is pretty nice, TBH. I imagine it's a HB SLI bridge it comes with (or at least the equivalent of a LED bridge) and that's worth a good chunk, IMO. If you can get over the fact that it takes up essentially 3 slots, and you like the design, then I don't see why not. I still think MSI is being ridiculous with their card pricing.

Quote:


> Originally Posted by *kx11*
> 
> a tip from an anonymous person indicates that Nvidia might remove the " Disable SLi " option and replace it with " turn off 2nd GPU" meaning SLi will always be on until you shut off the 2nd GPU
> 
> kinda weird IMO


I wonder if this will be a global setting or one that can be set on a per-game basis? I could imagine when you go to set up an individual application to 'disable SLI' that it could look a bit daunting to the inexperienced user. It would be cool to disable a second GPU that either scales way more than you need it do and then possibly enable it for PhysX. Borderlands series comes to mind







. I think games like Trine also use PhysX. Random, but it's there







. Also, the video below from LTT channel kinda explains how multi-GPU solutions may go different courses. That's the only other reason I could think that they would make it such an explicit option. Disable SLI, but still allow it to use the second GPU as it pleases.


----------



## DarX098

Can I join the club now?


----------



## ilgello

Hello guys, did Anyone try to mod a 980 EK Full cover Waterblock to fit a 1080?


----------



## Derko1

Quote:


> Originally Posted by *jedimasterben*
> 
> It is different with Boost 3.0, core clock and voltage are kind of willy nilly now. You can manually adjust the clock/voltage curve by hitting Ctrl+F and manually changing each state, but I haven't personally had much luck with that at this point, possibly due to my card hitting its power limit pretty quickly.


Where can I read up on using this curve mode for OC'ing? I was able to do a max of 1.093v and was stable at +150, but I would like to obviously fine tune it.


----------



## Shadowdane

Quote:


> Originally Posted by *Jquala*
> 
> https://redd.it/4sls5a
> 
> Please go there and stomach through my post. I've been gathering information regarding different ASIC%, FE VS AIB and what it means for overclockability. If I can get more results I can add more to my findings and hopefully publish a concrete consensus of what the best overclocking option will be for future consumers


Last I checked nothing can read ASIC% on Pascal cards yet... GPU-Z gives an unsupported error when trying to read it on my GTX1080 cards.


----------



## SiriusLeo

Quote:


> Originally Posted by *Rivered082*
> 
> When testing them individually, I was able to hit these offsets stable, but only individually. Whenever I added them together on a curve I instantly got a crash:
> 
> 1.093v +200
> 1.081v +200
> 1.075v +175
> 1.062v +175
> 1.050v +200
> 1.046v +150
> 1.031v +150
> 1.025v +150
> 1.012v +150
> 1.000v +175
> 0.993v +175
> 0.981v +200
> 0.975v +200
> 0.962v +200


This is interesting. I went into PrecisionX and did the same as you; tested every voltage point to find the max stable overclock per voltage point. Even though I can give my card up to 1.093v it appears to not really like anything much over 0.962v while just overclocking the Core and 1.013v when overclocking the Core + Memory. After 1.013v I see zero benefit at all to additional voltage for overclock-ability on either Core or Memory.

Although my Core can reach 2126mhz and Memory at 12,006mhz I've found that it's not 100% stable across the board after hours of testing. These figures below are 100% stable across the board after hours of testing.

*0.962v = 2062mhz Core // 10,012mhz Memory = lowest volts to highest Core without touching the memory.

1.013v = 2075mhz Core // 11,520mhz Memory = lowest volts to highest Core with Memory overclock.*

What I found is when bumping the volts up higher than 1.013v it starts to really effect my memory OC's and my overall scores when bench-marking suffer. Before I did this I was running *1.062v = 2100mhz Core // 10,934mhz Memory.*

If I wanted to drop down to 2000mhz Core I could get away with running as little as *0.950v = 2000mhz Core // 11,520mhz Memory.*

*1.013v = 2075mhz Core // 11,520mhz Memory* is EQUAL to BETTER than *1.062v = 2100mhz Core // 10,934mhz Memory* for me and my power limit is significant less. I rarely hit the power limit anymore when looking at GPU-Z.

Maybe the trick with Pascal is to see how HIGH you can overclock and how LOW you can get your volts. The new OC meta.


----------



## achilles73

Quote:


> Originally Posted by *Derko1*
> 
> Where can I read up on using this curve mode for OC'ing? I was able to do a max of 1.093v and was stable at +150, but I would like to obviously fine tune it.


There's a video at the end that explains how it works.
http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,1.html


----------



## Tideman

Just received two Zotac 1080 AMP! Editions

Boosted to 1962 right out of the box (tested with Heaven at 1440p). They hit 83c/73c on the stock fan settings. Really want to see how they handle 4k but still waiting for my replacement rog swift to arrive.. Hopefully this week.


----------



## Jquala

FFS. TIL Reddit is full of idiots who just regurgitate the same thing over and over without even reading through the post. Just fyi, data was collected not to show off how many cards I had or how high of an ASIC quality one of my cards were but the indisputable fact there was a linear trend with current *[Unsupported gpu-z ASIC scores*


----------



## nexxusty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Coil whine. I haz it :/
> 
> Going to do the same thing I did with my 295X2 and leave it CSGO's menu for hours at about 900 FPS. That made most of my coil whine go away. One can only hope it will do the same. Then again it may just be my power supply doesn't play nice. I have a brand new EVGA SuperNOVA 1000 G2 sitting here next to me. I should probably try it to see if I have less coil whine but screw all that re-cabling. Too lazy for that right now.


You all have coil whine.... load up Doom in Vulkan with VSYNC off. Enjoy 3500-4000fps and in turn, coil whine.


----------



## boredgunner

Quote:


> Originally Posted by *nexxusty*
> 
> You all have coil whine.... load up Doom in Vulkan with VSYNC off. Enjoy 3500-4000fps and in turn, coil whine.


I will try gaming at thousands of FPS out of curiosity. An unrealistic scenario that has no bearing on the real world for most of us though.


----------



## dante`afk

Quote:


> Originally Posted by *Avant Garde*
> 
> Anybody here with EVGA FTW ? I've opted for that one...


i got 2 of them.

//////

I made some tests yesterday between the new HB bridge and the old flexible ones, I could not see any difference? is it maybe because my board can support only 8x/8x and not 16x/16x?


----------



## aberrero

Quote:


> Originally Posted by *xTesla1856*
> 
> I've had terrible coil whine in the past with MSI, ASUS and Gigabyte. Only cards that didn't have it were from EVGA. Might be anecdotal though


My EVGA 1080 has cool whine. It has definitely decreased since I bought it though. Might actually be totally gone now.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jquala*
> 
> https://redd.it/4sls5a
> 
> Please go there and stomach through my post. I've been gathering information regarding different ASIC%, FE VS AIB and what it means for overclockability. If I can get more results I can add more to my findings and hopefully publish a concrete consensus of what the best overclocking option will be for future consumers


Someone here said that the gpu-z asic readings are false as its not properly supporting Pascal yet so sounds like you've wasted a lot of time.

As a side note I am still overclocking my FE card to find out its limits. Haven't had a crash yet operating at:

Core Voltage : +100%
Power Limit : 120%
Core Clock : + 225
Memory Clock : +575

Which results in like 2113MHz core clock or somewhere thereabouts. Played Dying Light, CSGO and Thief so far without issue as well as benchmarked 3dmark Firestrike and Thief without artifacts or crashing. I wanted to test more last night but I had to go to sleep early. Will give it another crack tonight when I get out of work.


----------



## Vellinious

Quote:


> Originally Posted by *xer0h0ur*
> 
> Someone here said that the gpu-z asic readings are false as its not properly supporting Pascal yet so sounds like you've wasted a lot of time.
> 
> As a side note I am still overclocking my FE card to find out its limits. Haven't had a crash yet operating at:
> 
> Core Voltage : +100%
> Power Limit : 120%
> Core Clock : + 225
> Memory Clock : +575
> 
> Which results in like 2113MHz core clock or somewhere thereabouts. Played Dying Light, CSGO and Thief so far without issue as well as benchmarked 3dmark Firestrike and Thief without artifacts or crashing. I wanted to test more last night but I had to go to sleep early. Will give it another crack tonight when I get out of work.


His biggest mistake, even though ASIC quality as shown in GPUz is wrong, was that he didn't also log ambient and core temps for the clocks he's tracking. I can pretty much guarantee that ambient and core temps are a larger determining factor in the clocks he saw (discounting COMPLETELY, overclocker ability), especially since ASIC isn't reporting correctly yet in GPUz.


----------



## KillerBee33

Any news on BIOS Tool?


----------



## ssgwright

Quote:


> Originally Posted by *KillerBee33*
> 
> Any news on BIOS Tool?


waiting patiently as well


----------



## KillerBee33

Quote:


> Originally Posted by *ssgwright*
> 
> waiting patiently as well


Same here







New Driver coming this week , promising to fix bunch of issues , atleast that.


----------



## xer0h0ur

Quote:


> Originally Posted by *Vellinious*
> 
> His biggest mistake, even though ASIC quality as shown in GPUz is wrong, was that he didn't also log ambient and core temps for the clocks he's tracking. I can pretty much guarantee that ambient and core temps are a larger determining factor in the clocks he saw (discounting COMPLETELY, overclocker ability), especially since ASIC isn't reporting correctly yet in GPUz.


Fair point. For what its worth I am waterblocked so temps aren't really going to be an issue for me. Even with voltage cranked its not going past 41C.


----------



## jprovido

any tips to improve my oc? I'm at 2063mhz 11000mhz ram +50mv. raising the voltage doesn't really help with stability. does flashing the bios help me get 2100? I have an ASUS rog strix gtx 1080. temps are in check just can't get it to clock higher


----------



## Vellinious

Quote:


> Originally Posted by *xer0h0ur*
> 
> Fair point. For what its worth I am waterblocked so temps aren't really going to be an issue for me. Even with voltage cranked its not going past 41C.


There are still layers there, even more so than Maxwell, where certain temp levels just aren't going to offer any kind of stability at certain clocks. And certainly with Pascal, where certain temp levels will cause the card to down clock.

I look forward to getting mine, and doing some testing with various ambient temps to see where those layers really are. The 980tis were the same way...just not as pronounced.


----------



## xer0h0ur

Man, I haven't had an Nvidia card since the GTX 690 so any insight or information you can give me is welcome.


----------



## lapino

Received my EVGA GTX1080 FTW today (Belgium), very happy with it. Card looks amazing, fits very nicely in my case, fun with the RGB leds and temps seem to stay below 80°C with a 60% fans speed which is almost inaudible with casefans on decent speed for airflow.


----------



## ChevChelios

Quote:


> Originally Posted by *KillerBee33*
> 
> New Driver coming this week , promising to fix bunch of issues , atleast that.


source ?


----------



## KillerBee33

Quote:


> Originally Posted by *ChevChelios*
> 
> source ?


http://www.pcworld.com/article/3094251/components-graphics/vr-problems-plaguing-nvidias-geforce-gtx-1080-gtx-1070-will-be-fixed-this-week.html

http://www.tomshardware.com/news/nvidia-gtx-1080-vr-boost,32235.html


----------



## ChevChelios

thx


----------



## Jquala

I don't understand what part of I had acknowledged that gpuz was outputting inaccurate yet not false numbers. If you have say a 93% card today and gpuz 0.9 comes out tomorrow you'll probably see a 66% ASIC score vs someone with 100% might see a 83%. It's wrong only in the sense it is applying the formula for maxwell onto pascal but in terms of power vs effeciency it still follows a linear trend. People can discount gpu-z ASIC scores all they want but the numbers don't lie


----------



## KillerBee33

Quote:


> Originally Posted by *Jquala*
> 
> I don't understand what part of I had acknowledged that gpuz was outputting inaccurate yet not false numbers. If you have say a 93% card today and gpuz 0.9 comes out tomorrow you'll probably see a 66% ASIC score vs someone with 100% might see a 83%. It's wrong only in the sense it is applying the formula for maxwell onto pascal but in terms of power vs effeciency it still follows a linear trend. People can discount gpu-z ASIC scores all they want but the numbers don't lie


What is ASIC exactly?


----------



## Jquala

Quote:


> Originally Posted by *Vellinious*
> 
> His biggest mistake, even though ASIC quality as shown in GPUz is wrong, was that he didn't also log ambient and core temps for the clocks he's tracking. I can pretty much guarantee that ambient and core temps are a larger determining factor in the clocks he saw (discounting COMPLETELY, overclocker ability), especially since ASIC isn't reporting correctly yet in GPUz.[/
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Fair point. For what its worth I am waterblocked so temps aren't really going to be an issue for me. Even with voltage cranked its not going past 41C.
> 
> 
> 
> First, gpu-z is displaying innacurate but not false numbers. In context it's displaying ASIC quality along a 90-100% scale. People with cards near the lower 90s are less efficient than cards in the upper 95%+. A easy way to tell is pop open your vf curve I'll bet my bottom dollar a 100% ASIC card will run at a higher frequency @0.843 than a 91%card.
> 
> Second, all fe cards were put under water never reaching over 44C while the aib cards had a fan curve that set most from hitting 65C. These results were done without the interference of thermal throttling. Can people for one moment remember that I didn't form these numbers from my opinion but the numbers formed my opinion.
Click to expand...


----------



## Vellinious

Quote:


> Originally Posted by *Jquala*
> 
> I don't understand what part of I had acknowledged that gpuz was outputting inaccurate yet not false numbers. If you have say a 93% card today and gpuz 0.9 comes out tomorrow you'll probably see a 66% ASIC score vs someone with 100% might see a 83%. It's wrong only in the sense it is applying the formula for maxwell onto pascal but in terms of power vs effeciency it still follows a linear trend. People can discount gpu-z ASIC scores all they want but the numbers don't lie


Because there's nothing proven there yet, and you're speculating as to what it "might" be when it happens. You're submitting statistical data with unknown variables, based on assumption. It's no more concrete than if someone were to say, I've tested 3 ASUS cards and they overclock better than the MSI cards that I've seen reviews on, and therefore, ASUS is better.

If you don't understand that using numbers like these, fly in the face of everything statistical, comparative analysis is supposed to do, then I'm not really sure what to tell you. lol


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> What is ASIC exactly?


The TLDR: It's the measurement of voltage leak in the core.


----------



## Jquala

Quote:


> Originally Posted by *KillerBee33*
> 
> What is ASIC exactly?


It is a feature on gpu-z 0.8.7 and before that tells you a certain % in which how power efficient your card is. Someone with higher ASIC scores can run at the same clocks but with less power. There was a general trend with maxwell and still presently in pass all that higher ASIC scores are drafting higher oc headroom. I personally logged every 1080 gtx that passed through my hands


----------



## KillerBee33

Quote:


> Originally Posted by *Vellinious*
> 
> The TLDR: It's the measurement of voltage leak in the core.


OK how does higher AISC benefit gamers?


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> OK how does higher AISC benefit gamers?


For someone that doesn't overclock? It doesn't really help at all. The card may run a few degrees c cooler with a higher ASIC, because it's running a tad less voltage at the boost clocks, but....that's really about it.


----------



## KillerBee33

Quote:


> Originally Posted by *Vellinious*
> 
> For someone that doesn't overclock? It doesn't really help at all. The card may run a few degrees c cooler with a higher ASIC, because it's running a tad less voltage at the boost clocks, but....that's really about it.


So by that Logic a 65.2 ASIC should be BAD CARD?


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> So by that Logic a 65.2 ASIC should be BAD CARD?


Eh...it's a good probability it won't overclock very well. If overclocking is the end game for the user, then, yes...it's less than desirable. If the user is just going to be gaming, I doubt they'd even notice.


----------



## Jquala

Quote:


> Originally Posted by *Vellinious*
> 
> Because there's nothing proven there yet, and you're speculating as to what it "might" be when it happens. You're submitting statistical data with unknown
> variables, based on assumption. It's no more concrete than if someone were to say, I've tested 3 ASUS cards and they overclock better than the MSI cards that I've seen reviews on, and therefore, ASUS is better.
> 
> If you don't understand that using numbers like these, fly in the face of everything statistical, comparative analysis is supposed to do, then I'm not really sure what to tell you. lol


My variables were very clear. The only mystery is how the asic quality will be displayed with gpu-z 0.9.0 vs how it is displayed now. I set every card to run its own vf curve with precision then I dialed in the highest OC it can remain stable in a 20min loop. The results were card with higher ASIC readings had greater overclocking headroom in an almost equal intervals between each percentage. I have good reasons to believe that when gpu-z 0.9 comes out it will support my claim. I'm not just going off what reviewers have stated albeit their reviews align with my own results but i stress tested, benched and logged each card in an controlled environment


----------



## KillerBee33

Quote:


> Originally Posted by *Vellinious*
> 
> Eh...it's a good probability it won't overclock very well. If overclocking is the end game for the user, then, yes...it's less than desirable. If the user is just going to be gaming, I doubt they'd even notice.


This is my 980 GAME STABLE . Any thoughts?


----------



## Jquala

Quote:


> Originally Posted by *Vellinious*
> 
> Eh...it's a good probability it won't overclock very well. If overclocking is the end game for the user, then, yes...it's less than desirable. If the user is just going to be gaming, I doubt they'd even notice.


It only matters to you if you fall under the people that care more of pushing their cards vs how many actual frames more we get from doing it. In the grand scheme of things a gamer would probably not notice and shouldn't concern themselves with it. I figured people in overclock.net might find it peculiar


----------



## Jquala

I can't even read what % it is 65% or 85% and is it running on stock bios and voltage?


----------



## nexxusty

Quote:


> Originally Posted by *Jquala*
> 
> My variables were very clear. The only mystery is how the asic quality will be displayed with gpu-z 0.9.0 vs how it is displayed now. I set every card to run its own vf curve with precision then I dialed in the highest OC it can remain stable in a 20min loop. The results were card with higher ASIC readings had greater overclocking headroom in an almost equal intervals between each percentage. I have good reasons to believe that when gpu-z 0.9 comes out it will support my claim. I'm not just going off what reviewers have stated albeit their reviews align with my own results but i stress tested, benched and logged each card in an controlled environment


Interesting theory nonetheless.

I think certain people here don't like or appreciate theories. From what I understand we might have actual ASIC readings in the future, as opposed to none at all. Ever... lol.

I don't think this was a waste of time. Good job.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jquala*
> 
> First, gpu-z is displaying innacurate but not false numbers. In context it's displaying ASIC quality along a 90-100% scale. People with cards near the lower 90s are less efficient than cards in the upper 95%+. A easy way to tell is pop open your vf curve I'll bet my bottom dollar a 100% ASIC card will run at a higher frequency @0.843 than a 91%card.
> 
> Second, all fe cards were put under water never reaching over 44C while the aib cards had a fan curve that set most from hitting 65C. These results were done without the interference of thermal throttling. Can people for one moment remember that I didn't form these numbers from my opinion but the numbers formed my opinion.


Are you high? An inaccurate reading IS a false reading.... No one is arguing with you about ASIC readings in GPU-Z following a linear curve. People, including myself, are simply stating its 100% worthless to take an ASIC reading right now that is inaccurate.


----------



## Twinnuke

Gigabyte G1 1080 Incoming.

neweggorder.png 17k .png file


Obviously I don't have it in front of me yet due to shipping. I should have gotten 2 day T_T. But I was expecting it to come from New Jersey.


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> This is my 980 GAME STABLE . Any thoughts?


Looks ok to me, but the guys in the 980 owners thread would probably be a bit more helpful.


----------



## Vellinious

Quote:


> Originally Posted by *Jquala*
> 
> My variables were very clear. The only mystery is how the asic quality will be displayed with gpu-z 0.9.0 vs how it is displayed now.


A contradiction.

Look....I personally believe your hypothesis is correct, but it's far from sound and can't be proven until ASIC quality reads correctly from GPUz. Speculation is speculation, no matter how fervently you believe it.


----------



## KillerBee33

Quote:


> Originally Posted by *Vellinious*
> 
> Looks ok to me, but the guys in the 980 owners thread would probably be a bit more helpful.


Helpfull with disproving the ASIC theory? We had this discussion on few other threads , there is no actual prove ASICs numbers mean better or worse GPU , atleast we couldnt find it








Some 980Ti guys with 78 ASIC couldn't get over 1418MHz .


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> Helpfull with disproving the ASIC theory? We had this discussion on few other threads , there is no actual prove ASICs numbers mean better or worse GPU , atleast we couldnt find it
> 
> 
> 
> 
> 
> 
> 
> 
> Some 980Ti guys with 78 ASIC couldn't get over 1418MHz .


The 980 and 980TI are different cores, for starters. The GM204s overclock quite a bit higher. And 1580 on a GM204 isn't high....not even close. Pretty average, really.

When you start looking at how well a card will overclock, there are SEVERAL factors you need to take into consideration, especially with Maxwell and Pascal. First and foremost, ambient temp and core temp. Then overall build quality of the GPU. The skill of the person trying to overclock it. Power delivery....the list goes on and on and on. SO, while other things do play a role in how well a card will overclock, the ASIC quality of the core is a very good indicator of just how well a GPU will overclock. Especially since Maxwell....and as the processes continue to get smaller, it'll play an even larger role, because adding more voltage to the small process GPUs is becoming more and more difficult without some really extreme cooling.

Go to the Kingpin Forums and ask him for help overclocking a classy / KPE. The first thing you get asked is what the ASIC is. Why? Because they all know, as well as most of us do, that ASIC matters...especially with Maxwell and more than likely, Pascal... Time will tell.


----------



## superkyle1721

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> Looks ok to me, but the guys in the 980 owners thread would probably be a bit more helpful.
> 
> 
> 
> Helpfull with disproving the ASIC theory? We had this discussion on few other threads , there is no actual prove ASICs numbers mean better or worse GPU , atleast we couldnt find it
> 
> 
> 
> 
> 
> 
> 
> 
> Some 980Ti guys with 78 ASIC couldn't get over 1418MHz .
Click to expand...

ASIC is an indicator. For instance let's say your card had a higher ASIC score. Does that always mean you will be able to click it higher with an unlocked bios? No it doesn't as you have noticed. Was is certain is that your card could achieve the same clock speed but at a reduced voltage if it had a higher ASIC. That much is fact. There are many variables at play with graphics card. Asic is just a single indicator but does have its relevance to many here.

Sent from my iPhone using Tapatalk


----------



## xer0h0ur

My 290X had lower ASIC than either of the GPUs on my 295X2 and the 290X overclocked better. ASIC to me isn't an end all be all indicator of overclockabilty. But I have never been a physical volt modding overclocker so I may be wrong with extreme overclocking.


----------



## Vellinious

Quote:


> Originally Posted by *xer0h0ur*
> 
> My 290X had lower ASIC than either of the GPUs on my 295X2 and the 290X overclocked better. ASIC to me isn't an end all be all indicator of overclockabilty. But I have never been a physical volt modding overclocker so I may be wrong with extreme overclocking.


The Hawaii architecture was all together different and much harder to use ASIC as an indicator. They were more along the lines of Kepler, in some ways. Where the lower ASIC cards were pretty good under water, because they could take the extra voltage. I had a 290X that would run Firestrike all day long at 1310 / 1820, and gave me some GREAT runs for scores at 1333 / 1862. It was a 74% asic GPU.


----------



## uberwootage

Ok is GPUZ calculating ASIC correctly? How did they get these numbers? How did they calibrate my card and my motherboard sensors to read it correctly. "Checks for a cal sticker on case."

ASIC means nothing. Without calibrating it there is a huge huge margin for error thus making ASIC mean nothing. Anyone who has worked with electronics knows just because software will say something does not mean its correct. I work with highspeed fiber optic recovers. 100+ Ghz stuff. The data we get means nothing until we load a correction factor. And even then. We change a fiber patch cord it throws off everything.

The only people who can give us a valid ASIC reading is Nvidia. I could write a program that reads it at over 9000 but it does not mean your gpu is going to go super saiyan. Anyone who seems to think that whatever GPUZ says the ASIC is an accurate reading has a very crude understanding of electronics.


----------



## xer0h0ur

Quote:


> Originally Posted by *Vellinious*
> 
> The Hawaii architecture was all together different and much harder to use ASIC as an indicator. They were more along the lines of Kepler, in some ways. Where the lower ASIC cards were pretty good under water, because they could take the extra voltage. I had a 290X that would run Firestrike all day long at 1310 / 1820, and gave me some GREAT runs for scores at 1333 / 1862. It was a 74% asic GPU.


Yup, both cards were under water with EK blocks. Still have them both, will be selling em soon. The ASIC on the 290X was 69% and the ASIC on the 295X2 was 81% and 84%


----------



## KillerBee33

Quote:


> Originally Posted by *Vellinious*
> 
> The 980 and 980TI are different cores, for starters. The GM204s overclock quite a bit higher. And 1580 on a GM204 isn't high....not even close. Pretty average, really.
> 
> When you start looking at how well a card will overclock, there are SEVERAL factors you need to take into consideration, especially with Maxwell and Pascal. First and foremost, ambient temp and core temp. Then overall build quality of the GPU. The skill of the person trying to overclock it. Power delivery....the list goes on and on and on. SO, while other things do play a role in how well a card will overclock, the ASIC quality of the core is a very good indicator of just how well a GPU will overclock. Especially since Maxwell....and as the processes continue to get smaller, it'll play an even larger role, because adding more voltage to the small process GPUs is becoming more and more difficult without some really extreme cooling.
> 
> Go to the Kingpin Forums and ask him for help overclocking a classy / KPE. The first thing you get asked is what the ASIC is. Why? Because they all know, as well as most of us do, that ASIC matters...especially with Maxwell and more than likely, Pascal... Time will tell.


My first question was exacly that How does ASIC effect gamers who overclock , not Entering a World Wide competition. I just don't see why would one overpay for GPU with high ASIC or let himself into believing it's a better card. Also don't see what Chips Model has to do with this matter if you say High is better than High should be better for every Chip. I'm dropping this.Hopefully some will look into this before chasing high ASIC card.


----------



## Vellinious

Quote:


> Originally Posted by *uberwootage*
> 
> Ok is GPUZ calculating ASIC correctly? How did they get these numbers? How did they calibrate my card and my motherboard sensors to read it correctly. "Checks for a cal sticker on case."
> 
> ASIC means nothing. Without calibrating it there is a huge huge margin for error thus making ASIC mean nothing. Anyone who has worked with electronics knows just because software will say something does not mean its correct. I work with highspeed fiber optic recovers. 100+ Ghz stuff. The data we get means nothing until we load a correction factor. And even then. We change a fiber patch cord it throws off everything.
> 
> The only people who can give us a valid ASIC reading is Nvidia. I could write a program that reads it at over 9000 but it does not mean your gpu is going to go super saiyan. Anyone who seems to think that whatever GPUZ says the ASIC is an accurate reading has a very crude understanding of electronics.


lol...now that's funny


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> My first question was exacly that How does ASIC effect gamers who overclock , not Entering a World Wide competition. I just don't see why would one overpay for GPU with high ASIC or let himself into believing it's a better card. Also don't see what Chips Model has to do with this matter if you say High is better than High should be better for every Chip. I'm dropping this.Hopefully some will look into this before chasing high ASIC card.


And I answered that question....for gamers, it really doesn't matter. Scroll back a ways.... For overclockers and people that run benchmarks, it matters. lol


----------



## gagac1971

about asic reading...all of my cars that i had in long 10 years the best overclockers was been the cards whit higher asic reading...
what to say more...


----------



## looniam

i'll just leave this here:
Quote:


> Originally Posted by *thechosenwon*
> 
> I should check back here more often, your explanation is not exactly correct.
> 
> Easiest way to explain ASIC and how it relates to you guys is that it is a measurement of the quality of the gpu and how well it can scale at a set baseline voltage.
> Higher asic means it needs less voltage for XXX clock. Lower asic means it will need more voltage for XXX clock.
> Prior to maxwell, *ASIC HAD LESS meaning or I should say was less significant* because you got a lot of voltage scaling out of Kepler. You could take a lower asic 780kpti card for example, give it lots of volts and it can hit the same clocks as a higher asic counterpart running less voltage but clocking higher. On Maxwell, we do not have this luxury of running 1.4v+ on the gpus using air/water cooling
> 
> 
> 
> 
> 
> 
> 
> . This means that ASIC is more relevant because "typically and in most cases" a higher ASIC card will get you more clocks with the lowest possible voltage at the end of the day considering the voltage limits on Maxwell. For the sole reason of not being able to add much voltage on air/water with maxwell, ASIC becomes more relevant this gen, not the other way around.
> If you take ten pieces 80% ASIC and 10 pieces of 70% ASIC, and see how high each one clocks with min def voltage for KP980ti cards of 1.16v under 3d load, you will see the higher ASIC cards clocking the highest. There are always exceptions ofc, so not all cards will fall in line like that. Which brings us to the next part, leakage.
> 
> ASIC does not accurately reflect leakage unfortunately and THIS is the lottery part of the equation and what can cause a high asic card to "underperform" to expectations. *HIGHER ASIC DOES NOT EQUAL LOWER LEAKAGE*, it is the opposite. Higher ASIC has higher leakage PERIOD. Leakage and gpu scalability/headroom/ASIC VALUE scale linearly together. Lowest ASIC cards have lowest leakage, this is one reason why on Kepler you were able to increase the voltage on a low ASIC part so much and get good scaling on air water, because they had lowest leakage. Can a lower asic card clock higher than a higher asic card, YES! It is because of leakage values and all cards are different. The "lottery" is the LEAKAGE. Some high asic gpus have insane leakage numbers, This is THE SOLE reason why a high ASIC card may fall short on air/water. So much nonsense about ASIC on the net since 980KPti launched, what I explained is the real deal. Hope it helps you to understand more.
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> AISC represents how much "leakage" a chip has.
> 
> higher ASIC is lower leakage and won't need as much voltage to get the same clock speed as a chip with a lower ASIC.
> 
> prior to maxwell, ASIC had more meaning but now plays a much smaller role in OCing. your 75% ASIC card will boost slightly higher out of the box (w/default voltage) than a 72% ASIC card. but how much more you get out of it is still the silicone lottery (as usual) and what kind of temps its dealing with - lower=better.
Click to expand...


----------



## lyang238

I just about came in my pants when I got home and realized this was delivered today!

I

**Edit**
More pics installed


----------



## jedimasterben

Quote:


> Originally Posted by *lyang238*
> 
> I just about came in my pants when I got home and realized this was delivered today!
> 
> I


Oh sweet baby jesus, please give a BIOS dump!!


----------



## CannedBullets

I've finally been able to order a GTX 1080 FTW for its msrp. I'm going to install it as soon as possible before my knee surgery leaves me in crutches.


----------



## Jquala

Quote:


> Originally Posted by *Vellinious*
> 
> A contradiction.
> 
> Look....I personally believe your hypothesis is correct, but it's far from sound and can't be proven until ASIC quality reads correctly from GPUz. Speculation is speculation, no matter how fervently you believe it.


I only believe my prediction because the numbers I've logged are indicating it. A false reading doesn't mean it's absolutely inaccurate is all I'm saying. Of course it's speculation until gpu-z 0.9.0 comes out I never said anything was concrete. The thing I'm having trouble wrapping my head around is because gpu-z is not currently supporting ASIC quality that the readings from 0.8.7 are obsolete. I'm trying to say maybe they are not considering the cards with higher readings presently are still the ones clocking higher.. I'm really not trying to start any heated arguments. It was something I decided to do in order to help people decide whether or not getting FE over say a midrange aib card. This was solely done to collect numbers the community can utilize.


----------



## uberwootage

Quote:


> Originally Posted by *Vellinious*
> 
> lol...now that's funny


No whats funny is you are spouting off about it yet you cant show any facts.

Explain to me how GPUZ detects the ASIC value of a given GPU. Go ahead let me know. Because right now the new version shows reading is not supported. Older version give me 99.5% and 100% with another version. So 2 versions 2 diff results. So update your software. Here you go if i use 0.2.3 i get a core of 5005mhz. omg i have the fastest gpu in the world. I mean GPUZ says it and it cant be wrong. It only reports the truth. And new versions dont list the ASIC so we can use that so we have to use the old versions because they did not remove it because the reading was very inaccurate

Here we go guys. Proof that a 100% ASIC or 99.5% whatever it feels like reporting because that changes because its spot on accurate lets you run a gtx 1080 at 5ghz on the core. Disregard the old version because GPUZ removed the ASIC in it because Nvidia did not want anyone to know that they can produce flawless chips and they also hide the real clocks of the card. Thats why the 1080 is so fast. Its 5ghz GPUZ told me so.



Also lower ASIC chips clock higher. Ask anyone with LN2.

Now if you dont mind i have to look up an old cpuz so i can see if i can have a -vid on my i7 so i can tell everyone it pulls 5ghz at - volts. Pretty much makes its own power its amazing. Because software is so accurate to measure things. Tomorrow at work im putting all the flukes in the trash. From now on we use software.


----------



## Jquala

Quote:


> Originally Posted by *nexxusty*
> 
> Interesting theory nonetheless.
> 
> I think certain people here don't like or appreciate theories. From what I understand we might have actual ASIC readings in the future, as opposed to none at all. Ever... lol.
> 
> I don't think this was a waste of time. Good job.


Thank you for that. In a forum of overclockers I figured it was worth reporting my findings. If 99 people find it useless I'm glad it could help direct at least someone. It current trend with the "false ASIC" readings and the ability of the card to OC was too glaring for me to ignore despite knowing gpu-z was not currently fitted to give us ASIC scores we are familiar with


----------



## DokoBG

Hey guys do we have a solid custom bios yet for these babies to unlock the power limits and some voltage + remove Boost 3.0 ? I have my Asus ROG Strix OC 1080 and i see that there is a bios from Dancop that doesn't look very appealing at the moment. Is there anything else out ?


----------



## uberwootage

Quote:


> Originally Posted by *Vellinious*
> 
> lol...now that's funny


No whats funny is you are spouting off about it yet you cant show any facts.

Explain to me how GPUZ detects the ASIC value of a given GPU. Go ahead let me know.
Quote:


> Originally Posted by *CannedBullets*
> 
> I've finally been able to order a GTX 1080 FTW for its msrp. I'm going to install it as soon as possible before my knee surgery leaves me in crutches.


Grats. Lets us know your clocks and voltages. I wanna see how more of the higher end offerings stack up to the FE's. IT feels like there is an artificial wall in all the bio's that keep them from going higher. I mean even the Zotac amp extreme that jaystwocents hit a wall hard a card that i did not expect to. Almost like Nvidia is saying "Hey new contract you wanna use them then you have to put this in your bios." IF thats the case i really wanna see how well they do when we get rid of that. But so far the Strix bios that "gets rid of the tdp" sucks and hits a wall even harder. It displays higher clocks but i dont think its hitting them because there is no improvement in fps from 2ghz to 2.2ghz. So who knows what they are doing i would really like to see whatr some of the higher cards that have more power phases can do but right now were not even topping out the stock FE's power.

I mean the 1080 is fast. Really fast i love it. Buit it feels like its got a weight tied to its let. It feels like it could be so much faster but nope nothing. We just need time to get a proper bios and really let the 1080 stretch its legs.

Anyways let me know how it runs. I was looking at one of them but microcenter only had one in stock and thats a hour drive for me and the way they sell there it would of been gone and bestbuy had a fe in stock so i just picked up the fe. Regretted it until i got the waterblock on it. Now i love it.


----------



## kx11

new GPUz will tell SLi owners if they have high banwidth bridge or not


----------



## uberwootage

Quote:


> Originally Posted by *lyang238*
> 
> I just about came in my pants when I got home and realized this was delivered today!
> 
> I


I second that. IF you can get us a bios dump and i will smack that rep button lol


----------



## xer0h0ur

They disabled ASIC readings on this version


----------



## lyang238

Quote:


> Originally Posted by *uberwootage*
> 
> I second that. IF you can get us a bios dump and i will smack that rep button lol


DONE!!!

MSIGTX1080SeahawkXBIOS.zip 147k .zip file


----------



## lapino

Benchmarked my GTX1080 FTW in 3DMark. Is this a score to be expected?


----------



## Works4me

Got These 2 little things hooked up yesterday ....





( for some reason the camera makes the white led strips seem purple )
Gonna put them under water later this month , 2 blocks do not come cheap and as i haven't reached any thermal issues or throttling yet i fail to see the point of doing it right now ,
when a better bios will be available - maybe .


----------



## aberrero

With the fan auto off feature on these cards, and how cool they run generally, I think water cooling is really not worth it. I replaced an AIO 290x and it is really nice to not hear the pump running all the time anymore.


----------



## KickAssCop

10595 is a damn good score. I score 9000+ some change on my 980 Tis maxed out. :up:


----------



## Snabeltorsk

Quote:


> Originally Posted by *lapino*
> 
> Benchmarked my GTX1080 FTW in 3DMark. Is this a score to be expected?


Thats a bit low on graphics score, usually ppl get around 24000 points.


----------



## Tideman

Quote:


> Originally Posted by *aberrero*
> 
> With the fan auto off feature on these cards, and how cool they run generally, I think water cooling is really not worth it. I replaced an AIO 290x and it is really nice to not hear the pump running all the time anymore.


Yeah I'm pleased with the idle temps on my 1080 AMPs. In fan off mode, they idle in the the low 30s.

My top card hits 83C load though while the bottom reaches only 73C. This is on factory clocks/default fan. So can't say I'm happy with those temps, but I guess this is just the nature of non-blower cards? Might have to crank up my case fans.


----------



## pez

That's one reason I've always attempted to avoid directly stacked SLI cards/motherboards. I love my Z97 Pro for that. With my 970s I was seeing a 5-6C temperature delta between the top and bottom card. Should everything go right today with FedEx, I should have my second card today to test for any temperature issues. I'd be very happy to see that same delta carry over with 2 G1s.


----------



## Tideman

Quote:


> Originally Posted by *pez*
> 
> That's one reason I've always attempted to avoid directly stacked SLI cards/motherboards. I love my Z97 Pro for that. With my 970s I was seeing a 5-6C temperature delta between the top and bottom card. Should everything go right today with FedEx, I should have my second card today to test for any temperature issues. I'd be very happy to see that same delta carry over with 2 G1s.


My cards aren't stacked though, I have 2 slots between them.

Good luck with your G1s. Would be interested to know how they hold up, temp wise.


----------



## skline00

Custom water cooling really helps cooling in SLI/CF setups.


----------



## lapino

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Thats a bit low on graphics score, usually ppl get around 24000 points.


ran again with freshly booted pc, got around 22.000 for gpu now. Any idea what I could do to improve this? Some bios setting for pcie? Anything else?


----------



## Spiriva

Using the Strix ocx bios on my two Evga FE´s i saw a gain in mhz (from 2200mhz to 2264mhz) both cards do 2200mhz and then both cards did 2264mhz. However as many people have reported the mhz went up but the preformance went down. 3dmark/valley benchmark/gta5 all took a preformance dive with the strix bios with higher mhz.

I tested to run the cards on the strix ocx bios at 2200 but with higher V to see what happen and sure enuff, even on the same mhz as the Evga FE bios with just more volt atted it performed worse.

Have anyone tested the Seahawk bios posted a few pages back, any gain ?


----------



## pez

Quote:


> Originally Posted by *Tideman*
> 
> My cards aren't stacked though, I have 2 slots between them.
> 
> Good luck with your G1s. Would be interested to know how they hold up, temp wise.


Ah, my mistake. I assumed with how I see most boards use x16/x16 in SLI for slots 1 and 2 that that was the case. Are you seeing those temps in benchmarks or actual gaming? I'm not one that's big for benchmarking synthetics, but I'll give it a shot to see







.


----------



## kx11

Galax 1080 HOF review ( german )






this card is huge


----------



## jedimasterben

Quote:


> Originally Posted by *lyang238*
> 
> DONE!!!
> 
> MSIGTX1080SeahawkXBIOS.zip 147k .zip file


yesssssssssssss

Thank you! Will flash it as soon as can!


----------



## KillerBee33

Quote:


> Originally Posted by *lyang238*
> 
> DONE!!!
> 
> MSIGTX1080SeahawkXBIOS.zip 147k .zip file


Hey, got a GPUZ screenshot after flashing ? Thnx


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> Hey, got a GPUZ screenshot after flashing ? Thnx


I will have Afterburner graphs from before and after.


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> I will have Afterburner graphs from before and after.


Thnx. Any idea what changes in this bios?


----------



## achilles73

Quote:


> Originally Posted by *jedimasterben*
> 
> I will have Afterburner graphs from before and after.


In wich 1080 card you will flash the msi 1080 seahawk ?
I was thinking to flash it in my msi gaming x, but i think they have the same clocks...
Will wait to see how it works with you.
Thanks.


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> Thnx. Any idea what changes in this bios?


From the single review done on it so far, it looks like it has a higher power limit. The sea hawk X is a FE board, so I figured I would give it a shot and test it out.

Quote:


> Originally Posted by *achilles73*
> 
> In wich 1080 card you will flash the msi 1080 seahawk ?
> I was thinking to flash it in my msi gaming x, but i think they have the same clocks...
> Will wait to see how it works with you.
> Thanks.


The sea hawk X is a FE card, so you won't want to flash it on a custom card, you're already bypassing the power limit as it is









I will be flashing it to my Asus FE with an EVGA hybrid on it.


----------



## achilles73

Quote:


> Originally Posted by *jedimasterben*
> 
> From the single review done on it so far, it looks like it has a higher power limit. The sea hawk X is a FE board, so I figured I would give it a shot and test it out.
> The sea hawk X is a FE card, so you won't want to flash it on a custom card, you're already bypassing the power limit as it is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be flashing it to my Asus FE with an EVGA hybrid on it.


Ok, thanks for the info and clarification.


----------



## Spiriva

Quote:


> Originally Posted by *jedimasterben*
> 
> From the single review done on it so far, it looks like it has a higher power limit. The sea hawk X is a FE board, so I figured I would give it a shot and test it out.
> The sea hawk X is a FE card, so you won't want to flash it on a custom card, you're already bypassing the power limit as it is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be flashing it to my Asus FE with an EVGA hybrid on it.


From this review ? http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_sea_hawk_x_review,39.html

You gonna flash it now or later to day ? Sitting at work and are pretty bored, would be more fun to read about your findings with this bios


----------



## Peacecamper

Hello,

short question: Did anybody of you, who installed the EK waterblock, experience a great increase of coil whine with it? I have read three reports in german forums about that so far and I have the same issue myself. When I had the FE cooler installed, I heard no coil whine at all (allthough it might be surpressed by the fan noise). But now that I have the EK cooler on the card, it's freaking me out. Also two of the guys who encountered this already switched to different waterblocks and voilá, the coil whine is gone. I ordered a Watercool Heatkiller IV myself which will arrive next week and will test this out as well.
Somehow the EK cooler seems to intensify the noise that the coils produce. Or the other coolers are just so much better in reducing it.

By the way, I saw that you can use the EK cooler with the FE backplate. What about the Heatkiller, can I use the orignal backplate with it as well? Should be possible, right?


----------



## KillerBee33

Quote:


> Originally Posted by *Spiriva*
> 
> From this review ? http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_sea_hawk_x_review,39.html
> 
> You gonna flash it now or later to day ? Sitting at work and are pretty bored, would be more fun to read about your findings with this bios


Do these Fairestrike Scores measured as GPU or Overall ? They never clarify that


----------



## Spiriva

Quote:


> Originally Posted by *KillerBee33*
> 
> Do these Fairestrike Scores measured as GPU or Overall ? They never clarify that


I would assume they mean overall score, specially since they post this picture right under:








that also says 20789


----------



## jedimasterben

Quote:


> Originally Posted by *Spiriva*
> 
> From this review ? http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_sea_hawk_x_review,39.html
> 
> You gonna flash it now or later to day ? Sitting at work and are pretty bored, would be more fun to read about your findings with this bios


Well, I'm at work too, but I left my computer on at home and don't have a full day today, so I'm running three Firestrike runs right now on the stock FE BIOS at +170/+500 and logging with AB, then will flash the BIOS and apply the same overclock and will see what happens, especially with power limit.


----------



## KillerBee33

Quote:


> Originally Posted by *Spiriva*
> 
> I would assume they mean overall score, specially since they post this picture right under:
> 
> 
> 
> 
> 
> 
> 
> 
> that also says 20789


Humm it doesnt really make much sense then







We may all have the 1080's but the rest of the components are are playin a big role in a Overall score .


----------



## Vellinious

Quote:


> Originally Posted by *KillerBee33*
> 
> Humm it doesnt really make much sense then
> 
> 
> 
> 
> 
> 
> 
> We may all have the 1080's but the rest of the components are are playin a big role in a Overall score .


Most reviewers do this....tells you just how much they actually know about the hardware they're reviewing.


----------



## chronicfx

Quote:


> Originally Posted by *Works4me*
> 
> Got These 2 little things hooked up yesterday ....
> 
> 
> 
> 
> 
> ( for some reason the camera makes the white led strips seem purple )
> Gonna put them under water later this month , 2 blocks do not come cheap and as i haven't reached any thermal issues or throttling yet i fail to see the point of doing it right now ,
> when a better bios will be available - maybe .


Nice GPU score, you are hitting about 1000 points more than me at not too different clocks I am 2075 core 10800mem, hmmmm is it that you are on extreme platform... x16/x16?? or Faster Ram??? My ram is at Cas15 3000MHz 24GB! My graphics score with two FE edition on air is ~10500 with a 2075 clock (But mine boosts lower through most about 2000 on the first test and 2050 on the second). What's your ram speed? Better put that 5930k under water too! You are cooking it pretty good hitting 88c with just the physics load. Don't worry instead of heat I am juicing mine with vcore at 1.5v and 4.9GHz.







Nice rig you should be happy, I owned the 980Ti 6G and it was a great card, you will be happy!


----------



## Sourcesys

Hi guys, just got my Palit 1080

I used MSI AfterBurner to oc'ed it 1987 coreclock and 10800 memclock and it started showing artifacts when going higher

Seems like the AB Voltage slider does nothing to increase the voltage

Any tips and advice?

Is this the limit of the card?

should I sent it back?


----------



## jedimasterben

Well, results are so far a bust, guys. The MSI BIOS seems to have the same power limit, during Firestrike I was bumping against it the entire time, same as on the stock BIOS, and score was around 600 points less, though that is probably due to clock speed differences between the FE and the Sea Hawk X, which I tried to account for, but apparently didn't









So looks like we're still waiting for a BIOS tweaker.


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> Well, results are so far a bust, guys. The MSI BIOS seems to have the same power limit, during Firestrike I was bumping against it the entire time, same as on the stock BIOS, and score was around 600 points less, though that is probably due to clock speed differences between the FE and the Sea Hawk X, which I tried to account for, but apparently didn't
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So looks like we're still waiting for a BIOS tweaker.


Uhumm, as expected....+1 for taking one for the Team


----------



## Snabeltorsk

Msi.1080.Seahawk.EK.zip 149k .zip file


This is the MSI Sea Hawk EK version Bios if anyone want to try it for something








Firestrike ultra running @ +118 core +480 mem @ 1.0930Mv
Max tdp 117 , max temp 41c
NO Vrel at all during test.


----------



## Joshwaa

EVGA FTW ACX 3.0 coming in Monday. I think I am having shippers remorse that I did not opt for the next day air to get it tomorrow. Talked to EK and they said water block will be avail in August. Is it August yet? Time to retire the 780 TI with the EK full cover.


----------



## uberwootage

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Msi.1080.Seahawk.EK.zip 149k .zip file


Quote:


> Originally Posted by *jedimasterben*
> 
> Oh sweet baby jesus, please give a BIOS dump!!


Awesome. I'll test it when I get home. Right now I'm running the version for corsair version seahawk at 2155 on the core so I'm curious to see if this can give any more to the core.


----------



## Works4me

Quote:


> Originally Posted by *chronicfx*
> 
> Nice GPU score, you are hitting about 1000 points more than me at not too different clocks I am 2075 core 10800mem, hmmmm is it that you are on extreme platform... x16/x16?? or Faster Ram??? My ram is at Cas15 3000MHz 24GB! My graphics score with two FE edition on air is ~10500 with a 2075 clock (But mine boosts lower through most about 2000 on the first test and 2050 on the second). What's your ram speed? Better put that 5930k under water too! You are cooking it pretty good hitting 88c with just the physics load. Don't worry instead of heat I am juicing mine with vcore at 1.5v and 4.9GHz.
> 
> 
> 
> 
> 
> 
> 
> Nice rig you should be happy, I owned the 980Ti 6G and it was a great card, you will be happy!


I had this system under water already ( gtx 980 gaming 4g in SLI ) but i sold them and waited for the 1080 to arrive , now i need 2 new water blocks
my ram is only 2400 and yes i'm on the extreme platform .


----------



## nexxusty

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Thats a bit low on graphics score, usually ppl get around 24000 points.


Yup. 24,000+ here.


----------



## JonnyBigBoss

One month in... I love my GTX 1080. In hindsight I probably didn't need to go higher than a GTX 1070, but it feels nice knowing that for the first time ever I have something that is high-end. It is so much more quiet and powerful than I hoped for.


----------



## nexxusty

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> One month in... I love my GTX 1080. In hindsight I probably didn't need to go higher than a GTX 1070, but it feels nice knowing that for the first time ever I have something that is high-end. It is so much more quiet and powerful than I hoped for.


Trust me. Going for lower end video cards is not the way. Always buy the fastest single card.

You'll never have thoughts of "what if" because you know you have the best.

I used to buy midrange and over clock my way into high end. They don't make GPU's that way anymore. Last GPU's you could do that with were the 6950/6970's....

I'd say you made the right choice.


----------



## boredgunner

Quote:


> Originally Posted by *nexxusty*
> 
> Trust me. Going for lower end video cards is not the way. Always buy the fastest single card.
> 
> You'll never have thoughts of "what if" because you know you have the best.
> 
> I used to buy midrange and over clock my way into high end. They don't make GPU's that way anymore. Last GPU's you could do that with were the 6950/6970's....
> 
> I'd say you made the right choice.


Agreed 100%. Going with the fastest single card is the most fool-proof option. To hell with dual GPU cards or buying less when you can afford more.


----------



## BrainSplatter

Quote:


> Originally Posted by *boredgunner*
> 
> Agreed 100%. Going with the fastest single card is the most fool-proof option.


But then u have to get another one because 2 of the next lower level cards are 50% faster in half of the games. At least NVIDIA saved us from from getting 3- and 4-way SLI


----------



## kx11

watch this shaky video of HOF 1080 SLi , does the top GPU cover 2 or 3 PCI slots ??


----------



## arrow0309

Quote:


> Originally Posted by *kx11*
> 
> watch this shaky video of HOF 1080 SLi , does the top GPU cover 2 or 3 PCI slots ??


3 slots, he has a fourth, free one between them


----------



## Jorginto

Is there any bios editing guide? My Asus FE arrived just today and I'd like to give it some mo' juice and increase the power limit.


----------



## graymoon

hi. i have a problem with my card. i had a 1080p 60hz monitor and the core clock on idle was 200-300. but today i change my monitor to 2k 144hz and the core clock stay on 1200 MHZ and wont come down. is there any solution for this?


----------



## boredgunner

Quote:


> Originally Posted by *graymoon*
> 
> hi. i have a problem with my card. i had a 1080p 60hz monitor and the core clock on idle was 200-300. but today i change my monitor to 2k 144hz and the core clock stay on 1200 MHZ and wont come down. is there any solution for this?


What's 2k? Although the problem is probably with 144 Hz. Did you try lowering refresh rate to 120 Hz just to test?


----------



## graymoon

Quote:


> Originally Posted by *boredgunner*
> 
> What's 2k? Although the problem is probably with 144 Hz. Did you try lowering refresh rate to 120 Hz just to test?


benq xl2730z. yeah i tried 120hz and the core clock stayed on 230mhz.


----------



## ChevChelios

Quote:


> Originally Posted by *nexxusty*
> 
> Trust me. Going for lower end video cards is not the way. Always buy the fastest single card.
> 
> You'll never have thoughts of "what if" because you know you have the best.
> 
> I used to buy midrange and over clock my way into high end. They don't make GPU's that way anymore. Last GPU's you could do that with were the 6950/6970's....
> 
> I'd say you made the right choice.


Quote:


> Originally Posted by *boredgunner*
> 
> Agreed 100%. Going with the fastest single card is the most fool-proof option. To hell with dual GPU cards or buying less when you can afford more.


exactly why I ended up getting 1080 over 1070


----------



## kx11

Quote:


> Originally Posted by *arrow0309*
> 
> 3 slots, he has a fourth, free one between them


hmmm i see

so if i got 2 i might not be able to connect them in PCI-1 + PCI-3 ? kinda of a problem


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> Awesome. I'll test it when I get home. Right now I'm running the version for corsair version seahawk at 2155 on the core so I'm curious to see if this can give any more to the core.


anyone test this on a FE card yet? Is the MSI EK "seahawk" an FE card?


----------



## ChaosBlades

Quote:


> Originally Posted by *ssgwright*
> 
> anyone test this on a FE card yet? Is the MSI EK "seahawk" an FE card?


Sea Hawk X is Reference
Sea Hawk EK is Custom


----------



## dentnu

Just in case you guys missed it the 3dmark time spy benchmark came out today.

Here is my top score so far in the new 3dmark time spy benchmark.

http://www.3dmark.com/spy/15962


----------



## pez

Second card arrived as well as my bridge today. It's somewhat growing on m. Top card is getting pretty warm, but the fact I'm wearing headphones a majority of the time, and the fact that the performance I'm getting in-game is fantastic, I've got no complaints







.



EDIT:
Couple install/build pics:


----------



## MerkageTurk

Just purchased my one, will arrive Saturday

Woop Woop


----------



## uberwootage

Running the ref. seahawk bios on a fe with water cooling and the new drivers I'm at 2,175 on the core.


----------



## jedimasterben

Quote:


> Originally Posted by *uberwootage*
> 
> Running the ref. seahawk bios on a fe with water cooling and the new drivers I'm at 2,175 on the core.


What were you getting before?


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> Running the ref. seahawk bios on a fe with water cooling and the new drivers I'm at 2,175 on the core.


can you hook me up with that bios?


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> can you hook me up with that bios?


i tested this on a FE direct from Nvidia.

seahawk.zip 149k .zip file


----------



## gagac1971

hey guys nice news here...
for us from europe evga is almost having in stock the hybrid water cooler kit in stock for 129 euro ...
here is the link http://eu.evga.com/Products/Product.aspx?pn=400-HY-5188-B1
i cant wait to slap this baby on my gygabite FE and see if something will change in overclock.
at least temp will be great and no more 100% fans...


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> i tested this on a FE direct from Nvidia.
> 
> seahawk.zip 149k .zip file


works good but it's no different from my stock bios... scoring and overclocking almost exactly the same


----------



## nexxusty

Quote:


> Originally Posted by *boredgunner*
> 
> What's 2k? Although the problem is probably with 144 Hz. Did you try lowering refresh rate to 120 Hz just to test?


1440p. 1080p=1k, 1440p=2k, Something=3k, 2160p=4k.

Heh.


----------



## ssgwright

here's my timespy bench results, what are you guys getting? 7,752


----------



## Maintenance Bot

Quote:


> Originally Posted by *ssgwright*
> 
> here's my timespy bench results, what are you guys getting? 7,752


Here is mine, almost identical gpu score to yours. http://www.3dmark.com/spy/19240


----------



## dante`afk

I'm bored and wanted to check out the asus bios.

is this going to brick? never had such an error message?


----------



## ssgwright

Quote:


> Originally Posted by *dante`afk*
> 
> I'm bored and wanted to check out the asus bios.
> 
> is this going to brick? never had such an error message?


no you get that error when flashing a different vendors bios, as long as you're not trying to flash a custom card's bios to a vendor or vice versa


----------



## dante`afk

Quote:


> Originally Posted by *ssgwright*
> 
> no you get that error when flashing a different vendors bios, as long as you're not trying to flash a custom card's bios to a vendor or vice versa


well the FTW is a custom card no? I intend to flash the asus strix bios on my FTW, do it or rather nah?


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> works good but it's no different from my stock bios... scoring and overclocking almost exactly the same


What stock bios are you using? FE's there are 2 bios out for them....well the nvidia made bios. 86.04.17.00.01 and 86.04.11.00.0C. All the branded Fe's Asus evga and what not are using 01 and the nvidia ones are using a mix of 01 and 0C


----------



## ssgwright

i'm not sure, I'd try to find out before just in case. I've never bricked from flashing, I will get the card not to respond and I have to throw in another card to boot with so I could flash the other card back again. Pain in the arse but comes with the territory.


----------



## ssgwright

my stock bios is the zotac FE bios.


----------



## dante`afk

hmmm. doesn't want to go through?


----------



## ssgwright

first type "nvflash --protectoff"

then flash the card


----------



## ssgwright

here's my bios version: 86.04.17.00.01


----------



## DADDYDC650

Hi!


----------



## dante`afk

this command opens another window but I cannot do anything in that window, nor in the previous cmd window.


----------



## ssgwright

you type nvflash --protectoff?

ensure to use 2 dashes and no space --protectoff


----------



## dante`afk

the protectoff command worked after reboot, however when I tried flashing again then the same error appeared.


----------



## ssgwright

Quote:


> Originally Posted by *ssgwright*
> 
> here's my timespy bench results, what are you guys getting? 7,752


I scored 20 points higher with my bios 7,777

course I got a time error or something...


----------



## Mad Pistol

I know they're not 1080's, but...



... and yes, I need to dust my case.


----------



## Leipatemeibbaa

GTX1080 SLI with such low Valley Extreme HD score. Anybody have any idea?

My CPU is 4930K @ 4.4GHZ. I don't think it is my CPU bottleneck the Video Cards.

One thing I notice is during benchmarking both card is at 79% load instead of 100%.


----------



## Vellinious

Quote:


> Originally Posted by *Leipatemeibbaa*
> 
> 
> 
> GTX1080 SLI with such low Valley Extreme HD score. Anybody have any idea?
> 
> My CPU is 4930K @ 4.4GHZ. I don't think it is my CPU bottleneck the Video Cards.
> 
> One thing I notice is during benchmarking both card is at 79% load instead of 100%.


Valley is super CPU dependent. The higher the core clock on your CPU, the higher you'll score.


----------



## Benjiw

Quote:


> Originally Posted by *Vellinious*
> 
> Valley is super CPU dependent. The higher the core clock on your CPU, the higher you'll score.


Can confirm.


----------



## xer0h0ur

So do we have any conclusive results on which is the best BIOS to use for overclocking FE cards? Of course I only mean in terms of picking up performance instead of losing performance with higher clocks.

To whomever was asking about coil whine using EK blocks, I have coil whine on my card but its nothing particularly alarming.


----------



## ChaosBlades

Quote:


> Originally Posted by *xer0h0ur*
> 
> So do we have any conclusive results on which is the best BIOS to use for overclocking FE cards? Of course I only mean in terms of picking up performance instead of losing performance with higher clocks.
> 
> To whomever was asking about coil whine using EK blocks, I have coil whine on my card but its nothing particularly alarming.


If you can hit 2.1Ghz you are set that is pretty much the limit for everyone right now. If not I would wait for actual custom bios instead of playing around with other AIB bios. Generally when doing that the results end up being mediocre at best.


----------



## Ragnarook

Quote:


> Originally Posted by *Leipatemeibbaa*
> 
> 
> 
> GTX1080 SLI with such low Valley Extreme HD score. Anybody have any idea?
> 
> My CPU is 4930K @ 4.4GHZ. I don't think it is my CPU bottleneck the Video Cards.
> 
> One thing I notice is during benchmarking both card is at 79% load instead of 100%.




This is mine cpu at 5ghz, with two 1080´s. Make sure you have your v-sync turned off, otherwise you will get a lower score.
*Gpus both at 2200mhz (evga 1080 FE stock bios 86.04.17.00.80 on both) Evga Hb sli bridge.


----------



## aberrero

The EVGA artifact tester is telling me I get artifacts, but I can't actually see any in game. Any thoughts on if I should trust it?


----------



## xer0h0ur

Quote:


> Originally Posted by *yusky03*
> 
> If you can hit 2.1Ghz you are set that is pretty much the limit for everyone right now. If not I would wait for actual custom bios instead of playing around with other AIB bios. Generally when doing that the results end up being mediocre at best.


Yeah, 2114MHz here.


----------



## Sheyster

EVGA 1080 FTW #1 arrived from Amazon today. It boosts to 2025 MHz out of the box, and runs pretty damn cool with the stock fan curve. I'll play with the OC a bit tomorrow.


----------



## t1337dude

I'm in the club. My overpriced ($760) MSI Gaming X 8G 1080 came today. I really like the build quality on it compared to the Gigabyte 980 Ti G1 I had. It's prettier and also much quieter! The G1's fan had a bit of a grinding noise with the sound signature and I was not a fan. This thing is a whisper under load by comparison. Just ran regular Firestrike and got 22,550 on OC mode.

Was it worth the upgrade over the 980 Ti? I'm seeing a 25% increase in my games on average, and I'm satisfied with that. But my Firestrike score maybe seems low. What do you guys think? Any suggestions?


----------



## Sourcesys

It seems my Palit 1080 is Voltage locked, any Idea how I can bypass this?


----------



## Setzer

Quote:


> Originally Posted by *nexxusty*
> 
> 1440p. 1080p=1k, 1440p=2k, Something=3k, 2160p=4k.
> 
> Heh.


No.
1080p = very close to actual 2K.
1440p = would be close to 2.6K
2160p = very close to actual 4K

2K = 2048 wide (2 x 1024, 1024 = 1K) so it's 2048 x 1152.
3K = 3072 wide (3 x 1024) for 3072 x 1728
4K = 4096 wide (4 x 1024) for 4096 x 2304.


----------



## CannedBullets

I've never done GPU overclocking since it seems more stressful on hardware than CPU overclocking. But I want to push my EVGA 1080 FTW to 2000 mhz.

Where would I get started? What's a good guide for overclocking Pascal GPUs?


----------



## Ragnarook

Anyone who been running 3dmark time spy yet ?









I got this result.

4790k @ 5ghz, both 1080 @ 2200mhz (nvidia 368.81 driver)


----------



## Joshwaa

Like I stated earlier 1080 FTW should be here Monday but I have a question to ask. I am running Win 7 Pro. I have loaded up Win 10 a couple times to try but my FPS was always lower on Win 10. I am getting jealous seeing people run the Time Spy benchmark and want to join in the fun. Has anyone else noticed Win 10 being slightly slower in FPS? Maybe there was an adjustment I needed to make. Games I play now are Direct X11 but I am sure I will need DirectX 12 in the future. Thoughts?


----------



## ChevChelios

Quote:


> 1080 @ 2200mhz


how ?

teach me sensei


----------



## Ragnarook

Quote:


> Originally Posted by *ChevChelios*
> 
> how ?
> 
> teach me sensei


110% pure luck







But the diff between 2050-2100-2200mhz isnt very big, in real games maybe the diff is around 1-3fps.


----------



## axiumone

Quote:


> Originally Posted by *Ragnarook*
> 
> Anyone who been running 3dmark time spy yet ?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got this result.
> 
> 4790k @ 5ghz, both 1080 @ 2200mhz (nvidia 368.81 driver)


Oh dang, that's 1.1k more than what I got. [email protected] and 1080 sli at 2000mhz.

Let's see if the hybrid coolers will let me stabilize my clocks a little more.


----------



## DADDYDC650

These two bad boys will finally meet today.


----------



## DamiNQN

Quote:


> Originally Posted by *DADDYDC650*
> 
> These two bad boys will finally meet today.


I bought the same couple, not yet arrive...
Let me know how it works


----------



## uberwootage

We need to start a database. Card brand. Driver version. Bios version max stable clocks and type of cooling. Just a nice big list.


----------



## Antsu

Just got my Gigabyte G1 Gaming 1080 and was wondering which bios is compatible with this card? I was thinking of flashing the sea hawk bios.


----------



## Jpmboy

Quote:


> Originally Posted by *Setzer*
> 
> No.
> 1080p = very close to actual 2K.
> 1440p = would be close to 2.6K
> 2160p = very close to actual 4K
> 
> 2K = 2048 wide (2 x 1024, 1024 = 1K) so it's 2048 x 1152.
> 3K = 3072 wide (3 x 1024) for 3072 x 1728
> 4K = 4096 wide (4 x 1024) for 4096 x 2304.


1600P ~ 3K


----------



## Leipatemeibbaa

Quote:


> Originally Posted by *Ragnarook*
> 
> 
> 
> This is mine cpu at 5ghz, with two 1080´s. Make sure you have your v-sync turned off, otherwise you will get a lower score.
> *Gpus both at 2200mhz (evga 1080 FE stock bios 86.04.17.00.80 on both) Evga Hb sli bridge.


I assume the v-sync u r talking about is in the Nivida control panel->Manage 3D settings? I do the 3Dmark profile there and I need to check if it is off or not.
My valley benchmark run is both card on 2100GHz. I was expecting at least 160 for a GTX1080 SLI.


----------



## boredgunner

Quote:


> Originally Posted by *Setzer*
> 
> No.
> 1080p = very close to actual 2K.
> 1440p = would be close to 2.6K
> 2160p = very close to actual 4K
> 
> 2K = 2048 wide (2 x 1024, 1024 = 1K) so it's 2048 x 1152.
> 3K = 3072 wide (3 x 1024) for 3072 x 1728
> 4K = 4096 wide (4 x 1024) for 4096 x 2304.


People just need to stop with the "k" terminology, although I suppose 4k is okay since it almost always refers to 3840 x 2160, which is exactly 4x the resolution of 1920 x 1080 so mathematically 1920 x 1080 can't be 2k in relation to 3840 x 2160 being 4k. 2560 x 1440 can't be 2k either because it isn't half of 3840 x 2160.


----------



## Avant Garde

What's the thermal threshold for EVGA FTW ? My fans are spinning @1055rpm in IDLE...


----------



## boredgunner

Quote:


> Originally Posted by *Avant Garde*
> 
> What's the thermal threshold for EVGA FTW ? My fans are spinning @1055rpm in IDLE...


What is your temp limit set to? I think default and max are the same as FE.


----------



## Avant Garde

I thought that during desktop usage and browsing fans should not spin at all? I've just installed PrecisionX OC though...


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> We need to start a database. Card brand. Driver version. Bios version max stable clocks and type of cooling. Just a nice big list.


Great idea.


----------



## Ragnarook

Quote:


> Originally Posted by *Leipatemeibbaa*
> 
> I assume the v-sync u r talking about is in the Nivida control panel->Manage 3D settings? I do the 3Dmark profile there and I need to check if it is off or not.
> My valley benchmark run is both card on 2100GHz. I was expecting at least 160 for a GTX1080 SLI.


Yes, add valley.exe there and turn v sync off, so it looks like this picture:


----------



## axiumone

Quote:


> Originally Posted by *Avant Garde*
> 
> What's the thermal threshold for EVGA FTW ? My fans are spinning @1055rpm in IDLE...


What are your idle clocks? Are you running a 144-165hz display?


----------



## Avant Garde

Everything is on AUTO trust me







GPU: 253MHz nad MEM: 405MHz

Currently I'm on a 60Hz Dell P2414H


----------



## Leipatemeibbaa

Quote:


> Originally Posted by *Ragnarook*
> 
> Yes, add valley.exe there and turn v sync off, so it looks like this picture:


Thank you!

I ran a firestrike ultra and got 10118 score, which seems ok for a GTX1080 SLI, just the Valley score doesn't seems good at all..


----------



## AllGamer

Hey guys, not sure if anyone caught this one yet, but there are more proof the new HB SLI bridge does make a big difference in some games, and minor in some other.

http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/

Now, if only EK will release their HB SLI bridges then i'll be set.

All the new EVGA Pro SLI are sold out , don't confuse them with the old EVGA Pro SLI.


----------



## Zurv

weeeee.....



and before anyone says anything... i went 4 way when 3/4 SLI would still work with the unlock key (in theory, as the keys never came out.) Now they block anything but approved benchmarking tools to use anything but 2 way. yes, it was to late for me to return the cards because i already water blocked them. Also, yes, 1080 isn't enough to run games maxed in 4k. That is why i had 4 TItan Xs and 4 way SLI worked fine for the games i cared about.


----------



## nexxusty

Quote:


> Originally Posted by *Zurv*
> 
> weeeee.....
> 
> 
> 
> (and before anyone says anything... i went 4 way when 3/4 SLI would still work with the unlock key. Now they block anything but approved benchmarking tools to use anything but 2 way. yes, it was to late for me to return the cards because i already water blocked them. Also, yes, 1080 isn't enough to run games maxed in 4k. That is why i had 4 TItan Xs and 4 way SLI worked fine for the games i cared about.)


When she works she works, not debating that hehe. This is more "Masturdebating" than anything anyway...


----------



## Zurv

Quote:


> Originally Posted by *AllGamer*
> 
> Hey guys, not sure if anyone caught this one yet, but there are more proof the new HB SLI bridge does make a big difference in some games, and minor in some other.
> 
> http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/
> 
> Now, if only EK will release their HB SLI bridges then i'll be set.
> 
> All the new EVGA Pro SLI are sold out , don't confuse them with the old EVGA Pro SLI.


It would have been nice if they put in the LED bridge for the 1080 test. From the NVidia docs, the new HB and LED should be the same unless you are over 5k. The flex bridges are poo.


----------



## axiumone

Quote:


> Originally Posted by *Zurv*
> 
> weeeee.....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> and before anyone says anything... i went 4 way when 3/4 SLI would still work with the unlock key (in theory, as the keys never came out.) Now they block anything but approved benchmarking tools to use anything but 2 way. yes, it was to late for me to return the cards because i already water blocked them. Also, yes, 1080 isn't enough to run games maxed in 4k. That is why i had 4 TItan Xs and 4 way SLI worked fine for the games i cared about.


LMAO. Great comment!









Edit - Now, you have to do everything in your power to stay near the top of the hall of fame.


----------



## Zurv

Quote:


> Originally Posted by *axiumone*
> 
> LMAO. Great comment!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit - Now, you have to do everything in your power to stay near the top of the hall of fame.


yeah, i'd think most people would stay away from 4 cards because it is pretty worthless. Do people really just get stuff to benchmark? (i care about it for gaming)
In the past I've made it to the top off list only by going crazy stuff to my cards. What i benchmarked with was NOT was i would be using every day.







That said, because the gtx 1080 doesn't really OC much, what i benched in 3dmark, is what i'm using 100% of the time.

It was NVidia marketing that took away 3/4 SLI from games. Maybe this is bad marketing now







I wish they would just go back to not officially supporting it.. but right now they are actively locking 3/4 out from everything other than approved benchmarking apps.


----------



## nexxusty

Quote:


> Originally Posted by *Zurv*
> 
> yeah, i'd think most people would stay away from 4 cards because it is pretty worthless. Do people really just get stuff to benchmark? (i care about it for gaming)
> In the past I've made it to the top off list only by going crazy stuff to my cards. What i benchmarked with was NOT was i would be using every day.
> 
> 
> 
> 
> 
> 
> 
> That said, because the gtx 1080 doesn't really OC much, what i benched in 3dmark, is what i'm using 100% of the time.
> 
> It was NVidia marketing that took away 3/4 SLI from games. Maybe this is bad marketing now
> 
> 
> 
> 
> 
> 
> 
> I wish they would just go back to not officially supporting it.. but right now they are actively locking 3/4 out from everything other than approved benchmarking apps.


Have we tried renaming the .exe of a game to the same as 3dmark?

I cannot see them putting any more time into the Game/App detection routine. Could be dead wrong though.


----------



## Zurv

Quote:


> Originally Posted by *nexxusty*
> 
> Have we tried renaming the .exe of a game to the same as 3dmark?
> 
> I cannot see them putting any more time into the Game/App detection routine. Could be dead wrong though.


I was thinking of playing around with NVinspector. But SLI is picky. A profile for 3dmark totally works differently than the-always-needing-to-be-tweaked witcher 3









right now i have 3 games i can't play with the graifx level/60fps+ i used to be able play when i had my titan Xs (4 way sli)


----------



## nexxusty

Quote:


> Originally Posted by *Zurv*
> 
> I was thinking of playing around with NVinspector. But SLI is picky. A profile for 3dmark totally works differently than the-always-needing-to-be-tweaked witcher 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> right now i have 3 games i can't play with the graifx level/60fps+ i used to be able play when i had my titan Xs (4 way sli)


Try doing the rename and messing with AFR modes, etc... maybe even AA SLi mode if that still exists.


----------



## jtom320

Honestly I think at the end of the day the two way limitation for SLI is a great thing. It takes dev resources and focuses them into two way which in theory should lend itself to better support and better scaling in general. 3 and 4 way is such a miniscule group that those resources are really better served on the much larger but still miniscule group of two way users.

I also don't really believe that two 1080s isn't enough for gaming at 4k considering the framerates I've seen with one 1080 at 4k. Not 60 locked in everything certainly but it's fast enough to play most things at 60 with settings set just a hair under ultra.

I don't say any of this to piss multiple card owners off but the benefits for the many are simply a better use of everyone's time.


----------



## Zurv

Quote:


> Originally Posted by *jtom320*
> 
> Honestly I think at the end of the day the two way limitation for SLI is a great thing. It takes dev resources and focuses them into two way which in theory should lend itself to better support and better scaling in general. 3 and 4 way is such a miniscule group that those resources are really better served on the much larger but still miniscule group of two way users.
> 
> I also don't really believe that two 1080s isn't enough for gaming at 4k considering the framerates I've seen with one 1080 at 4k. Not 60 locked in everything certainly but it's fast enough to play most things at 60 with settings set just a hair under ultra.
> 
> I don't say any of this to piss multiple card owners off but the benefits for the many are simply a better use of everyone's time.


sadly, game devs don't spend any time working on sli. That is why new drivers need to be released







At most it will be NVidia's driver team doing the work.. sooo.. that is there job.

That said, if games used DX12 multi GPU thingy to make multi-GPUs (of the same model) look like one logical video to a game - that would be the best option. But by the time games will start doing that i'm sure i'll be 2-3 rounds of other video cards. (i assume i'll be on Titan-whatever-the-next-one-is by the end of the year.. 2 way 1080 sli make me sad)

also, "hair under ultra" .. not for me.. i have very little time to play games and when i do i want it to look 100% the best it can. that will also mean getting the dell oled screen when it comes out too. (i have a 4k LG oled and games look GREAT!)


----------



## jtom320

Quote:


> Originally Posted by *Zurv*
> 
> sadly, game devs don't spend any time working on sli. That is why new drivers need to be released
> 
> 
> 
> 
> 
> 
> 
> At most it will be NVidia's driver team doing the work.. sooo.. that is there job.
> 
> That said, if games used DX12 multi GPU thingy to make multi-GPUs (of the same model) look like one logical video to a game - that would be the best option. But by the time games will start doing that i'm sure i'll be 2-3 rounds of other video cards. (i assume i'll be on Titan-whatever-the-next-one-is by the end of the year.. 2 way 1080 sli make me sad)
> 
> also, "hair under ultra" .. not for me.. i have very little time to play games and when i do i want it to look 100% the best it can. that will also mean getting the dell oled screen when it comes out too. (i have a 4k LG oled and games look GREAT!)


It's not for me either. My point however is that two 1080s are going to be fine for any game on the market at 60 fps / 4k. At least any game I know of.


----------



## MrTOOSHORT

Picked up a GTX 1080 Seahawk EK X today from Memoryexpress:



Can't wait to get it installed, which will be later on today or tonight.


----------



## AllGamer

Quote:


> Originally Posted by *jtom320*
> 
> It's not for me either. My point however is that two 1080s are going to be fine for any game on the market at 60 fps / 4k. At least any game I know of.


On a single monitor, a GTX1080 is more than enough for anyone even at stock speeds.

but once you go multi-monitor (3+1), or play 3D glasses games (nVidia 3D vision), then it's just barely enough average 33 FPS

having 2 GTX1080 improves the situation quite a bit.


----------



## AllGamer

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Picked up a GTX 1080 Seahawk EK X today from Memoryexpress:
> 
> 
> 
> Can't wait to get it installed, which will be later on today or tonight.


Congrats!

The 2 I ordered will not arrived until next month... not enough stock

There are aprox 30 in back order, I'm surprised Memory Express were able to get their hands on some before the rest of the other Canadian retailers.


----------



## Avant Garde

EVGA FTW

Stock :





OC [Afterburner: Power Limit = 120% ; Core Clock = +100 ; Fan Speed = 50%] :


----------



## Zurv

Quote:


> Originally Posted by *jtom320*
> 
> It's not for me either. My point however is that two 1080s are going to be fine for any game on the market at 60 fps / 4k. At least any game I know of.


witcher 3 and the division and at points in hitman and TR. with maxed settings. NV works et al. (that is even with lower AA settings). All things that i needed 4 titan Xs (borderline.. the NVidia 100% needs it with nvworks - which are still off with "ultra") before to do. Two 1080s is about the same, in game perf, as 3 titan X. Which is impressive at the power usage and price but... but not good enough for me when i play a game.

All will drop below 60 with maxed settings. Sooo.. If less than 60 is fine for you.. then fine. If less than the best settings is fine for you... but not for me and i could play with them in the past. I don't want "fine"

This reminds me when people would say playing a game with a 980 ti worked great in 4k... it didn't.

I not telling you what you should find acceptable. But I'm not sure how you are telling me that playing at lower settings/perf than i used to have it fine for me.


----------



## DOOOLY

So i went and picked up the cheapest 1080 i could find, the Msi Aero . I am just waiting for my Ek Water Block to come. Anyone else have a Msi
Aero ?


----------



## Zurv

Quote:


> Originally Posted by *DOOOLY*
> 
> So i went and picked up the cheapest 1080 i could find, the Msi Aero . I am just waiting for my Ek Water Block to come. Anyone else have a Msi
> Aero ?


nice







does the aero use the same layout as the FE? ie, your no worried about the waterblock fitting? (well.. i guess you checked on the website.. so.. it is kinda a dumb question)
It is smart to get the cheapest one. All cards seem to perform the same and the issues would be thermal limiting, but with water it doesn't matter


----------



## DOOOLY

I looked up compatibility before buying the card







no worries here. It's same layout as FE. I did notice how thin the PCB is on this Aero is it like that on the FE editions ?


----------



## Jpmboy

Time spy Benchmark !

http://www.overclock.net/t/1606006/3d-mark-time-spy-benchmark-top-30/0_20#post_25351132


----------



## kx11

is there any guide to OC using MSI AB curve panel thing ?!!


----------



## bfedorov11

Quote:


> Originally Posted by *kx11*
> 
> is there any guide to OC using MSI AB curve panel thing ?!!


http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3.html


----------



## Avant Garde

I've just noticed that my FTW is running in PCI-E 3.0 x16 @ 16 1.1 instead of 16 3.0



Firestrike 1.1 result :



EDIT:

Under load it goes to 16 3.0 and here it is, Firestrike 1.1 second run :


----------



## Snabeltorsk

Quote:


> Originally Posted by *Avant Garde*
> 
> I've just noticed that my FTW is running in PCI-E 3.0 x16 @ 16 1.1 instead of 16 3.0
> 
> 
> 
> Firestrike 1.1 result :
> 
> 
> 
> EDIT:
> 
> Under load it goes to 16 3.0 and here it is, Firestrike 1.1 second run :


Thats normal, it goes to 3.0 under load, idle 1.1.
Nothing to be worried about, mine do the same.


----------



## Visceral

Installing my hybrid cooler from evga on my FE tomorrow. I'm assuming there is no "killer" bios I should flash it with? Currently it hits 2.1 and generally stays around 2.05


----------



## uberwootage

Quote:


> Originally Posted by *Visceral*
> 
> Installing my hybrid cooler from evga on my FE tomorrow. I'm assuming there is no "killer" bios I should flash it with? Currently it hits 2.1 and generally stays around 2.05


In using a seahawk bios. Got me an extra 10mhz but the fe bios is the best clocking bios.


----------



## gagac1971

gygabite gtx 1080 FE on valley...
Quote:


> Originally Posted by *Visceral*
> 
> Installing my hybrid cooler from evga on my FE tomorrow. I'm assuming there is no "killer" bios I should flash it with? Currently it hits 2.1 and generally stays around 2.05


hey man tell us later your hybrid kit expirience...and if something changes in overclock due to lower temp...
i am about to get that kit for my 1080 from evga europe...


----------



## gagac1971

my FE is the same overclock...starts 2140 mhz ans satlle at 2068 mhz...


----------



## Hydrored

#56 third run, i'm happy with my 1080's http://www.3dmark.com/fs/9348629


----------



## axiumone

Damn. I installed two hybrid coolers tonight. Even with them keeping the temps under 50c, they're not doing much good. At 2050mhz dropping down to 1987 when the power limit kicks in, in sli.

I can't wait for a higher power limit bios to stabilize these clocks.


----------



## ssgwright

i want to lock the clocks... I'm sick of this voltage and clock adjustment crap... I'm not worried about saving power when I play I want it locked...I miss old school cards set a clock and that's what it runs at no matter what, if it's not stable it artifacts or locks up and that's it...


----------



## Hydrored

my cards won't do better than 2114 no matter the voltage slider


----------



## Ascendor81

This dude on /r/Nvidia has a bios for his Xtreme Waterforce that allows him to get to 150% power limit. I had him upload it to techpowerup, but it is still not showing up.


__
https://www.reddit.com/r/4syn4l/my_1080_xtreme_waterforce_arrived_today_oc/


----------



## Spiriva

Quote:


> Originally Posted by *jscheema*
> 
> This dude on /r/Nvidia has a bios for his Xtreme Waterforce that allows him to get to 150% power limit. I had him upload it to techpowerup, but it is still not showing up.
> 
> 
> __
> https://www.reddit.com/r/4syn4l/my_1080_xtreme_waterforce_arrived_today_oc/


Would be nice to flash the FE with that bios, to see if it works at 150% power limit.


----------



## aberrero

Quote:


> Originally Posted by *ssgwright*
> 
> i want to lock the clocks... I'm sick of this voltage and clock adjustment crap... I'm not worried about saving power when I play I want it locked...I miss old school cards set a clock and that's what it runs at no matter what, if it's not stable it artifacts or locks up and that's it...


being able to slow down means it can cool down and therefore turbo higher.


----------



## evelnj

Quote:


> Originally Posted by *jscheema*
> 
> This dude on /r/Nvidia has a bios for his Xtreme Waterforce that allows him to get to 150% power limit. I had him upload it to techpowerup, but it is still not showing up.
> 
> 
> __
> https://www.reddit.com/r/4syn4l/my_1080_xtreme_waterforce_arrived_today_oc/%5B/URL


----------



## yenclas

Quote:


> Originally Posted by *evelnj*
> 
> https://www.techpowerup.com/vgabios/184613/184613


This bios has power limit to 150% sure ?


----------



## Antsu

Quote:


> Originally Posted by *evelnj*
> 
> https://www.techpowerup.com/vgabios/184613/184613


Someone brave enough to try it?


----------



## ffodi

Quote:


> Originally Posted by *yenclas*
> 
> This bios has power limit to 150% sure ?


150% is "just a number".....the important thing is to know the base (100%) PowerTarget in Watts....eg.: if 100% is 180W, than 150% should be 270W, but if 100% is 270W, than the max will be slightly over 400W.









I think this waterforce BIOS has the same limits like the Xtreme Gaming, which has a default PT of 250W and a max of 375W (yeah, that is 150%). The Xtreme Gaming BIOS is already available on TPU....

Did some research according to PT by different cards (checked the different BIOSes on TPU with the help of a HEXeditor...) . These are the default PTs (hope these are the right values, and I did not made any mistake):

Asus Strix GTX1080 *198W*
Gainward Phoenix GS GTX1080 *200W*
Gigabyte G1 Gaming GTX1080 *200W*
Gigabyte Xtreme Gaming GTX1080 *250W*
MSI Gaming X GTX1080 *270W*
Palit SuperJetstream GTX1080 *200W*
Zotac AMP Extreme GTX1080 *320W*

+1 Ref, 1080FE *180W*

The MSI gaming has a max. PT of ~300W, the Zotac and the Xtreme Gaming cards have it around 375W, the rest of the cards have a max. PT below 300W.


----------



## Sourcesys

Suddenly my card goes from 1950mhz to stable 2050mhz @ 1.093v without me doing anything?!









Whats going on here, the driver crashes at 2076mhz tho.

Is there any news on a Bios tweaker? I want to try higher voltages


----------



## uberwootage

Quote:


> Originally Posted by *Antsu*
> 
> Someone brave enough to try it?


Just woke up. I'll try it. Worst case I go to onboard and blind flash it back.

Works ok. Going to try for more. But first test

First test 2.164 on the core. 550 on the memory. TDP hit 89.8. No throttling at all 47c max temp using the stock cooler with a gtx 980ti hybrid


----------



## Antsu

Quote:


> Originally Posted by *uberwootage*
> 
> Just woke up. I'll try it. Worst case I go to onboard and blind flash it back


Went ahead and flashed the gigabyte extreme bios, which also has the 150% power slider.

Sadly my card doesn't go over 110% in the graph and is still power limited.


----------



## uberwootage

Quote:


> Originally Posted by *Antsu*
> 
> Went ahead and flashed the gigabyte extreme bios, which also has the 150% power slider.
> 
> Sadly my card doesn't go over 110% in the graph and is still power limited.


What card are you running? Gpuz is not shoeing mine is power limited now. Doing a run at 2177 right now

2177 stable.


----------



## Antsu

Quote:


> Originally Posted by *uberwootage*
> 
> What card are you running? Gpuz is not shoeing mine is power limited now. Doing a run at 2177 right now
> 
> 2177 stable.


I worded that badly. Indeed I meant that the BIOS seems to work fine but my card is the problem.

Gigabyte G1 Gaming 1080.


----------



## uberwootage

Quote:


> Originally Posted by *Antsu*
> 
> I worded that badly. Indeed I meant that the BIOS seems to work fine but my card is the problem.
> 
> Gigabyte G1 Gaming 1080.


I'm on a fe. Did not see much gain but running the stock and sehawk I topped out at 2,164 and 2,177. This I top out at 2,195


----------



## fat4l

Is this true that FE cards are clocking better/higher(in general) than AIB cards ?


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> Is this true that FE cards are clocking better/higher(in general) than AIB cards ?


Yeah.

Update on the bios. Ran valley 100% stable 2 times 2,180 now I can only run it at 2,164. But I can play games at 2,190 with that bios. Did a big 40man BG with a ton of aoe and it ran fine.

Also I'm sure it has to do with my windows installed. That is from 3 PC's ago lol. I'll do a fresh install but this ssd has had the same windows on it since I had my i7 4700k. Then my i5 4690k and now this system lol

Bios runs great . water cooled no throttling. I did see a slight increase in max stable clocks and a good increase in game stable clocks testing in a 40man BG in wow at 4k with everything maxed msaa x8 and render at 200%. The 1.093v max bios limit is pretty crappy tho. I would of loved for it to be 1.1 or 1.15v


----------



## versions

Quote:


> Originally Posted by *Sourcesys*
> 
> Suddenly my card goes from 1950mhz to stable 2050mhz @ 1.093v without me doing anything?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whats going on here, the driver crashes at 2076mhz tho.
> 
> Is there any news on a Bios tweaker? I want to try higher voltages


Funny you say that, my card appears to have become magically stable at 2139MHz from 2126MHz despite not being stable before. Haven't tested it fully but seems good so far. Perhaps it's the new driver?


----------



## boredgunner

Quote:


> Originally Posted by *fat4l*
> 
> Is this true that FE cards are clocking better/higher(in general) than AIB cards ?


On water yes.


----------



## uberwootage

Ok so after using that water force bios here is what I came up with.

Unlike the strix xoc bios overclocking improves results. Its a little behind what I can score and the frames I get with ref based bios but not much. I think my dirty windows install is giving me issues its an old install that's been used on a few PC's. The water force bios ran nice and stable and I did have an improvement when it came to the max over clock. However the fe and seahawk bios preform better. But so far the water force will give the highest clocks that improve performance unlike the strix ocx that when I tested from 2ghz to 2.2 there was 0 fps increase. To be exact 0.000

But the seahawk running at 2,151 is slightly faster then the water force running at 2,164

Again this is what I seen in my testing. It could work better or worse for you.

So i seen a 2.9% drop in average fps 1.2% drop at max. 6.55% max. Again not as bad as the strix ocx bios that thing is trash and had an average of 6% drop.

Later today i'll do more testing on a fresh install and do some run averages. But i went back to the seahawk bios but i still haven't taken the waterforce off the plate yet and i plan on going back to it and spending some time with it to see if there is a chance it will be my daily driver.

Waterforce left. Seahawk right.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Ok so after using that water force bios here is what I came up with.
> 
> Unlike the strix xoc bios overclocking improves results. Its a little behind what I can score and the frames I get with ref based bios but not much. I think my dirty windows install is giving me issues its an old install that's been used on a few PC's. The water force bios ran nice and stable and I did have an improvement when it came to the max over clock. However the fe and seahawk bios preform better. But so far the water force will give the highest clocks that improve performance unlike the strix ocx that when I tested from 2ghz to 2.2 there was 0 fps increase. To be exact 0.000
> 
> But the seahawk running at 2,151 is slightly faster then the water force running at 2,164
> 
> Again this is what I seen in my testing. It could work better or worse for you.


So what do u do actually ?
Buy an FE card, use strix OC bios(despite the fact it uses custom pcb ?? ) = profit ?


----------



## versions

Quote:


> Originally Posted by *uberwootage*
> 
> Ok so after using that water force bios here is what I came up with.
> 
> Unlike the strix xoc bios overclocking improves results. Its a little behind what I can score and the frames I get with ref based bios but not much. I think my dirty windows install is giving me issues its an old install that's been used on a few PC's. The water force bios ran nice and stable and I did have an improvement when it came to the max over clock. However the fe and seahawk bios preform better. But so far the water force will give the highest clocks that improve performance unlike the strix ocx that when I tested from 2ghz to 2.2 there was 0 fps increase. To be exact 0.000
> 
> But the seahawk running at 2,151 is slightly faster then the water force running at 2,164
> 
> Again this is what I seen in my testing. It could work better or worse for you.


Sea Hawk EK X has a custom PCB and Sea Hawk X has reference, right? Which one are you talking about? If you're talking about the Sea Hawk X with reference PCB, what difference is there to the Founders Edition BIOS?


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> So what do u do actually ?
> Buy an FE card, use strix OC bios(despite the fact it uses custom pcb ?? ) = profit ?


This is what you do.

Buy a FE. Flash the Strix ocx bios = 6% drop in your frame rates and overclocking provides absolutely no increase in performance. Only bios that does this. Every other bios shows some gain from overclocking. The OCX shown me absolutely zero.

My testing with a 5 run average.

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/2540#post_25335325

I have a Strrix GTX 1080 sitting here. And no matter what my FE always clocks higher then it even with non FE bios and even if i use the same bios on the strix. Jaystwocents even said his FE's clock higher then all the cards he tested. And that was in his zotac amp extreme review. The thing is the FE's need to be watercooled.

From everything i seen. FE's are binned. At first i thought it was just a better bios but after getting this Strix and flashing it with a FE bios i saw no improvement in clocks. So this strix will be going to a friend of mine. I managed to pick one up at microcenter and he wanted one so i told him i would get it if i can do some testing on it.

As of now. Stock cards i'll take anything but a FE. The heatsinks on the FE's are nice and work OK if you turn the fan up. I found 65% where it sits i cant hear it but it cools pretty well. But still cant compare to aftermarket heatsinks. If you plan on overclocking and not going water then go with a card with a better cooler. But if you can water cool it. "Pick up the GTX 980ti evga hybrid when they go for $60 on amazon. Remove the cover over the heatsink. Remove the heatsink and put the water block on. You wont go over 50c even overclocked and in a hot room. You really do not need the shroud thats in the 1080 hybrid kits so just save some money pick the 980ti kit for $60. Just keep the rear fan shroud on the stock heatsink do not do what gamernexus did. It will install just like it would normally tubes out the side and all the stock heatsink parts other then the heatsink and heatsink cover fit without any mods.

Quote:


> Originally Posted by *versions*
> 
> Sea Hawk EK X has a custom PCB and Sea Hawk X has reference, right? Which one are you talking about? If you're talking about the Sea Hawk X with reference PCB, what difference is there to the Founders Edition BIOS?


Just the Seahawk X. I got a EK bios but i just have not tested it yet. I dont see there being any improvement over the other. And even then from what i see the seahawk x bios is only slightly better by a hair. Outside of benchmarks you wont see a gain over a fe bios. I'll test it when i get a chance.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> This is what you do.
> 
> Buy a FE. Flash the Strix ocx bios = 6% drop in your frame rates and overclocking provides absolutely no increase in performance. Only bios that does this. Every other bios shows some gain from overclocking. The OCX shown me absolutely zero.
> 
> My testing with a 5 run average.
> 
> http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/2540#post_25335325
> 
> I have a Strrix GTX 1080 sitting here. And no matter what my FE always clocks higher then it even with non FE bios and even if i use the same bios on the strix. Jaystwocents even said his FE's clock higher then all the cards he tested. And that was in his zotac amp extreme review. The thing is the FE's need to be watercooled.
> 
> From everything i seen. FE's are binned. At first i thought it was just a better bios but after getting this Strix and flashing it with a FE bios i saw no improvement in clocks. So this strix will be going to a friend of mine. I managed to pick one up at microcenter and he wanted one so i told him i would get it if i can do some testing on it.
> 
> As of now. Stock cards i'll take anything but a FE. The heatsinks on the FE's are nice and work OK if you turn the fan up. I found 65% where it sits i cant hear it but it cools pretty well. But still cant compare to aftermarket heatsinks. If you plan on overclocking and not going water then go with a card with a better cooler. But if you can water cool it. "Pick up the GTX 980ti evga hybrid when they go for $60 on amazon. Remove the cover over the heatsink. Remove the heatsink and put the water block on. You wont go over 50c even overclocked and in a hot room. You really do not need the shroud thats in the 1080 hybrid kits so just save some money pick the 980ti kit for $60. Just keep the rear fan shroud on the stock heatsink do not do what gamernexus did. It will install just like it would normally tubes out the side and all the stock heatsink parts other then the heatsink and heatsink cover fit without any mods.
> Just the Seahawk X. I got a EK bios but i just have not tested it yet. I dont see there being any improvement over the other. And even then from what i see the seahawk x bios is only slightly better by a hair. Outside of benchmarks you wont see a gain over a fe bios. I'll test it when i get a chance.


Thanks and + rep.

I will be watercooling the card as I'm watercooling already.
So .....should I just pick ANY FE card or does it have to be Nvidia FE ?
Will this one be fine ?
https://www.overclockers.co.uk/palit-geforce-gtx-1080-founders-edition-8192mb-gddr5x-pci-express-graphics-card-gx-036-pl.html
Thanks


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Picked up a GTX 1080 Seahawk EK X today from Memoryexpress:
> 
> 
> 
> Can't wait to get it installed, which will be later on today or tonight.


NIce choice... that's like an ASUS Poseidon? Or water only?
Quote:


> Originally Posted by *ssgwright*
> 
> i want to lock the clocks... I'm sick of this voltage and clock adjustment crap... I'm not worried about saving power when I play I want it locked...I miss old school cards set a clock and that's what it runs at no matter what, if it's not stable it artifacts or locks up and that's it...


download the new EVGA PrecisionX, it will ask you for your card's SN#. Enable K-boost and the card runs at P0 always. Or,m use NV INspector to set all P-states to the max stock boost, then adjust clocks for P0. Save as a NVI profile.


----------



## MrTOOSHORT

@Jpmboy

It's a preapplied EK block on a Gaming X PCB.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> NIce choice... that's like an ASUS Poseidon? Or water only?
> download the new EVGA PrecisionX, it will ask you for your card's SN#. Enable K-boost and the card runs at P0 always. Or,m use NV INspector to set all P-states to the max stock boost, then adjust clocks for P0. Save as a NVI profile.


Can it overclock higher than FE?


----------



## ssgwright

can someone hook me up with an EVGA FE bios? I want the full functionality of precision.


----------



## uberwootage

Quote:


> Originally Posted by *versions*
> 
> Sea Hawk EK X has a custom PCB and Sea Hawk X has reference, right? Which one are you talking about? If you're talking about the Sea Hawk X with reference PCB, what difference is there to the Founders Edition BIOS?


Quote:


> Originally Posted by *Gary2015*
> 
> Can it overclock higher than FE?


Everything i seen FE's clock higher when water cooled.
Quote:


> Originally Posted by *Gary2015*
> 
> Can it overclock higher than FE?


Quote:


> Originally Posted by *ssgwright*
> 
> can someone hook me up with an EVGA FE bios? I want the full functionality of precision.


You will also need a serial number from a EVGA FE to unlock the software.


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> I'm on a fe. Did not see much gain but running the stock and sehawk I topped out at 2,164 and 2,177. This I top out at 2,195


Please stoppppp. You're killing me here.

I want to flash a bios so badly. My patience is diminishing.


----------



## ssgwright

can someone hook me up with a serial number!!! please!!!!


----------



## ssgwright

sweet i used an old SN from my old 780 and it worked!


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Everything i seen FE's clock higher when water cooled.
> 
> You will also need a serial number from a EVGA FE to unlock the software.


Sooo bro. Will I be good with
PNY Nvidia GeForce GTX 1080 Graphics Card Founders Edition 8GB GDDR5X
or
Palit GeForce GTX 1080 "Founders Edition" 8192MB GDDR5X PCI-Express Graphics Card
??








This is what we have to buy + waterblock = WIN YEAH?


----------



## NBAasDOGG

Hi gentleman,

Finally got my Gigabyte GTX1080 Xtreme Gaming. Nice big and cool card, but I'm disappointed with overclock. Max stable linear overclock is around 2067-2101MHZ using 100% volt and 150% power slider. This card really can do 150 power, but how can I overclock to 2100-2200? I have noticed that some of you can hit 2150mhz+, but how?
Btw, does flashing bioses change fan speeds or other things?


----------



## KillerBee33

Did anyone install EVGA Hybrid Kit for 10Series yet? Any major changes from the previous Kit?


----------



## versions

Quote:


> Originally Posted by *fat4l*
> 
> Sooo bro. Will I be good with
> PNY Nvidia GeForce GTX 1080 Graphics Card Founders Edition 8GB GDDR5X
> or
> Palit GeForce GTX 1080 "Founders Edition" 8192MB GDDR5X PCI-Express Graphics Card
> ??
> 
> 
> 
> 
> 
> 
> 
> 
> This is what we have to buy + waterblock = WIN YEAH?


I'd buy EVGA because of their official position on warranty for cards that have had their coolers removed. While I've heard that people have gotten away with returning cards to other manufacturers, to my knowledge no one else openly supports it.

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hi gentleman,
> 
> Finally got my Gigabyte GTX1080 Xtreme Gaming. Nice big and cool card, but I'm disappointed with overclock. Max stable linear overclock is around 2067-2101MHZ using 100% volt and 150% power slider. This card really can do 150 power, but how can I overclock to 2100-2200? I have noticed that some of you can hit 2150mhz+, but how?
> Btw, does flashing bioses change fan speeds or other things?


So far it appears that the Founders Edition is the best overclocker, even though I'm pretty sure NVIDIA said they were not binned. This is especially on water because the FE cooler isn't the best. Here's a forum post on that:
http://www.overclock.net/t/1604713/cb-gtx-1080-which-partner-card-is-the-best/50#post_25337561


----------



## xer0h0ur

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hi gentleman,
> 
> Finally got my Gigabyte GTX1080 Xtreme Gaming. Nice big and cool card, but I'm disappointed with overclock. Max stable linear overclock is around 2067-2101MHZ using 100% volt and 150% power slider. This card really can do 150 power, but how can I overclock to 2100-2200? I have noticed that some of you can hit 2150mhz+, but how?
> Btw, does flashing bioses change fan speeds or other things?


Just because you may have a BIOS allowing higher power limit doesn't mean you have a die that will overclock that high. Also as previously mentioned by someone else, that 150% power limit number is only as good as the base power limit. In other words its 150% of what.


----------



## Avant Garde

This is my latest score in Firestrike 1.1 :



In MSI Afterburner : +100 GPU Core & 50% Fan Fpeed



GPU : *EVGA GTX1080 FTW*


----------



## Sheyster

Quote:


> Originally Posted by *Avant Garde*
> 
> This is my latest score in Firestrike 1.1 :
> GPU : *EVGA GTX1080 FTW*


Gotta love the load temps with the FTW! I'm seeing similar results with mine.


----------



## uberwootage

Quote:


> Originally Posted by *versions*
> 
> I'd buy EVGA because of their official position on warranty for cards that have had their coolers removed. While I've heard that people have gotten away with returning cards to other manufacturers, to my knowledge no one else openly supports it.
> So far it appears that the Founders Edition is the best overclocker, even though I'm pretty sure NVIDIA said they were not binned. This is especially on water because the FE cooler isn't the best. Here's a forum post on that:
> http://www.overclock.net/t/1604713/cb-gtx-1080-which-partner-card-is-the-best/50#post_25337561


Yeah. They are not calling it binning. Its the "premium quality components" aka were binning but not going to say it so company's don't get mad that they are getting pur left overs

Quote:


> Originally Posted by *nexxusty*
> 
> Please stoppppp. You're killing me here.
> 
> I want to flash a bios so badly. My patience is diminishing.


Just flash it. I flashed it back doing some more testing on it but running on a nvidia founder edition its running fine water cooled. If your on stock air the higher voltage might just make you thermal throttle more then stock and cause an even bigger dip in performance.

This is not the bios we are looking for. We want something with the tdp unlocked and at least 1.15v "for the water cooled guys" so we can really run this card.


----------



## Avant Garde

The best on my GPU so far :





Just to see what this FTW is capable of with 100% Fan









Afterburner : +110 GPU Core


----------



## xer0h0ur

Right now it seems as if water cooling these cards is only serving a purpose of having a quiet setup and operating at low temperatures. Other than that there is no reason to water cool. Partially regretting putting an EK block on the 1080 right now.


----------



## ssgwright

what bios has 150% power? can someone hook me up?


----------



## CannedBullets

I love my GTX 1080 FTW, it just came in today. Runs smoothly, runs Rise of the Tomb Raider at 1920 x 1080 on ultra settings at 60 FPS


----------



## uberwootage

Quote:


> Originally Posted by *xer0h0ur*
> 
> Right now it seems as if water cooling these cards is only serving a purpose of having a quiet setup and operating at low temperatures. Other than that there is no reason to water cool. Partially regretting putting an EK block on the 1080 right now.


Yeah I was going to put an ek on mine but its a waste of $125. The $60 evga hybrid I have works amazing.

Yeah it runs cooler but the big point is it eliminates thermal throttling. When the chosen one comes out. Aka a good bios then we will see what they can do and I'll replace my evga with a ek


----------



## MrTOOSHORT

It's just a straight line on the graph what ever I do with my EK'd 1080.


----------



## MerkageTurk

I am really upset, look what the courier (MyHermes) did to my Amazon Parcel., but why didn't amazon put it in one of their cardboard boxes?

its a £619 item, a £1 item even comes with cardboard



































































































Driver of the courier attempted to open the package, the seal was broken and the package was torn.


----------



## superkyle1721

Quote:


> Originally Posted by *MerkageTurk*
> 
> I am really upset, look what the courier (MyHermes) did to my Amazon Parcel., but why didn't amazon put it in one of their cardboard boxes?
> 
> its a £619 item, a £1 item even comes with cardboard
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Driver of the courier attempted to open the package, the seal was broken and the package was torn.


Not that I condone this if it is unwarranted but Amazon is really good when you have an issue. If you contact them and show them they will likely offer a replacement or give you a discount on the purchase. It's worth the 5 minutes it will take to contact them

Sent from my iPhone using Tapatalk


----------



## MerkageTurk

I did they said they will contact the courier on my behalf and I will not face this in the future


----------



## xer0h0ur

Not even my package and I am triggered by those photos.


----------



## lyang238

Looks like the sweet spot for my 1080 Sea Hawk is 2164 mhz. Stays around 47-40 C while gaming.

2164.jpg 1153k .jpg file


----------



## uberwootage

Quote:


> Originally Posted by *MerkageTurk*
> 
> I am really upset, look what the courier (MyHermes) did to my Amazon Parcel., but why didn't amazon put it in one of their cardboard boxes?
> 
> its a £619 item, a £1 item even comes with cardboard
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Driver of the courier attempted to open the package, the seal was broken and the package was torn.


I would be on the phone with them telling them they have 5 min. to fix this problem before I lose my ****. If you don't have prime make them give you prime. Make them send you out a replacement and have them pack it right.


----------



## Naked Snake

So is it safe to run a FE @ 100% fan speed for 8 hours of gaming? Or the fan will explode? Noise won't be a problem now that I'm using a wireless headset and I'm far away the case and is my only way to get 2.1 without thermal thorttling


----------



## boredgunner

Quote:


> Originally Posted by *Naked Snake*
> 
> So is it safe to run a FE @ 100% fan speed for 8 hours of gaming? Or the fan will explode? Noise won't be a problem now that I'm using a wireless headset and I'm far away the case and is my only way to get 2.1 without thermal thorttling


Last time I checked, reference NVIDIA cards use reliable fans made by Delta. Might not be Delta anymore but it should be fine.


----------



## aberrero

Quote:


> Originally Posted by *Naked Snake*
> 
> So is it safe to run a FE @ 100% fan speed for 8 hours of gaming? Or the fan will explode? Noise won't be a problem now that I'm using a wireless headset and I'm far away the case and is my only way to get 2.1 without thermal thorttling


Wireless headset?

Have you considered hitting the card with a 120mm fan mounted next to it? I'm sure it will be fine, but the fan will not last as long for sure.


----------



## Shadowdane

Was that sold by Amazon or a 3rd party seller??

I had one item I ordered that I hadn't realized was actually a 3rd party and was packaged very poorly and got damaged in shipping. Amazon of course let me return it and I left a scathing review of the 3rd party seller.

After that I avoid anything that isn't being sold by Amazon or at least Fulfilled by Amazon.. At least then I know it will be packaged well.


----------



## Naked Snake

Quote:


> Originally Posted by *aberrero*
> 
> Wireless headset?
> 
> Have you considered hitting the card with a 120mm fan mounted next to it? I'm sure it will be fine, but the fan will not last as long for sure.





There is actually a side fan and I have one inside the case but the temps are 68 C @ 100% fan speed that's the only way I can use the card @ 2152mhz. I know I should get a water cooler but is not going to happen anytime soon.


----------



## uberwootage

Quote:


> Originally Posted by *Naked Snake*
> 
> So is it safe to run a FE @ 100% fan speed for 8 hours of gaming? Or the fan will explode? Noise won't be a problem now that I'm using a wireless headset and I'm far away the case and is my only way to get 2.1 without thermal thorttling


It will be fine. Its not very common that they fail and they are set up to run in server environments at 100%. My advice is when you clean your heatsink every few months every other cleaning put a drop of oil on the bearing. 100% you will be sucking up more dust.

https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2

$67 well worth it. I picked up up when it was $60 and it works amazing. Simple install cheap and effective. My loads never hit over 50c and in games mostly its around 48c. If you worry about your fan just get that and save a bunch of money on the official 1080 kit. Pm me if you get it and need help installing or wanna see any pics. Its a direct no mod bolt in upgrade.


----------



## Naked Snake

Quote:


> Originally Posted by *uberwootage*
> 
> It will be fine. Its not very common that they fail and they are set up to run in server environments at 100%. My advice is when you clean your heatsink every few months every other cleaning put a drop of oil on the bearing. 100% you will be sucking up more dust.
> 
> https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2
> 
> $67 well worth it. I picked up up when it was $60 and it works amazing. Simple install cheap and effective. My loads never hit over 50c and in games mostly its around 48c. If you worry about your fan just get that and save a bunch of money on the official 1080 kit. Pm me if you get it and need help installing or wanna see any pics. Its a direct no mod bolt in upgrade.


Thanks for your input I really want to do that but is not possible anytime soon. My grandma will travel to USA and come back to my country around december this year so she might be able to bring me the hybrid kit, until then I'm out of luck.


----------



## nizmoz

Received my EVGA GTX 1080 FTW yesterday! Love it so far!









Some pics compared to my old 970 GTX SSC.


----------



## Avant Garde

On mine FTW white is more like some kind of light blue


----------



## MerkageTurk

It was sold and despatched by amazon.

I also have amazon prime.

I believe the driver opened the contents and wrapped it up with their own package.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> @Jpmboy
> 
> It's a preapplied EK block on a Gaming X PCB.


nice(er)!
Quote:


> Originally Posted by *Gary2015*
> 
> Can it overclock higher than FE?


if the card can, he sure can.








Quote:


> Originally Posted by *ssgwright*
> 
> can someone hook me up with an EVGA FE bios? I want the full functionality of precision.


Quote:


> Originally Posted by *ssgwright*
> 
> sweet i used an old SN from my old 780 and it worked!


Now this is good to know! +1


----------



## alawadhi3000

Quote:


> Originally Posted by *Naked Snake*
> 
> Thanks for your input I really want to do that but is not possible anytime soon. My grandma will travel to USA and come back to my country around december this year so she might be able to bring me the hybrid kit, until then I'm out of luck.


Amazon can ship that part internationally.


----------



## ssgwright

so far I'm seeing the best performance with the seahawk EK bios... has anyone else been testing these and have a fav?


----------



## Twinnuke

1080 is in. Running at 2000MHZ. Haven't tried to go any farther. Temps top out at 75 with furmark.


----------



## NBAasDOGG

A few questions here.

Is there any bios out there which is the best for overclocking? And imagine i flash my Gigabyte gtx1080 Xtreme gaming to FE or other AIB cards, will it mess up my fan setting from bios as well?


----------



## Naked Snake

Quote:


> Originally Posted by *alawadhi3000*
> 
> Amazon can ship that part internationally.




Hahaha sadly out of luck like I said, my country suck


----------



## Antsu

Quote:


> Originally Posted by *NBAasDOGG*
> 
> A few questions here.
> 
> Is there any bios out there which is the best for overclocking? And imagine i flash my Gigabyte gtx1080 Xtreme gaming to FE or other AIB cards, will it mess up my fan setting from bios as well?


That bios has a max power limit far higher than the FE or similiar models. And power limit being the limiting factor in most cases, you would actually probably lose performance.


----------



## Twinnuke

Gigabyte GTX G1. Did I get a dud?

cant even +100 Core / + 250 Memory. Red dots artifacting.


----------



## xer0h0ur

With so many different default clock speeds I can't even judge that. What is the resulting actual clock speed observed while benching or gaming?


----------



## Twinnuke

Gaming. Bench Crashes almost instantly Stock Clock on the g1 is like 1696 / 1835 w/ boost and Memory is 10010


----------



## xer0h0ur

Dang so you're saying you can't even reach 2000MHz core clock?


----------



## Twinnuke

Doesn't seem like it. I've put 108% power Limit. +100% Voltage. Still drivers crashing and red dots and such. terrible.


----------



## xer0h0ur

Give it a try using Afterburner 4.3.0 Beta 4. Just crank the power limit and core voltage then give it a whirl again.


----------



## Twinnuke

The highest I can go is 100 & 108 and it's at that. Trying MSI now. gonna run a bench.


----------



## stocksux

Officially a member of the 1080 club! Asus Strix 1080 OC with an EK waterblock and backplate


----------



## versions

You're not looking at the listed boost frequency in GPU-Z, are you? What that says isn't what you're actually getting, look at the sensors tab or some other program to monitor the actual clock speeds you're getting.
Quote:


> Originally Posted by *stocksux*
> 
> Officially a member of the 1080 club! Asus Strix 1080 OC with an EK waterblock and backplate


Please report back with what overclocks you're getting. Haven't seen a lot of overclocks on non-FE cards under water. Interested to see if it can reach FE frequencies or not.


----------



## stocksux

After playing around with it for about 20 minutes or so I've come up with 2126Ghz core and 11.5k ish mem clocks. I'll pull benchmark results later and post an exact mem clock


----------



## Twinnuke

Quote:


> Originally Posted by *versions*
> 
> You're not looking at the listed boost frequency in GPU-Z, are you? What that says isn't what you're actually getting, look at the sensors tab or some other program to monitor the actual clock speeds you're getting.
> Please report back with what overclocks you're getting. Haven't seen a lot of overclocks on non-FE cards under water. Interested to see if it can reach FE frequencies or not.


I can barely break 2000 according to AB. hit about 2003. Tried to go up 10 MHZ and it crashed at 2050.


----------



## Raisingx

Is it normal to have buzz/coil whine even with fps locked to 60 or gpu usage above 40% ? (made sure to check where the noise came from with a water cooling tube)

Returned my MSI 1080 Gaming X, what are the odds of having another one with the same problem ?

PSU = EVGA Supernova G2 1300w


----------



## Twinnuke

So I can sit at 2000 with No voltage modifier but when I push it even 5 mhz above no voltage change will help. Crash crash crash after about 25 seconds in furmark.


----------



## xer0h0ur

Quote:


> Originally Posted by *Twinnuke*
> 
> I can barely break 2000 according to AB. hit about 2003. Tried to go up 10 MHZ and it crashed at 2050.


Yeah that sounds like you struck out on the silicon lottery unless your BIOS is particularly weak in terms of base power limit.


----------



## Twinnuke

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah that sounds like you struck out on the silicon lottery unless your BIOS is particularly weak in terms of base power limit.


Not really sure how I can even test the base power limit. But 108% is max on G1 when you try to OC it.


----------



## xer0h0ur

Quote:


> Originally Posted by *ffodi*
> 
> 150% is "just a number".....the important thing is to know the base (100%) PowerTarget in Watts....eg.: if 100% is 180W, than 150% should be 270W, but if 100% is 270W, than the max will be slightly over 400W.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think this waterforce BIOS has the same limits like the Xtreme Gaming, which has a default PT of 250W and a max of 375W (yeah, that is 150%). The Xtreme Gaming BIOS is already available on TPU....
> 
> Did some research according to PT by different cards (checked the different BIOSes on TPU with the help of a HEXeditor...) . These are the default PTs (hope these are the right values, and I did not made any mistake):
> 
> Asus Strix GTX1080 *198W*
> Gainward Phoenix GS GTX1080 *200W*
> Gigabyte G1 Gaming GTX1080 *200W*
> Gigabyte Xtreme Gaming GTX1080 *250W*
> MSI Gaming X GTX1080 *270W*
> Palit SuperJetstream GTX1080 *200W*
> Zotac AMP Extreme GTX1080 *320W*
> 
> +1 Ref, 1080FE *180W*
> 
> The MSI gaming has a max. PT of ~300W, the Zotac and the Xtreme Gaming cards have it around 375W, the rest of the cards have a max. PT below 300W.


Quoted for posterity

I keep saying power limit, mean to say power target


----------



## Twinnuke

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quoted for posterity
> 
> I keep saying power limit, mean to say power target


Is it possible to Vmod a Giga yet? Maybe to an EK Bios? Not sure if the VRM's could handle that.


----------



## xer0h0ur

I haven't attempted any BIOS flashes yet. I am waiting for the braver people to flesh out which BIOS is the best for FE cards so I can then try it.


----------



## alawadhi3000

Quote:


> Originally Posted by *Naked Snake*
> 
> 
> 
> Hahaha sadly out of luck like I said, my country suck


Thats weird, it ships to Bahrain so not sure why won't they ship to other countries.



You can use BorderLinx or other freight forwarding services then.


----------



## Cool Mike

I was very lucky to grab a Gigabyte xtreme premium pack from Newegg yesterday. I will receive it Monday. Can't wait.


----------



## Pendulum

My EVGA 1080 had some pretty nasty coil whine. I managed to bring it down to an acceptable level after stressing the card for a week.

I wasn't quite sure what was causing it since this is a fresh build...until I moved houses. I suppose my old house had some electrical issues because the coil whine is 95% gone now.


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> so far I'm seeing the best performance with the seahawk EK bios... has anyone else been testing these and have a fav?


Where did you get the seahawk ek bios at?

I got some nice results with the gigabyte waterforce if you wanna try that out.


----------



## fat4l

Maybe you should go with 1080 volt mods ?









https://xdevs.com/guide/pascal_oc/

It's very easy to remove all power limits .....


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> Maybe you should go with 1080 volt mods ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://xdevs.com/guide/pascal_oc/
> 
> It's very easy to remove all power limits .....


I would but i just dont have any pots and my soldering iron here sucks. I'll end up ordering the pots and just do it at work. I got the caps and resistors at work so i just need the pots.

And my multi meter here "Should" be calibrated but i have never calibrated it. I just do a hillbilly calibration. Measure some stuff on my works flukes that are calibrated write down what i get go home and adjust it to those values. Maybe some time during the week i'll start on it looks pretty simple i just have to find the time to do it. And bios flashing is fun lol.

Tested a few bios. And they all suck. So far the FE and the MSI seahawk along with the Gigabyte waterforce are giving the best results.

Palit gamerock. Sucks bad one of the worst bios i have tested.

HerculeZ Twin X2. About the same as the EVGA's nothing great

Asus Strix OC. Again same old same old.

I'll test more. The top 3 i listed are all great and stable. Using the Waterforce as a daily driver now. If anyone else wants me to test some out just let me know and i'll try to get some test on it and let you know how it is.


----------



## Cornerer

Quote:


> Originally Posted by *uberwootage*
> 
> I'll test more. The top 3 i listed are all great and stable. Using the Waterforce as a daily driver now. If anyone else wants me to test some out just let me know and i'll try to get some test on it and let you know how it is.


May I ask for Sea Hawk EK X? Pretty sure lots of us are interested in that 1 having u mentioned Sea Hawk


----------



## FattysGoneWild

Quote:


> Originally Posted by *CannedBullets*
> 
> I love my GTX 1080 FTW, it just came in today. Runs smoothly, runs Rise of the Tomb Raider at 1920 x 1080 on ultra settings at 60 FPS


You can do the same at 1440p. DO IT! You are not exploiting the full power of that card @1080p.


----------



## NBAasDOGG

Can someone quickly give me a tutorial how to flash bios on Pascal?
Thanks, can't wait to brick my card


----------



## Snabeltorsk

Here is Sea Hawk

Msi.SeaHawk.zip 149k .zip file


----------



## CannedBullets

Quote:


> Originally Posted by *FattysGoneWild*
> 
> You can do the same at 1440p. DO IT! You are not exploiting the full power of that card @1080p.


I wish I had a 1440p monitor but right now I'm on a 23 inch 1920 x 1080 monitor.


----------



## FattysGoneWild

Quote:


> Originally Posted by *CannedBullets*
> 
> I wish I had a 1440p monitor but right now I'm on a 23 inch 1920 x 1080 monitor.


Common. You have that nice system and cant swing a 1440p monitor? Why would you buy a 1080 for only 1080p? I done the same myself. But, updated to a 1440p 144Hz G-Synch monitor ASAP. Its quite a jump from 1080p and totally worth it. Do that card justice!


----------



## CannedBullets

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Common. You have that nice system and cant swing a 1440p monitor? Why would you buy a 1080 for only 1080p? I done the same myself. But, updated to a 1440p 144Hz G-Synch monitor ASAP. Its quite a jump from 1080p and totally worth it. Do that card justice!


I've had the monitor since 2013 when I started building PCs and it never really felt urgent for me to upgrade to a 1440 monitor right now. Maybe in the next two card generations when the GPUs can run 1440 more easily. Right now 1080 still seems to be the norm so I feel covered.


----------



## H3avyM3tal

Whats a good oc for the memory? I'm doing +250 on my ftw (+50 on the core since its boosting above 2000 on its own)?


----------



## kx11

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Whats a good oc for the memory? I'm doing +250 on my ftw (+50 on the core since its boosting above 2000 on its own)?


go nuts


----------



## Sourcesys

Quote:


> Originally Posted by *kx11*
> 
> go nuts


What does the new Voltage slider even do now? Its just % now right? What does it modify? Seems to have no effect on the voltage what so ever, the card is still 1.093v locked.


----------



## emett

Quote:


> Originally Posted by *CannedBullets*
> 
> I've had the monitor since 2013 when I started building PCs and it never really felt urgent for me to upgrade to a 1440 monitor right now. Maybe in the next two card generations when the GPUs can run 1440 more easily. Right now 1080 still seems to be the norm so I feel covered.


Dude!


----------



## Works4me

What's an acceptable curve everyone's using ? i have 2 MSI GTX 1080 GAMING X and i want to fully utilize them .
I got the cores on both doing 2060 and the mem on 10864 without using the curve .


----------



## H3avyM3tal

Whats this curve thing?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Whats this curve thing?


This video will help:


----------



## jedimasterben

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Whats a good oc for the memory? I'm doing +250 on my ftw (+50 on the core since its boosting above 2000 on its own)?


Most are only capable of +450-500 before error correction kicks in and performance actually goes down with higher frequencies. Be sure to bench thoroughly to make sure average scores aren't decreasing. My VRAM on my FE is stable at +540 with no artifacting, but performance is several percent lower than with it at +500, which is where it begins it's downward spiral.
Quote:


> Originally Posted by *kx11*
> 
> go nuts


Are you sure you're gaining any extra performance at that clock?
Quote:


> Originally Posted by *Works4me*
> 
> What's an acceptable curve everyone's using ? i have 2 MSI GTX 1080 GAMING X and i want to fully utilize them .
> I got the cores on both doing 2060 and the mem on 10864 without using the curve .


Meh, the curve doesn't seem to work right for me, everything is actually even less stable, voltage and clock speeds still bounce around wildly.


----------



## Spiriva

Is it the normal MSi Sea Hawk (not Sea Hawk EK version) bios that currently is the "top bios" to use for best preformance ?


----------



## kx11

never used a custom bios with 1080 , do they turn off GPU Boost 3 ? what's the real gain in them ?!!


----------



## Antsu

Quote:


> Originally Posted by *Spiriva*
> 
> Is it the normal MSi Sea Hawk (not Sea Hawk EK version) bios that currently is the "top bios" to use for best preformance ?


On my G1 gaming the normal sea hawk bios did absolutely nothing.


----------



## Sourcesys

Quote:


> Originally Posted by *kx11*
> 
> never used a custom bios with 1080 , do they turn off GPU Boost 3 ? what's the real gain in them ?!!


Mostly power target und voltage unlock


----------



## ssgwright

i even tried k-boost and the clocks will still throttle... no matter how loose an overclock i have


----------



## Gary2015

Do I need a three or four slot HB SLI bridge for Asus Rampage Edition 10 in x16 lanes?


----------



## PU skunk

Price gouging is through the roof right now and getting worse.
Anybody know where to get a FE 1080 for retail?


----------



## Gary2015

Quote:


> Originally Posted by *PU skunk*
> 
> Price gouging is through the roof right now and getting worse.
> Anybody know where to get a FE 1080 for retail?


Here...

http://www.geforce.com/hardware/10series/geforce-gtx-1080?ClickID=cfzawp4p4se7ksv7iw7ziefeafvvaiezensn&ClickID=by6d6e6snnsqsvf6vgdqdzksqyyugmfdggyg


----------



## Mad Pistol

Quote:


> Originally Posted by *Gary2015*
> 
> Here...
> 
> http://www.geforce.com/hardware/10series/geforce-gtx-1080?ClickID=cfzawp4p4se7ksv7iw7ziefeafvvaiezensn&ClickID=by6d6e6snnsqsvf6vgdqdzksqyyugmfdggyg


They don't list it either, but they have a 3-year warranty on the card, so you're actually getting a pretty decent value when you order direct from Nvidia.


----------



## uberwootage

Quote:


> Originally Posted by *PU skunk*
> 
> Price gouging is through the roof right now and getting worse.
> Anybody know where to get a FE 1080 for retail?


Quote:


> Originally Posted by *Twinnuke*
> 
> So I can sit at 2000 with No voltage modifier but when I push it even 5 mhz above no voltage change will help. Crash crash crash after about 25 seconds in furmark.


Microcenter. Call them and have them check.


----------



## JaredC01

Just got my rig back together after installing the EK block and backplate for my G1 Gaming 1080. Been playing with Afterburner and Heaven 4.0 for about 45 minutes now...

Temps are stable at 44*C with a room temp of 25*C running Heaven (been looping for about 30 minutes), absolute MAXIMUM overclock is 2126MHz stable. I've backed it off a tad down to 2114MHz and it's running like a dream.



Off to test something other than Heaven...


----------



## KillerBee33

Quote:


> Originally Posted by *JaredC01*
> 
> Just got my rig back together after installing the EK block and backplate for my G1 Gaming 1080. Been playing with Afterburner and Heaven 4.0 for about 45 minutes now...
> 
> Temps are stable at 44*C with a room temp of 25*C running Heaven (been looping for about 30 minutes), absolute MAXIMUM overclock is 2126MHz stable. I've backed it off a tad down to 2114MHz and it's running like a dream.
> Off to test something other than Heaven...


Get it to 57 Degrees and see if anything changes....mine goes smooth to that point then Pwr+VRel throttles it down 50MHz and then some more.


----------



## jedimasterben

Got tired of waiting for a BIOS modding tool and did the hard mod on xDevs.







Just soldered a small wire over resistors RS1, RS2, and RS3, did not add resistors to any capacitors, I just kept it easy.

Before (120% power limit, +170 core, +500 VRAM), Vcore bouncing between 0.981v and 1.031v with clock speed between 1965MHz and 2025MHz.


After (same OC settings as above), Vcore stable at 1.062v and clock speed stable at 2062MHz.


----------



## Gary2015

Quote:


> Originally Posted by *Mad Pistol*
> 
> They don't list it either, but they have a 3-year warranty on the card, so you're actually getting a pretty decent value when you order direct from Nvidia.


I thought the retail price for the FE was $699.


----------



## JaredC01

Quote:


> Originally Posted by *KillerBee33*
> 
> Get it to 57 Degrees and see if anything changes....mine goes smooth to that point then Pwr+VRel throttles it down 50MHz and then some more.


I can't get it to 57 degrees, it levels out at 45... Physically won't go any higher than that.


----------



## Cornerer

Quote:


> Originally Posted by *jedimasterben*
> 
> Most are only capable of +450-500 before error correction kicks in and performance actually goes down with higher frequencies. Be sure to bench thoroughly to make sure average scores aren't decreasing. My VRAM on my FE is stable at +540 with no artifacting, but performance is several percent lower than with it at +500, which is where it begins it's downward spiral.


Very true indeed. Same applied to my previous HIS HD 6950 "n" years ago. It was overclocked at 912/1335 core/memory and you could go more than 100MHz higher for both while witnessing the tailing off performance ...


----------



## KillerBee33

Can few Founder Edition owners confirm this? And also what brand is that FE .


----------



## Gary2015

[/quote]

Yes. mine says Zotac FE.


----------



## xer0h0ur

Quote:


> Originally Posted by *jedimasterben*
> 
> Got tired of waiting for a BIOS modding tool and did the hard mod on xDevs.
> 
> 
> 
> 
> 
> 
> 
> Just soldered a small wire over resistors RS1, RS2, and RS3, did not add resistors to any capacitors, I just kept it easy.
> 
> Before (120% power limit, +170 core, +500 VRAM), Vcore bouncing between 0.981v and 1.031v with clock speed between 1965MHz and 2025MHz.
> 
> 
> After (same OC settings as above), Vcore stable at 1.062v and clock speed stable at 2062MHz.


Nice results. I wish I would have known about this easy power limit mod when I was installing the EK block on my card. Now I really don't feel like going through the hassle of draining the loop to disconnect it and do the mod.


----------



## fat4l

Quote:


> Originally Posted by *jedimasterben*
> 
> Got tired of waiting for a BIOS modding tool and did the hard mod on xDevs.
> 
> 
> 
> 
> 
> 
> 
> Just soldered a small wire over resistors RS1, RS2, and RS3, did not add resistors to any capacitors, I just kept it easy.
> 
> Before (120% power limit, +170 core, +500 VRAM), Vcore bouncing between 0.981v and 1.031v with clock speed between 1965MHz and 2025MHz.
> 
> 
> After (same OC settings as above), Vcore stable at 1.062v and clock speed stable at 2062MHz.


Nice man! Finally someone brave nuff









Well to be honest, you dont even need to use soldering. The only thing you really need is Coollaboratory Liquid Ultra/pro paste.
Look here :
http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/3/

So, basically what you did is ....short these 3 resistors yeah?


----------



## Spiriva

Quote:


> Originally Posted by *KillerBee33*
> 
> Can few Founder Edition owners confirm this? And also what brand is that FE .


Mine says "EVGA (3842)"


----------



## uberwootage

Quote:


> Originally Posted by *KillerBee33*
> 
> Can few Founder Edition owners confirm this? And also what brand is that FE .


Thats the Nvidia branded bios. the 01 bios is the one used for all the founders. They will just change the id. But the bios is the same.

Every founder including Nvidia. 86.04.17.00.01

Another bios that was only on Nvidia FE cards "some of them" 86.04.11.00.0C


----------



## Spiriva

Quote:


> Originally Posted by *uberwootage*
> 
> Thats the Nvidia branded bios. the 01 bios is the one used for all the founders. They will just change the id. But the bios is the same.
> 
> Every founder including Nvidia. 86.04.17.00.01
> 
> Another bios that was only on Nvidia FE cards "some of them" 86.04.11.00.0C




Evga FE 1080.

Both my cards do 2200mhz, both with this bios


----------



## nexxusty

Quote:


> Originally Posted by *Twinnuke*
> 
> 
> 
> 1080 is in. Running at 2000MHZ. Haven't tried to go any farther. Temps top out at 75 with furmark.


God... STOP USING FURMARK.

Ya hear? EVERYONE. It's not 2006, we're not testing GTX 280's.... Stop using Furmark.


----------



## fat4l

For more hard/volt mods.... See this vid













Or The guide here again: https://xdevs.com/guide/pascal_oc/

Full HD pics:
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/images/front_full.jpg
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/images/back_full.jpg

Power limit mod-very easy, use CLP/U. You can remove it easy and RMA if needed!











More info regarding power limits:
http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/3/


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> Can few Founder Edition owners confirm this? And also what brand is that FE .


The one you're looking at is the direct from Nvidia model, either from teh Nvidia store or at Best Buy.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Nice results. I wish I would have known about this easy power limit mod when I was installing the EK block on my card. Now I really don't feel like going through the hassle of draining the loop to disconnect it and do the mod.


Yeah, that's a ton of work, I wouldn't do that, either. I have an EVGA CLC installed on mine, so removing that took a whopping four screws lol, and then the 9000 tiny screws holding the backplate and heatspreader on








Quote:


> Originally Posted by *fat4l*
> 
> Nice man! Finally someone brave nuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well to be honest, you dont even need to use soldering. The only thing you really need is Coollaboratory Liquid Ultra/pro paste.
> Look here :
> http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/3/
> 
> So, basically what you did is ....short these 3 resistors yeah?


Well, not soldering actually requires having CLU laying around, which I do not lol. I actually wanted to completely remove them, but my soldering iron tip isn't small enough and after trying with an air reflow tool for like 10 minutes I just said screw it and just soldered the wire lol.

And yes, it is just the three resistors, RS1, RS2, and RS3


----------



## nexxusty

Quote:


> Originally Posted by *jedimasterben*
> 
> The one you're looking at is the direct from Nvidia model, either from teh Nvidia store or at Best Buy.
> Yeah, that's a ton of work, I wouldn't do that, either. I have an EVGA CLC installed on mine, so removing that took a whopping four screws lol, and then the 9000 tiny screws holding the backplate and heatspreader on
> 
> 
> 
> 
> 
> 
> 
> 
> Well, not soldering actually requires having CLU laying around, which I do not lol. I actually wanted to completely remove them, but my soldering iron tip isn't small enough and after trying with an air reflow tool for like 10 minutes I just said screw it and just soldered the wire lol.
> 
> And yes, it is just the three resistors, RS1, RS2, and RS3


I'll be trying this today.

Off to get a Kraken G10 I guess. I just hope I can fit it on with the FE's stock VRM/RAM cooling solution.

Can anyone with an FE with an AIO tell me if the GPU die is flush with the stock VRM/RAM plate? If not I doubt the AIO will fit within the cutout for the die.

Any takers?


----------



## uberwootage

Quote:


> Originally Posted by *Spiriva*
> 
> 
> 
> Evga FE 1080.
> 
> Both my cards do 2200mhz, both with this bios


00-80 is the SC bios


----------



## PU skunk

Quote:


> Originally Posted by *uberwootage*
> 
> Microcenter. Call them and have them check.


$650 on their store site. I will have to settle for that, thx.


----------



## jedimasterben

Quote:


> Originally Posted by *nexxusty*
> 
> I'll be trying this today.
> 
> Off to get a Kraken G10 I guess. I just hope I can fit it on with the FE's stock VRM/RAM cooling solution.
> 
> Can anyone with an FE with an AIO tell me if the GPU die is flush with the stock VRM/RAM plate? If not I doubt the AIO will fit within the cutout for the die.
> 
> Any takers?


I actually meant to look at that when I had the block off today, but I didn't think about it, sorry!


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> The one you're looking at is the direct from Nvidia model, either from teh Nvidia store or at Best Buy.


I was under impression Subvendor meant CHIP MANUFACTURER not bios ....Mine is MSI FE with no MSI info in GPU-Z , not saying is a bad thing had very good and long experience with NVIDIA Subvendor(chip or bios)


----------



## uberwootage

Quote:


> Originally Posted by *PU skunk*
> 
> $650 on their store site. I will have to settle for that, thx.


My store had 10 the sold them in 4 hours.

Not bad FE's are the best you just gotta cool them. I payed $699 for mine and no complaints 2.2ghz


----------



## Spiriva

Quote:


> Originally Posted by *uberwootage*
> 
> 00-80 is the SC bios


Nice of EVGA to give me the SC bios on my cards, because i got the regular version not the SC


----------



## Inglewood78

Quote:


> Originally Posted by *PU skunk*
> 
> Price gouging is through the roof right now and getting worse.
> Anybody know where to get a FE 1080 for retail?


Are you in the US? FEs are actually quite easy to get at retail. Current Nvidia store has them in stock but it come in stock often in other etailers.

http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/


----------



## zGunBLADEz

Guys stop using other bioses to try squeeze more mhz out of your cards. In reality your loosing performance...
My zotac FE bios still the best bios so far of all the ones i have tried at the same mhz ...


----------



## nexxusty

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Guys stop using other bioses to try squeeze more mhz out of your cards. In reality your *losing* performance...
> My zotac FE bios still the best bios so far of all the ones i have tried at the same mhz ...


Fixed that for you. ;-)

Seahawk BIOS works fine on FE's. No performance loss AFAIK.


----------



## zGunBLADEz

Quote:


> Originally Posted by *nexxusty*
> 
> Fixed that for you. ;-)
> 
> Seahawk BIOS works fine on FE's. No performance loss AFAIK.


which one? Theres 2


----------



## nexxusty

Quote:


> Originally Posted by *jedimasterben*
> 
> I actually meant to look at that when I had the block off today, but I didn't think about it, sorry!


Np man. I will find out soon enough. Off to get a G10 bracket. Hoping I don't have to hack it apart... lol.


----------



## versions

Quote:


> Originally Posted by *uberwootage*
> 
> 00-80 is the SC bios


It's the regular EVGA FE BIOS. I have the same.


----------



## uberwootage

Quote:


> Originally Posted by *versions*
> 
> It's the regular EVGA FE BIOS. I have the same.


Its the same bios they use on the sc. Just downclocked.

Quote:


> Originally Posted by *zGunBLADEz*
> 
> which one? Theres 2


the bios for the ek i tested sucks. Lost 10mhz and i seen some throttling in wow. The corsair version when i ran it ran much better.

So far the best bios i used is the Waterforce. beats the stock fe bios "all of them" on my card


----------



## Neon Lights

Anyone tried using this BIOS from here: http://forum.hwbot.org/showthread.php?t=159025 ? (You just have to pull the OC BIOS to the shortcut and click y in the command prompt two times)

1080strixXOC.zip 1466k .zip file


Got the tip for it from here:

__
https://www.reddit.com/r/4siq81/any_news_on_evga_1080_ftw_bios_mods/

I would be grateful for anyone with a FE who tries it! I am just a little afraid because the card does not have two BIOSes.


----------



## SweWiking

Quote:


> Originally Posted by *uberwootage*
> 
> Its the same bios they use on the sc. Just downclocked.
> the bios for the ek i tested sucks. Lost 10mhz and i seen some throttling in wow. The corsair version when i ran it ran much better.
> 
> So far the best bios i used is the Waterforce. beats the stock fe bios "all of them" on my card


Do you have a download link for the waterforce bios ?


----------



## Snabeltorsk

https://www.techpowerup.com/vgabios/184613/184613


----------



## kx11




----------



## ssgwright

Quote:


> Originally Posted by *Neon Lights*
> 
> Anyone tried using this BIOS from here: http://forum.hwbot.org/showthread.php?t=159025 ? (You just have to pull the OC BIOS to the shortcut and click y in the command prompt two times)
> 
> 1080strixXOC.zip 1466k .zip file
> 
> 
> Got the tip for it from here:
> 
> __
> https://www.reddit.com/r/4siq81/any_news_on_evga_1080_ftw_bios_mods/
> 
> I would be grateful for anyone with a FE who tries it! I am just a little afraid because the card does not have two BIOSes.


drag and drop didn't work for me due to subvendor mismatch but I did try the bios, it worked ok but its not the best one for me.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Neon Lights*
> 
> Anyone tried using this BIOS from here: http://forum.hwbot.org/showthread.php?t=159025 ? (You just have to pull the OC BIOS to the shortcut and click y in the command prompt two times)
> 
> 1080strixXOC.zip 1466k .zip file
> 
> 
> Got the tip for it from here:
> 
> __
> https://www.reddit.com/r/4siq81/any_news_on_evga_1080_ftw_bios_mods/
> 
> I would be grateful for anyone with a FE who tries it! I am just a little afraid because the card does not have two BIOSes.


I tried both the strix and strix XOC from that thread on a Sea Hawk EK X, both worked first time. I tried them again today and this time the monitor wouldn't go on, but pc booted up. Nothing worked to get the monitor going with those bios'.

Had to stick an old card in to flash the 1080 back to stock bios to get the system up and running again.

Don't think the voltage over 1.1v really helped me when I did test those bios'.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Neon Lights*
> 
> Anyone tried using this BIOS from here: http://forum.hwbot.org/showthread.php?t=159025 ? (You just have to pull the OC BIOS to the shortcut and click y in the command prompt two times)
> 
> 1080strixXOC.zip 1466k .zip file
> 
> 
> Got the tip for it from here:
> 
> __
> https://www.reddit.com/r/4siq81/any_news_on_evga_1080_ftw_bios_mods/
> 
> I would be grateful for anyone with a FE who tries it! I am just a little afraid because the card does not have two BIOSes.


I did your wasting your time..


----------



## zGunBLADEz

Quote:


> Originally Posted by *nexxusty*
> 
> Fixed that for you. ;-)
> 
> Seahawk BIOS works fine on FE's. No performance loss AFAIK.


It seems ok dont like the 105% power limiter tho
But close to that 25k graphics score so it seems ok for 2101/590+, still my founder bios net me 25,200


----------



## ssgwright

i want a bios where when your not gaming it throttles but when you game (no matter what game) the clocks and volts max out. no darn throttling.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ssgwright*
> 
> i want a bios where when your not gaming it throttles but when you game (no matter what game) the clocks and volts max out. no darn throttling.


You have to do that yourself using the curve i have mine a 2.1GHz a 1.00mV it dont throttle or nothing it stays there. original founder bios.. on water btw



In all the bios i have tried it takes different settings you have to play with the curve till it sticks...

If it doesnt stick is running out of power try lower


----------



## ssgwright

how no matter what I set it at it throttles, for example I love the original black ops multiplayer and it never maxes out with that game... granted i never drop below 144fps but still...


----------



## jedimasterben

Quote:


> Originally Posted by *zGunBLADEz*
> 
> You have to do that yourself using the curve i have mine a 2.1GHz a 1.00mV it dont throttle or nothing it stays there. original founder bios.. on water btw
> 
> 
> 
> In all the bios i have tried it takes different settings you have to play with the curve till it sticks...
> 
> If it doesnt stick is running out of power try lower


Damn. I wanna see that Afterburner log


----------



## Neon Lights

Sorry for wasting someone's time. I just checked on Hardwareluxx (German forum) in this thread: http://www.hardwareluxx.de/community/f305/nvidia-geforce-gtx-1080-gp104-sammelthread-faq-1118466-337.html and what basically is the case is that while you get up to 1.24V VCore and more GPU Clock, the actual performance, if you are not using a 1080 Strix, is actually lower than with the standard BIOS with every other card, so it seems to be the best to not use the BIOS unless you have a 1080 Strix.
The monitor output not working is because of the 1080 Strix (as you may know) having two HDMI and two DisplayPort connectors instead of just one HDMI and three DisplayPorts, so I think the first DisplayPort does not work, but the others should.

I personally am using a 1080 FE under water with the Shunt mod (put Liquid Metal thermal paste on all three shunt resistors, in order to bridge them) and am getting stable 2126 ("+220" in AB)/5508 ("+500" in AB).


----------



## Antsu

Quote:


> Originally Posted by *ssgwright*
> 
> how no matter what I set it at it throttles, for example I love the original black ops multiplayer and it never maxes out with that game... granted i never drop below 144fps but still...


AFAIK no bios right now allows you to go over the powerlimit as much as the card would like. Atleast on my Gigabyte G1 it doesn't matter what bios I run, I am always limited by power. This makes me think not even a bios editor will fix it, and you will indeed need to do a hard mod.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Neon Lights*
> 
> Sorry for wasting someone's time. I just checked on Hardwareluxx (German forum) in this thread: http://www.hardwareluxx.de/community/f305/nvidia-geforce-gtx-1080-gp104-sammelthread-faq-1118466-337.html and what basically is the case is that while you get up to 1.24V VCore and more GPU Clock, the actual performance, if you are not using a 1080 Strix, is actually lower than with the standard BIOS, so it seems to be the best to not use the BIOS unless you have a 1080 Strix.
> The monitor output not working is because of the 1080 Strix (as you may know) having two HDMI and two DisplayPort connectors instead of just one HDMI and three DisplayPorts, so I think the first DisplayPort does not work, but the others should.


I think DP port is the culprit as I think I'm using the other one instead of the one a couple days ago.

The bios didn't do much for me anyways, but still might check it out again later.

Thanks.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ssgwright*
> 
> how no matter what I set it at it throttles, for example I love the original black ops multiplayer and it never maxes out with that game... granted i never drop below 144fps but still...


Yesterday i did 6 hrs nonstop doom gameplay at 4k the card didnt even flinch with those settings....

Like i said play with the curve till you get it to stick... If it bounces drop the overclock a lil bit


----------



## Antsu

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Yesterday i did 6 hrs nonstop doom gameplay at 4k the card didnt even flinch with those settings....
> 
> Like i said play with the curve till you get it to stick... If it bounces drop the overclock a lil bit


Yeah, but you are using 1.00V... Most people here want to use atleast the "full" 1.093V, for which your solution does not work unfortunately.


----------



## Antsu

oops...


----------



## zGunBLADEz

Quote:


> Originally Posted by *Antsu*
> 
> Yeah, but you are using 1.00V... Most people here want to use atleast the "full" 1.093V, for which your solution does not work unfortunately.


Your not accomplishing too much with more volts in this case this gpu dont work like that.. The extreme overclockers prove that to us.. Multiple phases more power connectors prove us that...

Same as a 980ti feeding it volts are not going to give you more mhz.. hell i can do 1500+ with 1.16mV XD
you know how much volts it required for that extra 60mhz i managed to squeeze out of my 980ti?
ALL WAY UP TO 1.31mV 425watts bios and it wasn't even 24/7 stable, bench stable yes

Everything we have learn in die shrinks it will require more and more and more stupid ridiculous amounts of voltages that we cant provide or the chip will take without the appropriate cooling...

and after all pascal is maxwell


----------



## ssgwright

anyone know which bios has the 150% power limit instead of the 120%?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ssgwright*
> 
> anyone know which bios has the 150% power limit instead of the 120%?


This one:

*https://www.techpowerup.com/vgabios/184169/gigabyte-gtx1080-8192-160609*

I used it today on my card.


----------



## ssgwright

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> This one:
> 
> *https://www.techpowerup.com/vgabios/184169/gigabyte-gtx1080-8192-160609*
> 
> I used it today on my card.


what do you think of it?


----------



## MrTOOSHORT

Just another bios. I don't know too much about pascal to really say.


----------



## ssgwright

i know... I even enable k-boost and it still throttled on me lol


----------



## Antsu

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Your not accomplishing too much with more volts in this case this gpu dont work like that.. The extreme overclockers prove that to us.. Multiple phases more power connectors prove us that...
> 
> Same as a 980ti feeding it volts are not going to give you more mhz.. hell i can do 1500+ with 1.16mV XD
> you know how much volts it required for that extra 60mhz i managed to squeeze out of my 980ti?
> ALL WAY UP TO 1.31mV 425watts bios and it wasn't even 24/7 stable, bench stable yes
> 
> Everything we have learn in die shrinks it will require more and more and more stupid ridiculous amounts of voltages that we cant provide or the chip will take without the appropriate cooling...
> 
> and after all pascal is maxwell


You are right. I have had similiar experiences myself on 980 Ti and now on 1080. However, atleast in my case I do not throttle lower than 1.00V anyway, so this would net me 0% performance at best.

Which was my original point about this not helping those people who insist on pushing the last Mhz, what this conversation originally was about btw...


----------



## uberwootage

Quote:


> Originally Posted by *zGunBLADEz*
> 
> which one? Theres 2


Quote:


> Originally Posted by *zGunBLADEz*
> 
> Your not accomplishing too much with more volts in this case this gpu dont work like that.. The extreme overclockers prove that to us.. Multiple phases more power connectors prove us that...
> 
> Same as a 980ti feeding it volts are not going to give you more mhz.. hell i can do 1500+ with 1.16mV XD
> you know how much volts it required for that extra 60mhz i managed to squeeze out of my 980ti?
> ALL WAY UP TO 1.31mV 425watts bios and it wasn't even 24/7 stable, bench stable yes
> 
> Everything we have learn in die shrinks it will require more and more and more stupid ridiculous amounts of voltages that we cant provide or the chip will take without the appropriate cooling...
> 
> and after all pascal is maxwell


End of the day its the persons card. If someone wants to they can. You have no say over it unless you sending out paypal donations to buy peoples gpus. If someone wants to run more volts they can. Personally i want to run a solid 1.1v to my card. Why because i want to and its not going to damage my card. Its the same reason i have twin turbos on my camaro. Do i need a 800hp car when the speed limit is 70? Not at all i only even been to the track two times. But i wanted to top out the 6.2L the best i could. So i'll run 1.1v to it to squeeze out every last mhz i can because at the end of the day its my card and i'll do with it how i please.

We can provide upto 1.25v its just a matter of a bios. Those with water cooling can cool 1.25v no problem. So by your logic why overclock a 980ti? its a old hunk of junk if you gotta overclock it to make it relevant just put it in the trash where it belongs because there is absolutely no point in attempting to get every last drop of performance out of your card.

End of the day its not your card its someone elses. Anyone whos been overclocking the 1080's and working with them knows what to expect and if they wanna crank up the volts its there card they can.


----------



## Visceral

1080 Hybrid update.

I installed the EVGA hybrid kit on my 1080FE.

The bad:
The install instructions for it *suck* They suck, suck suck. They miss a step or two you'll have to guess and the worst part was I followed the instructions for every stage, including removal of some of the smallest screws I've ever seen, got to the end of the guide and started looking for the portion of the guide that tells you which screws go back where, etc.

There isn't one.

If you take the screws out, and there are ton, write down where they went. The ****ty instructions just STOP and don't even bother to address it.
Also, the screws for the crappy plastic back plate are ******* impossible to get back in. I ended up leaving the plastic backplate OFF.

The good:
Wow. Holy **** cool. At max load I haven't seen it go over 42c. It reminds me of my beloved Fury X's.
The overclocking: Haven't tried to increase my oc yet but I'm pretty sure I can. The OC is now ROCK SOLID. It never varies like it did before. It hits 2126 and just stays there. I'll try getting it higher.

TLDR: ****TY instructions. Great cooler.


----------



## PU skunk

Quote:


> Originally Posted by *uberwootage*
> 
> Not bad FE's are the best you just gotta cool them. I payed $699 for mine and no complaints 2.2ghz


So I hear. I plan to get an aftermarket cooler for sure. Last I checker there weren't any yet.


----------



## PU skunk

.
Quote:


> Originally Posted by *Inglewood78*
> 
> Are you in the US? FEs are actually quite easy to get at retail. Current Nvidia store has them in stock but it come in stock often in other etailers.
> 
> http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/


Your right, I was thinking of the $600 standard edition Nvidia was supposed to offer.


----------



## ssgwright

Quote:


> Originally Posted by *PU skunk*
> 
> So I hear. I plan to get an aftermarket cooler for sure. Last I checker there weren't any yet.


Quote:


> Originally Posted by *uberwootage*
> 
> My store had 10 the sold them in 4 hours.
> 
> Not bad FE's are the best you just gotta cool them. I payed $699 for mine and no complaints 2.2ghz


2.2 huh... on air? obviously... lets see some benches...


----------



## Rhadamanthis

@visceral

you can
Quote:


> Originally Posted by *Visceral*
> 
> 1080 Hybrid update.
> 
> I installed the EVGA hybrid kit on my 1080FE.
> 
> The bad:
> The install instructions for it *suck* They suck, suck suck. They miss a step or two you'll have to guess and the worst part was I followed the instructions for every stage, including removal of some of the smallest screws I've ever seen, got to the end of the guide and started looking for the portion of the guide that tells you which screws go back where, etc.
> 
> There isn't one.
> 
> If you take the screws out, and there are ton, write down where they went. The ****ty instructions just STOP and don't even bother to address it.
> Also, the screws for the crappy plastic back plate are ******* impossible to get back in. I ended up leaving the plastic backplate OFF.
> 
> The good:
> Wow. Holy **** cool. At max load I haven't seen it go over 42c. It reminds me of my beloved Fury X's.
> The overclocking: Haven't tried to increase my oc yet but I'm pretty sure I can. The OC is now ROCK SOLID. It never varies like it did before. It hits 2126 and just stays there. I'll try getting it higher.
> 
> TLDR: ****TY instructions. Great cooler.


you can write detailed instruction if you have mount official 1080 gtx hybrid kit for 1080?


----------



## Clockster

So I'm pulling the trigger on a MSI GTX 1080 SEA HAWK X today.
I've been fairly underwhelmed by the non ref 1080 cards I've owned, including the "almighty" Gigabyte GTX1080 Xtreme gaming.
Hopefully the Sea hawk does the job...better lol


----------



## Derko1

Quote:


> Originally Posted by *Clockster*
> 
> So I'm pulling the trigger on a MSI GTX 1080 SEA HAWK X today.
> I've been fairly underwhelmed by the non ref 1080 cards I've owned, including the "almighty" Gigabyte GTX1080 Xtreme gaming.
> Hopefully the Sea hawk does the job...better lol


I just got this same card last week and while my temps are super low, 38-42C on load and 29-30C on idle. I'm not able to get anything above 2.1 stable. I think I hit my max at 2088 core and 5547 on mem. I have gotten the voltage to go as high as 1.093 to do 2.1ghz, but my score drops by like 400 points on 3dmarks. So it seems like I've hit a wall with mine at 2088.

What have you been able to get with the other cards you've had? I'm a little disappointed honestly, but I love the temps and will probably get a second one later down the road.

My 3d marks scores with those clocks...

Fire Strike 18,245 : http://www.3dmark.com/fs/9387713
Fire Strike Extreme 10,579 : http://www.3dmark.com/fs/9387741
Fire Strike Ultra 5704: http://www.3dmark.com/fs/9387762


----------



## zGunBLADEz

Quote:


> Originally Posted by *uberwootage*
> 
> End of the day its the persons card. If someone wants to they can. You have no say over it unless you sending out paypal donations to buy peoples gpus. If someone wants to run more volts they can. Personally i want to run a solid 1.1v to my card. Why because i want to and its not going to damage my card. So i'll run 1.1v to it to squeeze out every last mhz i can because at the end of the day its my card and i'll do with it how i please.
> 
> We can provide upto 1.25v its just a matter of a bios. Those with water cooling can cool 1.25v no problem. So by your logic why overclock a 980ti? its a old hunk of junk if you gotta overclock it to make it relevant just put it in the trash where it belongs because there is absolutely no point in attempting to get every last drop of performance out of your card.
> 
> End of the day its not your card its someone elses. Anyone whos been overclocking the 1080's and working with them knows what to expect and if they wanna crank up the volts its there card they can.


Took out the irrelevant car analogy, to begin with.

ok let me first start saying 980TI is a better card than 1080.... Specially in the overclocking department.

Whos saying or mandating here how to run their cards??
If you want to feed the card 1.5mV go ahead be my guest. That dont going to make a difference..

To keep it simple, im stating the obvious pascal is like maxwell end of story...
More power more phases dont do crap to the card not even a fixed bios like the xoc that is not reporting power even do crap to it..

But if you want to feed it more volts and try another bioses which decreases performance or dont give you more juice well thats on you..
Im reporting MY FINDINGS for the new fellowers ocers out there. So they dont come and said hey guys im running uber 2.2ghz+ but my performance is lower, whats going on....

But like you said if you sleep better at night feeding it more volts hey be my guest XD


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Took out the irrelevant car analogy, to begin with.
> 
> ok let me first start saying 980TI is a better card than 1080.... Specially in the overclocking department.
> 
> Whos saying or mandating here how to run their cards??
> If you want to feed the card 1.5mV go ahead be my guest. That dont going to make a difference..
> 
> To keep it simple, im stating the obvious pascal is like maxwell end of story...
> More power more phases dont do crap to the card not even a fixed bios like the xoc that is not reporting power even do crap to it..
> 
> But if you want to feed it more volts and try another bioses which decreases performance or dont give you more juice well thats on you..
> But like you said if you sleep better at night feeding it more volts hey be my guest XD


Can't speak for the 980TI owners but 980's were better than GREAT overclockers 1127MHz @ 1.212V to 1544MHz @ 1.250V is a huge difference in performance even by Firestrike 13000 GPU to 16800 GPU scores. EVGAs KINGPIN team stated that 980TI overclocks better than 1080 but 1080 has much better performance regardless of 980TIs overclock abilities.


----------



## pez

Quote:


> Originally Posted by *nexxusty*
> 
> God... STOP USING FURMARK.
> 
> Ya hear? EVERYONE. It's not 2006, we're not testing GTX 280's.... Stop using Furmark.


While I don't agree with you all the time, I do now. Please. For the love of the GPUs.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> This one:
> 
> *https://www.techpowerup.com/vgabios/184169/gigabyte-gtx1080-8192-160609*
> 
> I used it today on my card.


Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Just another bios. I don't know too much about pascal to really say.


did you notice if the power limit was tripping less with this bios?


----------



## GraveDigger7878

So even under water it sounds like most people are not getting over 2.1ghz. I guess that is good given the stated base clock of the FE card. It looks like my Titan is kicking the bucket so I will have to get a 1080. I wonder if the EVGA classified will bring anything more to the table


----------



## KillerBee33

Quote:


> Originally Posted by *GraveDigger7878*
> 
> So even under water it sounds like most people are not getting over 2.1ghz. I guess that is good given the stated base clock of the FE card. It looks like my Titan is kicking the bucket so I will have to get a 1080. I wonder if the EVGA classified will bring anything more to the table


2200 most likely will be Top gaming OC with bios Mod which is roughly 35% over Stock which is the same general number 9Series gets.


----------



## GraveDigger7878

Thanks killerbee, that is exactly what I was looking for


----------



## yusuket520

I got a Gigabyte G1 with waterblock with EVGA BIOS I got 2164-2177 stable with power limit to 120%, where the G1 BIOS only allow 108% on power limit


----------



## uberwootage

Quote:


> Originally Posted by *GraveDigger7878*
> 
> So even under water it sounds like most people are not getting over 2.1ghz. I guess that is good given the stated base clock of the FE card. It looks like my Titan is kicking the bucket so I will have to get a 1080. I wonder if the EVGA classified will bring anything more to the table


It won't bring anything. At most it might be on par with what the FE's are pulling with better cooling but looks like the gpu's are binned and nvidia is keeping most of them for there own cards. I would like to see if the classified brings more volts then the 1.093 if so then should help out. But I think the Fe's are binned. Everything I seen they are pulling higher clocks in general and volt for volt every one that I seen is clocking better. I had a strix here I sold to a friend and the fe was faster.

Bring me to this. I was about to get a ek block. But its a total waste. I have a evga hybrid cooler I played $60 for load temps over clocked and looping 3d mark don't hit over 50c. The ek should preform better but not $90 better. Well see. After we get more volts I might go with an ek block but right now looks like goickibg cheap on a fe is the way to go.

I ran my strix for a while. Tested pretty much every bios and the card was 100mhz slower then the fe. Might not seem like much but pretty crappy when you considered the improvement in the card and power circuits over reference. I ended up selling it to my buddy at cost. Nice card just a downgrade from the fe I have. And I water cool he only runs air so its a better choice for him and on air its faster then the fe that runs pretty warm when ovsrclocked.


----------



## KillerBee33

Quote:


> Originally Posted by *GraveDigger7878*
> 
> Thanks killerbee, that is exactly what I was looking for


Was gonna sell the 1080 but did some # crunching








Makes sense to keep it for another 6-8 months


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> It won't bring anything. At most it might be on par with what the FE's are pulling with better cooling but looks like the gpu's are binned and nvidia is keeping most of them for there own cards. I would like to see if the classified brings more volts then the 1.093 if so then should help out. But I think the Fe's are binned. Everything I seen they are pulling higher clocks in general and volt for volt every one that I seen is clocking better. I had a strix here I sold to a friend and the fe was faster.
> 
> Bring me to this. I was about to get a ek block. But its a total waste. I have a evga hybrid cooler I played $60 for load temps over clocked and looping 3d mark don't hit over 50c. The ek should preform better but not $90 better. Well see. After we get more volts I might go with an ek block but right now looks like goickibg cheap on a fe is the way to go.
> 
> I ran my strix for a while. Tested pretty much every bios and the card was 100mhz slower then the fe. Might not seem like much but pretty crappy when you considered the improvement in the card and power circuits over reference. I ended up selling it to my buddy at cost. Nice card just a downgrade from the fe I have. And I water cool he only runs air so its a better choice for him and on air its faster then the fe that runs pretty warm when ovsrclocked.


We can stil ldo the volt mod








I would do it, but I'm not sure if it will yeld any better OC....under water(EK)

Btw, is there any temp sensor for VRMs on FE card ?


----------



## Joshwaa

Has anyone pulled apart a FTW 1080 yet? If so how was the TIM job and do you think it will be worth replacing or is heat not the limiting factor just the power%.


----------



## Snabeltorsk

Would really like a Bios Editor to be able to raise voltage and power limit.


----------



## Sourcesys

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Would really like a Bios Editor to be able to raise voltage and power limit.


Just programm one then









Or do the Coollaboratory Liquid Pro hardmod by shortening the resistors


----------



## metal409

Quote:


> Originally Posted by *KillerBee33*
> 
> Can few Founder Edition owners confirm this? And also what brand is that FE .


Here is my MSI FE.


----------



## KillerBee33

Quote:


> Originally Posted by *metal409*
> 
> Here is my MSI FE.


Yeap , i digged a bit and Subvendor is CHIP manufacturer , not BIOS like some state








Always had good luck with NVIDIA chips


----------



## JaredC01

Ended up flashing the Gigabyte Xtreme BIOS on my G1 Gaming card under full water (EK w/ backplate) and did a bit more tweaking.

I had done some prelim tests yesterday sitting comfortably at 2114 MHz in Heaven, but some further testing showed issues holding the number. Flashed the Xtreme BIOS for a higher power limit (150% from 108%), upped the voltage a bit (50% overage for 1.075v total core volts), and ran some more tests. Now showing 100% rock solid at 2114 MHz, zero throttling for any reason.

For giggles, here's the new setup...


Here was the old...


----------



## Visceral

Re: The 1080 Hybrid kit. The only advice I can give is pay attention where you pull the screws from. Even then, I was unable to get the plastic back plate back on (but seriously, I don't care)

My 1080 Hybrid FE now runs at 43c under load at 2139ghz.


----------



## KillerBee33

Quote:


> Originally Posted by *Visceral*
> 
> Re: The 1080 Hybrid kit. The only advice I can give is pay attention where you pull the screws from. Even then, I was unable to get the plastic back plate back on (but seriously, I don't care)
> 
> My 1080 Hybrid FE now runs at 43c under load at 2139ghz.


It's doable with a bit of Dremeling







puting the shroud back on top of it and the plastiglas too


----------



## Methodical

Just waiting for the Ti to complete my build. I'm usually not the patient type.


----------



## Snabeltorsk

Quote:


> Originally Posted by *Sourcesys*
> 
> Just programm one then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or do the Coollaboratory Liquid Pro hardmod by shortening the resistors


Nah i want to be able to raise voltage in software


----------



## jedimasterben

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Nah i want to be able to raise voltage in software


Doesn't work that way without removing the power limit, which is only available by bridging the three resistors on the PCB









Honestly, it's a stupid simple mod that is very hard to screw up if you're worried about that sort of thing.


----------



## Snabeltorsk

Quote:


> Originally Posted by *jedimasterben*
> 
> Doesn't work that way without removing the power limit, which is only available by bridging the three resistors on the PCB
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly, it's a stupid simple mod that is very hard to screw up if you're worried about that sort of thing.


Im not limited by PL im limited by voltage.
core 2126 [email protected] 1.0930Mv and i never hit the 120% powerlimit, though i could use more voltage.
So bridging rs1.2.3 doesnt help my voltage now does it ?


----------



## Neon Lights

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Im not limited by PL im limited by voltage.
> core 2126 [email protected] 1.0930Mv and i never hit the 120% powerlimit, though i could use more voltage.
> So bridging rs1.2.3 doesnt help my voltage now does it ?


The same for me, I have the Shunt mod, I just need an FE BIOS with 1.24V max Vcore (higher and the GPU gets problems)!


----------



## jedimasterben

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Im not limited by PL im limited by voltage.
> core 2126 [email protected] 1.0930Mv and i never hit the 120% powerlimit, though i could use more voltage.
> So bridging rs1.2.3 doesnt help my voltage now does it ?


For some reason I was thinking you were on a FE board









There are volt mods you can do on the xDevs article, but that is taking it to an extreme.







What is your highest power limit you're hitting now?


----------



## Snabeltorsk

109
Quote:


> Originally Posted by *jedimasterben*
> 
> For some reason I was thinking you were on a FE board
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There are volt mods you can do on the xDevs article, but that is taking it to an extreme.
> 
> 
> 
> 
> 
> 
> 
> What is your highest power limit you're hitting now?


PL 109 when bench/gaming at 4K. But ofc its likely i need higher PL aswell when/if i can raise the voltage abit higher than 1.0930Mv
Im on a MSI Sea Hawk EK.


----------



## nexxusty

Quote:


> Originally Posted by *pez*
> 
> While I don't agree with you all the time, I do now. Please. For the love of the GPUs.


Hehe, I do have my moments pez. ;-)

Quote:


> Originally Posted by *KillerBee33*
> 
> Yeap , i digged a bit and Subvendor is CHIP manufacturer , not BIOS like some state
> 
> 
> 
> 
> 
> 
> 
> 
> Always had good luck with NVIDIA chips


Nvidia makes all the CHIP's. Subvendor is the name of the Subvendor who made the BIOS.

Subvendor literally means a vendor under another vendor. Nvidia being the top vendor.

Not sure where you got this info, however it's incorrect.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Hehe, I do have my moments pez. ;-)
> Nvidia makes all the CHIP's. Subvendor is the name of the Subvendor who made the BIOS.
> 
> Subvendor literally means a vendor under another vendor. Nvidia being the top vendor.
> 
> Not sure where you got this info, however it's incorrect.


Got any info other than yourself?


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Got any info other than yourself?


No need.

Naw I don't. I'm telling you tho boy!! Hehe.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> No need.
> 
> Naw I don't. I'm telling you tho boy!! Hehe.


I knew you wouldnt. Just because you feel like saying something doesn't mean you should.


----------



## Snabeltorsk

Quote:


> Originally Posted by *KillerBee33*
> 
> I knew you wouldnt. Just because you feel like saying something doesn't mean you should.


Nvidia makes all the chip,s


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> I knew you wouldnt. Just because you feel like saying something doesn't mean you should.


I don't like people spreading misinformation.

You were doing that.

I don't need sources. I don't talk crap. I know what I am talking about.

Subvendor= BIOS made by Subvendor. Period.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> I don't like people spreading misinformation.
> 
> You were doing that.
> 
> I don't need sources. I don't talk crap. I know what I am talking about.
> 
> Subvendor= BIOS made by Subvendor. Period.


You pretty loud for someone with no info.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> You pretty loud for someone with no info.


Sure am.

I am rarely wrong however. Been doing this 20+ years.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Sure am.
> 
> I am rarely wrong however. Been doing this 20+ years.


Congratulations!!! but Screaming that doesn't make it a fact. But thanx anyway, askin you for information is a headache.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Congratulations!!! but Screaming that doesn't make it a fact. But thanx anyway, askin you for information is a headache.


Not if you take what I say at face value...

Anyway, don't want to start an argument or bad blood bro. Just keep what I said in mind. That's all.


----------



## Inglewood78

Nvidia makes all GPU chips.

Founder's Edition = Made by Nvidia, vendor (Asus, EVGA, etc) puts it in their own branded box and bios

Custom cards (Xtreme, Strix) = Nvidia GPU chip, custom vendor PCB, custom bios.


----------



## nexxusty

Quote:


> Originally Posted by *Inglewood78*
> 
> Nvidia makes all GPU chips.
> 
> Founder's Edition = Made by Nvidia, vendor (Asus, EVGA, etc) puts it in their own branded box and bios
> 
> Custom cards (Xtreme, Strix) = Nvidia GPU chip, custom vendor PCB, custom bios.


Thank you.
Quote:


> Originally Posted by *Snabeltorsk*
> 
> Nvidia makes all the chip,s


You too. Heh.


----------



## kx11

so is anyone interested in Galax HOF 1080 Bios ?!! i'm not an expert but i can extract it and upload it here then someone will experiment with it


----------



## Snabeltorsk

Quote:


> Originally Posted by *kx11*
> 
> so is anyone interested in Galax HOF 1080 Bios ?!! i'm not an expert but i can extract it and upload it here then someone will experiment with it


Sure Thing, ill try it


----------



## dante020

Does anyone know the maximum resolution and refresh rate you can get out of the HDMI port while using a passive HDMI to DVI adapter? I have a few DVI-only 1440p60 monitors and would like to avoid using my flaky Accell DP to Dual Link DVI active adapter.

EDIT: I know that you couldn't do this in the past, however, I was thinking HDMI 2.0b might change things.


----------



## uberwootage

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Sure Thing, ill try it


Let me know how it runs. I'm stuck at work.


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Can't speak for the 980TI owners but 980's were better than GREAT overclockers 1127MHz @ 1.212V to 1544MHz @ 1.250V is a huge difference in performance even by Firestrike 13000 GPU to 16800 GPU scores. EVGAs KINGPIN team stated that 980TI overclocks better than 1080 but 1080 has much better performance regardless of 980TIs overclock abilities.


of course it have better performance, it should...

But you cant beat a 50% of overclock vs a barely 30-35% that we are getting on a 1080 as avg.

Not only that, a 980TI with a 50% of overclock is a beast the 1080 actually you are not getting the profits the higher you go because it scales bad with it.


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> of course it have better performance, it should...
> 
> But you cant beat a 50% of overclock vs a barely 30-35% that we are getting on a 1080 as avg.
> 
> Not only that, a 980TI with a 50% of overclock is a beast the 1080 actually you are not getting the profits the higher you go because it scales bad with it.


Have you noticed a big difference in Performance after 1506 on 980Ti?
We should really wait for BIOS tools before jumping the ship







Personally 35% increase is more than satisfying . Anything over makes less sense and probably not game stable








Had 1544 for 24/7 on 980 did benchmarking @ 1588 but not for gaming .


----------



## boredgunner

Quote:


> Originally Posted by *zGunBLADEz*
> 
> of course it have better performance, it should...
> 
> But you cant beat a 50% of overclock vs a barely 30-35% that we are getting on a 1080 as avg.
> 
> Not only that, a 980TI with a 50% of overclock is a beast the 1080 actually you are not getting the profits the higher you go because it scales bad with it.


I actually got some decent gains by overclocking my GTX 1080 to 2063 MHz (typical, will probably be higher in the winter). I had an MSI GTX 980 Ti Lightning previously which I ran at 1488, decent upgrade but only because I was able to sell the 980 Ti and some other stuff that I don't use.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Have you noticed a big difference in Performance after 1506 on 980Ti?
> We should really wait for BIOS tools before jumping the ship
> 
> 
> 
> 
> 
> 
> 
> Personally 35% increase is more than satisfying . Anything over makes less sense and probably not game stable
> 
> 
> 
> 
> 
> 
> 
> 
> Had 1544 for 24/7 on 980 did benchmarking @ 1588 but not for gaming .


I went through 3 980's in a vain attempt to try to get a 1600mhz clocker. No dice.

My ACX 2.0 was the best of the bunch at 1568mhz/8000mhz. Nothing I did would get me 1600mhz. Not a waterblock, voltmods... nothing. Lol.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> I went through 3 980's in a vain attempt to try to get a 1600mhz clocker. No dice.
> 
> My ACX 2.0 was the best of the bunch at 1568mhz/8000mhz. Nothing I did would get me 1600mhz. Not a waterblock, voltmods... nothing. Lol.


Got mine to 1595 @ 1.312V but it would crash in beching 2 out of 4 times







1582 was Solid @ 1.275V but not gaming Solid. Most i got out of 980 is17000 GPU Score in Firestrike







Oh and that's on a 65.2 ASIC


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Have you noticed a big difference in Performance after 1506 on 980Ti?
> We should really wait for BIOS tools before jumping the ship
> 
> 
> 
> 
> 
> 
> 
> Personally 35% increase is more than satisfying . Anything over makes less sense and probably not game stable
> 
> 
> 
> 
> 
> 
> 
> 
> Had 1544 for 24/7 on 980 did benchmarking @ 1588 but not for gaming .


Im not jumping ships i already sold my 980ti and got a 1080.. My 1080 is beyond warranty/rma now XD




Now she can fit a EK VGA Supremacy block no problems XD


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Got mine to 1595 @ 1.312V but it would crash in beching 2 out of 4 times
> 
> 
> 
> 
> 
> 
> 
> 1582 was Solid @ 1.275V but not gaming Solid


So close... lol. Mine was the same at 1600mhz. It would game or bench for about 2-5 minutes then crash no matter what.

I loved my 980. Best card I've owned until the 1080 rolled along. I didn't really miss out on the 980ti as I used 1080p then.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> So close... lol. Mine was the same at 1600mhz. It would game or bench for about 2-5 minutes then crash no matter what.
> 
> I loved my 980. Best card I've owned until the 1080 rolled along. I didn't really miss out on the 980ti as I used 1080p then.


Still have that little beast







you can see what i did to it in my Sig , not the best work but it worked


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Still have that little beast
> 
> 
> 
> 
> 
> 
> 
> you can see what i did to it in my Sig , not the best work but it worked


That looks really cool.

Nice job dude.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> That looks really cool.
> 
> Nice job dude.


If/Once the BIOS tool is out ill try to do a better job with the 1080 , just not sure if EVGA aio is the way to go, may take tht 980 apart and pit it back together as stock just to grab the Bracket from the pump , my idea was H90 .


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Still have that little beast
> 
> 
> 
> 
> 
> 
> 
> you can see what i did to it in my Sig , not the best work but it worked


Same stuff i did with my 1080 im not using the glass tho but the plate and everything managed to flush no problems so i can have gpu/cpu on same loop instead of an aio..
In the beginning i had the evga hybrid kit on it then decided to go back to custom water. Long story short i had to hard mod the card XD


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Same stuff i did with my 1080 im not using the glass tho but the plate and everything managed to flush no problems so i can have gpu/cpu on same loop instead of an aio..
> In the beginning i had the evga hybrid kit on it then decided to go back to custom water. Long story short i had to hard mod the card XD


I might do a custom Wall Mount Case with custom loop sometime, as of now i'm done spending money on things i have no time to enjoy .


----------



## PU skunk

Quote:


> Originally Posted by *KillerBee33*
> 
> I might do a custom Wall Mount Case with custom loop sometime.


I would like to see that.


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Im not jumping ships i already sold my 980ti and got a 1080.. My 1080 is beyond warranty/rma now XD
> 
> 
> 
> 
> Now she can fit a EK VGA Supremacy block no problems XD


Yeah this looks like a bigger job than slaping an aio unit. Did you do anything to shroud ot the plastiglas piece?


----------



## zGunBLADEz

No, i manage to cover the block entirely with the shroud.. The only thing it dont have is the glass as im using that to route the elbows of the block out.

I made a cover for it then i decided not to so im using the gpu itself to feed air to my rad as well...


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> No, i manage to cover the block entirely with the shroud.. The only thing it dont have is the glass as im using that to route the elbows of the block out.
> 
> I made a cover for it then i decided not to so im using the gpu itself to feed air to my rad as well...


Cool. How are the temps with that?


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Cool. How are the temps with that?


85F Ambient (1200rpms fans)
Doom 4k 6hrs log, middle one is the gpu. Highest one is the package core temp (Highest core temperature) and bottom one is the TCASE


Spoiler: Warning: Spoiler!


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 85F Ambient (1200rpms fans)
> Doom 4k 6hrs log, middle one is the gpu. Highest one is the package core temp (Highest core temperature) and bottom one is the TCASE
> 
> 
> Spoiler: Warning: Spoiler!


Nice. AIO units are not as impressive against that


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Nice. AIO units are not as impressive against that


the 1080 behave quite nice on the hybrid kit when i had on it..


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> the 1080 behave quite nice on the hybrid kit when i had on it..


Mine starts at 57 degrees VRel and around 65 degrees Pwr & VRel keeping it under that on FE is close to impossible .


----------



## uberwootage

Quote:


> Originally Posted by *zGunBLADEz*
> 
> the 1080 behave quite nice on the hybrid kit when i had on it..


Yeah in games i dont see higher then 48c with 2.164 on the core. Deff. worth the $60 i payed for the cooler on amazon.


----------



## KillerBee33

Quote:


> Originally Posted by *uberwootage*
> 
> Yeah in games i dont see higher then 48c with 2.164 on the core. Deff. worth the $60 i payed for the cooler on amazon.


Yeap i keep checking they are back to 67$ from 99$ two weeks ago .


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> Mine starts at 57 degrees VRel and around 65 degrees Pwr & VRel keeping it under that on FE is close to impossible .


Then you have a bad mount on that 1080.. I have a post here when i tested the 1080 on a 95F degree day on the hybrid. Thats before i started to lock clocks on the curve.

Here http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/970#post_25251216


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Then you have a bad mount on that 1080.. I have a post here when i tested the 1080 on a 95F degree day on the hybrid.
> 
> Here http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/970#post_25251216


No mount just yet, all stock i tried askin if anyone got the 1080 KIT from EVGA and if there is a difference but no answer yet .Stock Reference cooler with agressive Fan profile it gets to 72 max . I'll look into aio again when bios tools are out .Tried the CURVE and havent noticed anything different .


----------



## zGunBLADEz

Quote:


> Originally Posted by *KillerBee33*
> 
> No mount just yet, all stock i tried askin if anyone got the 1080 KIT from EVGA and if there is a difference but no answer yet .Stock Reference cooler with agressive Fan profile it gets to 72 max . I'll look into aio again when bios tools are out .Tried the CURVE and havent noticed anything different .


well theres the tests from a evga 980 ti kit on a 1080 on a 95f degree day..

Of course i have high perf LOUD fans on it (2200rpms) push/pull..


----------



## uberwootage

Quote:


> Originally Posted by *KillerBee33*
> 
> Yeap i keep checking they are back to 67$ from 99$ two weeks ago .


Yeah i seen it and i was like buy. Cant beat it.


----------



## zGunBLADEz

Quote:


> Originally Posted by *uberwootage*
> 
> Yeah i seen it and i was like buy. Cant beat it.


I grab it from $39 bucks cant beat that... XD
Was marked as used but when i received the darn thing was brand new still have the thermal paste on it and nothing missing..

Was silent too for my surprise, my first kit the one i used on the 980ti the darn pump was abit loud


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> well theres the tests from a evga 980 ti kit on a 1080 on a 95f degree day..
> 
> Of course i have high perf LOUD fans on it (2200rpms) push/pull..


I got a bunch of Thermaltake Riings on sale 9 $ each will use that again but i really want to try and fit H90 140mm in push , may or may not make a big difference just got a nice 140 LED fan


----------



## KillerBee33

Quote:


> Originally Posted by *zGunBLADEz*
> 
> I grab it from $39 bucks cant beat that... XD
> Was marked as used but when i received the darn thing was brand new still have the thermal paste on it and nothing missing..
> 
> Was silent too for my surprise, my first kit the one i used on the 980ti the darn pump was abit loud


Are you using it now? If not i'd like to buy it just for that Pump Bracket


----------



## H3avyM3tal

I have this oc on my card. Temps never pass 73 when benching. However, frequency does. It starts at 2112 and settles down around 2060. Why is that? I was under the impression that if temps are ok, the it won't throttle? Is that because it's unstable (I game for hours with this oc).


----------



## DOOOLY

I just exchanged my MSI Aero for a asus FE because of a screw that had a void warranty sticker then when i open up the asus FE i find a sticker on the screw. I have a EK block coming and i don't want to void my warranty. I hear from some people I will be fine some say not, can anyone clarify this ?


----------



## Joshwaa

Quote:


> Originally Posted by *H3avyM3tal*
> 
> 
> 
> I have this oc on my card. Temps never pass 73 when benching. However, frequency does. It starts at 2112 and settles down around 2060. Why is that? I was under the impression that if temps are ok, the it won't throttle? Is that because it's unstable (I game for hours with this oc).


What does it say your power % is getting up to when your clocks start dropping?


----------



## partypoison25

Quote:


> Originally Posted by *DOOOLY*
> 
> I just exchanged my MSI Aero for a asus FE because of a screw that had a void warranty sticker then when i open up the asus FE i find a sticker on the screw. I have a EK block coming and i don't want to void my warranty. I hear from some people I will be fine some say not, can anyone clarify this ?


Does not void warrenty. Have that straight from MSI themselves


----------



## ChaosBlades

Quote:


> Originally Posted by *DOOOLY*
> 
> I just exchanged my MSI Aero for a asus FE because of a screw that had a void warranty sticker then when i open up the asus FE i find a sticker on the screw. I have a EK block coming and i don't want to void my warranty. I hear from some people I will be fine some say not, can anyone clarify this ?


EVGA allows water-blocks, any other company and you will be less likely to get an RMA or flat out be denied. I have had many EVGA cards and put water blocks on them all. A few times I had to RMA and had zero problems.

Edit: IF you do need to RMA you just need to put the original EVGA cooler back on the card.
Quote:


> Originally Posted by *partypoison25*
> 
> Does not void warrenty. Have that straight from MSI themselves


If it didn't void the warranty the sticker would not be there.


----------



## Benjiw

Quote:


> Originally Posted by *ChaosBlades*
> 
> EVGA allows water-blocks, any other company and you will be less likely to get an RMA or flat out be denied. I have had many EVGA cards and put water blocks on them all. A few times I had to RMA and had zero problems.
> If it didn't void the warranty the stick would not be there.


Another vote for EVGA for their customer support, but also I've heard that MSI will still honour the warranty even if the sticker has been tampered with before so I don't think that the poster before you is wrong.


----------



## DOOOLY

What about Asus ?


----------



## MrTOOSHORT

I have a msi sea hawk 1080 ek x. Have a warranty sticker on the screw too. But in the accessories, there is some thermal paste from ek. Of course my temps weren't the best so I opened her up and redid the paste. Now if ripping that screw voids the warranty, then why the extra thermal paste?


----------



## nexxusty

Putting an H110i on my 1080 tomorrow. I'm expecting load temps to be 55c max. Probably less.


----------



## lyang238

Quote:


> Originally Posted by *nexxusty*
> 
> Putting an H110i on my 1080 tomorrow. I'm expecting load temps to be 55c max. Probably less.


Seems a bit overkill. I think you would be OK with a H90 or even H55. My Sea Hawk X is doing 50C max in an Inwin 805.


----------



## aberrero

The H90 is already almost overkill for an overvolted 290x, which uses over 200w more than a 1080.

I know people on this site aren't concerned with practicality, but I still say that watercooling the 1080 really doesn't make a whole lot of sense. You are sacrificing VRM cooling for a pretty meaningless drop in core temps. Any good after market cooler can keep temps below 60-65C easily. These GPUs can run at up to nearly 120C, or at least 80C without throttling. Why introduce pump noise and risk the VRMs overheating?


----------



## fat4l

Guys, does FE have any temp sensor for vrms ?


----------



## H3avyM3tal

Quote:


> Originally Posted by *Joshwaa*
> 
> What does it say your power % is getting up to when your clocks start dropping?


Valley shows 2113 (scoring 4000 on extreme hd preset), but AB shows


----------



## Snabeltorsk

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Valley shows 2113 (scoring 4000 on extreme hd preset), but AB shows


Powerlimit kicks in so youre card is throttling.
raise the powerlimit further in AB if you can.
Score is fine so if youre happy im happy


----------



## H3avyM3tal

Quote:


> Originally Posted by *Joshwaa*
> 
> Powerlimit kicks in so youre card is throttling.
> raise the powerlimit further in AB if you can.
> Score is fine so if youre happy im happy


Heh, powerlimit is at 130 already. A shame lol.

But I'm fine with it









However, why are the readings in ab wrong? I can't be getting that score with 1500Mhz.


----------



## Snabeltorsk

Heres my AB during a Valley Extrem HD Run
41 max temp
No Vrel, pwr or Vop.
I can clock it higher but then Vrel kicks in when i go over 45 degrees celsius so i keep it at 2100.
Gotta love thoose temps though. And im going to change the paste to coolabatory soon.


----------



## H3avyM3tal

Everything looks nominal. You have a go.

Think maybe I can up the vcore some more, or am I done with this core?


----------



## Snabeltorsk

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Everything looks nominal. You have a go.
> 
> Think maybe I can up the vcore some more, or am I done with this core?


Weird if it says 1500 in AB and 2113 in Valley, try again


----------



## ChaosBlades

Quote:


> Originally Posted by *Benjiw*
> 
> Another vote for EVGA for their customer support, but also I've heard that MSI will still honour the warranty even if the sticker has been tampered with before so I don't think that the poster before you is wrong.


I'm just saying I would not count on it. EVGA has no such sticker. Just want to be clear, EVGA has a known track record and has been documented stating it is fine numerous times, other companies can not say that. Just keep your stock cooler and re-attached if you need to RMA.


----------



## pez

Quote:


> Originally Posted by *aberrero*
> 
> The H90 is already almost overkill for an overvolted 290x, which uses over 200w more than a 1080.
> 
> I know people on this site aren't concerned with practicality, but I still say that watercooling the 1080 really doesn't make a whole lot of sense. You are sacrificing VRM cooling for a pretty meaningless drop in core temps. Any good after market cooler can keep temps below 60-65C easily. These GPUs can run at up to nearly 120C, or at least 80C without throttling. Why introduce pump noise and risk the VRMs overheating?


Well that's for AIO kits.

People that are doing custom loops aren't going to run into that issue. I find two benefits to WC'ing the 1080 that are specific for my use-case so far. My cards in SLI work up some heat, and therefore some fan noise. GTA V @ 4K puts bother GPUs in a constant 95-100% usage. Top GPU is getting to 78-79C with small blips to 80C with the bottom card sticking to 71-72C with small blips to 73C. Fans, respectively getting to 80% and 65% max. Also, as been said before, retaining certain temps helps to hold boost at a more steady range. My cards boost anywhere from the stock boost of 1835 all the way to 1946 at certain times. So for me, in the case of a custom loop, there would be an improvement to noise, decreased heat, and a more consistent boost clock







.


----------



## CannedBullets

Heads up for EVGA GTX 1080 FTW owners. Definitely set a custom fan curve with EVGA PrecisionX to get better temps. Default max fan speed is 47% which is definitely not enough.


----------



## KickAssCop

If you had the option of Gigabyte G1, Zotac Amp Extreme, EVGA FTW or ASUS Strix, which one will you buy?


----------



## Snabeltorsk

Quote:


> Originally Posted by *KickAssCop*
> 
> If you had the option of Gigabyte G1, Zotac Amp Extreme, EVGA FTW or ASUS Strix, which one will you buy?


If you were going to keep the cooler i would say Zotac.


----------



## pfinch

Here I am!
Zotac Extreme AMP! 1080

Do you guys think it's a good sample? 2113 Mhz // 70% FAN // max Temp 53°C ... without throttling

The score is via Teamviewer (@work) so maybe it's like 5750+

http://www.3dmark.com/3dm/13413202?


----------



## escalibur

Quote:


> Originally Posted by *KickAssCop*
> 
> If you had the option of Gigabyte G1, Zotac Amp Extreme, EVGA FTW or ASUS Strix, which one will you buy?


Zotac all the way! Gigabyte Xtreme and Zotac AMP! Extreme are possibly the best Pascal air coolers we've seen so far. Palit is not far behind (Palit's biggest con is much shorter warranty).


----------



## CannedBullets

Quote:


> Originally Posted by *KickAssCop*
> 
> If you had the option of Gigabyte G1, Zotac Amp Extreme, EVGA FTW or ASUS Strix, which one will you buy?


I'd say the EVGA FTW. The stock fan settings aren't really good for cooling which is why you need to use EVGA PrecisionX to set up a custom fan curve, after you get one set up the temps are a lot better than stock settings. I did that and my temperatures went from 77 C to 72 C for Rise of the Tomb Raider on max settings.

I don't know why EVGA set the default max fan speed to 47%, it should at least be 70 - 75%.


----------



## OmegaNemesis28

Long story short is Amazon finally freakin delivered my 1080FTW preorder. Over a month long. It pissed me off that EVGA was shipping new cards to new customers for awhile. During the wait I got a 1070 to tide me over. Now that I have the 1080, I did some benches on my machine if anyone is interested. I see an okay 13 or so percent improvement over the 1070 at 1080p on average. It's the higher resolutions that make the difference stand out 20 percent or more at 4k. I'm loving DOOM right now with DSR

Definitely a crazy upgrade from my 6990 dinosaur. I'm actually suspicious of my cheap RAM. I think they need a good over clock, probably the slowest pieces in my system.


----------



## zGunBLADEz

Quote:


> Originally Posted by *aberrero*
> 
> The H90 is already almost overkill for an overvolted 290x, which uses over 200w more than a 1080.
> 
> I know people on this site aren't concerned with practicality, but I still say that watercooling the 1080 really doesn't make a whole lot of sense. You are sacrificing VRM cooling for a pretty meaningless drop in core temps. Any good after market cooler can keep temps below 60-65C easily. These GPUs can run at up to nearly 120C, or at least 80C without throttling. Why introduce pump noise and risk the VRMs overheating?


You have to understand too that you taking the mayor heat source out of the card which is the core, the vrms and mem can be passively cooled just fine..
The heatsink (NO the pcb vrms plate) on the card it just cools the die same crap anyway on the hybrid but with the aio at least its not throwing that heat inside the card so your vrms/mems are running much cooler...


----------



## metal409

Using an MSI FE 1080 with EK block. Haven't played with clocks too much, but here's a quick valley run.


----------



## jedimasterben

Quote:


> Originally Posted by *metal409*
> 
> Using an MSI FE 1080 with EK block. Haven't played with clocks too much, but here's a quick valley run.


Holy dicks, 2139 with 1.00v.You win


----------



## AllGamer

Quote:


> Originally Posted by *KillerBee33*
> 
> Can few Founder Edition owners confirm this? And also what brand is that FE .


says NVIDIA (10DE) it's a MSI Founder Edition


Does anyone have a list of all the Subvendor IDs?
I did a search and couldn't find anything on the interweb


----------



## Clockster

Quote:


> Originally Posted by *Derko1*
> 
> I just got this same card last week and while my temps are super low, 38-42C on load and 29-30C on idle. I'm not able to get anything above 2.1 stable. I think I hit my max at 2088 core and 5547 on mem. I have gotten the voltage to go as high as 1.093 to do 2.1ghz, but my score drops by like 400 points on 3dmarks. So it seems like I've hit a wall with mine at 2088.
> 
> What have you been able to get with the other cards you've had? I'm a little disappointed honestly, but I love the temps and will probably get a second one later down the road.
> 
> My 3d marks scores with those clocks...
> 
> Fire Strike 18,245 : http://www.3dmark.com/fs/9387713
> Fire Strike Extreme 10,579 : http://www.3dmark.com/fs/9387741
> Fire Strike Ultra 5704: http://www.3dmark.com/fs/9387762


My core is sitting at +150
My mem is sitting at +400

Seems stable but haven't had much time to play with it.
Will do some proper benching and clocking this weekend.

http://www.3dmark.com/spy/99517


----------



## jedimasterben

Was playing some Just Cause 3 last night. After doing the power limit mod, I can set voltage to 1.081v and it will not fluctuate unless the GPU clock falls like during a loading screen. Successfully got to 2101MHz, but haven't yet tried to go higher, will be doing that this week.


----------



## Sourcesys

Quote:


> Originally Posted by *metal409*
> 
> Using an MSI FE 1080 with EK block. Haven't played with clocks too much, but here's a quick valley run.


Nice, but why only 1.00v?


----------



## uberwootage

Anyone try out the hof bios?


----------



## Alwrath

Quote:


> Originally Posted by *CannedBullets*
> 
> Heads up for EVGA GTX 1080 FTW owners. Definitely set a custom fan curve with EVGA PrecisionX to get better temps. Default max fan speed is 47% which is definitely not enough.


That is the way its been for every graphics card ever released since forever. Always always always set up a custom fan curve, even if the temps are passable you can always lower your temps by doing this giving your gpu a longer life, overclocked or not.


----------



## metal409

Quote:


> Originally Posted by *Sourcesys*
> 
> Nice, but why only 1.00v?


I haven't touched the voltage at all, it defaulted to that on it's own.


----------



## H3avyM3tal

Currently my 1080 is hooked up to the pse with one cable that splits into 2 pcie8. Should I hook it up with two different cables so that it uses 2 rails instead of 1?


----------



## uberwootage

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Currently my 1080 is hooked up to the pse with one cable that splits into 2 pcie8. Should I hook it up with two different cables so that it uses 2 rails instead of 1?


Depends on how hard your hitting that 12 volt rail. If you got a ton of stuff hooked up then yeah. If not then its fine unless you have a pretty weak PSU.


----------



## THEROTHERHAMKID

I have a g1 1080
Any better bios I can install for better oc?


----------



## fat4l

Guys. Please please tell me whats good(better than average) or average boost for 1080 FE, on air, ~70C ?


----------



## fewness

Quote:


> Originally Posted by *jedimasterben*
> 
> Was playing some Just Cause 3 last night. After doing the power limit mod, I can set voltage to 1.081v and it will not fluctuate unless the GPU clock falls like during a loading screen. Successfully got to 2101MHz, but haven't yet tried to go higher, will be doing that this week.


How do I mod the power limit please?


----------



## Inglewood78

You are looking for a stable 2100 core clock. If you can hit that then you are above average.


----------



## OmegaNemesis28

Quote:


> Originally Posted by *Alwrath*
> 
> That is the way its been for every graphics card ever released since forever. Always always always set up a custom fan curve, even if the temps are passable you can always lower your temps by doing this giving your gpu a longer life, overclocked or not.


If you set a custom fan curve, do you always have to run the app?


----------



## jedimasterben

Quote:


> Originally Posted by *fewness*
> 
> How do I mod the power limit please?


You need to short resistors RS1, RS2, and RS3 on the PCB, per this article: https://xdevs.com/guide/pascal_oc/


----------



## partypoison25

Quote:


> Originally Posted by *ChaosBlades*
> 
> EVGA allows water-blocks, any other company and you will be less likely to get an RMA or flat out be denied. I have had many EVGA cards and put water blocks on them all. A few times I had to RMA and had zero problems.
> 
> Edit: IF you do need to RMA you just need to put the original EVGA cooler back on the card.
> If it didn't void the warranty the sticker would not be there.


Its just there to warn off less experienced users. As long as you do not physically damage the card when taking it apart and put the stock cooler back on, MSI will honor warranty with the seal broken.


----------



## fewness

Quote:


> Originally Posted by *jedimasterben*
> 
> You need to short resistors RS1, RS2, and RS3 on the PCB, per this article: https://xdevs.com/guide/pascal_oc/










i'll give it a try then


----------



## THEROTHERHAMKID

Best bios for g1 1080 please?


----------



## Joshwaa

So far with my 1080 FTW I can get to 2147 core for about 7 seconds then it drops to 2101 and then to 2088 and will stay there for the entire time. I am only getting to 91% power and temps stay below 67C. Any Ideas why it would be pulling the core clock down? I have also tried going up on the volts but it does not seem to matter.


----------



## Shadowdane

Quote:


> Originally Posted by *metal409*
> 
> I haven't touched the voltage at all, it defaulted to that on it's own.


You likely got a very high ASIC quality... at this point no way to check that though. Previously higher ASIC ran at lower voltages. I'm kinda surprised GPUZ hasn't updated support for the GTX 10 models.. or did Nvidia no longer make this readable at all.

Have two 1080s here, both run at different voltages.
Card 1: 1.03v
Card 2: 1.05v


----------



## GreedyMuffin

Waterblock nesasarry on GTX 1080 for some overclocking as it was on the 980Ti?

Wondering if I should get myself a 1080 (I need another card for folding, min. 1070, so a couple of $ to get more performance would be worth it). but if I need to purchase a waterblock it will be to expensive, at least for now.

Thanks!









What is your favorite modell of the 1080?


----------



## xer0h0ur

Quote:


> Originally Posted by *Shadowdane*
> 
> You likely got a very high ASIC quality... at this point no way to check that though. Previously higher ASIC ran at lower voltages. I'm kinda surprised GPUZ hasn't updated support for the GTX 10 models.. or did Nvidia no longer make this readable at all.
> 
> Have two 1080s here, both run at different voltages.
> Card 1: 1.03v
> Card 2: 1.05v


GPU-Z actually just updated the other day but it removed ASIC readings altogether from it.


----------



## uberwootage

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Waterblock nesasarry on GTX 1080 for some overclocking as it was on the 980Ti?
> 
> Wondering if I should get myself a 1080 (I need another card for folding, min. 1070, so a couple of $ to get more performance would be worth it). but if I need to purchase a waterblock it will be to expensive, at least for now.
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What is your favorite modell of the 1080?


No not required. Get the zotac amp extreme and you will be happy with the cooler. However FE's clock higher but to get to the 2.1ghz zone on a fe your going to need a water block. Im using a $60 980ti hybrid cooler from amazon. At 2,177 on the core I load at 50c and games only put me at 48c.

Over clocking I pick a fe with a water block. Normal cards with no mods. Seahawk with the corsair cooler. Zotac amp extreme


----------



## Sourcesys

Quote:


> Originally Posted by *xer0h0ur*
> 
> GPU-Z actually just updated the other day but it removed ASIC readings altogether from it.


They removed ASCI from GPUZ only for 10XX cards, because the formula for ASCI has changed. the 90%+ ASCI values displayed with older GPUZ versions are meaningless. Generaly ASCI is pretty meaningless alltogether.


----------



## Snabeltorsk

Quote:


> Originally Posted by *uberwootage*
> 
> Anyone try out the hof bios?


The guy that said he had one hasnt listed it here yet.


----------



## uberwootage

Quote:


> Originally Posted by *Snabeltorsk*
> 
> The guy that said he had one hasnt listed it here yet.


Damn. I wanna try it lol.


----------



## xer0h0ur

Quote:


> Originally Posted by *uberwootage*
> 
> No not required. Get the zotac amp extreme and you will be happy with the cooler. However FE's clock higher but to get to the 2.1ghz zone on a fe your going to need a water block. Im using a $60 980ti hybrid cooler from amazon. At 2,177 on the core I load at 50c and games only put me at 48c.
> 
> Over clocking I pick a fe with a water block. Normal cards with no mods. Seahawk with the corsair cooler. Zotac amp extreme


I've seen plenty of people in this thread talking about doing 2100+ on air with their FE.


----------



## uberwootage

Quote:


> Originally Posted by *xer0h0ur*
> 
> I've seen plenty of people in this thread talking about doing 2100+ on air with their FE.


Most throttle unless they run at 100% fan. If you wanna run a solid 2.1 ghz you need better cooling


----------



## KillerBee33

Quote:


> Originally Posted by *uberwootage*
> 
> Most throttle unless they run at 100% fan. If you wanna run a solid 2.1 ghz you need better cooling


I've tested FE on air with fan profile 30% to 40 Degrees next mark 75% @ 65 Degrees and 100% @ 70 degrees , havent seen higher than 72 , starts throttling @ 57 degrees but wont go lower than 2088MHZ


----------



## Inglewood78

Quote:


> Originally Posted by *xer0h0ur*
> 
> I've seen plenty of people in this thread talking about doing 2100+ on air with their FE.


It's very hard to sustain a 2100+ overclock after 1hr of gaming on air. All cards throttle at some point over 50-60C, which is a hard temp to be under unless you are on water.


----------



## ikjadoon

Quote:


> Originally Posted by *Joshwaa*
> 
> So far with my 1080 FTW I can get to 2147 core for about 7 seconds *then it drops to 2101 and then to 2088* and will stay there for the entire time. I am only getting to 91% power and *temps stay below 67C*. Any Ideas why it would be pulling the core clock down? I have also tried going up on the volts but it does not seem to matter.


Quote:


> Originally Posted by *KillerBee33*
> 
> I've tested FE on air with fan profile 30% to 40 Degrees next mark 75% @ 65 Degrees and 100% @ 70 degrees , havent seen higher than 72 , *starts throttling @ 57 degrees but wont go lower than 2088MHZ*


I think this is GPU Boost 3.0 working as intended, actually. GPU Boost 3.0 has "optimizations" for water-cooling. So, if the GPU knows it's being water-cooled (i.e., lower than 50C load), then it boosts even more.
Quote:


> *NVIDIA GPU Boost 3.0
> *
> Dynamically maximizes clock speeds based on workload and allows enthusiast-class controls such as temperature target and fan controls, extra over-voltage headroom, and *optimizations for water-cooling solutions*. This allows gamers to push performance to new levels.


It might be 57C, but I've heard 50C from a few people. I hope Anandtech confirms/denies this tomorrow when their full GTX 1080 review launches.


----------



## Phinix

Quote:


> Originally Posted by *ikjadoon*
> 
> I hope Anandtech confirms/denies this tomorrow when their full GTX 1080 review launches.


How do you know this? I cant seem to find any post or comment talking about another review from them, I assume it's a very detailed review of the chip itself?


----------



## Joshwaa

Thanks for the info. Seems to be just what is going on. If EK would just hurry up with the FTW waterblock I would be all set.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Most throttle unless they run at 100% fan. If you wanna run a solid 2.1 ghz you need better cooling


You sir seem to know stuff.
Its really hot in here now...30C ambient lol
If i do 100% fan, max power and temp target, max volts im getting max boost 2150 and then sitting at 2105. Card temp goes to 71C.

Not sure if this is a good card?
Im going to custom watercool it.


----------



## xer0h0ur

If you're hitting 2150 stable at any point then you got a good card. I can't push much past 2114 stable and I am waterblocked. As is the 2100 MHz mark is generally speaking considered to be above average already. The outstanding cards are hitting 2200MHz but there really aren't many doing that.


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> You sir seem to know stuff.
> Its really hot in here now...30C ambient lol
> If i do 100% fan, max power and temp target, max volts im getting max boost 2150 and then sitting at 2105. Card temp goes to 71C.
> 
> Not sure if this is a good card?
> Im going to custom watercool it.


You got a good card. To me its pointless putting on a ek block for $125. The temps are pretty close to what you get with a $60 aio gtx 980ti hybrid. Im riding that until a better bios comes out. But i load at 48c in games 50c max.


----------



## xer0h0ur

Quote:


> Originally Posted by *uberwootage*
> 
> You got a good card. To me its pointless putting on a ek block for $125. The temps are pretty close to what you get with a $60 aio gtx 980ti hybrid. Im riding that until a better bios comes out. But i load at 48c in games 50c max.


LOL, wrong person to tell to avoid waterblocks. I don't think they have ever not put a waterblock on their video cards.

While I even somewhat regret putting a waterblock on my card at least in the end I have a silent rig running my radiator fans at barely a shade over 1000 RPM and my 1080 doesn't even go past 41C at load.


----------



## uberwootage

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, wrong person to tell to avoid waterblocks. I don't think they have ever not put a waterblock on their video cards.
> 
> While I even somewhat regret putting a waterblock on my card at least in the end I have a silent rig running my radiator fans at barely a shade over 1000 RPM and my 1080 doesn't even go past 41C at load.


Yeah to me the 8c drop is not worth the extra $65. I'll slap a ek on mine soon as the modded bios roll out until then i'll ride this aio on it.

update new wow patch is not hitting my 1080 hard at all. Inm citys and what not i can hold 2177 but in bg's im dropping to 1,300 lol. 20% load . Playing wow in a bg at 36c lol. time for k boost


----------



## Inglewood78

I think for return on investment, you can't beat a $70 AIO kit that fits the 1080 FE. I'm running a NZXT G10 with Corsair H105 and I flatline at about 48C overclocked, underload.

If i didn't have this already, I would have bought the AIO from amazon.


----------



## nexxusty

Quote:


> Originally Posted by *Inglewood78*
> 
> I think for return on investment, you can't beat a $70 AIO kit that fits the 1080 FE. I'm running a NZXT G10 with Corsair H105 and I flatline at about 48C overclocked, underload.
> 
> If i didn't have this already, I would have bought the AIO from amazon.


Hmm. Great news for me then. Putting an H110i on my FE tonight.

48c sounds great. Heh. Any extra heatsinks installed on RAM or VRM?


----------



## Twinnuke

Put my 1080 and 50% power limit and 60C temp limit. Playing Black Desert at 45 FPS maxed out with a severely underclocked card while I work. haha


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> You got a good card. To me its pointless putting on a ek block for $125. The temps are pretty close to what you get with a $60 aio gtx 980ti hybrid. Im riding that until a better bios comes out. But i load at 48c in games 50c max.










Well I'm watercoolign already so not plugging this card into the loop would be a sin









Pretty much what @xer0h0ur said









This card is also hitting power limit HARRD. 130% +

I'm also thinking of doing hard volt mod. I'm not sure, if even with bios tool we will be able to push as much as we need, for watercooling









See 1060 vmodded to 2200MHz easy, ON AIR !


----------



## ChaosBlades

Quote:


> Originally Posted by *nexxusty*
> 
> Hmm. Great news for me then. Putting an H110i on my FE tonight.
> 
> 48c sounds great. Heh. Any extra heatsinks installed on RAM or VRM?


He is using this so you will need to get that or the black version. It supports the 1080 FE
https://www.nzxt.com/products/kraken-g10-white

Edit: Not sure on the difference but it only supports the H110 and not the H110i according to the compatibility list.


----------



## GreedyMuffin

Ended up ordering a ASUS 1080 Strix. Cooler and quieter compared to the MSI gaming X.

Thanks for ya'll s help!


----------



## uberwootage

Quote:


> Originally Posted by *nexxusty*
> 
> Hmm. Great news for me then. Putting an H110i on my FE tonight.
> 
> 48c sounds great. Heh. Any extra heatsinks installed on RAM or VRM?


Stock cooler. pop off the heatsink cover. bolt on the block. i got the fan set to 60%


----------



## nexxusty

Quote:


> Originally Posted by *ChaosBlades*
> 
> He is using this so you will need to get that or the black version. It supports the 1080 FE
> https://www.nzxt.com/products/kraken-g10-white
> 
> Edit: Not sure on the difference but it only supports the H110 and not the H110i according to the compatibility list.


Yeah it's not ASETEK style now that I think of it. Oh well.

I have 2 Antec 1250's lying around. Good thing.

Thanks for the tip sir! Oh I have a Kraken G10 bracket. Picked it up on Monday.


----------



## uberwootage

https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2

Direct bolt on no cutting needed. Thats what im using.


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2
> 
> Direct bolt on no cutting needed. Thats what im using.


If I can't fit the G10 on with the VRM cooling solution I will cut the G10 bracket so it's identical to the EVGA one and use it that way.

I'm doing the shunt mod at the same time, not comfortable removing the power limit and not having heatsinks on the VRM.


----------



## uberwootage

https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2

Direct bolt on no cutting needed. Thats what im using.
Quote:


> Originally Posted by *nexxusty*
> 
> If I can't fit the G10 on with the VRM cooling solution I will cut the G10 bracket so it's identical to the EVGA one and use it that way.
> 
> I'm doing the shunt mod at the same time, not comfortable removing the power limit and not having heatsinks on the VRM.


Yeah i got the gigabyte waterforce bios on my FE and i can hold 2,177 solid in benches without a tdp mod.

yeah thats why im keeping the fan at 60%. let the stock heatsink soak upp the heat and let the fan cool it as much as it can. works great.


----------



## wsarahan

Hi guys how are you?

Can you help me with some 1080 SLI issue?

When i enter a game the both cards sometimes do not boost equal, now i`m playing Hitman and one card is bossting to 2089 and the second card is at stock, 1759...

After sometimes or when the GPu usage is more intense the card goes to 2089 like the other one, but i loose some FPS untill this happens

I was looking and realized the the second card only goes up in the clock when the usage passes someting about 50%, but in the old series like 980TI when i launched the game the cards was always with the same clock no matter what usage

I`m using AB 4.3 beta 4

Is it normal in the new Boost 3.0 and Pascal? Any way to make both cards always boots the same?

I`m using max performance profile at Nvidia control pannel


----------



## ikjadoon

Quote:


> Originally Posted by *Phinix*
> 
> How do you know this? I cant seem to find any post or comment talking about another review from them, I assume it's a very detailed review of the chip itself?


THE EDITOR FROM ANANDTECH REPLIED. Go to this 10TB hard drive review, go to page 3 of the comments, and you'll find this. Ryan Smith is the video card reviewer at Anandtech.



And the image he posted?



OH HOLY MOTHER OF GOODNESS, *IT'S CONFIRMED*. Our anecdotal testing was CORRECT. Pascal GPUs DO begin to reduce maximum boost at even 39C! Expect this graph to be fully explained in the next 16 hours with an article on Anandtech's front page.


----------



## AllGamer

yet the more reason to go Watercooling









I'm sure many of us here already suspected the temperature to speed drop issue, we just never really made detailed analysis to measure it


----------



## ssgwright

we need a darn bios editor!!!!


----------



## Vellinious

Pretty much says what a lot of us already knew from Maxwell....the cooler you keep it, the higher it'll boost.


----------



## nexxusty

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> Can you help me with some 1080 SLI issue?
> 
> When i enter a game the both cards sometimes do not boost equal, now i`m playing Hitman and one card is bossting to 2089 and the second card is at stock, 1759...
> 
> After sometimes or when the GPu usage is more intense the card goes to 2089 like the other one, but i loose some FPS untill this happens
> 
> I was looking and realized the the second card only goes up in the clock when the usage passes someting about 50%, but in the old series like 980TI when i launched the game the cards was always with the same clock no matter what usage
> 
> I`m using AB 4.3 beta 4
> 
> Is it normal in the new Boost 3.0 and Pascal? Any way to make both cards always boots the same?
> 
> I`m using max performance profile at Nvidia control pannel


All cards use different voltages. Likely one of your cards CAN'T boost as high in SLi.

You're only option is to cool them both better.


----------



## wsarahan

Quote:


> Originally Posted by *nexxusty*
> 
> All cards use different voltages. Likely one of your cards CAN'T boost as high in SLi.
> 
> You're only option is to cool them both better.


The problem is that one card do not boost at all sometimes, i can reach the voltage to 100% and the card stays at default clock, and after50% gpou usage the card starts to boost normally


----------



## uberwootage

https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2

Direct bolt on no cutting needed. Thats what im using.
Quote:


> Originally Posted by *nexxusty*
> 
> If I can't fit the G10 on with the VRM cooling solution I will cut the G10 bracket so it's identical to the EVGA one and use it that way.
> 
> I'm doing the shunt mod at the same time, not comfortable removing the power limit and not having heatsinks on the VRM.


Yeah i got the gigabyte waterforce bios on my FE and i can hold 2,177 solid in benches without a tdp mod.

yeah thats why im keeping the fan at 60%. let the stock heatsink soak upp the heat and let the fan cool it as much as it can. works great.
Quote:


> Originally Posted by *ikjadoon*
> 
> THE EDITOR FROM ANANDTECH REPLIED. Go to this 10TB hard drive review, go to page 3 of the comments, and you'll find this. Ryan Smith is the video card reviewer at Anandtech.
> 
> 
> 
> And the image he posted?
> 
> 
> 
> OH HOLY MOTHER OF GOODNESS, *IT'S CONFIRMED*. Our anecdotal testing was CORRECT. Pascal GPUs DO begin to reduce maximum boost at even 39C! Expect this graph to be fully explained in the next 16 hours with an article on Anandtech's front page.


Well i can say for a fact my FE does not reduce max boost at 39c. Hightest its been is 50c and the boost has always stayed at 2,177.


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2
> 
> Direct bolt on no cutting needed. Thats what im using.
> Yeah i got the gigabyte waterforce bios on my FE and i can hold 2,177 solid in benches without a tdp mod.
> 
> yeah thats why im keeping the fan at 60%. let the stock heatsink soak upp the heat and let the fan cool it as much as it can. works great.
> Well i can say for a fact my FE does not reduce max boost at 39c. Hightest its been is 50c and the boost has always stayed at 2,177.


Of course it doesn't.

No idea why they said 39c. That's BS.


----------



## ikjadoon

Quote:


> Originally Posted by *uberwootage*
> 
> Well i can say for a fact my FE does not reduce max boost at 39c. Hightest its been is 50c and the boost has always stayed at 2,177.


Quote:


> Originally Posted by *nexxusty*
> 
> Of course it doesn't.
> 
> No idea why they said 39c. That's BS.


True; I had never seen it as low as 39C (see my original comment to Ryan). I'm curious: maybe this behavior is enabled/disabled or even a configurable value in certain BIOSes.


----------



## uberwootage

Quote:


> Originally Posted by *nexxusty*
> 
> Of course it doesn't.
> 
> No idea why they said 39c. That's BS.


Well we know how right tomshardware can be at times. Some of the people who write those reviews have zero idea what they are talking about.


----------



## xer0h0ur

Quote:


> Originally Posted by *ikjadoon*
> 
> THE EDITOR FROM ANANDTECH REPLIED. Go to this 10TB hard drive review, go to page 3 of the comments, and you'll find this. Ryan Smith is the video card reviewer at Anandtech.
> 
> 
> 
> And the image he posted?
> 
> 
> 
> OH HOLY MOTHER OF GOODNESS, *IT'S CONFIRMED*. Our anecdotal testing was CORRECT. Pascal GPUs DO begin to reduce maximum boost at even 39C! Expect this graph to be fully explained in the next 16 hours with an article on Anandtech's front page.


What the....what levels of ******ation cause an engineer to be like "I think we should begin the boost clock throttling at 39C. Seems like a perfectly legit temperature limit to start slowing down the GPU....NOT


----------



## fewness

Does anyone know if I can simply short those three, instead of solder resistors on top of them? What could go wrong? I mean the voltage is limited anyway, right?

Can I just use pencil to draw a line on each of them?


----------



## xer0h0ur

There is no need to put a resistor on them. You just need to either solder a wire to short it end to end or use some CLU/P on them if you don't want to leave traces of a mod in terms of soldering.


----------



## Inglewood78

Quote:


> Originally Posted by *nexxusty*
> 
> Hmm. Great news for me then. Putting an H110i on my FE tonight.
> 
> 48c sounds great. Heh. Any extra heatsinks installed on RAM or VRM?


I bought some ramsinks off amazon but even with the thermal tape, it kept falling off....

I dont have anything on it and only have fans blowing across it. No instability so far...


----------



## fewness

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is no need to put a resistor on them. You just need to either solder a wire to short it end to end or use some CLU/P on them if you don't want to leave traces of a mod in terms of soldering.










That was fast. What is CLU/P?


----------



## Inglewood78

Quote:


> Originally Posted by *ikjadoon*
> 
> THE EDITOR FROM ANANDTECH REPLIED. Go to this 10TB hard drive review, go to page 3 of the comments, and you'll find this. Ryan Smith is the video card reviewer at Anandtech.
> 
> 
> 
> And the image he posted?
> 
> 
> 
> OH HOLY MOTHER OF GOODNESS, *IT'S CONFIRMED*. Our anecdotal testing was CORRECT. Pascal GPUs DO begin to reduce maximum boost at even 39C! Expect this graph to be fully explained in the next 16 hours with an article on Anandtech's front page.


I idle at 29C and never seen it drop when the temps increase so the 39C for the first reduction does seem a little aggressive.


----------



## xer0h0ur

Coollaboratory liquid ultra or coollaboratory liquid pro


----------



## Sourcesys

Quote:


> Originally Posted by *fewness*
> 
> 
> 
> 
> 
> 
> 
> 
> That was fast. What is CLU/P?


liquid metal


----------



## fewness

Quote:


> Originally Posted by *xer0h0ur*
> 
> Coollaboratory liquid ultra or coollaboratory liquid pro


Great. Thank you!


----------



## Sourcesys

Quote:


> Originally Posted by *ikjadoon*
> 
> And the image he posted?
> 
> 
> 
> OH HOLY MOTHER OF GOODNESS, *IT'S CONFIRMED*. Our anecdotal testing was CORRECT. Pascal GPUs DO begin to reduce maximum boost at even 39C! Expect this graph to be fully explained in the next 16 hours with an article on Anandtech's front page.


This IS really ******ed, no other word can describe that way of engineering.


----------



## ikjadoon

Quote:


> Originally Posted by *uberwootage*
> 
> Well we know how right tomshardware can be at times. Some of the people who write those reviews have zero idea what they are talking about.


Quote:


> Originally Posted by *xer0h0ur*
> 
> What the....what levels of ******ation cause an engineer to be like "I think we should begin the boost clock throttling at 39C. Seems like a perfectly legit temperature limit to start slowing down the GPU....NOT


Quote:


> Originally Posted by *Inglewood78*
> 
> I idle at 29C and never seen it drop when the temps increase so the 39C for the first reduction does seem a little aggressive.


Quote:


> Originally Posted by *Sourcesys*
> 
> This IS really ******ed, no other word can describe that way of engineering.


























Yeesh. I shouldn't have posted it yet. It's not Tom's Hardware--it's Anandtech. We'll all figure out exactly how they got those results tomorrow.

Let me throw out a reasonable situation that actually means Nvidia further optimized for water-cooling (and not that they're "******ed", for Pete's sake).

Maybe in GPU Boost 2.0, that GPU would've maxed out at 1820MHz--on air or water. Running it higher voltages/frequencies may have increased the rate of voltage/heat-induced degradation. However, if NVIDIA can confirm that the GPU die is cool enough and that heat-related degradation is reduced, why not ramp it up? So, in GPU Boost 3.0, they've allowed water-cooled systems to go higher.

This is *not* unprecedented. How many motherboards offer a fix for cold bugs for LN2 users? Plenty of the high-end ones. Are you complaining, "WHAT? WHY DON'T THEY...UM...Umm...um...why don't they offer that for air users, too?! How DARE they throttle us? How DARE they? Everyone, boycott Gigabyte, ASUS, MSI, and EVGA. They throttle for air users, but not for LN2 users!"

Put down your shoddy pitchforks. Or at least stow them away for tomorrow, you boobs.


----------



## fat4l

Quote:


> Originally Posted by *fewness*
> 
> 
> 
> Does anyone know if I can simply short those three, instead of solder resistors on top of them? What could go wrong? I mean the voltage is limited anyway, right?
> 
> Can I just use pencil to draw a line on each of them?


It's these 3 ....

http://postimg.org/image/a3gm1hd9f/full/

Use CLU over them. See general mod here:
http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/3/


----------



## VSG

That drop in boost clocks is ASIC dependent btw, so don't take that chart for granted. Having said that, GPU Boost 3.0 is more trouble than benefit. Variable Vcore tied to GPU Boost and outside manual control is just another step towards the greenlight wall.


----------



## looniam

Quote:


> Originally Posted by *geggeg*
> 
> *That drop in boost clocks is ASIC dependent* btw, so don't take that chart for granted. Having said that, GPU Boost 3.0 is more trouble than benefit. Variable Vcore tied to GPU Boost and outside manual control is just another step towards the greenlight wall.


i always suspected that since not all maxwell cards, but the majority show that behavior and never at the same temps; anywhere form 62c to 74c; which took a bios mod to relieve it.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> If you're hitting 2150 stable at any point then you got a good card. I can't push much past 2114 stable and I am waterblocked. As is the 2100 MHz mark is generally speaking considered to be above average already. The outstanding cards are hitting 2200MHz but there really aren't many doing that.


I wonder if I'm doing all good ? Or maybe wrong ? I've been playing with this card for only a few hours ...









Did you max voltage/power limit/temp limit to the max ? That's what I did in afterburner. Then I was increaseing clocks until driver crash...
Correct ?


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> I wonder if I'm doing all good ? Or maybe wrong ? I've been playing with this card for only a few hours ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you max voltage/power limit/temp limit to the max ? That's what I did in afterburner. Then I was increaseing clocks until driver crash...
> Correct ?


I literally did that. I cranked the power limit and core voltage then started bumping the GPU clock 25MHz at a time. Something like 6 benchmarks later I figured out I couldn't push +250MHz on the core clock and I was perfectly stable at +225 so I just left it there. So my current settings are:

Core Voltage: +100%
Power Limit: 120%
Core Clock: +225
Mem Clock: +575

Load temp of 41C using those settings and a resulting GPU boost clock of 2114MHz.


----------



## ikjadoon

Quote:


> Originally Posted by *geggeg*
> 
> That drop in boost clocks is ASIC dependent btw, so don't take that chart for granted. Having said that, GPU Boost 3.0 is more trouble than benefit. Variable Vcore tied to GPU Boost and outside manual control is just another step towards the greenlight wall.


That would explain why some people see it and some don't.

Hmm...you're talking about GPU Boost in general, then? I'm not sure I follow. This is what Intel does with CPUs: besides your maximum overclock and maximum Vcore, it creates its own "voltage / frequency" curve at lower CPU loads. That hasn't inhibited CPU overclocking--I think the vast majority of Intel overclockers run with all C-states and EIST enabled these days.

And, wait, doesn't GPU Boost 3.0 allow you to tweak that, with the new V/F curve? Isn't that manual control?

But, otherwise, yes. I was sad when Nvidia started Greenlight. But, I don't know--I give them a bit of credit here. Intel could've done that years ago, but they never did: you can run 3V through a modern Intel processor, even on Intel-designed motherboards. I do think GPUs might be more fragile to high voltage / high temperatures than CPUs. But, for those of us with good cooling--yup, it's a bummer.


----------



## VSG

Intel, or more specifically motherboard makers, allow for fixed manual voltage without any changes as well. This is not yet possible with BIOS that support GPU Boost 3.0. That's what I was referring to, in that you can set voltage states but they will still fluctuate and this will affect other parameters.


----------



## nexxusty

Quote:


> Originally Posted by *fewness*
> 
> 
> 
> Does anyone know if I can simply short those three, instead of solder resistors on top of them? What could go wrong? I mean the voltage is limited anyway, right?
> 
> Can I just use pencil to draw a line on each of them?


User solder only ideally. If you can't do that, use wire.

Using anything else is bad practice and bad advice.


----------



## nexxusty

Quote:


> Originally Posted by *ikjadoon*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeesh. I shouldn't have posted it yet. It's not Tom's Hardware--it's Anandtech. We'll all figure out exactly how they got those results tomorrow.
> 
> Let me throw out a reasonable situation that actually means Nvidia further optimized for water-cooling (and not that they're "******ed", for Pete's sake).
> 
> Maybe in GPU Boost 2.0, that GPU would've maxed out at 1820MHz--on air or water. Running it higher voltages/frequencies may have increased the rate of voltage/heat-induced degradation. However, if NVIDIA can confirm that the GPU die is cool enough and that heat-related degradation is reduced, why not ramp it up? So, in GPU Boost 3.0, they've allowed water-cooled systems to go higher.
> 
> This is *not* unprecedented. How many motherboards offer a fix for cold bugs for LN2 users? Plenty of the high-end ones. Are you complaining, "WHAT? WHY DON'T THEY...UM...Umm...um...why don't they offer that for air users, too?! How DARE they throttle us? How DARE they? Everyone, boycott Gigabyte, ASUS, MSI, and EVGA. They throttle for air users, but not for LN2 users!"
> 
> Put down your shoddy pitchforks. Or at least stow them away for tomorrow, you boobs.


I don't trust Anand for anything but benchmarks. Even then....

I'm still calling BS. My pitchfork is down for the time being however a 39c throttle is not happening. It would have been reported.


----------



## ArakniD

Flow rate in the WC loop either in parallel or series makes SFA different IMHO.
Quote:


> Originally Posted by *nexxusty*
> 
> User solder only ideally. If you can't do that, use wire.
> 
> Using anything else is bad practice and bad advice.


Doing that mod doesn't remove or change the power limits. I've done it to my Gigabyte. All it did was decrease the reported power usage.. but it still limited and reported limited to GPUz

Only a custom bios removed that limit.. Asus Strixx OC bios to be precise.


----------



## ssgwright

so I just flashed my Zotac FE with the "waterforce" bios and got a black screen... I've spent the last two hours trying to flash it back.. (i"m on water so I had to drain and disconnect and move to another slot)


----------



## KickAssCop

How are the Gigabyte G1 Gaming cards this time round? Any good news or bad news about them?
They are currently available on pre-order on Amazon for 650 bucks which is about maximum I am willing to pay







.


----------



## GreedyMuffin

Damn. MSI gaming X and ASUS STRIX is not in store, and they don't know when it will be either...

Wondering if I should go for a FE, then a WB later down the road when I can afford it?

Thoughts? Does the custom cards OC better?


----------



## uberwootage

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Damn. MSI gaming X and ASUS STRIX is not in store, and they don't know when it will be either...
> 
> Wondering if I should go for a FE, then a WB later down the road when I can afford it?
> 
> Thoughts? Does the custom cards OC better?


No Fe's are the top clockers on water.

Quote:


> Originally Posted by *ssgwright*
> 
> so I just flashed my Zotac FE with the "waterforce" bios and got a black screen... I've spent the last two hours trying to flash it back.. (i"m on water so I had to drain and disconnect and move to another slot)


That's odd. That bios is my daily driver on a nvidia fe. I wounded why it works on mine but not yours. I have seen nvflash freeze up a few times mid flash and I had to reflash it before I rebooted.


----------



## wsarahan

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> Can you help me with some 1080 SLI issue?
> 
> When i enter a game the both cards sometimes do not boost equal, now i`m playing Hitman and one card is bossting to 2089 and the second card is at stock, 1759...
> 
> After sometimes or when the GPu usage is more intense the card goes to 2089 like the other one, but i loose some FPS untill this happens
> 
> I was looking and realized the the second card only goes up in the clock when the usage passes someting about 50%, but in the old series like 980TI when i launched the game the cards was always with the same clock no matter what usage
> 
> I`m using AB 4.3 beta 4
> 
> Is it normal in the new Boost 3.0 and Pascal? Any way to make both cards always boots the same?
> 
> I`m using max performance profile at Nvidia control pannel


Anyone?


----------



## jedimasterben

Quote:


> Originally Posted by *ArakniD*
> 
> Doing that mod doesn't remove or change the power limits. I've done it to my Gigabyte. All it did was decrease the reported power usage.. but it still limited and reported limited to GPUz
> 
> Only a custom bios removed that limit.. Asus Strixx OC bios to be precise.


That's not how it works. Shorting the resistors tells the _hardware_ power limiter on the PCB that the card is using less power. As long as the card is in 3D mode and is not limited by temperature or voltage limits in the BIOS, then the card will continue to boost up and up.


----------



## uberwootage

Quote:


> Originally Posted by *wsarahan*
> 
> Anyone?


What kind of cards. What are the temps like on your cards. What's the psu


----------



## Jorginto

Quote:


> Originally Posted by *wsarahan*
> 
> Anyone?


ULPS? Did you turn it off?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ssgwright*
> 
> so I just flashed my Zotac FE with the "waterforce" bios and got a black screen... I've spent the last two hours trying to flash it back.. (i"m on water so I had to drain and disconnect and move to another slot)


Could possibly saved a headache if you used a different Display Port.

See here:

*http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/3050#post_25356232*

Luckily I didn't have to drain my loop. Got the PC to boot from another card in the lower pci-E slot so I could flash back. I tried the bios again later on, the Asus XOC in that post, with the DP cable in another spot and it worked without issue.


----------



## KillerBee33

Can someone suggest this for GPU or ther is a better option?
http://www.newegg.com/Product/Product.aspx?Item=N82E16835100007


----------



## pez

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> Can you help me with some 1080 SLI issue?
> 
> When i enter a game the both cards sometimes do not boost equal, now i`m playing Hitman and one card is bossting to 2089 and the second card is at stock, 1759...
> 
> After sometimes or when the GPu usage is more intense the card goes to 2089 like the other one, but i loose some FPS untill this happens
> 
> I was looking and realized the the second card only goes up in the clock when the usage passes someting about 50%, but in the old series like 980TI when i launched the game the cards was always with the same clock no matter what usage
> 
> I`m using AB 4.3 beta 4
> 
> Is it normal in the new Boost 3.0 and Pascal? Any way to make both cards always boots the same?
> 
> I`m using max performance profile at Nvidia control pannel


I'm a bit late as usual, but my two cards don't always boost the same, but from what I've seen when I can be bothered to look up at my OSD is that they're within 50Mhz of each other. The only time I see one of them drop lower is when I'm in a game that is using the second card at 70% compared to the first card at 95+%. However, I have not seen any performance drops from this.

What game(s) specifically?
What GPUs do you have exactly?
Are the BIOS' the same in GPU-z and do you have them linked in MSI AB?


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> Can someone suggest this for GPU or ther is a better option?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835100007


I use Gelid Extreme, performs better than AS5. You wouldn't want AS5 on a GPU, though, if you want Arctic Silver you want to use Ceramique instead, as it is not capacitive, which can cause issues.


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> I use Gelid Extreme, performs better than AS5. You wouldn't want AS5 on a GPU, though, if you want Arctic Silver you want to use Ceramique instead, as it is not capacitive, which can cause issues.


http://www.newegg.com/Product/Product.aspx?Item=N82E16835426020&cm_re=Gelid_Extreme-_-35-426-020-_-Product
?


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835426020&cm_re=Gelid_Extreme-_-35-426-020-_-Product
> ?


That's the stuff. There is also Kryonaut Grizzly or something like that that also performs exceptionally well, but I've never used it before, mostly because I don't change hardware all that often


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> That's the stuff. There is also Kryonaut Grizzly or something like that that also performs exceptionally well, but I've never used it before, mostly because I don't change hardware all that often


never had to use any at all , taking a hybrid kit off the 980 and putting it to stock with Factory Heatsink


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> never had to use any at all , taking a hybrid kit off the 980 and putting it to stock with Factory Heatsink


Gotcha. Pretty much anything will work, then, but always good to go with some of the best


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> Gotcha. Pretty much anything will work, then, but always good to go with some of the best


Thank you. Want to try and slap an H90 on the 1080 FE will take that EVGA Hybrid kit apart for the pump bracket


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> Thank you. Want to try and slap an H90 on the 1080 FE will take that EVGA Hybrid kit apart for the pump bracket


I really doubt the H90 will give any better performance than the EVGA CLC. I'm using one currently and before I stuck it into the side of my new Corsair 240 it was hitting a whopping 53C at OC'd load with two Gentle Typhoon AP-14 in push/pull, so you'd pretty much need something with a double rad to do any better.


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> I really doubt the H90 will give any better performance than the EVGA CLC. I'm using one currently and before I stuck it into the side of my new Corsair 240 it was hitting a whopping 53C at OC'd load with two Gentle Typhoon AP-14 in push/pull, so you'd pretty much need something with a double rad to do any better.


my 980 gets up to 60 but never higher @ 1544+4005 on EVGA aio , you dont think a 140mm will give 1080 a better chance?


----------



## fewness

Quote:


> Originally Posted by *fat4l*
> 
> It's these 3 ....
> 
> http://postimg.org/image/a3gm1hd9f/full/
> 
> Use CLU over them. See general mod here:
> http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/3/


OK, now i'm confused...which 3 should I short then... if I got the wrong ones, will it kill the card?


----------



## fat4l

Quote:


> Originally Posted by *fewness*
> 
> OK, now i'm confused...which 3 should I short then... if I got the wrong ones, will it kill the card?


Check this post here :

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/3000_30#post_25355541

_"You can find current shunts which are marked *RS1, RS2, RS3* on PCB with black "_

This is one of them:


----------



## jedimasterben

Quote:


> Originally Posted by *KillerBee33*
> 
> my 980 gets up to 60 but never higher @ 1544+4005 on EVGA aio , you dont think a 140mm will give 1080 a better chance?


Probably not, more than likely you'd get better performance by changing the fans or changing air flow in your case


----------



## ssgwright

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Could possibly saved a headache if you used a different Display Port.
> 
> See here:
> 
> *http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/3050#post_25356232*
> 
> Luckily I didn't have to drain my loop. Got the PC to boot from another card in the lower pci-E slot so I could flash back. I tried the bios again later on, the Asus XOC in that post, with the DP cable in another spot and it worked without issue.


I remembered that about the display ports, I tried and none of them worked. Then I used my sons 970 in a lower slot to try and flash but the only way I could get into windows was with the nv driver installed and with it installed I couldnt flash, and when I tried to disable the driver (or 970 in device manager) I got a black screen. So after about 2 hrs I gave up (trying to flash blind after hitting disable in the device manager lol) still didn't get the famous nvflash "beeps" so I shut her down and drained, switched cards booted and low and behold the blind flash I did before the drain worked??? wth... no beeps but it flashed so I drained for nothing. sorry I'm just rambling writing this, it's 6am here.


----------



## ssgwright

Quote:


> Originally Posted by *jedimasterben*
> 
> I use Gelid Extreme, performs better than AS5. You wouldn't want AS5 on a GPU, though, if you want Arctic Silver you want to use Ceramique instead, as it is not capacitive, which can cause issues.


on the core it's no issue, I wouldn't use it for the ram or anything else though


----------



## KillerBee33

Quote:


> Originally Posted by *jedimasterben*
> 
> Probably not, more than likely you'd get better performance by changing the fans or changing air flow in your case


Tried everything else and this is the best i've got,

This setup keeps 6700 @ 4.7 and 1.32V @ 74 Max


----------



## Joshwaa

Quote:


> Originally Posted by *KillerBee33*
> 
> Can someone suggest this for GPU or ther is a better option?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835100007


I have used Gelid eXtreme for about the past 2 years. Used Antec Formula 7 before that. The are about the same to me.


----------



## uberwootage

Quote:


> Originally Posted by *KillerBee33*
> 
> Can someone suggest this for GPU or ther is a better option?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835100007


I use to love as5 now I'm using NT-H1


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> I remembered that about the display ports, I tried and none of them worked. Then I used my sons 970 in a lower slot to try and flash but the only way I could get into windows was with the nv driver installed and with it installed I couldnt flash, and when I tried to disable the driver (or 970 in device manager) I got a black screen. So after about 2 hrs I gave up (trying to flash blind after hitting disable in the device manager lol) still didn't get the famous nvflash "beeps" so I shut her down and drained, switched cards booted and low and behold the blind flash I did before the drain worked??? wth... no beeps but it flashed so I drained for nothing. sorry I'm just rambling writing this, it's 6am here.


Enable your onboard when your flashing. Bad flash plug the HDMI into your onboard then boot and flash back.

Still trying to figure out what went wrong. That was the same bios I been running for a few days now without issue. The only diff. is you have a zotac fe and I have a nvidia fe but they are the same card so it must of been a nvflash issue


----------



## Shadowdane

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> Can you help me with some 1080 SLI issue?
> 
> When i enter a game the both cards sometimes do not boost equal, now i`m playing Hitman and one card is bossting to 2089 and the second card is at stock, 1759...
> 
> After sometimes or when the GPu usage is more intense the card goes to 2089 like the other one, but i loose some FPS untill this happens
> 
> I was looking and realized the the second card only goes up in the clock when the usage passes someting about 50%, but in the old series like 980TI when i launched the game the cards was always with the same clock no matter what usage
> 
> I`m using AB 4.3 beta 4
> 
> Is it normal in the new Boost 3.0 and Pascal? Any way to make both cards always boots the same?
> 
> I`m using max performance profile at Nvidia control pannel


I get the same thing here sometimes if my GPU Usage % is below 50% on both cards.. the clockspeeds on the 2nd card (lower one) will drop maybe 150-200Mhz below the primary card.

WIth GPU Boost 3.0 the clockspeeds can vary based on GPU load, if your not at high load clocks will lower. I've especially seen this if I turn on a fps limiter and the GPUs don't need to work as hard. I was seeing this a lot in Fallout 4 as I'm using a 60fps limiter as weird stuff happens in Bethesda's games with high fps.

For games where my GPU load is only at 50% or lower on each I just crank up the MSAA or other settings as you clearly have more GPU power to spare. For example I cranked GTA5 up to 8xMSAA, which looks amazing btw. Now both cards run at 2012-2038Mhz pretty much all the time and load in the 70-95% range on each depending on the area I'm in.


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> Enable your onboard when your flashing. Bad flash plug the HDMI into your onboard then boot and flash back.
> 
> Still trying to figure out what went wrong. That was the same bios I been running for a few days now without issue. The only diff. is you have a zotac fe and I have a nvidia fe but they are the same card so it must of been a nvflash issue


I don't have an on-board vid option on my x99


----------



## Sheyster

Quote:


> Originally Posted by *wsarahan*
> 
> Anyone?


Download EVGA PrecisionXOC and try the K-boost feature.

Quote:


> Originally Posted by *Sourcesys*
> 
> SLI is ****.


No wonder you have 0 rep with this kind of response.


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> I don't have an on-board vid option on my x99


That sucks. Should pick up a cheap gt 710 as a backup for bad flashes. Get one for about $10


----------



## Sourcesys

Quote:


> Originally Posted by *Sheyster*
> 
> No wonder you have 0 rep with this kind of response.


Maybe its because I have less than 30 posts?









Sorry for pointing out the obvious tho


----------



## xer0h0ur

Quote:


> Originally Posted by *ssgwright*
> 
> so I just flashed my Zotac FE with the "waterforce" bios and got a black screen... I've spent the last two hours trying to flash it back.. (i"m on water so I had to drain and disconnect and move to another slot)


Exactly why I have just been letting people figure out what works and what doesn't work in terms of the BIOS options out there. I am waterblocked on a mATX system that only has access to one PCI-E slot that can be used right now so I would be in a world of hurt if I my 1080 black screened. Would have to buy a 16X PCI-E miner extension cable to hook up the 2nd video card to the other PCI-E slot.


----------



## Joshwaa

Anyone else notice that the Phsyx box is not checked in Gpuz even though it is installed?


----------



## boredgunner

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone else notice that the Phsyx box is not checked in Gpuz even though it is installed?


I'll check when I get home, but I've had zero issues running GPU PhysX games with the GTX 1080. In fact the gap between it and my previous 1488 MHz GTX 980 Ti tends to be biggest in PhysX games.


----------



## KillerBee33

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone else notice that the Phsyx box is not checked in Gpuz even though it is installed?


Did you install Driver thru WINDOWS UPDATES? Try reinstalling or Load up Borderlands if you have it, it'll crash if PhysX isn't installed.

Forget the red OUTLINE just the PhysX box


----------



## Joshwaa

Quote:


> Originally Posted by *KillerBee33*
> 
> Did you install Driver thru WINDOWS UPDATES? Try reinstalling or Load up Borderlands if you have it, it'll crash if PhysX isn't installed.
> 
> Forget the red OUTLINE just the PhysX box


I do not use Windows Updates. Used nVidia drivers. Seemed odd. I even tried using the clean install function a 2nd time and still no phsyx box checked. I do not have a game that uses it so I can not test that way.


----------



## GreedyMuffin

Wondering if I should pick up an eVGA 1080 FE Demo for 80USD less than the normal price?

Same warranty etc.


----------



## KillerBee33

Quote:


> Originally Posted by *Joshwaa*
> 
> I do not use Windows Updates. Used nVidia drivers. Seemed odd. I even tried using the clean install function a 2nd time and still no phsyx box checked. I do not have a game that uses it so I can not test that way.


Weird indeed. Try this and reinstall again
http://www.wagnardmobile.com/forums/viewtopic.php?f=5&t=281


----------



## nexxusty

Quote:


> Originally Posted by *pez*
> 
> I'm a bit late as usual, but my two cards don't always boost the same, but from what I've seen when I can be bothered to look up at my OSD is that they're within 50Mhz of each other. The only time I see one of them drop lower is when I'm in a game that is using the second card at 70% compared to the first card at 95+%. However, I have not seen any performance drops from this.
> 
> What game(s) specifically?
> What GPUs do you have exactly?
> Are the BIOS' the same in GPU-z and do you have them linked in MSI AB?


I already answered this. Basically the same answer as you too.

He keeps asking this question. 3rd time by my count with no mention of people responding. My guess is he doesn't like the answers? I could care less tbh.

I'd avoid this guy....


----------



## Joshwaa

Quote:


> Originally Posted by *KillerBee33*
> 
> Weird indeed. Try this and reinstall again
> http://www.wagnardmobile.com/forums/viewtopic.php?f=5&t=281


I will give it a try. That is a newer version than what I have been using.


----------



## wsarahan

Quote:


> Originally Posted by *pez*
> 
> I'm a bit late as usual, but my two cards don't always boost the same, but from what I've seen when I can be bothered to look up at my OSD is that they're within 50Mhz of each other. The only time I see one of them drop lower is when I'm in a game that is using the second card at 70% compared to the first card at 95+%. However, I have not seen any performance drops from this.
> 
> What game(s) specifically?
> What GPUs do you have exactly?
> Are the BIOS' the same in GPU-z and do you have them linked in MSI AB?


'

Any game, in benchmarks the cards boost normal

The bios are the same, just checkjed here

My GPU`s is 1080 SC EVGA ACX 3.0


----------



## wsarahan

Quote:


> Originally Posted by *Shadowdane*
> 
> I get the same thing here sometimes if my GPU Usage % is below 50% on both cards.. the clockspeeds on the 2nd card (lower one) will drop maybe 150-200Mhz below the primary card.
> 
> WIth GPU Boost 3.0 the clockspeeds can vary based on GPU load, if your not at high load clocks will lower. I've especially seen this if I turn on a fps limiter and the GPUs don't need to work as hard. I was seeing this a lot in Fallout 4 as I'm using a 60fps limiter as weird stuff happens in Bethesda's games with high fps.
> 
> For games where my GPU load is only at 50% or lower on each I just crank up the MSAA or other settings as you clearly have more GPU power to spare. For example I cranked GTA5 up to 8xMSAA, which looks amazing btw. Now both cards run at 2012-2038Mhz pretty much all the time and load in the 70-95% range on each depending on the area I'm in.


Thanks, Yes this happens here

With Heaven banchmark for example the cards boost equal because stays both at 99%

At Hitman, COD.... sometimes the second card do not boost at all staying with the default clock, just the primary acrd boosts

About the KBOOST at EVGA precison what it does?

Thanks again


----------



## nexxusty

Quote:


> Originally Posted by *wsarahan*
> 
> Thanks, Yes this happens here
> 
> With Heaven banchmark for example the cards boost equal because stays both at 99%
> 
> At Hitman, COD.... sometimes the second card do not boost at all staying with the default clock, just the primary acrd boosts
> 
> About the KBOOST at EVGA precison what it does?
> 
> Thanks again


Ugh.... is this really that hard to figure out?

Benchmarks = Using both GPU's at 99% at all times.

Games = Not always using 99% of both GPU's = clock throttling.

Simple. Nothing erroneous happening here.


----------



## ikjadoon

Quote:


> Originally Posted by *nexxusty*
> 
> I don't trust Anand for anything but benchmarks. Even then....
> 
> I'm still calling BS. My pitchfork is down for the time being however a 39c throttle is not happening. It would have been reported.


Well, it's kind of a benchmark.







In a way. It's a graph. We can agree it's a graph for sure.









I've now noticed a few people in the GTX 1070 thread reporting 40C now, after they've done more careful testing (looking at temperature ramp-up instead of just the last 30 seconds).
Quote:


> Originally Posted by *marik123*
> 
> I just did more testing tonight and my GPU stays at 2100mhz boost if temp < 40c, then drop 12.5mhz per 3-5c increase. The lowest I seen in boost is 2037.5mhz when temperature reached 65C during Heaven 4.0.


That's on an ASUS STRIX GTX 1070, too.
Quote:


> Originally Posted by *geggeg*
> 
> Intel, or more specifically motherboard makers, allow for fixed manual voltage without any changes as well. This is not yet possible with BIOS that support GPU Boost 3.0. That's what I was referring to, in that you can set voltage states but they will still fluctuate and this will affect other parameters.


Er, in a way. _Technically_, any CPU will throttle at the TjMax. I'm unsure if that option can be turned off in the motherboard. It's just that NVIDIA is far more aggressive: they aren't waiting until TjMax, but even 40-50C.

I see your point, though--I'm being pedantic.

Well, Anandtech wasn't certain what was the cause either:
Quote:


> For what it's worth, the GTX 1080 gets up to 68C relatively quickly, so GPU performance stabilizes rather soon. But this does mean that GTX 1080's performance is more temperature dependent than GTX 980's. *Throwing a GTX 1080 under water could very well net you a few percent performance increase* by avoiding the compensation effect, along with any performance gained from avoiding the card's 83C temperature throttle.
> 
> In any case, I believe this to be compensation for the effects of higher temperatures on the GPU, backing off on voltages/clockspeeds due to potential issues. What those issues are I'm not sure; it could be that *16nm FinFET doesn't like high voltages at higher temperatures* (*NVIDIA takes several steps to minimize GPU degradation*), or something else entirely.


I don't understand why NVIDIA would limit performance unless they had actual degradation concerns. Or, if they were just trying to stop us from overclocking, AMD would've allowed overvolting to appeal to the OC crowd--but AFAIK, they have limited it as well.

I do think...it's an actual degradation concern, but I don't keep up with long-term overclockers and their GPUs, either, haha, so I've no idea what it takes to degrade these 7.2B transistor chips.


----------



## wsarahan

Quote:


> Originally Posted by *nexxusty*
> 
> Ugh.... is this really that hard to figure out?
> 
> Benchmarks = Using both GPU's at 99% at all times.
> 
> Games = Not always using 99% of both GPU's = clock throttling.
> 
> Simple. Nothing erroneous happening here.


I know my friend

Anyway, i was just trying to say that in this generatiion my boost is not equal from what i had at my 980TI SLi for example

The clocks go up and down a lot of times, even without the limit of 144 fps the second card do not boost sometimes

But it`s ok i got what you said

Thanks


----------



## nexxusty

Quote:


> Originally Posted by *wsarahan*
> 
> I know my friend
> 
> Anyway, i was just trying to say that in this generatiion my boost is not equal from what i had at my 980TI SLi for example
> 
> The clocks go up and down a lot of times, even without the limit of 144 fps the second card do not boost sometimes
> 
> But it`s ok i got what you said
> 
> Thanks


Apologies for my demeanor. I was under the impression you weren't listening to us.

I know man... I feel you. We ALL hate GPU Boost 3.0. I mean actual hate too... I don't hate anything really... GPU Boost 3.0 however
... lol.
Quote:


> Originally Posted by *ikjadoon*
> 
> Well, it's kind of a benchmark.
> 
> 
> 
> 
> 
> 
> 
> In a way. It's a graph. We can agree it's a graph for sure.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've now noticed a few people in the GTX 1070 thread reporting 40C now, after they've done more careful testing (looking at temperature ramp-up instead of just the last 30 seconds).
> That's on an ASUS STRIX GTX 1070, too.
> Er, in a way. _Technically_, any CPU will throttle at the TjMax. I'm unsure if that option can be turned off in the motherboard. It's just that NVIDIA is far more aggressive: they aren't waiting until TjMax, but even 40-50C.
> 
> I see your point, though--I'm being pedantic.
> 
> Well, Anandtech wasn't certain what was the cause either:
> I don't understand why NVIDIA would limit performance unless they had actual degradation concerns. Or, if they were just trying to stop us from overclocking, AMD would've allowed overvolting to appeal to the OC crowd--but AFAIK, they have limited it as well.
> 
> I do think...it's an actual degradation concern, but I don't keep up with long-term overclockers and their GPUs, either, haha, so I've no idea what it takes to degrade these 7.2B transistor chips.


Excess voltage degrades chips, heat, and time to an extreme lesser extent.

Nothing else. So.... if we stay at 1.1v and keep temps under 50c there will be no degradation. I can guarantee it. No matter what mhz the core is at.


----------



## xer0h0ur

Its not the transistor count they are worried about so much as the new manufacturing node process. I suppose they are worried about long term effects of overclocking these finfet GPUs.


----------



## wsarahan

Quote:


> Originally Posted by *nexxusty*
> 
> Apologies for my demeanor. I was under the impression you weren't listening to us.
> 
> I know man... I feel you. We ALL hate GPU Boost 3.0. I mean actual hate too... I don't hate anything really... GPU Boost 3.0 however
> ... lol.
> Excess voltage degrades chips, heat, and time to an extreme lesser extent.
> 
> Nothing else. So.... if we stay at 1.1v and keep temps under 50c there will be no degradation. I can guarantee it. No matter what mhz the core is at.


No problem man

I was testing some games here, i think the Hitman SLI is a crap, all other games one time or another make the gpu boost equal, can take some time but happens, the Hitmam even SLI profile do when he want

Problem solved


----------



## uberwootage

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Wondering if I should pick up an eVGA 1080 FE Demo for 80USD less than the normal price?
> 
> Same warranty etc.


Up to you. Fe's on water are gtx 1080's in beat mode.


----------



## nexxusty

Quote:


> Originally Posted by *wsarahan*
> 
> No problem man
> 
> I was testing some games here, i think the Hitman SLI is a crap, all other games one time or another make the gpu boost equal, can take some time but happens, the Hitmam even SLI profile do when he want
> 
> Problem solved


I've heard there can be issues with SLi and Hitman.

Maybe try locking the clocks lower on both cards. Or both GPU clocks to the clock the one GPU eventually drops to. Might stop fps fluctuations.

Worth a shot.


----------



## ikjadoon

Quote:


> Originally Posted by *nexxusty*
> 
> Excess voltage degrades chips, heat, and time to an extreme lesser extent.
> 
> Nothing else. So.... if we stay at 1.1v and keep temps under 50c there will be no degradation. I can guarantee it. No matter what mhz the core is at.


OK. So, then...in fact...Nvidia's reasoning makes sense here? If the temperatures are low enough to avoid degradation...then they ramp up the voltage & clock speeds because, measurably, the possibility of degradation is lower.

In other words,

high volts / high clocks / low temp - less degradation
high volts / high clocks / high temp - more degradation

So, in effect, NVIDIA is "giving more performance", right? Or no? From what I can tell, Boost 2.0 didn't give you higher volts / higher frequencies at 35C vs 65C, right?
Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not the transistor count they are worried about so much as the new manufacturing node process. I suppose they are worried about long term effects of overclocking these finfet GPUs.


Err, true. I was maybe getting at the same idea, as 16nm FF is what allowed that high transistor count, but I agree. I don't know if/how degradation changes with FF.


----------



## nexxusty

Quote:


> Originally Posted by *ikjadoon*
> 
> OK. So, then...in fact...Nvidia's reasoning makes sense here? If the temperatures are low enough to avoid degradation...then they ramp up the voltage & clock speeds because, measurably, the possibility of degradation is lower.
> 
> In other words,
> 
> *high volts* / high clocks / low temp - less degradation
> high volts / high clocks / high temp - more degradation
> 
> So, in effect, NVIDIA is "giving more performance", right? Or no? From what I can tell, Boost 2.0 didn't give you higher volts / higher frequencies at 35C vs 65C, right?
> Err, true. I was maybe getting at the same idea, as 16nm FF is what allowed that high transistor count, but I agree. I don't know if/how degradation changes with FF.


I can't comment on voltage above 1.1v at this time. I doubt even 1.20v would degrade any card providing said card had sufficient cooling however.

I would agree with them giving more performance, 2.0 definitely did not have such a wide spectrum of conditions. It's currently not working out for the enthusiasts though.

If there is any merit to 39c throttling we are effed. 39c is impossible to hold on air/water under load. Without a doubt.


----------



## xer0h0ur

Quote:


> Originally Posted by *nexxusty*
> 
> I can't comment on voltage above 1.1v at this time. I doubt even 1.20v would degrade any card providing said card had sufficient cooling however.
> 
> I would agree with them giving more performance, 2.0 definitely did not have such a wide spectrum of conditions. It's currently not working out for the enthusiasts though.
> 
> If there is any merit to 39c throttling we are effed. 39c is impossible to hold on air/water under load. Without a doubt.


If I wasn't dealing with so many bills right now I would say "challenge accepted" but it would of course involve a lot of radiator real estate to maintain sub 39C temps at load.


----------



## nexxusty

Quote:


> Originally Posted by *Artah*
> 
> what do you guys use to speed up the voltage increment testing to find the stable uncore voltages? I want to run it at 3.6/3.7 only for now.


I've always just used anything AVX based. Real bench, handbrake or IBT.

I've taken to testing a potential OC be it CPU, Cache or RAM with 20 passes of IBT on high. Looking for symmetry in the Whetstone GFOPS numbers the test shows you.

Shows instability for me very quickly. I can usually bang out a stable OC in 2-3 hours with how fast these new CPU's finish.
Quote:


> Originally Posted by *xer0h0ur*
> 
> If I wasn't dealing with so many bills right now I would say "challenge accepted" but it would of course involve a lot of radiator real estate to maintain sub 39C temps at load.


Haha. Diminishing returns will force you to use so many inches of rad space it would be nuts.

I've seen a q6600 at 4ghz @ 1.5v with a massive pump and a truck radiator, it loaded at 38c or so I think. He had it setup really nice too. Quick disconnects going to tubing from the basement. It was wild.


----------



## kx11

Galax HOF 1080 bios

https://www.dropbox.com/s/7l444osxkg2m3vm/GP104-HOF1080.rom?dl=0

now some shots


----------



## uberwootage

Quote:


> Originally Posted by *kx11*
> 
> Galax HOF 1080 bios
> 
> https://www.dropbox.com/s/7l444osxkg2m3vm/GP104-HOF1080.rom?dl=0
> 
> now some shots


Quote:


> Originally Posted by *nexxusty*
> 
> Apologies for my demeanor. I was under the impression you weren't listening to us.
> 
> I know man... I feel you. We ALL hate GPU Boost 3.0. I mean actual hate too... I don't hate anything really... GPU Boost 3.0 however
> ... lol.
> Excess voltage degrades chips, heat, and time to an extreme lesser extent.
> 
> Nothing else. So.... if we stay at 1.1v and keep temps under 50c there will be no degradation. I can guarantee it. No matter what mhz the core is at.


To sexy


----------



## ArakniD

Quote:


> Originally Posted by *jedimasterben*
> 
> That's not how it works. Shorting the resistors tells the _hardware_ power limiter on the PCB that the card is using less power. As long as the card is in 3D mode and is not limited by temperature or voltage limits in the BIOS, then the card will continue to boost up and up.


Well I did it and I still got power limited. Couldn't push pass 1900 and 1.06v .. gigabyte bios.

It was like the limit never lifted.


----------



## raidflex

Anyone come from an GTX 780 TI SLI setup to a 1080? I am really considering going to a 1080 just to get off SLI, even if the performance difference isn't huge. On top of that my 780 Ti's probably draw close to 600 watts while overclocked, so heat dump in my WC loop would be significantly less.

Also trying to get an idea of what kind of clocks people are getting on water, from what I have read already not much of a difference from air. I would WC the card anyways as I already have a full custom setup.

I was thinking about the EVGA FTW 1080 because of the extra 8-pin power connector, not sure if that will help the OC though. The heatsink/fan means nothing to me since I will be WC it anyways.


----------



## Joshwaa

Quote:


> Originally Posted by *raidflex*
> 
> Anyone come from an GTX 780 TI SLI setup to a 1080? I am really considering going to a 1080 just to get off SLI, even if the performance difference isn't huge. On top of that my 780 Ti's probably draw close to 600 watts while overclocked, so heat dump in my WC loop would be significantly less.
> 
> Also trying to get an idea of what kind of clocks people are getting on water, from what I have read already not much of a difference from air. I would WC the card anyways as I already have a full custom setup.
> 
> I was thinking about the EVGA FTW 1080 because of the extra 8-pin power connector, not sure if that will help the OC though. The heatsink/fan means nothing to me since I will be WC it anyways.


Water cooling will help to sustain higher clocks due to the 50C downclock point I am noticing. However you would be in the same boat I am in waiting for EK to make a block for the card.


----------



## boredgunner

Quote:


> Originally Posted by *Joshwaa*
> 
> However you would be in the same boat I am in waiting for EK to make a block for the card.


Unless...

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127952

In stock at the time of this post.


----------



## nexxusty

Quote:


> Originally Posted by *ArakniD*
> 
> Well I did it and I still got power limited. Couldn't push pass 1900 and 1.06v .. gigabyte bios.
> 
> It was like the limit never lifted.


Then you applied the mod incorrectly.


----------



## Joshwaa

That's the one I was considering to make things easier but I have been sticking with EVGA since I had a problem with MSI RMA that I had to go thru.


----------



## xer0h0ur

Quote:


> Originally Posted by *raidflex*
> 
> Anyone come from an GTX 780 TI SLI setup to a 1080? I am really considering going to a 1080 just to get off SLI, even if the performance difference isn't huge. On top of that my 780 Ti's probably draw close to 600 watts while overclocked, so heat dump in my WC loop would be significantly less.
> 
> Also trying to get an idea of what kind of clocks people are getting on water, from what I have read already not much of a difference from air. I would WC the card anyways as I already have a full custom setup.
> 
> I was thinking about the EVGA FTW 1080 because of the extra 8-pin power connector, not sure if that will help the OC though. The heatsink/fan means nothing to me since I will be WC it anyways.


Join the club, I dumped my fire breathing dragons in favor of the 1080 for the same reasons. 295X2 + 290X may have been more power when tri-fire worked properly but all that heat, power consumption and needing an extra radiator just was an annoyance. Frankly multi-gpu setups aren't good enough and aren't supported properly enough yet. Was the same for my GTX 690. So for the foreseeable future I am sticking to single powerful GPUs.


----------



## marc0053

Just tried the HOF bios on my gigabyte G1 and now the gpu won't post past error 62 on the ASUS RE10 motherboard and Z170 MOCF.
Maybe just a bad flash on my part but be careful guys when you flash a non reference gpu bios.


----------



## nexxusty

Quote:


> Originally Posted by *marc0053*
> 
> Just tried the HOF bios on my gigabyte G1 and now the gpu won't post past error 62 on the ASUS RE10 motherboard and Z170 MOCF.
> Maybe just a bad flash on my part but be careful guys when you flash a non reference gpu bios.


Obviously.

I think you're all a bit naive for flashing BIOSES. None of you have any idea what you're actually doing to the card.

Learn how to solder and watercool. Not flash BIOSES from completely different cards from different PCB's for god sakes.

This is not the first time I've said this.


----------



## uberwootage

Testing the hof bios right now. Let's see how it does.


----------



## kx11

crap , my high bandwidth SLi bridge is nowhere to be found









this Gigabyte x99 ultra gaming mobo is awesome , but it doesn't have a DDR4 auto OC test included


----------



## ikjadoon

Quote:


> Originally Posted by *nexxusty*
> 
> I can't comment on voltage above 1.1v at this time. I doubt even 1.20v would degrade any card providing said card had sufficient cooling however.
> 
> I would agree with them giving more performance, 2.0 definitely did not have such a wide spectrum of conditions. It's currently not working out for the enthusiasts though.
> 
> If there is any merit to 39c throttling we are effed. 39c is impossible to hold on air/water under load. Without a doubt.


Err, for enthusiasts, wait: isn't this a good thing? We get higher clocks, if we go lower. You don't have to get to 39C; every 3-5C apparently helps.

Maybe I'm thinking about this in a weird way. In my mind, it's like if Intel let your 4.4GHz i7-5930K boost to 4.8GHz if you kept the temperature below 50C. When you went over 50C, you went back to your normal OC. I mean, like a Turbo Boost on top of your overclock.

It's just that NVIDIA's choice of 39C as the start point is....curiously low.
Quote:


> Originally Posted by *xer0h0ur*
> 
> If I wasn't dealing with so many bills right now I would say "challenge accepted" but it would of course involve a lot of radiator real estate to maintain sub 39C temps at load.


True. But, I think 39C is only for the highest bins. Seems like every 3-5C you can drop, you gain another 13MHz.


----------



## Vellinious

As I said earlier....it just reinforces what a lot of us already knew. The cooler you keep them, the higher they'll clock.

Pretty easy to figure out. Maxwell was the same way. Keep a 980ti cooler than 30c peak, your chances of reaching 1600 increase exponentially.


----------



## Joshwaa

Has anyone redone the TIM (thermal paste) on the EVGA 1080 FTW to see if it helps? Was considering doing this while I wait for the waterblock to come. I know it helped a lot on my 780ti but I think they have started to use better paste since then. Could be wrong but heard it was shin etsu now.


----------



## GreedyMuffin

I did that on my mothers 980TI (She's folding) and the temps dropped by a good 5¤C at least.

Went for that 1080 FE from eVGA (Demo modell). Will buy myself a block later on.


----------



## Snabeltorsk

Changing Paste on my Msi tomorrow to CLP or CLU, i have both at home just gonna decide wich one to use.


----------



## kx11

2nd GPU is running hotter always like 10c hotter than GPU1 , i'm guessing the space between the cards is the reason


----------



## xer0h0ur

Ugh, yeah that is a pretty safe bet. Its got no breathing room there.


----------



## GreedyMuffin

I used Gelid Extreme. I like that one the best. But personal preference etc after all.


----------



## xer0h0ur

Been using Prolimatech PK-3 for years now


----------



## looniam

Quote:


> Originally Posted by *nexxusty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *marc0053*
> 
> Just tried the HOF bios on my gigabyte G1 and now the gpu won't post past error 62 on the ASUS RE10 motherboard and Z170 MOCF.
> Maybe just a bad flash on my part but be careful guys when you flash a non reference gpu bios.
> 
> 
> 
> Obviously.
> 
> I think you're all a bit naive for flashing BIOSES. *None of you have any idea what you're actually doing to the card.*
> 
> Learn how to solder and watercool. Not flash BIOSES from completely different cards from different PCB's for god sakes.
> 
> This is not the first time I've said this.
Click to expand...


----------



## nexxusty

Quote:


> Originally Posted by *looniam*


I'm betting that 1080 wasn't flashed to a different BIOS from stock.

Everyone says performance is lessened with BIOS flashes.

So again.... flashing different 1080 BIOSES is not the best idea.


----------



## xer0h0ur

*attempting not to laugh*
*laughs hysterically*


----------



## looniam

Quote:


> Originally Posted by *nexxusty*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm betting that 1080 wasn't flashed to a different BIOS from stock.
> 
> *Everyone says performance is lessened with BIOS flashes.*
> 
> So again.... flashing different 1080 BIOSES is not the best idea.
Click to expand...

everyone?

i know a few 1080 owners that are passing around bios like a $2 hooker at a biker rally - some work better some don't; so please don't speak for _everyone._

E; damned typos.


----------



## ikjadoon

Quote:


> Originally Posted by *Vellinious*
> 
> As I said earlier....it just reinforces what a lot of us already knew. The cooler you keep them, the higher they'll clock.
> 
> Pretty easy to figure out. Maxwell was the same way. Keep a 980ti cooler than 30c peak, your chances of reaching 1600 increase exponentially.


Err, true. The mantra is always the same. I'm kind of harping on the point, as some people seemed incredulous this was happening. Though, Maxwell was more consistent, at least to the testing over at Anandtech. Who knows... Cheers,


----------



## nexxusty

Quote:


> Originally Posted by *looniam*
> 
> everyone?
> 
> i know a few 1080 owners that are passing around bios like a $2 hooker at a biker rally - some work better some don't; so please don't speak for _everyone._
> 
> E; damned typos.


Ugh. There is a reason some work better and some don't.

Blindly flashing BIOSES with different timings, etc. Just doesn't seem like a bright idea for me. Or anyone else.


----------



## uberwootage

Well i can say for a fact on my fe the gigabyte waterforce bios is the best preforming bios. Beats stock big a good bit. Firestrike i can consistently pull 25k -26k on the gpu depending on if i max it out or just run my daily driver clock. And i have tested pretty much every bios. And really i dont care its just a gpu if it dies i'll buy another one. If people dont like flashing its fine. But there have been no problems that have not been fixed bu flashing. Have i seen drops in performance from other bios? Yeah the inno was crap and the Asus XOC was the biggest turd ever. However in 3dmark as well as in games "wow the only game i play." the waterforce bios has pulled higher stable clocks as well as pulled more frames then the stock bios. I will say it is picky about them. People running the EVGA FE bios report its nice. On my card it sucks. People report the waterforce did not show an improvement on there card. I see a 20mhz higher stable clock

Do i advice people to flash there cards? Only if you can afford to lose $700. If my card burned out today i would go to microcenter and buy a seahawk soon as i get my shoes on. But if you want every last mhz out of your card then flash away. To date i have flashed every bios out for them. Got one that i have not flashed? PM me it and i'll flash it. End of the day i gained 20mhz its not a lot but its free. And im willing to do it so im going to. It just makes it that much faster.

Side note The Galaxy HOF bios will brick your card so dont flash i just tested it.

If you do decide to join me on the zero f's given flashing this is how you recover if you have a igpu and what i did for my 6700k

flash and brick

Win 10 respoot holding shift and f8.

Go through all the stuff for safe mode. pick safe mode with command prompt

reboot.

nvflash as normal.

reboot and delete that bios and tell others.

Anyways im back on the waterforce. Its running great and is the best bios i have used. Then comes the Seahawk "Corsair" version and then the stock Nvidia FE bios the 0C version the 01 version for some reason i see some performance dips.

But time to play some wow and enjoy all the bugs.


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> Well i can say for a fact on my fe the gigabyte waterforce bios is the best preforming bios. Beats stock big a good bit. Firestrike i can consistently pull 25k -26k on the gpu depending on if i max it out or just run my daily driver clock. And i have tested pretty much every bios. And really i dont care its just a gpu if it dies i'll buy another one. If people dont like flashing its fine. But there have been no problems that have not been fixed bu flashing. Have i seen drops in performance from other bios? Yeah the inno was crap and the Asus XOC was the biggest turd ever. However in 3dmark as well as in games "wow the only game i play." the waterforce bios has pulled higher stable clocks as well as pulled more frames then the stock bios. I will say it is picky about them. People running the EVGA FE bios report its nice. On my card it sucks. People report the waterforce did not show an improvement on there card. I see a 20mhz higher stable clock
> 
> Do i advice people to flash there cards? Only if you can afford to lose $700. If my card burned out today i would go to microcenter and buy a seahawk soon as i get my shoes on. But if you want every last mhz out of your card then flash away. To date i have flashed every bios out for them. Got one that i have not flashed? PM me it and i'll flash it. End of the day i gained 20mhz its not a lot but its free. And im willing to do it so im going to. It just makes it that much faster.
> 
> Side note The Galaxy HOF bios will brick your card so dont flash i just tested it.
> 
> If you do decide to join me on the zero f's given flashing this is how you recover if you have a igpu and what i did for my 6700k
> 
> flash and brick
> 
> Win 10 respoot holding shift and f8.
> 
> Go through all the stuff for safe mode. pick safe mode with command prompt
> 
> reboot.
> 
> nvflash as normal.
> 
> reboot and delete that bios and tell others.
> 
> Anyways im back on the waterforce. Its running great and is the best bios i have used. Then comes the Seahawk "Corsair" version and then the stock Nvidia FE bios the 0C version the 01 version for some reason i see some performance dips.
> 
> But time to play some wow and enjoy all the bugs.


Bro!

Marc just told us he bricked with the HOF BIOS.

I have to tell you guys.... with or without performance increases my point has been proven.

Not the best idea to blindly flash BIOSES.


----------



## looniam

Quote:


> Originally Posted by *nexxusty*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> everyone?
> 
> i know a few 1080 owners that are passing around bios like a $2 hooker at a biker rally - some work better some don't; so please don't speak for _everyone._
> 
> E; damned typos.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ugh. There is a reason some work better and some don't.
> 
> Blindly flashing BIOSES with different timings, etc. Just doesn't seem like a bright idea for me. Or anyone else.
Click to expand...

i agree w/you for the most part but being an "i told you so" or calling someone out isn't very helpful _since the reports can stand on their own_. i'll just leave at that and i assure you i have to no desire to get in an argument about it.

we good?


----------



## Ludus

Hi
May I ask you how di you get a fiat frequency ?

I have a 1080 with waterblock too bue the frequency still going up and down due to power limit (already setted on 120%)

Thank you


----------



## nexxusty

Quote:


> Originally Posted by *looniam*
> 
> i agree w/you for the most part but being an "i told you so" or calling someone out isn't very helpful _since the reports can stand on their own_. i'll just leave at that and i assure you i have to no desire to get in an argument about it.
> 
> we good?


Hahaha. Yeah we're good man. No desire at all. I love all (most) of the boys here.

I liked that "I told you so" comment. I can't really say anything to that.

Lol.


----------



## jedimasterben

Quote:


> Originally Posted by *Ludus*
> 
> Hi
> May I ask you how di you get a fiat frequency ?
> 
> I have a 1080 with waterblock too bue the frequency still going up and down due to power limit (already setted on 120%)
> 
> Thank you


You will need to short the three resistors on the PCB as per the Xdevs article.


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> Well i can say for a fact on my fe the gigabyte waterforce bios is the best preforming bios. Beats stock big a good bit. Firestrike i can consistently pull 25k -26k on the gpu depending on if i max it out or just run my daily driver clock. And i have tested pretty much every bios. And really i dont care its just a gpu if it dies i'll buy another one. If people dont like flashing its fine. But there have been no problems that have not been fixed bu flashing. Have i seen drops in performance from other bios? Yeah the inno was crap and the Asus XOC was the biggest turd ever. However in 3dmark as well as in games "wow the only game i play." the waterforce bios has pulled higher stable clocks as well as pulled more frames then the stock bios. I will say it is picky about them. People running the EVGA FE bios report its nice. On my card it sucks. People report the waterforce did not show an improvement on there card. I see a 20mhz higher stable clock
> 
> Do i advice people to flash there cards? Only if you can afford to lose $700. If my card burned out today i would go to microcenter and buy a seahawk soon as i get my shoes on. But if you want every last mhz out of your card then flash away. To date i have flashed every bios out for them. Got one that i have not flashed? PM me it and i'll flash it. End of the day i gained 20mhz its not a lot but its free. And im willing to do it so im going to. It just makes it that much faster.
> 
> Side note The Galaxy HOF bios will brick your card so dont flash i just tested it.
> 
> If you do decide to join me on the zero f's given flashing this is how you recover if you have a igpu and what i did for my 6700k
> 
> flash and brick
> 
> Win 10 respoot holding shift and f8.
> 
> Go through all the stuff for safe mode. pick safe mode with command prompt
> 
> reboot.
> 
> nvflash as normal.
> 
> reboot and delete that bios and tell others.
> 
> Anyways im back on the waterforce. Its running great and is the best bios i have used. Then comes the Seahawk "Corsair" version and then the stock Nvidia FE bios the 0C version the 01 version for some reason i see some performance dips.
> 
> But time to play some wow and enjoy all the bugs.


your running an FE too right? for some reason the waterforce bricked mine as well. Honestly, the best bios so far for me is my stock Zotac FE bios. It pulled my highest score so far on firestrike extreme of 5858 and it's totally stable at the settings I used for that score, been gaming on it for a week.


----------



## Joshwaa

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I used Gelid Extreme. I like that one the best. But personal preference etc after all.


I went ahead and did it with Gelid Extreme. Not much of a difference. Tested Haven 4.0 with 80% fan with stock paste it would bounce between 61C and 60C. Now with the Gelid it stays steady at 59C. So for most probably not worth the time.


----------



## kx11

i moved the 2nd GPU down to the 3rd slot ( 4th if you count the mini black one ) but now the top gpu won't be recognized as a Display adapter while the second one is the only recognized display driver

the top runs @ x16 and the 2nd @ x8 , no way of swapping the PCI speed AFAIK


----------



## Joshwaa

So ran Firestrike Extreme and here is what I got. It will do till I get it water cooled.

http://www.3dmark.com/3dm/13464214


----------



## kx11

but them back as they were and things are normal now , i gues i need a Mobo with all slots running @ x16 like Asus Deluxe II ( which died on me )


----------



## JaredC01

Quote:


> Originally Posted by *boredgunner*
> 
> Unless...
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127952
> 
> In stock at the time of this post.


By the way, this is showing in stock right meow...


----------



## metal409

http://www.3dmark.com/fs/9428869

2202 is looking like the absolute max I can get out of the core.


----------



## Inglewood78

Quote:


> Originally Posted by *metal409*
> 
> http://www.3dmark.com/fs/9428869
> 
> 2202 is looking like the absolute max I can get out of the core.


Wow! that's a great overclock. Did you customize the voltage curve?


----------



## metal409

Quote:


> Originally Posted by *Inglewood78*
> 
> Wow! that's a great overclock. Did you customize the voltage curve?


Yeah, I had to tweak the curve to get it to clock that high stable. Looking through the logs, it was bouncing off the power limit a bit.


----------



## Inglewood78

Quote:


> Originally Posted by *metal409*
> 
> Yeah, I had to tweak the curve to get it to clock that high stable. Looking through the logs, it was bouncing off the power limit a bit.


Mines is 2150 stable with no custom curve. It may go higher with a custom curve but I am too lazy to do that right now haha


----------



## xer0h0ur

Quote:


> Originally Posted by *metal409*
> 
> http://www.3dmark.com/fs/9428869
> 
> 2202 is looking like the absolute max I can get out of the core.


That is a silicon lottery winner. You're one of the lucky few.


----------



## uberwootage

Quote:


> Originally Posted by *nexxusty*
> 
> Bro!
> 
> Marc just told us he bricked with the HOF BIOS.
> 
> I have to tell you guys.... with or without performance increases my point has been proven.
> 
> Not the best idea to blindly flash BIOSES.


Proven? No not really. There has not been one bricked gtx 1080 that has not been flashed back and is fine. And the performance gain is pretty good with the right bios.

And you know what? Its just a gpu. If it dies i'll just go buy another one. I cant take a $700 loss. I can take both of them going out and take a $1400 loss. I'll just go buy two more and i wont do what i did to break them again. And if that does happen its my own fault. I know the risk of bios flashing. And thats why im not telling everyone to flash this or that bios. IF they ask what im running i tell them and if they want to try it they can. Everyone here knows the risk. If you dont and you have a 1080 then you have money to throw away and buy another one.

Side note. The HOF brick was not a full brick. Could of been a bad flash i'll test it again tomorrow. Until then i advise that people do not flash it. I dont know of any other bricks from it but i would not test it unless you know how to blind flash with a usb or you have a back up card to use to recover the 1080.

The Waterforce why it has been working for a week on two of my cards. It has bricked a card. A founder edition even tho my cards are founder editions. Background on that. It is working on my nvidia found editions however it bricked a Zotac founder edition. I have have been useing this bios heavily since i flashed it and i have not seen any issues however it has bricked a card. Do not know why but i will also that people do not flash this bios.

ssgwright and my self has tested a bunch of bios some are better then others so if you wanna test something out ask him as well as my self and we can give you info on them before you risk your card.

Also if you cant see that one of us have flashed a bios its best to not try it. Just let me know and i'll test it. or ssgwright might test it. We have been running through the bios pretty hard. Bust as of right now with all the bios we tested those are the only two that have bricked cards. Both of the GTX 1080's were fixed and are running

So if you have a bios thats new and has not been tested. pm it to me and i'll run it.


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> Proven? No not really. There has not been one bricked gtx 1080 that has not been flashed back and is fine. And the performance gain is pretty good with the right bios.
> 
> And you know what? Its just a gpu. If it dies i'll just go buy another one. I cant take a $700 loss. I can take both of them going out and take a $1400 loss. I'll just go buy two more and i wont do what i did to break them again. And if that does happen its my own fault. I know the risk of bios flashing. And thats why im not telling everyone to flash this or that bios. IF they ask what im running i tell them and if they want to try it they can. Everyone here knows the risk. If you dont and you have a 1080 then you have money to throw away and buy another one.
> 
> Side note. The HOF brick was not a full brick. Could of been a bad flash i'll test it again tomorrow. Until then i advise that people do not flash it. I dont know of any other bricks from it but i would not test it unless you know how to blind flash with a usb or you have a back up card to use to recover the 1080.
> 
> The Waterforce why it has been working for a week on two of my cards. It has bricked a card. A founder edition even tho my cards are founder editions. Background on that. It is working on my nvidia found editions however it bricked a Zotac founder edition. I have have been useing this bios heavily since i flashed it and i have not seen any issues however it has bricked a card. Do not know why but i will also that people do not flash this bios.
> 
> ssgwright and my self has tested a bunch of bios some are better then others so if you wanna test something out ask him as well as my self and we can give you info on them before you risk your card.
> 
> Also if you cant see that one of us have flashed a bios its best to not try it. Just let me know and i'll test it. or ssgwright might test it. We have been running through the bios pretty hard. Bust as of right now with all the bios we tested those are the only two that have bricked cards. Both of the GTX 1080's were fixed and are running
> 
> So if you have a bios thats new and has not been tested. pm it to me and i'll run it.


You must have missed my point then.


----------



## kx11

what the crap is this all about ??



so the top GPU ( PCI slot_1 ) is the 2nd GPU according to NV control panel ?!! the mobo Bios menu clearly states that the PCI_slot_1 is the main display adapter ?!! but it's GPU 2 to NV control panel ??


----------



## xer0h0ur

Quote:


> Originally Posted by *nexxusty*
> 
> You must have missed my point then.


I really fail to see the purpose of hassling one of the few guys who are willing to experiment for the rest of us on his cards to find out which BIOS is giving the best performance on FE cards. I for one am grateful and patiently waiting to see the end result so I can flash my BIOS to the best one.


----------



## Aph0ticShield

Can anyone tell me why my EVGA 1080 FTW has the subvendor listed as 0000? I RMA'd the first one because of this, now the second one arrived, and it is identical.


----------



## kx11

highest OC i could get so far is 2114 , same clocks i got with Strix cards but this time the score is much better in FS Extreme


----------



## axiumone

Quote:


> Originally Posted by *kx11*
> 
> what the crap is this all about ??
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> so the top GPU ( PCI slot_1 ) is the 2nd GPU according to NV control panel ?!! the mobo Bios menu clearly states that the PCI_slot_1 is the main display adapter ?!! but it's GPU 2 to NV control panel ??


Same for me. I have to connect the displays to the second card in order to get a proper boot.


----------



## Joshwaa

Quote:


> Originally Posted by *Aph0ticShield*
> 
> Can anyone tell me why my EVGA 1080 FTW has the subvendor listed as 0000? I RMA'd the first one because of this, now the second one arrived, and it is identical.


I had some weird readings in GPUz. Such as Pshyx not being checked and pci-e only showing up as v2 not v3. Tried uninstalling drivers and clean install didn't work. I used the program DDU 16.1 (I think thats the version). The one that was released today. Now everything shows up as is should. Might be worth at try. Also a newer version of vid driver is out than the one your running.


----------



## nexxusty

Quote:


> Originally Posted by *xer0h0ur*
> 
> I really fail to see the purpose of hassling one of the few guys who are willing to experiment for the rest of us on his cards to find out which BIOS is giving the best performance on FE cards. I for one am grateful and patiently waiting to see the end result so I can flash my BIOS to the best one.


I'm not hassling anyone man... seriously....


----------



## dante`afk

how do I go about getting 1.2v or 1.25v ? I have tested the strix bios and inno3d bios but the max gpu-z went is 1.15?

How can I lock the voltage to force the gpu to reach it?
which bios is the go to bios people use here for OC?


----------



## Aph0ticShield

Quote:


> Originally Posted by *Joshwaa*
> 
> I had some weird readings in GPUz. Such as Pshyx not being checked and pci-e only showing up as v2 not v3. Tried uninstalling drivers and clean install didn't work. I used the program DDU 16.1 (I think thats the version). The one that was released today. Now everything shows up as is should. Might be worth at try. Also a newer version of vid driver is out than the one your running.


I don't think it's that. I used DDU before, and it's showing up as SUBSYS_00000000 in device manager as well. Other people have reported the same issue, even after flashing the bios on the card.

EVGA got back to me, and when they found out my second card was like that, they said that it was a problem with their RMA department, and they would ship a new one out to me immediately. I'm guessing this is a known issue - especially if they are willing to send me 3 with no questions asked.


----------



## Sheyster

Quote:


> Originally Posted by *nexxusty*
> 
> Obviously.
> 
> I think you're all a bit naive for flashing BIOSES. None of you have any idea what you're actually doing to the card.
> 
> Learn how to solder and watercool. Not flash BIOSES from completely different cards from different PCB's for god sakes.
> 
> This is not the first time I've said this.


Hard modding is not for everyone.

This said, I agree that it's dumb to randomly flash BIOS X on Card Y. Wait for a modded version of YOUR card's actual BIOS.


----------



## fewness

Ok, I shorted those 3 capacitors, now my cards are locked @ 135MHz for all conditions, desktop or loaded....

I now have two most useless GTX 1080s in the world....


----------



## Sheyster

Quote:


> Originally Posted by *fewness*
> 
> Ok, I shorted those 3 capacitors, now my cards are locked @ 135MHz for all conditions, desktop or loaded....
> 
> I now have two most useless GTX 1080s in the world....












Wow...


----------



## darkphantom

Quote:


> Originally Posted by *MrDerrikk*
> 
> Not that I know of, I had to preorder my FTW from B&H and apparently they'll get stock in on the 16th


Forgot to post back in here, but I got my FTW two weeks ago







loving it!

Is there a guide to OC using the EVGA precision app? or is it just churn up the power to 120% and increment slowly?


----------



## Bdonedge

Sold my G1 1070 and got a G1 1080 so I'm officially apart of the club


----------



## Naked Snake

So is there a guide with pictures on how to put the 980ti evga hybrid kit into the 1080? I might be able to grab one of those from a guy with a dead 980ti.


----------



## Bdonedge

How are y'all using Corsair AIO coolers on these? Do they sell adapters

Under benchmark stress test my G1 will get to 71C in a small not wel ventilated room so I'd like to switch the cooling
Method to an aio


----------



## Ludus

Quote:


> Originally Posted by *metal409*
> 
> Yeah, I had to tweak the curve to get it to clock that high stable. Looking through the logs, it was bouncing off the power limit a bit.


How did you tweak the voltage curve ?
Have you done the three resistor mod ?
Quote:


> Originally Posted by *jedimasterben*
> 
> You will need to short the three resistors on the PCB as per the Xdevs article.


Thank you
I will make it asap


----------



## Aph0ticShield

Quote:


> Originally Posted by *darkphantom*
> 
> Forgot to post back in here, but I got my FTW two weeks ago
> 
> 
> 
> 
> 
> 
> 
> loving it!
> 
> Is there a guide to OC using the EVGA precision app? or is it just churn up the power to 120% and increment slowly?


130% is the limit on FTW. Use the Slave bios.


----------



## Lays

I'll be joining the club in a few days when my MSI Seahawk EK Gaming X shows up
(what a mouthful...)


----------



## ChevChelios

Quote:


> Originally Posted by *Bdonedge*
> 
> Sold my G1 1070 and got a G1 1080 so I'm officially apart of the club


do you get coil whine anywhere on your G1 1080 ?

particularly in the Unigine Heaven 4.0 benchmark/loop ?


----------



## pez

Quote:


> Originally Posted by *wsarahan*
> 
> '
> 
> Any game, in benchmarks the cards boost normal
> 
> The bios are the same, just checkjed here
> 
> My GPU`s is 1080 SC EVGA ACX 3.0


Quote:


> Originally Posted by *wsarahan*
> 
> Thanks, Yes this happens here
> 
> With Heaven banchmark for example the cards boost equal because stays both at 99%
> 
> At Hitman, COD.... sometimes the second card do not boost at all staying with the default clock, just the primary acrd boosts
> 
> About the KBOOST at EVGA precison what it does?
> 
> Thanks again


Yep, that sounds a bit normal. It's been covered by the others as well







.

If you're seeing a lot of drop off in usage of your second card, it's going to be between the game not having a good SLI profile, or possibly the game is telling your system it doesn't need that second card that badly.


----------



## Works4me

A quick and weird question for you :
I got a pair of MSI GTX 1080 GAMING X in SLI , I also have an EVGA FTW Coming in 2 weeks ( give or take ) , my question is should i keep the red dragons and sell the FTW ? or maybe sell the MSI's and get another FTW ?


----------



## MrDerrikk

Quote:


> Originally Posted by *darkphantom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrDerrikk*
> 
> Not that I know of, I had to preorder my FTW from B&H and apparently they'll get stock in on the 16th
> 
> 
> 
> Forgot to post back in here, but I got my FTW two weeks ago
> 
> 
> 
> 
> 
> 
> 
> loving it!
> 
> Is there a guide to OC using the EVGA precision app? or is it just churn up the power to 120% and increment slowly?
Click to expand...

You lucky duck, I only just got conformation that mine was being shipped yesterday...


----------



## Snabeltorsk

Quote:


> Originally Posted by *kx11*
> 
> highest OC i could get so far is 2114 , same clocks i got with Strix cards but this time the score is much better in FS Extreme


Thats a very average OC i must say. Very dissapointing for HoF Cards.


----------



## pez

Quote:


> Originally Posted by *Works4me*
> 
> A quick and weird question for you :
> I got a pair of MSI GTX 1080 GAMING X in SLI , I also have an EVGA FTW Coming in 2 weeks ( give or take ) , my question is should i keep the red dragons and sell the FTW ? or maybe sell the MSI's and get another FTW ?


From what I can tell, the MSI cards are going to be quieter and run cooler. They're probably the same, but I have seen reports from reviews that the EVGA fan profile by default is a little passive. You'll see reviews saying it's a little warmer, but it runs quieter. From the bit that I've seen, the FTWs generally aren't OC'ing as well as the Gaming X. As much as I haven't been a fan of MSI for the way they priced their cards, if you've already got them and you're happy, I don't see why you wouldn't keep them unless you're more attracted to the aesthetics of the EVGA card







.


----------



## Jpmboy

Quote:


> Originally Posted by *fewness*
> 
> OK, now i'm confused...which 3 should I short then... if I got the wrong ones, will it kill the card?


the three marked in the picture you posted are correct for a 1080 reference.


----------



## Bdonedge

Quote:


> Originally Posted by *ChevChelios*
> 
> do you get coil whine anywhere on your G1 1080 ?
> 
> particularly in the Unigine Heaven 4.0 benchmark/loop ?


I don't have unigine but during fire strike I get a little bit od coil. Nothing like my MSI 970 but it's definitely there


----------



## metal409

Quote:


> Originally Posted by *Ludus*
> 
> How did you tweak the voltage curve ?
> Have you done the three resistor mod ?


Sorry, I might have misunderstood before. I tweaked the voltage/frequency curve in afterburner. Just using the sliders like normal made my card stick to only 1.00v


----------



## marc0053

Just an update for the HOF bad flash on my gigabyte GTX 1080 G1.
I followed this guide to create a bootable USB with DOS command in safe mode and was able to reflash to my G1 bios again.
http://www.overclock.net/t/593427/how-to-unbrick-your-bricked-graphics-card-fix-a-failed-bios-flash/0_20
I've flashed gpus over the last 5 years+ no problem and this is the 1st time it would freeze at startup with error code 62.
Usually I got the system up and running but the monitor was blank but always was able to boot in safe mode and reflash the card.
This time around was a bit more difficult as I could boot up with the 6700k igpu but it would not recognize the GTX 1080 so couldn't flash.
I hope this is helpful for someone down the road.


----------



## SauronTheGreat

hey everyone i am new here, i just bought the G1 gaming card . these are my benchmarks and everything can someone tell why in the form filling of this club there are only two models reference and founders ? i guess it needs to be updated . i had a 980Ti G1 gaming before on air, this card consumes less power, its chip is cooler and gives almost 10 to 15 fps more in 4k in witcher 3 all settings maxed as compared to my old 980Ti g1 gaming ...


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> hey everyone i am new here, i just bought the G1 gaming card . these are my benchmarks and everything can someone tell why in the form filling of this club there are only two models reference and founders ? i guess it needs to be updated . i had a 980Ti G1 gaming before on air, this card consumes less power, its chip is cooler and gives almost 10 to 15 fps more in 4k in witcher 3 all settings maxed as compared to my old 980Ti g1 gaming ...


How are you getting 13.5 on 6700 @ 4.5GHz ? should be around 14.5


----------



## SauronTheGreat

lol killerbee you are everywhere







, 13.5 what ?


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> lol killerbee you are everywhere
> 
> 
> 
> 
> 
> 
> 
> , 13.5 what ?


Miswrote







13700 CPU in firestrike


----------



## SauronTheGreat

Quote:


> Originally Posted by *KillerBee33*
> 
> Miswrote
> 
> 
> 
> 
> 
> 
> 
> 13700 CPU in firestrike


hmmm i have no idea, can it be because i have not set the XMP ram profile on my bios ?


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> hmmm i have no idea, can it be because i have not set the XMP ram profile on my bios ?


Running stock you mean? Your Prof. says @ 4.5 , was just wondering, seems a bit low


----------



## SauronTheGreat

Quote:


> Originally Posted by *KillerBee33*
> 
> Running stock you mean? Your Prof. says @ 4.5 , was just wondering, seems a bit low


no, my cpu is working at 4.5ghz @ 1.35V but my rams are working on 3000Mhz and 1.35v their timings are on auto, thats what i meant by an xmp profile, their timings are not set ,if that what xmp means


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> no, my cpu is working at 4.5ghz @ 1.35V but my rams are working on 3000Mhz and 1.35v their timings are on auto, thats what i meant by an xmp profile, their timings are not set ,if that what xmp means


Something is not right there then , your Firestrike with that hardware should be a lot higher even on stock.


----------



## Whitechap3l

Hello,
Someone have Benchmarks from the MSI Sea Hawk EK X ?
I am waiting for 3 weeks and I have to wait 3 more to receiving mine... Now i'm maybe just buy a msi/evga FE and but it under water

thx for replies


----------



## kx11

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Thats a very average OC i must say. Very dissapointing for HoF Cards.


i'm using the flexible SLi bridge so maybe that's why the results are a bit low

there's a High performance button behind the card that seems to enable high OC profile but all i see it doing is turning the fan to 100%


----------



## Bdonedge

Quote:


> Originally Posted by *KillerBee33*
> 
> Something is not right there then , your Firestrike with that hardware should be a lot higher even on stock.


I agree - I have a 6700k at stock with 2666mhz ram and a G1 and my stock score is higher than his


----------



## KillerBee33

Quote:


> Originally Posted by *Bdonedge*
> 
> I agree - I have a 6700k at stock with 2666mhz ram and a G1 and my stock score is higher than his


If i remember correctly 6700 stock runs 13,600 score and 1080 should be around 22000 stock


----------



## SauronTheGreat

Quote:


> Originally Posted by *KillerBee33*
> 
> If i remember correctly 6700 stock runs 13,600 score and 1080 should be around 22000 stock


Quote:


> Originally Posted by *Bdonedge*
> 
> I agree - I have a 6700k at stock with 2666mhz ram and a G1 and my stock score is higher than his


what is the issue then ? i am soo confused and worried ? please help


----------



## fat4l

Quote:


> Originally Posted by *Jpmboy*
> 
> the three marked in the picture you posted are correct for a 1080 reference.


Which ones? Post your picture. Its getting confusing..


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> what is the issue then ? i am soo confused and worried ? please help


Try reinstalling GPU Driver and see your Overclock abilities , 6700k definetly clocked wrong with that score , Try default BIOS settings and run firestrike again .


----------



## kx11

when i restart the OS before the Bios image shows up the GPU bios\model\brand shows up for 2 seconds !!!

that started happening after i extracted the bios , any idea if this is normal ??


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> when i restart the OS before the Bios image shows up the GPU bios\model\brand shows up for 2 seconds !!!
> 
> that started happening after i extracted the bios , any idea if this is normal ??


Had GALAX 970 and also tried GALAX OC Soft. and thats when things like that started happening.
If you even tried their Software it may the the problem**.


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Had GALAX 970 and also tried GALAX OC Soft. and thats when things like that started happening.
> If you even tried their Software it may the the problem**.


i tried their software XtremeTuner+ which seems to be a good one with GPU\memory voltage control stuff , i'll do more testing i think it's the Mobo


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> i tried their software XtremeTuner+ which seems to be a good one with GPU\memory voltage control stuff , i'll do more testing i think it's the Mobo


Well , if Mobo turns out to be fine that XtremeTuner is the next thing to look into







You'll see what i'm talking about when you try to remove it.
Make sure to look in processes after you try to remove it


----------



## uberwootage

Quote:


> Originally Posted by *marc0053*
> 
> Just an update for the HOF bad flash on my gigabyte GTX 1080 G1.
> I followed this guide to create a bootable USB with DOS command in safe mode and was able to reflash to my G1 bios again.
> http://www.overclock.net/t/593427/how-to-unbrick-your-bricked-graphics-card-fix-a-failed-bios-flash/0_20
> I've flashed gpus over the last 5 years+ no problem and this is the 1st time it would freeze at startup with error code 62.
> Usually I got the system up and running but the monitor was blank but always was able to boot in safe mode and reflash the card.
> This time around was a bit more difficult as I could boot up with the 6700k igpu but it would not recognize the GTX 1080 so couldn't flash.
> I hope this is helpful for someone down the road.


Could of saved you some Hassel if you read what happened when i flashed it. Same thing I seen so this bios is a no go for anything but hof's


----------



## marc0053

Quote:


> Originally Posted by *uberwootage*
> 
> Could of saved you some Hassel if you read what happened when i flashed it. Same thing I seen so this bios is a no go for anything but hof's


I flashed the HOF bios about 5 minutes after kx11 posted it


----------



## SauronTheGreat

Quote:


> Originally Posted by *KillerBee33*
> 
> Try reinstalling GPU Driver and see your Overclock abilities , 6700k definetly clocked wrong with that score , Try default BIOS settings and run firestrike again .


I did what you asked now my graphics score is fine and 22,400 but at stock bios mobo settings my physics score is only 12,800 ...is my CPU faulty ? Please help me man, I am so upset at the moment


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Well , if Mobo turns out to be fine that XtremeTuner is the next thing to look into
> 
> 
> 
> 
> 
> 
> 
> You'll see what i'm talking about when you try to remove it.
> Make sure to look in processes after you try to remove it


installed another pair of 1080s ( soon to be sold ) and the GPU1 is in PCI slot 3 while GPU2 is in PCI slot 1 is still there

i think i'll return it and get another deluxe ii Mobo to take advantage of those 4x16 PCI slots


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> I did what you asked now my graphics score is fine and 22,400 but at stock bios mobo settings my physics score is only 12,800 ...is my CPU faulty ? Please help me man, I am so upset at the moment


12,8 huh....what Motherboard? 6700 *K* ???


----------



## SauronTheGreat

Quote:


> Originally Posted by *KillerBee33*
> 
> 12,8 huh....what Motherboard? 6700 *K* ???


I have a Asus Maximus VIII Extreme and yes its K


----------



## GraveDigger7878

So the HOF cards are getting about the same overclocks to 2.1? Hmmmmm....


----------



## xer0h0ur

Just because they use better components or aren't using the reference design doesn't mean that the card will have a die capable of hitting the high end overclocks. As is 2100+ is already above average. So far its very rare to get a card that can hit 2200MHz.


----------



## kx11

Quote:


> Originally Posted by *GraveDigger7878*
> 
> So the HOF cards are getting about the same overclocks to 2.1? Hmmmmm....


they're definitly getting high as i can run a benchmark with +90 core clock but the graphs are reading 2156 maximum for some reason

my Mobo is a dud anyway


----------



## KillerBee33

Quote:


> Originally Posted by *KillerBee33*
> 
> 12,8 huh....what Motherboard? 6700 *K* ???


Is the CPU cooled at all ? See if ASUS has new BIOS out , if not try flashing it with same , also try XMP profile ON.


----------



## SauronTheGreat

Quote:


> Originally Posted by *KillerBee33*
> 
> Is the CPU cooled at all ? See if ASUS has new BIOS out , if not try flashing it with same , also try XMP profile ON.


with the xmp profile on and at 4.5ghz my my score was 13.8 and my cpu cooler keeps the cpu almost below 62C ... and my mobo bios is the latest


----------



## KillerBee33

Quote:


> Originally Posted by *SauronTheGreat*
> 
> man my cpu temps are excellent
> with the xmp profile on and at 4.5ghz my my score was 13.8 and my cpu cooler keeps the cpu almost below 62C ... and my mobo bios is the latest


PM


----------



## fat4l

Hmmm.. Is this result ok ?
I mean the scores ?
11.9K graphics

http://www.3dmark.com/3dm/13486654?

I need to put it under water + remove tdp limits...
Well, I see there's mixed info about those 3 resistors for TDP removal.

Can anyone *WHO DID TDP LIMIT MOD*, mark those 3 resistors(RS1/RS2/RS3) that need to be shorted please ?
Or do we need to put 10 Ohm resistors on top of the 3 caps(C263/C271/C278) ?
Thank you

Here is the pcb you can use:
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/images/front_full.jpg

Now, on https://xdevs.com/guide/pascal_oc/#step3 , they say :
_"Note that earlier version of this guide incorrectly mentioned need to short RS1, RS2, RS3. This is wrong, and will cause card clock to lock at 135MHz. Do not short shunt resistors themselves, but add resistors like shown on photo below. Sorry for confusion."_


----------



## Snabeltorsk

TDP Limit resistors are the RS1,2.3 ones.


----------



## xer0h0ur

Oh that is so foul. I would have shorted those three resistors had I seen that guide before adding my EK block. I guess it was good after all that I didn't end up doing that mod. So resistors are necessary then.


----------



## fat4l

Quote:


> Originally Posted by *Snabeltorsk*
> 
> TDP Limit resistors are the RS1,2.3 ones.


Quote:


> Originally Posted by *fewness*
> 
> Ok, I shorted those 3 capacitors, now my cards are locked @ 135MHz for all conditions, desktop or loaded....
> 
> I now have two most useless GTX 1080s in the world....


So what the guy above did ? ??

@fewness hm? What did you do ?


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh that is so foul. I would have shorted those three resistors had I seen that guide before adding my EK block. I guess it was good after all that I didn't end up doing that mod. So resistors are necessary then.


Well.. theres caps and resistors...
One picture in the guide shows resistors, one picture shows caps ...Well ....


----------



## Snabeltorsk

Think most ppls use CLU/CLP on the Rs1,Rs2,Rs3 then u can remove it if u like.


----------



## nexxusty

Quote:


> Originally Posted by *fat4l*
> 
> Hmmm.. Is this result ok ?
> I mean the scores ?
> 11.9K graphics
> 
> http://www.3dmark.com/3dm/13486654?
> 
> I need to put it under water + remove tdp limits...
> Well, I see there's mixed info about those 3 resistors for TDP removal.
> 
> Can anyone *WHO DID TDP LIMIT MOD*, mark those 3 resistors(RS1/RS2/RS3) that need to be shorted please ?
> Or do we need to put 10 Ohm resistors on top of the 3 caps(C263/C271/C278) ?
> Thank you
> 
> Here is the pcb you can use:
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/images/front_full.jpg
> 
> Now, on https://xdevs.com/guide/pascal_oc/#step3 , they say :
> _"Note that earlier version of this guide incorrectly mentioned need to short RS1, RS2, RS3. This is wrong, and will cause card clock to lock at 135MHz. Do not short shunt resistors themselves, but add resistors like shown on photo below. Sorry for confusion."_


Ugh... conflicting info. @Jedimasterben you removed your power limit. Did you add resistors or shunt?

I remember him saying he used a wire from one end of the resistors to the other to shunt them. He says his power limit is removed.

Then we have buddy a page back saying he IS stuck at 135mhz with the shunt mod applied.

IDFK.


----------



## fat4l

Quote:


> Originally Posted by *nexxusty*
> 
> Ugh... conflicting info. @Jedimasterben you removed your power limit. Did you add resistors or shunt?
> 
> I remember him saying he used a wire from one end of the resistors to the other to shunt them. He says his power limit is removed.
> 
> Then we have buddy a page back saying he IS stuck at 135mhz with the shunt mod applied.
> 
> IDFK.


Exactly thats how I feel like....








Did @fewness short caps (C263/C271/C278) or resistors ?


----------



## nexxusty

Quote:


> Originally Posted by *fat4l*
> 
> Exactly thats how I feel like....
> 
> 
> 
> 
> 
> 
> 
> 
> Did @fewness short caps (C263/C271/C278) or resistors ?


From my knowledge, shorting the caps would be the safer way.

That was never revised in the guide. I say we don't attempt the shunt until PROVEN otherwise.


----------



## fat4l

Check comments on this youtube vid




_"I'm aware of the issue and all the 10xx card videos will get annotations pointing out that problem. The CLU method still works(Der8auer told me it works and used it on a few cards already) However soldering the shunts drops the resitance so low that the card notices that there is something wrong with power in readings and gets stuck in 2D clocks."_


----------



## nexxusty

Quote:


> Originally Posted by *fat4l*
> 
> Check comments on this youtube vid
> 
> 
> 
> 
> _"I'm aware of the issue and all the 10xx card videos will get annotations pointing out that problem. The CLU method still works(Der8auer told me it works and used it on a few cards already) However soldering the shunts drops the resitance so low that the card notices that there is something wrong with power in readings and gets stuck in 2D clocks."_


Haha solder drops the resistance too low.

I can see that yeah....I have CLU lying around... I dunno about this. It doesn't seem to dip or run though does it.. also can be cleaned off easier.

Hmm..

Quote:


> Originally Posted by *fewness*
> 
> Ok, I shorted those 3 capacitors, now my cards are locked @ 135MHz for all conditions, desktop or loaded....
> 
> I now have two most useless GTX 1080s in the world....


You seeing this?


----------



## bobrob

Guys something terrible and expected happened while removing the nuts without the right tools:

I nicked a resistor (I think?!) and it just fell off, seems something Ram related, can please someone advise how to best approach this ?

I have a microsolder station, but this thing is really tiny, I might apply some flux and solder on the board then with some pliers press the resistor down and warm it up hoping it'll flow right...

What would you guys suggest ? Will i fry everything powering it up without resistor ?

Thanks in advance for your assistance....

hostare immagini


----------



## Sourcesys

Its capacitor, just solder it back.


----------



## ssgwright

has anyone tried the power mod yet (besides the guy that's stuck at 135mhz)?


----------



## bobrob

Quote:


> Originally Posted by *Sourcesys*
> 
> Its capacitor, just solder it back.


I like your confidence man, I tried but that thing is so small...

Any tips in how to approach it?

Should I put some solder on the board first ?


----------



## looniam

ICYMI just an FYI:

GeForce Hot Fix driver version 368.95
Fixed DPC latency bug on Pascal GPUs.
http://nvidia.custhelp.com/app/answers/detail/a_id/4202


----------



## Sourcesys

Quote:


> Originally Posted by *bobrob*
> 
> I like your confidence man, I tried but that thing is so small...
> 
> Any tips in how to approach it?
> 
> Should I put some solder on the board first ?


Use a magnifier.

Seriously, Use a magnifier and lot of flux, or ask someone who is good at soldering. And DO NOT put in your PC before fixing.


----------



## bobrob

Quote:


> Originally Posted by *Sourcesys*
> 
> Its capacitor, just solder it back.


Quote:


> Originally Posted by *Sourcesys*
> 
> Use a magnifier.
> 
> Seriously, Use a magnifier and lot of flux, or ask someone who is good at soldering. And DO NOT put in your PC before fixing.


It seems I am unable to get a proper mechanichal joint, I will talk to a friend of mine and see if he had better tools, he should also have some exp working on this.

Cheers :/


----------



## xer0h0ur

I also used needle nose pliers to remove those hex screws and I managed to scratch the black PCB coating over some circuitry paths. I just checked it to make sure I didn't sever any connection, which thank god I didn't, and then used a black sharpie to cover the scratches. I would not want to have to do that again. Next time I will just pony up the dollerinos for the tool.


----------



## jase78

You will need to basically pretin the board with fresh 60/40 thin lead solder. Then clean and reflux and heat up each and it should work. Or just have your friend do it


----------



## ilgello

Quote:


> Originally Posted by *xer0h0ur*
> 
> I also used needle nose pliers to remove those hex screws and I managed to scratch the black PCB coating over some circuitry paths. I just checked it to make sure I didn't sever any connection, which thank god I didn't, and then used a black sharpie to cover the scratches. I would not want to have to do that again. Next time I will just pony up the dollerinos for the tool.


Scratched a bit mine too as I could not find the tool locally...


----------



## Sourcesys

Quote:


> Originally Posted by *Sourcesys*
> 
> Use a magnifier.
> 
> Seriously, Use a magnifier and lot of flux, or ask someone who is good at soldering.


Quote:


> Originally Posted by *bobrob*
> 
> It seems I am unable to get a proper mechanichal joint, I will talk to a friend of mine and see if he had better tools, he should also have some exp working on this.
> 
> Cheers :/


No worries man, its not to hard, let your friend do it. I used to solder myself a lot of PCBs for a living, once you understand how flux and solder work together, you dont even need to touch the components anymore.


----------



## fishyswaz

Quote:


> Originally Posted by *bobrob*
> 
> It seems I am unable to get a proper mechanichal joint, I will talk to a friend of mine and see if he had better tools, he should also have some exp working on this.
> 
> Cheers :/


You might have pulled the pads up with the cap that attach to the traces - have a good look and see if this is the case. All totally fixable - you can use a scalpel really lightly to expose the two traces adjacent where the pads were and resolder (or sodder depending where you're from







) the cap - that's worse case scenario though, see if the pads are gone first.

If you're not confident soldering, either pay someone to do it or practice on something already broken/cheap. Tips are always slightly tin the solder iron tip and clean before soldering, flux pens are good and take your time


----------



## bfedorov11

Hotfix for dpc latency issue is out. People still reporting the problem though.

https://forums.geforce.com/default/topic/951723/geforce-drivers/announcing-geforce-hotfix-driver-368-95/


----------



## GreedyMuffin

So the 1080 FE cooler is a pain in the @ss to remove?


----------



## Sourcesys

Quote:


> Originally Posted by *fishyswaz*
> 
> You might have pulled the pads up with the cap that attach to the traces - have a good look and see if this is the case. All totally fixable - you can use a scalpel really lightly to expose the two traces adjacent where the pads were and resolder (or sodder depending where you're from
> 
> 
> 
> 
> 
> 
> 
> ) the cap - that's worse case scenario though, see if the pads are gone first.
> 
> If you're not confident soldering, either pay someone to do it or practice on something already broken/cheap. Tips are always slightly tin the solder iron tip and clean before soldering, flux pens are good and take your time


I wouldnt use a scalpel, its to dangerous that he is gonna cut a lane


----------



## fishyswaz

Quote:


> Originally Posted by *Sourcesys*
> 
> I wouldnt use a scalpel, its to dangerous that he is gonna cut a lane


Na - you just use it stroking sideways in the same direction as the trace towards the missing pad with its tip gently to remove the pcb covering over a small piece of trace - no danger really unless you push down to cut. Used to do it all the time on arcade and console pcbs when pads had been lifted. If you're that worried about it use a fibreglass pen - although its waaay messier.

But that the worse case scenario - if the pads aren't lifted then really, really don't try it. Tbh if he's not confident in repairing it regardless I'd pay someone to do it. Would take less than 10 mins to fix.


----------



## Sourcesys

Quote:


> Originally Posted by *fishyswaz*
> 
> Na - you just use it stroking sideways with its tip gently to remove the pcb covering over a small piece of trace - no danger really unless you push down to cut. Used to do it all the time on arcade and console pcbs when pads had been lifted. If you're that worried about it use a fibreglass pen - although its waaay messier.
> 
> But that the worse case scenario - if the pads aren't lifted then really, really don't try it. Tbh if he's not confident in repairing it regardless I'd pay someone to do it. Would take less than 10 mins to fix.


For beginers a tweezer is better imo, but be careful not flick the component. Everything else agreed 100%


----------



## uberwootage

Quote:


> Originally Posted by *Sourcesys*
> 
> I wouldnt use a scalpel, its to dangerous that he is gonna cut a lane


He's not cutting. Just scratching. Its a side to side motion. As long as he's careful it's fine I do it all the time at work.


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> has anyone tried the power mod yet (besides the guy that's stuck at 135mhz)?


I won't. Better to wait for bios tdp mod. Messing with the shunts can introduce some issues


----------



## Jpmboy

Quote:


> Originally Posted by *fat4l*
> 
> Hmmm.. Is this result ok ?
> I mean the scores ?
> 11.9K graphics
> 
> http://www.3dmark.com/3dm/13486654?
> 
> I need to put it under water + remove tdp limits...
> Well, I see there's mixed info about those 3 resistors for TDP removal.
> 
> Can anyone *WHO DID TDP LIMIT MOD*, mark those 3 resistors(RS1/RS2/RS3) that need to be shorted please ?
> Or do we need to put 10 Ohm resistors on top of the 3 caps(C263/C271/C278) ?
> Thank you
> 
> Here is the pcb you can use:
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/images/front_full.jpg
> 
> Now, on https://xdevs.com/guide/pascal_oc/#step3 , they say :
> _"Note that earlier version of this guide incorrectly mentioned need to short RS1, RS2, RS3. This is wrong, and will cause card clock to lock at 135MHz. Do not short shunt resistors themselves, but add resistors like shown on photo below. Sorry for confusion."_


http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/280_20#post_25362015


----------



## Kold

I have successfully installed the G10 and H55 on my EVGA GTX 1080 FTW. I had to use a copper shim from eBay between the GPU die and H55 cooler, but other than that it's a perfect fit. In my computer room (80F) it reaches a max of 55C in more demanding games like Witcher 3. This particular card gets up to 2120MHz and 5500 on the memory. And I'm not sure if the monitor can effect temps, but I'm using a 1440p gsync 144hz. Oh, and I'm using a 1200RPM fan.. the stock H55 fan would probably drop temps another 2-4C.


----------



## confed

Just got my 1080 installed today. I was able to grab the Zotac AMP! 1080 through Jet. I am going to start with their OC utility and see what I can do with the card. Has anyone had a good experience with Firestorm and their Zotac card?


----------



## Maintenance Bot

Firestorm utility is ok and works, but I use Afterburner on my gpu.


----------



## uberwootage

Quote:


> Originally Posted by *Kold*
> 
> I have successfully installed the G10 and H55 on my EVGA GTX 1080 FTW. I had to use a copper shim from eBay between the GPU die and H55 cooler, but other than that it's a perfect fit. In my computer room (80F) it reaches a max of 55C in more demanding games like Witcher 3. This particular card gets up to 2120MHz and 5500 on the memory. And I'm not sure if the monitor can effect temps, but I'm using a 1440p gsync 144hz. Oh, and I'm using a 1200RPM fan.. the stock H55 fan would probably drop temps another 2-4C.


My god laying that on carpet like that.


----------



## Kold

Lol, it's not like I was sliding it across the carpet! But, now that you mention it, probably not one of my brightest moments.


----------



## xer0h0ur

Never mind the carpet. I was looking at the rat turds. Looks like he has bigger problems.


----------



## nexxusty

Quote:


> Originally Posted by *bobrob*
> 
> Guys something terrible and expected happened while removing the nuts without the right tools:
> 
> I nicked a resistor (I think?!) and it just fell off, seems something Ram related, can please someone advise how to best approach this ?
> 
> I have a microsolder station, but this thing is really tiny, I might apply some flux and solder on the board then with some pliers press the resistor down and warm it up hoping it'll flow right...
> 
> What would you guys suggest ? Will i fry everything powering it up without resistor ?
> 
> Thanks in advance for your assistance....
> 
> hostare immagini


Aww man. That sucks. Bigtime.

You have a reflow station? That's good. If the resistor is still there, flux it's ass and carefully heat up the area, then remove it with a toothpick carefully.

Get Solder paste. Clean up the area/contacts with Flux and electronics cleaner, put paste on both contacts where the resistor was. After that tin the contacts and re-apply Flux. Then put the resistor back on with the toothpick (sticks there because of the paste) and apply heat.

Done.

I used to fix 360's this way when people who shouldn't be doing this stuff tried to take the Xclamps off the 360 mobo. Always resulted in ripped off caps.

Good luck.


----------



## Kold

Quote:


> Originally Posted by *xer0h0ur*
> 
> Never mind the carpet. I was looking at the rat turds. Looks like he has bigger problems.


Lol, guess I'll be looking around when I get home. There's no way a rat could get into that room, but there is a door opening directly to the outside right next to it. We'll see in a few hours.


----------



## uberwootage

Quote:


> Originally Posted by *nexxusty*
> 
> Aww man. That sucks. Bigtime.
> 
> You have a reflow station? That's good. If the resistor is still there, flux it's ass and carefully heat up the area, then remove it with a toothpick carefully.
> 
> Get Solder paste. Clean up the area/contacts with Flux and electronics cleaner, put paste on both contacts where the resistor was. After that tin the contacts and re-apply Flux. Then put the resistor back on with the toothpick (sticks there because of the paste) and apply heat.
> 
> Done.
> 
> I used to fix 360's this way when people who shouldn't be doing this stuff tried to take the Xclamps off the 360 mobo. Always resulted in ripped off caps.
> 
> Good luck.


He can always use conductive epoxy. We use it at my work.


----------



## Jpmboy

Quote:


> Originally Posted by *Kold*
> 
> Lol, guess I'll be looking around when I get home. There's no way a rat could get into that room, but there is a door opening directly to the outside right next to it. We'll see in a few hours.


too small for rat turds.








Quote:


> Originally Posted by *uberwootage*
> 
> He can always use conductive epoxy. We use it at my work.


^^ This. much easier to do a small repair with.


----------



## uberwootage

Quote:


> Originally Posted by *Jpmboy*
> 
> too small for rat turds.
> 
> 
> 
> 
> 
> 
> 
> 
> ^^ This. much easier to do a small repair with.


He could use the windshield defogger repair kit. 2 part conductive epoxy at auto stores. I have used that for quick Fix's on stuff.


----------



## Kold

So forgive me, but what's the goal for people messing with traces? To stop the card from down clocking 13mhz increments from temps? Or is there more to gain from it?


----------



## Derpinheimer

Quote:


> Originally Posted by *Kold*
> 
> So forgive me, but what's the goal for people messing with traces? To stop the card from down clocking 13mhz increments from temps? Or is there more to gain from it?


I think it effectively disables the power limit, in turn yielding higher voltages and clocks?

BTW, anything new on getting >1.093 volts?


----------



## xer0h0ur

LOL Kold I was just playin. More than likely it was lint balls or something.


----------



## kx11

yeah they said HOF cards are weak





6116 in 3dmark FS ultra

http://www.3dmark.com/fs/9446682

XtremeTunerPlus is a crazy tool for OC but somehow it doesn't control the fan at all , thankfully there's a button on the back of the card that'll turn the Fans straight to 100%


----------



## AMDATI

Yeah the problem with putting something on carpet, is if the carpet has a stored electric charge, it will transfer to the board on contact. It might do nothing, but I certainly wouldn't take that risk..

It seems like a huge chunk of the last few pages of this thread is people damaging and breaking their expensive 1080's....it's like a disaster zone in here....people with too much money, taking risks for little to no gain.

I mean geez you've got the fastest consumer graphics card in the world.....chill out and quit messing with it! $700 down the drain for what? An attempt at gaining 5fps? An attempt at running 5c cooler when you're already 30c below the threshold? Soo silly.


----------



## Sheyster

Quote:


> Originally Posted by *pez*
> 
> From what I can tell, the MSI cards are going to be quieter and run cooler. They're probably the same, but I have seen reports from reviews that the EVGA fan profile by default is a little passive.


I've been running my FTW at 2114 MHz, with the default fan curve. The highest temp I've seen is 74 deg C after 2 hours of BF4.


----------



## Bdonedge

Out of curiosity has anyone else gotten a BSOD using the newest drivers? (Not the hotfix)

I got a BSOD earlier and after a ton of digging turns out it's an error caused by some way the driver interacts with the kernel


----------



## jorpe

Picked up a 1080 classy today, what a beast. Also, gigantic card. Validation submitted.


----------



## H3avyM3tal

Can anyone help my figure out what these stripes are?


----------



## moustang

Quote:


> Originally Posted by *Sheyster*
> 
> I've been running my FTW at 2114 MHz, with the default fan curve. The highest temp I've seen is 74 deg C after 2 hours of BF4.


So it's definitely hotter than the MSI.

My Gaming X never exceeded 65C at 2113mhz when it was on air. Never altered the fan curve, and it was probably the most quiet air cooled card I've ever owned.

And now that it's cooled with a NZXT Kraken X41 cooler and G10 bracket it's even cooler and more quiet.


----------



## ChaosBlades

Quote:


> Originally Posted by *H3avyM3tal*
> 
> Can anyone help my figure out what these stripes are?


I am on a crappy work monitor and the image is low quality but it looks like some Ambient Occlusion errors. Try disabling or changing SSAO, HBAO+, etc in graphics settings.


----------



## pfinch

Hey guys,

I have observed something strange with my Zotax AMP Extreme 1080.
The GPU-Score of all benchmarks differs strongly (more or less) from run to run.

At almost every new benchmark run my GPU-Score changes strongly (*NO CAPS/Throttle*). CPU Score is always the same.

Time Spy: 6900 to 7500
Fire Strike ULTRA: 5500 to 5750

Same AB Settings: 100% volt / 120% PT / curved and fixed voltage tested
Sameclocks: 2088 Core // 5550 Mem
Same max Temp: 56°C
Same conditions
*NO CAPS/Throttle*

Even if I lower the Core and/or Mem clock i don't recognize this at the GPU-Score... Sometimes the Score even drops lower.
---
With this behavior , it is impossible to find the max OC ... sadly


----------



## H3avyM3tal

Quote:


> Originally Posted by *ChaosBlades*
> 
> I am on a crappy work monitor and the image is low quality but it looks like some Ambient Occlusion errors. Try disabling or changing SSAO, HBAO+, etc in graphics settings.


So only lowering image quality can fix this? Seems kinda counter-productive, lol.


----------



## ChaosBlades

Quote:


> Originally Posted by *H3avyM3tal*
> 
> So only lowering image quality can fix this? Seems kinda counter-productive, lol.


Well did you try it? I could be wrong. Also make sure it isn't also enabled in the Nvidia control panel. There might be a conflict with the game implementation and nvidia.


----------



## H3avyM3tal

No, sorry. I'll be home later today. I'll try then and post the results.


----------



## Menno

I am finally recieving my Nvidia HB bridge today. Had to use borderlinx to ship it to me. Seems Nvidia only want to sell it through their own store. Testing tonight at 5k and 3440x1440. Well it looks cooler than 2 brown ribbon cables anyway ?. What is a overclock average for air FE in sli? I cant get it above 1976MHz @ 85c (temp target) begin boost is about 2050MHz. But steady throttles with temps. Wont dip under 1976 though. This is with 120% power and 85c temp target (92c seems abit too much for sli FE) and +175 on both cores.


----------



## ChaosBlades

Quote:


> Originally Posted by *pfinch*
> 
> Hey guys,
> 
> I have observed something strange with my Zotax AMP Extreme 1080.
> The GPU-Score of all benchmarks differs strongly (more or less) from run to run.
> 
> At almost every new benchmark run my GPU-Score changes strongly (*NO CAPS/Throttle*). CPU Score is always the same.
> 
> Time Spy: 6900 to 7500
> Fire Strike ULTRA: 5500 to 5750
> 
> Same AB Settings: 100% volt / 120% PT / curved and fixed voltage tested
> Sameclocks: 2088 Core // 5550 Mem
> Same max Temp: 56°C
> Same conditions
> *NO CAPS/Throttle*
> 
> Even if I lower the Core and/or Mem clock i don't recognize this at the GPU-Score... Sometimes the Score even drops lower.
> ---
> With this behavior , it is impossible to find the max OC ... sadly


The only thing that comes to mind is something in the background on your system messing with the results. Make sure you close all programs in the task tray and all that and rerun the tests. Even anti-virus.


----------



## Kold

Quote:


> Originally Posted by *pfinch*
> 
> Hey guys,
> 
> I have observed something strange with my Zotax AMP Extreme 1080.
> The GPU-Score of all benchmarks differs strongly (more or less) from run to run.
> 
> At almost every new benchmark run my GPU-Score changes strongly (*NO CAPS/Throttle*). CPU Score is always the same.
> 
> Time Spy: 6900 to 7500
> Fire Strike ULTRA: 5500 to 5750
> 
> Same AB Settings: 100% volt / 120% PT / curved and fixed voltage tested
> Sameclocks: 2088 Core // 5550 Mem
> Same max Temp: 56°C
> Same conditions
> *NO CAPS/Throttle*
> 
> Even if I lower the Core and/or Mem clock i don't recognize this at the GPU-Score... Sometimes the Score even drops lower.
> ---
> With this behavior , it is impossible to find the max OC ... sadly


Those fluctuations are normal. The Tome Spy one is slightly out of the norm, I have a feeling it could be your memory OC. Dial it back by 100 and see if that helps. I know with this new memory, going too high with an OC will yield lower results.


----------



## pfinch

Quote:


> Originally Posted by *ChaosBlades*
> 
> The only thing that comes to mind is something in the background on your system messing with the results. Make sure you close all programs in the task tray and all that and rerun the tests. Even anti-virus.


Thank you for your response









I already checked all background programs and I don't even notice that with my GTX 980 TI...








Maybe i will try the new DPC-Driver later..
Quote:


> Originally Posted by *Kold*
> 
> Those fluctuations are normal. The Tome Spy one is slightly out of the norm, I have a feeling it could be your memory OC. Dial it back by 100 and see if that helps. I know with this new memory, going too high with an OC will yield lower results.


Even at Stock Mem (5400) I got this.

Maybe i will try the new DPC-Driver later..


----------



## Joshwaa

Tried the new hot fix drivers. Seems I had the problem when the GPU was at idle clocks. Went from 500 to around 50 latency.


----------



## deejaykristoff

Quote:


> Originally Posted by *Kold*
> 
> I have successfully installed the G10 and H55 on my EVGA GTX 1080 FTW. I had to use a copper shim from eBay between the GPU die and H55 cooler, but other than that it's a perfect fit. In my computer room (80F) it reaches a max of 55C in more demanding games like Witcher 3. This particular card gets up to 2120MHz and 5500 on the memory. And I'm not sure if the monitor can effect temps, but I'm using a 1440p gsync 144hz. Oh, and I'm using a 1200RPM fan.. the stock H55 fan would probably drop temps another 2-4C.


yes the fact tou using 1440p increase the temps by 5 to 10° because the card have much more job to do.


----------



## jorpe

What does the bios switch on the side of my classified do function wise? Is it just a second and third bios or are there more features available that way?


----------



## jedimasterben

Quote:


> Originally Posted by *ssgwright*
> 
> has anyone tried the power mod yet (besides the guy that's stuck at 135mhz)?


Yep, shorting the resistors works great for me. Power limit now hovers in the 70-80% instead of 115-120%.
Quote:


> Originally Posted by *GreedyMuffin*
> 
> So the 1080 FE cooler is a pain in the @ss to remove?


It isn't so long as you have the proper tools, including the correct size nut driver to remove the nut-standoffs on the rear of the card.
Quote:


> Originally Posted by *nexxusty*
> 
> Ugh... conflicting info. @Jedimasterben you removed your power limit. Did you add resistors or shunt?
> 
> I remember him saying he used a wire from one end of the resistors to the other to shunt them. He says his power limit is removed.
> 
> Then we have buddy a page back saying he IS stuck at 135mhz with the shunt mod applied.
> 
> IDFK.


I only soldered over the shunt resistors, I did not add resistors to the capacitors in the second part of the xDevs article. Shorting the resistors is all we need for normal overclocking, adding resistors to the capacitors reduces power limit by another 3x or so, which is needed only for extreme OC runs








Quote:


> Originally Posted by *fat4l*
> 
> Check comments on this youtube vid
> 
> 
> 
> 
> _"I'm aware of the issue and all the 10xx card videos will get annotations pointing out that problem. The CLU method still works(Der8auer told me it works and used it on a few cards already) However soldering the shunts drops the resitance so low that the card notices that there is something wrong with power in readings and gets stuck in 2D clocks."_


Didn't happen for me, card performs like normal.


----------



## Sheyster

Quote:


> Originally Posted by *moustang*
> 
> So it's definitely hotter than the MSI.
> 
> My Gaming X never exceeded 65C at 2113mhz when it was on air. Never altered the fan curve, and it was probably the most quiet air cooled card I've ever owned.
> 
> And now that it's cooled with a NZXT Kraken X41 cooler and G10 bracket it's even cooler and more quiet.


Perhaps it is cooler with the default fan curve. You can't hear the FTW at all; EVGA was indeed very conservative with this card noise-wise. Easy enough to dial up the fans with a set/locked speed or a user defined curve in AB or PX-OC. Since the fans never went above 40% for me, even locking the fans at 50% would probably lower the load temp significantly.

What is the power limit on the MSI card? On the secondary BIOS, the FTW has a +130% limit, which should be ~280w.


----------



## fat4l

Quote:


> Originally Posted by *jedimasterben*
> 
> Didn't happen for me, card performs like normal.


Great. It would be cool if @fewness could step in and tell us what he did with his card that it's stuck ad 2D clocks.

I'm quoting a post from another thread:
Quote:


> Originally Posted by *marc0053*
> 
> Yes I did apply CLU on the 3 power resistors of the G1 card following this guide
> http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/
> 
> in the end it looked like this on my card


----------



## ssgwright

Quote:


> Originally Posted by *fat4l*
> 
> Great. It would be cool if @fewness could step in and tell us what he did with his card that it's stuck ad 2D clocks.
> 
> I'm quoting a post from another thread:


what was the result of him shunting them?


----------



## fat4l

Quote:


> Originally Posted by *ssgwright*
> 
> what was the result of him shunting them?


Quote:


> Originally Posted by *marc0053*
> 
> I had major throttling before the mod and after the mod it stopped. Did not help my overclock tho.


----------



## fat4l

Not sure what to do now ...
Titan X iscoming







...Should I keep 1080? It's still in 14-days return period.


----------



## G woodlogger

You have to consult your purse and your monitor


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> Not sure what to do now ...
> Titan X iscoming
> 
> 
> 
> 
> 
> 
> 
> ...Should I keep 1080? It's still in 14-days return period.


Pascal TITAN does not look like a WOW Factor







at most 25% over the 1080.


----------



## Jpmboy

Quote:


> Originally Posted by *nexxusty*
> 
> Not this....
> 
> Using epoxy that conducts in place of solder is idiocy. Sorry.
> 
> It's also not a repair at all. It's a bandaid. Solder is a repair.
> 
> You're usually pretty good at giving advice, don't tell people to use conductive epoxy just because they might want an easier "fix".
> 
> Bad practice man. Super mega bigtime.


for what he is doing.. and without a flow kit... epoxy will likely result in a beter outcome.
Quote:


> Originally Posted by *fat4l*
> 
> Not sure what to do now ...
> Titan X iscoming
> 
> 
> 
> 
> 
> 
> 
> ...Should I keep 1080? It's still in 14-days return period.


August 2nd supposedly. Direct from NVidia (my two TXs are NV direct).
gonna be $1200.
I'm surely getting one... maybe 2


----------



## fat4l

Quote:


> Originally Posted by *KillerBee33*
> 
> Pascal TITAN does not look like a WOW Factor
> 
> 
> 
> 
> 
> 
> 
> at most 25% over the 1080.


mhm...Now im thinking ...for this price you can get 2x 1080 right ..SLi loooks <3 and will give you more performance than a single Titan X(p)


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> mhm...Now im thinking ...for this price you can get 2x 1080 right ..SLi loooks <3 and will give you more performance than a single Titan X(p)


LOL **NOW YOU THINKING**


----------



## H3avyM3tal

So I checked, and unless I'm misding it, there is no option to change AO in the game? Im noticing these stripes in other places now... Halp.


----------



## MrTOOSHORT

New Titan-X coming out in a couple weeks? guess the 1080 is going back. Last day to return it.

Would have kept it if the big daddy came out early next spring.


----------



## mouacyk

Kinda excited to see if it might drag anything else with it.


----------



## Kold

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> New Titan-X coming out in a couple weeks? guess the 1080 is going back. Last day to return it.
> 
> Would have kept it if the big daddy came out early next spring.


If you really want to pay $1200 before possible tax. We really need AMD to bring some competition or the next Titan could end up being $1500.


----------



## mouacyk

And good luck with exclusive sales from NVIDIA. Chances are they'll be constantly sold out with eBay re-listing around $1700.


----------



## KillerBee33

Quote:


> Originally Posted by *Kold*
> 
> If you really want to pay $1200 before possible tax. We really need AMD to bring some competition or the next Titan could end up being $1500.


A lot of people grabbed 2 1080s and not for MSRP , don't see a bad thing saving a SLI headache and money on a single GPU .
We don't even know its performance yet , maybe 1200 is well worth it


----------



## mouacyk

As usual power target will be severely gimped.


----------



## MrTOOSHORT

Still have my Titan-X that I can wait with until I get the new card. 1080 is great card, just want a card that will be top dog for the next 6 months at least for benching purposes mostly.


----------



## Kold

I'll save my money and go with a 1080 Ti here in a few months. IF it's anything like last generation, I'll only be losing 3-6FPS, but saving tons of money.


----------



## escalibur

I was so happy today when I got my package and then this

__
https://www.reddit.com/r/4u29vw/ordered_zotac_1080_amp_extreme_waited_for_20_days/
 happened...


----------



## moustang

Quote:


> Originally Posted by *KillerBee33*
> 
> A lot of people grabbed 2 1080s and not for MSRP , don't see a bad thing saving a SLI headache and money on a single GPU .
> We don't even know its performance yet , maybe 1200 is well worth it


We know that the specs are not equal to 2 GTX 1080s.

In fact we know that the specs aren't even equal to 2 GTX 1070s.

And based off the previous two generations, two 1070s in SLI will be faster than the Titan of this generation. And cheaper as well. In fact the Titan Z was the first single GPU card that could beat my old 4GB 770 SLI setup. And having run SLI for the past 3 years I can assure you that your "SLI headache" is mostly a fantasy repeated by people who don't have SLI and has little basis in reality. In reality the "headache" usually consisted of waiting a couple of days for a proper SLI profile to be released for brand new games.


----------



## x7007

Hotfix is out if anyone noticed

http://nvidia.custhelp.com/app/answers/detail/a_id/4202

• Fixed DPC latency bug on Pascal GPUs.

https://forums.geforce.com/default/topic/951723/geforce-drivers/announcing-geforce-hotfix-driver-368-95/

fixed it for me

https://postimg.org/image/w10ozvkgh/

https://postimg.org/image/ldlftwks7/


----------



## ChaosBlades

Quote:


> Originally Posted by *x7007*
> 
> Hotfix is out if anyone noticed
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4202
> 
> • Fixed DPC latency bug on Pascal GPUs.
> 
> https://forums.geforce.com/default/topic/951723/geforce-drivers/announcing-geforce-hotfix-driver-368-95/
> 
> fixed it for me
> 
> https://postimg.org/image/w10ozvkgh/
> 
> https://postimg.org/image/ldlftwks7/


Good to know. I just picked up an MSI 1080 Seahawk EK just a few hours ago.


----------



## KillerBee33

Quote:


> Originally Posted by *moustang*
> 
> We know that the specs are not equal to 2 GTX 1080s.
> 
> In fact we know that the specs aren't even equal to 2 GTX 1070s.
> 
> And based off the previous two generations, two 1070s in SLI will be faster than the Titan of this generation. And cheaper as well. In fact the Titan Z was the first single GPU card that could beat my old 4GB 770 SLI setup. And having run SLI for the past 3 years I can assure you that your "SLI headache" is mostly a fantasy repeated by people who don't have SLI and has little basis in reality. In reality the "headache" usually consisted of waiting a couple of days for a proper SLI profile to be released for brand new games.


One mans headache is another mans happiness. You can stay as Real as you like but If you trying to prove something here , good luck to you Sir. I was simply stating the obvious "not to jump to comclusions' I also posted few pages behind that Pascal Titan does not LOOK so impressive. Humm go figure


----------



## Naked Snake

Oh c'mon now... SLI is still a headache, when it works it's great but when it doesn't work it sucks plain and simple.

Give me a SLI profile for Batman Akrham Knight for example, yeah didn't think so and I have been a SLI user since dinosaur age.

Anyway, someone using Evga XOC 6.0.3? Is it better than MSI afterburner 4.0.3 Beta while using a custom voltage curve?


----------



## Mr moff

Hi guys, I've just installed 2 msi 1080 founders editions in sli and after I enable afterburner voltage adjustment I can't move the slider. Using Evga precision X is the same.
Any ideas?


----------



## fewness

Quote:


> Originally Posted by *fat4l*
> 
> Great. It would be cool if @fewness could step in and tell us what he did with his card that it's stuck ad 2D clocks.


I found a interesting way to save them: just flash with that StrixXOC BIOS, which has no power table to begin with, then the cards come back to 2088 as default

But serious, so far I tried:

1. no mod, original BIOS
2. no mod, StrixXOC BIOS
3. shorted capacitors, original BIOS
4. shorted capacitors, StrixXOC BIOS
5. 10 Ohm resistor on top, original BIOS
6. 10 Ohm resistor on top, StrixXOC BIOS

#2 was crazy unstable for me, couldn't bench anything with that. #3 is 135MHz dead...everyone knows that. #5 gave the same overclocking ability as 1, boost is gone, voltage remain ~1.05, but lower score even with the apparently stable frequency. #4 and #6 both lead to higher overclocking ability, ~2190 in 3DMark and over 2200 in Valley, no boost, stable frequency and stable voltage 1.175 to 1.2, But still, the scores were lower than what can be achieved in #1, the untouched original configuration.

My cards are under water, never exceeded 45C in any conditions.

Why would GTX 1080 at a stable frequency throughout the run give lower scores than when the same max frequency is only achieved during part of the run because of boost?

But, whatever, if there are no other tricks invented before August 2, I guess buying a pair of Titan XP is the only choice moving forward....


----------



## KillerBee33

Quote:


> Originally Posted by *Mr moff*
> 
> Hi guys, I've just installed 2 msi 1080 founders editions in sli and after I enable afterburner voltage adjustment I can't move the slider. Using Evga precision X is the same.
> Any ideas?


----------



## fat4l

Quote:


> Originally Posted by *fewness*
> 
> I found a interesting way to save them: just flash with that StrixXOC BIOS, which has no power table to begin with, then the cards come back to 2088 as default
> 
> But serious, so far I tried:
> 
> 1. no mod, original BIOS
> 2. no mod, StrixXOC BIOS
> 3. shorted capacitors, original BIOS
> 4. shorted capacitors, StrixXOC BIOS
> 5. 10 Ohm resistor on top, original BIOS
> 6. 10 Ohm resistor on top, StrixXOC BIOS
> 
> #2 was crazy unstable for me, couldn't bench anything with that. #3 is 135MHz dead...everyone knows that. #5 gave the same overclocking ability as 1, boost is gone, voltage remain ~1.05, but lower score even with the apparently stable frequency. #4 and #6 both lead to higher overclocking ability, ~2190 in 3DMark and over 2200 in Valley, no boost, stable frequency and stable voltage 1.175 to 1.2, But still, the scores were lower than what can be achieved in #1, the untouched original configuration.
> 
> My cards are under water, never exceeded 45C in any conditions.
> 
> Why would GTX 1080 at a stable frequency throughout the run give lower scores than when the same max frequency is only achieved during part of the run because of boost?
> 
> But, whatever, if there are no other tricks invented before August 2, I guess buying a pair of Titan XP is the only choice moving forward....


Thanks mate. Just to make sure are we talking about caps or resistors you shorted? Can you show a pic of the pcb ?
How did you short them? Clu or wire?
Afaik if you short just 2 resistors (rs1-3) out of 3 then you should be on the safe side and your clocks should be fine. Maybe you wanna try?

Also der8auer will be posting a 1080 volt mod vid tomorrow on youtube.

Regarding strix OC bios, its been proven that it only adds "visual" mhz and volts, but real performance is not there.


----------



## Mr moff

Quote:


> Originally Posted by *KillerBee33*


Thanks mate, yeah I tried that but it's still the same. It's got me puzzled.


----------



## kx11

i think there's a line in the .ini that you delete or something


----------



## NBAasDOGG

A question to FTW owner. I realized that there is a bios switch next to the 2x8pin. Which one to choose? Should the switch face the 2x8pin or the FTW RGB Leds?


----------



## x7007

Does anyone know where I can get the iTunes software for Inno3d card ? It came with a disc but I don't have CD-ROM. Can't find in the site .


----------



## Sheyster

Quote:


> Originally Posted by *NBAasDOGG*
> 
> A question to FTW owner. I realized that there is a bios switch next to the 2x8pin. Which one to choose? Should the switch face the 2x8pin or the FTW RGB Leds?


If you want to use the secondary (130%) BIOS, the switch should be in the position closest to the 8-pin connectors. The default factory position should be the other position, farther away from the power connectors.


----------



## NBAasDOGG

Quote:


> Originally Posted by *Sheyster*
> 
> If you want to use the secondary (130%) BIOS, the switch should be in the position closest to the 8-pin connectors. The default factory position should be the other position, farther away from the power connectors.


You sir, you sir = boss. Thanks for the quick answer


----------



## Avant Garde

Can you report your OC results with BIOS switched to "Slave Mode" on FTW card?


----------



## xTesla1856

Anyone have experience with the Zotac AMP Extreme card? Is it worth getting over the other designs?


----------



## uberwootage

Quote:


> Originally Posted by *xTesla1856*
> 
> Anyone have experience with the Zotac AMP Extreme card? Is it worth getting over the other designs?


Jaystwocents did a review on it. 




Basically great cooler overclocks like crap compared to FE's. But unless your going water on it i would go with a Zotac. My first pick is a Galaxy HOF then Zotac Amp extreme


----------



## Avant Garde

Quote:


> Originally Posted by *xTesla1856*
> 
> Anyone have experience with the Zotac AMP Extreme card? Is it worth getting over the other designs?


Nah, not really. They are all pretty much a few % from each other. Choose the best looking and less bulky card.


----------



## pez

So random question and a long shot. Anyone here that did SLI with 2 Xtreme Gaming 1080s? Assuming you got the premium pack that I think they all came with, I'm wondering if anyone that has, would want to sell one of those SLI bridges







.


----------



## Eorzean

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> New Titan-X coming out in a couple weeks? guess the 1080 is going back. Last day to return it.
> 
> Would have kept it if the big daddy came out early next spring.


What store did you buy it from where you can return it after using it, out of curiosity?

I ordered my 1080 on the same day the TX was announced and couldn't cancel it. I was originally going to refuse the shipment, get a refund and get the TX, but decided in the end that I'm just going to keep the 1080, which is fine on 1440p. I'm just going to wait for Volta to jump on the 4K bandwagon. Hopefully by then there'll be some moderately priced 4K HDR monitors.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Eorzean*
> 
> What store did you buy it from where you can return it after using it, out of curiosity?
> 
> I ordered my 1080 on the same day the TX was announced and couldn't cancel it. I was originally going to refuse the shipment, get a refund and get the TX, but decided in the end that I'm just going to keep the 1080, which is fine on 1440p. I'm just going to wait for Volta to jump on the 4K bandwagon. Hopefully by then there'll be some moderately priced 4K HDR monitors.


Memoryexpress.

Was a problem for a few minutes, but they decided it was fine for return. They will change their policy for returning used water cooled cards in the future they said.


----------



## pantsoftime

Installed an XSPC waterblock on an NVIDIA GTX 1080 FE today. Temps stay below 42 under full load on a fairly restrained custom loop (1 x 240mm EK and 1 x 140mm XSPC rads). No complaints about the block - easy install and good instructions. I got the backplate but for anyone who doesn't care about looks, the backplate may not be a great deal. One note for anyone considering buying an NVIDIA built FE from Best Buy or direct - there is no warranty sticker on the screws / cooler.

Best I can get so far is 2100 / 11000. Getting G11583 in Firestrike Extreme but hitting power limit pretty frequently. Waiting for BIOS tweaker to push it harder.


----------



## nexxusty

Quote:


> Originally Posted by *Eorzean*
> 
> What store did you buy it from where you can return it after using it, out of curiosity?
> 
> I ordered my 1080 on the same day the TX was announced and couldn't cancel it. I was originally going to refuse the shipment, get a refund and get the TX, but decided in the end that I'm just going to keep the 1080, which is fine on 1440p. I'm just going to wait for Volta to jump on the 4K bandwagon. Hopefully by then there'll be some moderately priced 4K HDR monitors.


Yep. Volta = 4K. Couldn't of said it better myself. Smart choice.


----------



## Sourcesys

Quote:


> Originally Posted by *nexxusty*
> 
> Aww man. That sucks. Bigtime.
> 
> You have a reflow station? That's good. If the *resistor* is still there, flux it's ass and carefully heat up the area, then remove it with a toothpick carefully.
> 
> Get Solder paste. Clean up the area/contacts with Flux and electronics cleaner, put paste on both contacts where the *resistor* was. After that tin the contacts and re-apply Flux. Then put the *resistor* back on with the toothpick (sticks there because of the paste) and apply heat.


Why do you call it a resistor, man? Its not a resistor, its a capacitor.

Secondly he is COMPLETELY right, as someone who worked for YEARS with PCBs, microcontrollers and micro SMDs , I can ensure you that silver epoxy is the easiest way to solder such small parts without a problem, its the bulletproof way of doing this kind of thing. Whats your problem?

Your passive aggressive way of telling your opinion on things is annoying, relax.


----------



## Bdonedge

What GPU score on fire strike are y'all getting with stock settings?


----------



## xer0h0ur

Quote:


> Originally Posted by *Bdonedge*
> 
> What GPU score on fire strike are y'all getting with stock settings?


Stock? *Shudders* We don't utter that phrase round these parts.

Do you mean graphics score? Or the overall Firestrike score?


----------



## Bdonedge

Quote:


> Originally Posted by *xer0h0ur*
> 
> Stock? *Shudders* We don't utter that phrase round these parts.
> 
> Do you mean graphics score? Or the overall Firestrike score?


Graphics score - sorry I should have specified


----------



## boredgunner

Just tried The Talos Principle for the first time. To my delight, it has a built in benchmark tool. To my dismay, the results:


----------



## Raisingx

Need help with overclocking my MSI 1080, I can get it stable while running the witcher 3 @ 2101mhz core but if I go into the inventory for example, the game hangs when the clocks drop to 1350~mhz, same thing happens sometimes in overwatch when I go back into the main menu when there's a slight core drop while loading..

How do I avoid it crashing while dropping from maximum to medium clocks ?


----------



## Bdonedge

Quote:


> Originally Posted by *boredgunner*
> 
> Just tried The Talos Principle for the first time. To my delight, it has a built in benchmark tool. To my dismay, the results:


What resolution is that?

I loved that game btw


----------



## boredgunner

Quote:


> Originally Posted by *Bdonedge*
> 
> What resolution is that?
> 
> I loved that game btw


2560 x 1440

I had G-SYNC on though so the DX11 result might actually be higher without it.


----------



## Bdonedge

Hey are yall using displayport or DVI


----------



## outofmyheadyo

Anyone tried this with their card ?


----------



## GTRtank

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 
> 
> 
> 
> Anyone tried this with their card ?


haha I was just about to post this! Seems pretty sweet, especially if you are under water, to see what it will boost too! Obviously it will hit that voltage wall, but it will be interesting to see what kind of numbers you can get.


----------



## metal409

Quote:


> Originally Posted by *Raisingx*
> 
> Need help with overclocking my MSI 1080, I can get it stable while running the witcher 3 @ 2101mhz core but if I go into the inventory for example, the game hangs when the clocks drop to 1350~mhz, same thing happens sometimes in overwatch when I go back into the main menu when there's a slight core drop while loading..
> 
> How do I avoid it crashing while dropping from maximum to medium clocks ?


I have the same issue with witcher 3. Hop into the menu to assign skills, change inventory or look through a vendor and 90% of the time it will crash to desktop. Happens even under no over clock for me, annoying.


----------



## jedimasterben

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 
> 
> 
> 
> Anyone tried this with their card ?


http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/3000_40#post_25355441


----------



## uberwootage

Thing about the hardware tdp mod is that its not needed.

This is a founder edition running a modded bios. Game is maxed. 8X MSAA, CMMA,


----------



## xer0h0ur

Quote:


> Originally Posted by *Sourcesys*
> 
> Why do you call it a resistor, man? Its not a resistor, its a capacitor.
> 
> Secondly he is COMPLETELY right, as someone who worked for YEARS with PCBs, microcontrollers and micro SMDs , I can ensure you that silver epoxy is the easiest way to solder such small parts without a problem, its the bulletproof way of doing this kind of thing. Whats your problem?
> 
> Your passive aggressive way of telling your opinion on things is annoying, relax.


Arizonian gave me two infractions for calling him out on his behavior. A+ modding bro, give a card owner two infractions and let an agitator who doesn't even own a card skate away scot free. I feel like I am in grammar school all over again.


----------



## xer0h0ur

Quote:


> Originally Posted by *Bdonedge*
> 
> Graphics score - sorry I should have specified


My graphics score was 21377 on my first Firestrike run and after overclocking I ended up at 23798. I have a feeling my RAM is holding back my system performance as its not allowing me to run it at 2133MHz when I use the XMP profiles. Keeps forcing it @ 1600MHz. I am waiting to have Mushkin RMA my previous much lower timing 1600MHz Redline RAM.


----------



## uberwootage

The limiting factor with the 1080's is the low voltage they provide. So far at the max i get 1.093v i can pull 2,215mhz on the core before stability issues give me problems. Its stable but i'll still game at 2,190 just to be sure i wont have any chance of a crash.

The default voltage i seen on some cards top out at 1.062 to 1.093 depending on what card you have. Thew big limiting factor here is the voltage. But to be honest anything over 1.093v you need to be on water its to much for the air coolers because it starts kicking out heat after 1.062v


----------



## boredgunner

Quote:


> Originally Posted by *Bdonedge*
> 
> Hey are yall using displayport or DVI


DisplayPort to take full advantage of my monitor. Need that bandwidth.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> The limiting factor with the 1080's is the low voltage they provide. So far at the max i get 1.093v i can pull 2,215mhz on the core before stability issues give me problems. Its stable but i'll still game at 2,190 just to be sure i wont have any chance of a crash.
> 
> The default voltage i seen on some cards top out at 1.062 to 1.093 depending on what card you have. Thew big limiting factor here is the voltage. But to be honest anything over 1.093v you need to be on water its to much for the air coolers because it starts kicking out heat after 1.062v


Well I'll be doing had volt mod to get it to 1.25V + TDP mod...

Can you tell me, what bios is the best for FE? Upload it if you can







Or.....maybe the best 2 so I can try both..
Thanks


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> Well I'll be doing had volt mod to get it to 1.25V + TDP mod...
> 
> Can you tell me, what bios is the best for FE? Upload it if you can
> 
> 
> 
> 
> 
> 
> 
> Or.....maybe the best 2 so I can try both..
> Thanks


I got a big list of them. It really all depends. If you are doing the hardware mods go with the Nvidia bios. 86.04.17.00.01. Only improvement you will get off other bio's are higher tdp's but since your doing hardware mods you wont need those.

I'll be volt modding my card soon. Im getting 2 titan x's so im going to hot rod this 1080. I should have my pots on Monday then im just bringing a fluke home from work and i'll do it all. The fluke i have has not been calibrated in a few years so i do not use it for anything i need a accurate measurement on. The pots im using are also single turn 5 x 5 x 2.7 mm so its not to bulky. So it will be a nice clean look


----------



## metal409

Quote:


> Originally Posted by *uberwootage*
> 
> Thing about the hardware tdp mod is that its not needed.
> 
> This is a founder edition running a modded bios. Game is maxed. 8X MSAA, CMMA,


Are you saying that wow isn't stressing the card or your tdp is low because of a modded bios?

Here is what the witcher 3 looks like for me if I run a high clock, it keeps bouncing off the power limit.


----------



## KickAssCop

That power mod worth it. Where do I order liquid metal. Links?


----------



## uberwootage

Quote:


> Originally Posted by *metal409*
> 
> Are you saying that wow isn't stressing the card or your tdp is low because of a modded bios?
> 
> Here is what the witcher 3 looks like for me if I run a high clock, it keeps bouncing off the power limit.


My TDP is 300w

Wow is the only game i play 3d mark will load it but i still dont throttle and my TDP never maxes out.

But yeah i cant get wow to max out a single 1080 at all. Even at 5k.


----------



## metal409

Quote:


> Originally Posted by *uberwootage*
> 
> My TDP is 300w
> 
> Wow is the only game i play 3d mark will load it but i still dont throttle and my TDP never maxes out.
> 
> But yeah i cant get wow to max out a single 1080 at all. Even at 5k.


Gotcha. What bios are you on now? I kinda lost track, lol.


----------



## uberwootage

Quote:


> Originally Posted by *metal409*
> 
> Gotcha. What bios are you on now? I kinda lost track, lol.


Tested a modded FE bios. Based off 86.04.11.00.0C


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> I got a big list of them. It really all depends. If you are doing the hardware mods go with the Nvidia bios. 86.04.17.00.01. Only improvement you will get off other bio's are higher tdp's but since your doing hardware mods you wont need those.
> 
> I'll be volt modding my card soon. Im getting 2 titan x's so im going to hot rod this 1080. I should have my pots on Monday then im just bringing a fluke home from work and i'll do it all. The fluke i have has not been calibrated in a few years so i do not use it for anything i need a accurate measurement on. The pots im using are also single turn 5 x 5 x 2.7 mm so its not to bulky. So it will be a nice clean look


Nice nice!
I'm already on that bios -default bios.
So you are saying some of the other bioses you tested helped you gain soem mhz only by increasing TDP ?
Quote:


> Originally Posted by *KickAssCop*
> 
> That power mod worth it. Where do I order liquid metal. Links?


It depends where you are from








Simply search for:
Coollaboratory Liquid pro
Coollaboratory Liquid ultra
Thermal Grizzly Conductonaut


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> Nice nice!
> I'm already on that bios -default bios.
> So you are saying some of the other bioses you tested helped you gain soem mhz only by increasing TDP ?
> It depends where you are from


Over stock? Yeah Some are better then others. This is the best that i have used so far. Has a few tweaks to it as well as the TDP bumped up to 300w so there is no throttling at all. Stock i would throttle from the tdp and then the voltage some reason it did not go over 1.063v so i was limited to 2.18 as my max stable clock. With the voltage to 1.093v i can get to 2.21 stable.

Still messing around with it. So far it seems good been stable sold gains on it.


----------



## ssgwright

tested the liquid metal on that one resistor... no change here... it didn't do anything lol


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> tested the liquid metal on that one resistor... no change here... it didn't do anything lol


There are 3 that you need to do.


----------



## ssgwright

i thought if you did all three it locks the card at 2d clocks?


----------



## fat4l

Quote:


> Originally Posted by *ssgwright*
> 
> i thought if you did all three it locks the card at 2d clocks?


yes that's what will happen(afaik).
Do 2 resistors and report back


----------



## ssgwright

in the previous vid he only did the one and it worked.


----------



## bobrob

Thanks for the replies and the support guys, so no luck doing this by myself, did not want to fiddle too much with it as did not want to ruin the pads....

Tomorrow my friend will come take a look, if he fails I was thinking of using some collaboratory ultra or something like Technicqll Electronic glue 2g R-082 as pointed out...


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> i thought if you did all three it locks the card at 2d clocks?


It could if you do to much. This is the thing about a mod like this. There is another circuit in there that detects this and if its triggered as a safety it looks like they are locking the card to 2d clocks so that it does not damage its self. I haven't done it but from what i been seeing people did 3 but if you do to much it will lock it down to 2d clocks. What means there is a threshhold that it must measure if not it thinks there is a problem and downclocks to save its self.

I havent really looked at this circuit. A bypass for that should be pretty simple but i would have to look at it. I just ignored it and went with 300w tdp in the bios.

Quote:


> Originally Posted by *ssgwright*
> 
> in the previous vid he only did the one and it worked.


What bios are you running? You might be running a bios with a higher tdp so you really wont see much of a gain off that.


----------



## fat4l

Exactly. If you do all 3 and you put too much CLU, then it will lock the card @2D clocks.
Therefore do just 2 resistor, with fair amount of CLU and you should be good to go!


----------



## ssgwright

my stock Zotac FE bios


----------



## ssgwright

so this is weird... I flashed to the MSI EK bios and now the TDP while running furmark hits max 97% (used to bounce up to 120) but the card is still throttling???


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> so this is weird... I flashed to the MSI EK bios and now the TDP while running furmark hits max 97% (used to bounce up to 120) but the card is still throttling???


Needs more voltage


----------



## ssgwright

i have the volts maxed... it won't boost up... it's throttling.


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> i have the volts maxed... it won't boost up... it's throttling.


I seen that with the msi ek bios. I found the corsair seahawk bios to work better on fe's


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> I seen that with the msi ek bios. I found the corsair seahawk bios to work better on fe's


aren't they both the same?


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> aren't they both the same?


No corsair is a reference card the EK a custom.


----------



## ssgwright

hook me up! and how did you mod the bios to 300 tdp?


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> hook me up! and how did you mod the bios to 300 tdp?


Testing it for a guy since only a few people are testing bios out. i'll ask if i can give it out i was not told i could upload it so i'll ask.


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> Testing it for a guy since only a few people are testing bios out. i'll ask if i can give it out i was not told i could upload it so i'll ask.


please do, I've been dying to push this thing!


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> please do, I've been dying to push this thing!


Im waiting on the next one. Right now this only hits 1.093v what means i can only get to just over 2.2ghz. Need more volts to really push it.


----------



## ssgwright

Quote:


> Originally Posted by *uberwootage*
> 
> Im waiting on the next one. Right now this only hits 1.093v what means i can only get to just over 2.2ghz. Need more volts to really push it.


1.093 is fine with me as long as i can push the power limit


----------



## GreedyMuffin

Yeah. I'd rather bios flash my card instead of physical modding it.

Getting my 1080 FE tomorrow. If it overclocks like a turd, I'll return it. Will test it in my 4770 rig first, that way I don't need to drain my loop.

Either a Strix or a FTW from eVGA. What do you guys think?

I've put my limit on min 2050 on stock voltage, or else I'll trash it.


----------



## uberwootage

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yeah. I'd rather bios flash my card instead of physical modding it.
> 
> Getting my 1080 FE tomorrow. If it overclocks like a turd, I'll return it. Will test it in my 4770 rig first, that way I don't need to drain my loop.
> 
> Either a Strix or a FTW from eVGA. What do you guys think?
> 
> I've put my limit on min 2050 on stock voltage, or else I'll trash it.


FE's out clock everything when they get put under water. HOF might clock higher but i haven't seen the clocks on them


----------



## metal409

Quote:


> Originally Posted by *ssgwright*
> 
> Quote:
> 
> 
> 
> Originally Posted by *uberwootage*
> 
> Im waiting on the next one. Right now this only hits 1.093v what means i can only get to just over 2.2ghz. Need more volts to really push it.
> 
> 
> 
> 1.093 is fine with me as long as i can push the power limit
Click to expand...

In the same boat as you. Would love to get my hands on a modded bios.


----------



## GreedyMuffin

Yeah. Gotta test it before I purchase a EK block for it.

I got 45 days before I need to return it (Norway, Komplett.no FTW!!).

I can run the fan on 100% just to test, it will do around the same on water with that fanspeed on I hope.


----------



## uberwootage

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yeah. Gotta test it before I purchase a EK block for it.
> 
> I got 45 days before I need to return it (Norway, Komplett.no FTW!!).
> 
> I can run the fan on 100% just to test, it will do around the same on water with that fanspeed on I hope.


Water your loads temps will be about what the idle temps are with the fan at 100%. The card loves water cooling.


----------



## GreedyMuffin

Nah. I only got a XE240 and a XTX360. I would love a 480 extra.


----------



## ssgwright

hook me up uber!


----------



## jedimasterben

Quote:


> Originally Posted by *ssgwright*
> 
> i thought if you did all three it locks the card at 2d clocks?


Nope. Not sure what causes that.


----------



## ssgwright

Quote:


> Originally Posted by *jedimasterben*
> 
> Nope. Not sure what causes that.


have you tried it? does it work to do all three?


----------



## ssgwright

screw it im doing it, be back in a minute


----------



## jedimasterben

Quote:


> Originally Posted by *ssgwright*
> 
> have you tried it? does it work to do all three?


http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/3000_40#post_25355441


----------



## Neitzluber

Quote:


> Originally Posted by *KillerBee33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kold*
> 
> If you really want to pay $1200 before possible tax. We really need AMD to bring some competition or the next Titan could end up being $1500.
> 
> 
> 
> A lot of people grabbed 2 1080s and not for MSRP , don't see a bad thing saving a SLI headache and money on a single GPU .
> We don't even know its performance yet , maybe 1200 is well worth it
Click to expand...

Titan X (Pascal) at 11 GFLOPS is nowhere near the 15-16 GFLOPS you would get by running two GTX1080s in SLI, and yet is twice the price...
SLI isn't much of a headache these days.


----------



## ssgwright

well maybe I'm dong something wrong but I just did all three and no difference...


----------



## ssgwright

here's a pic... sorry it's all blurry I just took it real quick and threw the block back on:


----------



## uberwootage

Quote:


> Originally Posted by *ssgwright*
> 
> here's a pic... sorry it's all blurry I just took it real quick and threw the block back on:


From the guide "and that migraine inducing pic lol" C726 is diff. Chances are its nothing but i know the value of that cap but not the one on your card.

Chances are its nothing but rule of thumb one component is diff do not try anything on it. To hard to tell what else was chanced and if any internal traces are changed. "Chances are not" but you have to be 100% sure. Because they just dont randomly change component package styles at random. That your board required a totally separate program for the pick and place.

My advice is get some high res images of your pcb and start comparing them. Next time i take my block off i'll check mine and see if its like yours or that one and if its like yours i'll measure the cap see what it is.


----------



## looniam

Quote:


> Originally Posted by *ssgwright*
> 
> here's a pic... sorry it's all blurry I just took it real quick and threw the block back on:
> 
> 
> Spoiler: Warning: Spoiler!


between your application and that picture, imagine your hand shakes a bit.


----------



## ssgwright

Quote:


> Originally Posted by *looniam*
> 
> between your application and that picture, imagine your hand shakes a bit.


lol the application is actually pretty good but as you can see in the pick I had the card on my lap balancing the block on my leg all while snapping a quick pick... I just didn't let it focus, I just snapped it real quick.


----------



## looniam

oh sorry, i was expecting it to look more "bead like"


Spoiler: Warning: Spoiler!







but still, maybe have a beer.


----------



## xTesla1856

Guys, I need your help: I'm really torn between 3 cards and I don't know which one to get

-MSI Gaming X
-EVGA FTW
-Zotac AMP Extreme

Help me make up my mind and maybe tell me about your experience (temps, noise, coil whine) Thanks a milion


----------



## Kold

They'll all perform similarly. Go with EVGA for the excellent customer service and forums.


----------



## boredgunner

Quote:


> Originally Posted by *Kold*
> 
> They'll all perform similarly. Go with EVGA for the excellent customer service and forums.


This is all true. Those GTX 1080s all have some of the best cooling out of all of them. Overclocks are similar on all of those, although it seems the EVGA FTW might average a bit less than the others. While EVGA does have customer service going for them (I've dealt with them and I agree it's very good), I'd go with the Zotac AMP Extreme given the opportunity. Probably the best cooler out of the three (even though the other two are really good) and I have slightly more faith in its overclocking capabilities.

The downside to the Zotac is the ugly backplate, but I'd just paint it all black. The MSI Gaming X looks the same as many other Twin Frozr cards, the EVGA FTW is probably the best looking GPU of all time.


----------



## fjodsk

Hey guys, I'm new to this forum and quite a noob. I have a budget of 800$. So I'm gonna go with the GTX 1080 Xtreme most likely. Are there any BIOS level tweaks that can help remove the 1.093V cap? I'm not going to go much above that, I just want that 2100Mhz in case I get ****ty lottery. Preferably no hardware mods. I suck and will probably screw it up.

Thanks guys,


----------



## fewness

Quote:


> Originally Posted by *fat4l*
> 
> Thanks mate. Just to make sure are we talking about caps or resistors you shorted? Can you show a pic of the pcb ?
> How did you short them? Clu or wire?
> Afaik if you short just 2 resistors (rs1-3) out of 3 then you should be on the safe side and your clocks should be fine. Maybe you wanna try?
> 
> Also der8auer will be posting a 1080 volt mod vid tomorrow on youtube.
> 
> Regarding strix OC bios, its been proven that it only adds "visual" mhz and volts, but real performance is not there.


I'm too lazy to pull cards out and take pictures now, but it's the capacitors, right now I have 10ohm resistors soldered on top of them. I didn't have CLU.

I will wait till someone actually show a proven performance improvement before doing anything, if at all, on the cards again.

On the "visual" side, I'm now suspecting it's the Ctrl-F oc-curve's fault. It's the curve method that gave me "visual" improvement on mhz and volts, but lower scores....


----------



## fat4l

Quote:


> Originally Posted by *fewness*
> 
> I'm too lazy to pull cards out and take pictures now, but it's the capacitors, right now I have 10ohm resistors soldered on top of them. I didn't have CLU.
> 
> I will wait till someone actually show a proven performance improvement before doing anything, if at all, on the cards again.
> 
> On the "visual" side, I'm now suspecting it's the Ctrl-F oc-curve's fault. It's the curve method that gave me "visual" improvement on mhz and volts, but lower scores....


Aha. I will do the mod. I will put clu on the 2 resistors not capacitors.
Lets see


----------



## Joshwaa

With my EVGA GTX 1080 I never hits the Power Limit. It is the voltage limit I always hit. I saw above where someone was running a modded FE Bios? Does that mean we now have an editor or someone who can mod them. I would gladly put up a bounty for a mod on the FTW Bios.


----------



## ssgwright

Quote:


> Originally Posted by *Joshwaa*
> 
> With my EVGA GTX 1080 I never hits the Power Limit. It is the voltage limit I always hit. I saw above where someone was running a modded FE Bios? Does that mean we now have an editor or someone who can mod them. I would gladly put up a bounty for a mod on the FTW Bios.


he's being stingy and won't give it up lol


----------



## fjodsk

Damn that would be awesome! Are there any good BIOS right now?


----------



## ChaosBlades

Quote:


> Originally Posted by *Joshwaa*
> 
> With my EVGA GTX 1080 I never hits the Power Limit. It is the voltage limit I always hit. I saw above where someone was running a modded FE Bios? Does that mean we now have an editor or someone who can mod them. I would gladly put up a bounty for a mod on the FTW Bios.


I hit the 2.100Ghz limit before I hit either on my 1080 SeaHawk EK X. Took the fun out of overclocking to be honest. At full load 2.1Ghz Core and 5.2Ghz Memory I was only hitting like 73% Power limit and I still have a 0 voltage offset. No matter what voltage offset I can't get past 2.126Ghz and even that wasn't fully tested stable.


----------



## Kold

Quote:


> Originally Posted by *ChaosBlades*
> 
> I hit the 2.100Ghz limit before I hit either on my 1080 SeaHawk EK X. Took the fun out of overclocking to be honest. At full load 2.1Ghz Core and 5.2Ghz Memory I was only hitting like 73% Power limit and I still have a 0 voltage offset. No matter what voltage offset I can't get past 2.126Ghz and even that wasn't fully tested stable.


Yup 2126MHz is my max OC as well. Increasing the voltage only allows me to have higher clock rates once it starts to down clock from heat.


----------



## ChaosBlades

Quote:


> Originally Posted by *Joshwaa*
> 
> With my EVGA GTX 1080 I never hits the Power Limit. It is the voltage limit I always hit. I saw above where someone was running a modded FE Bios? Does that mean we now have an editor or someone who can mod them. I would gladly put up a bounty for a mod on the FTW Bios.


I hit the 2.100Ghz limit before I hit either on my 1080 SeaHawk EK X. Took the fun out of overclocking to be honest. At full load 2.1Ghz Core and 5.2Ghz Memory I was only hitting like 73% Power limit and I still have a 0 voltage offset. No matter what voltage offset I can't get past 2.126Ghz and even that wasn't fully tested stable.
Quote:


> Originally Posted by *Kold*
> 
> Yup 2126MHz is my max OC as well. Increasing the voltage only allows me to have higher clock rates once it starts to down clock from heat.


It is so weird, I hope at some point there is some kind of technical explanation for this.

I don't have that problem







I think 43C was the highest I got it since you know... waterblock.


----------



## Joshwaa

2139Mhz and 1.081v is what mine is limited to. If I try to push it any higher it will crash the driver.


----------



## Joshwaa

Sad day... EK pushed the date back for the FTW and Classy waterblocks to Late August.


----------



## nexxusty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Arizonian gave me two infractions for calling him out on his behavior. A+ modding bro, give a card owner two infractions and let an agitator who doesn't even own a card skate away scot free. I feel like I am in grammar school all over again.


I own a GTX 1080.....


----------



## GreedyMuffin

Hi!

Can anyone with a 1080 OCed to around 2100mhz test vanilla Timespy? (I guess there's where most 1080s hit).

I got this result with a 1500/1989 mhz 980Ti/4500 5960X.

Cheers and thanks!









Also, I get around 41¤C when folding on my 980Ti, will the temp be lower on the 1080 under water? It got a lower TDP, but is a smaller die - harder to cool? Dunno how big of a difference that makes.

Also, which waterblock is the highest performing one for the 1080 FE?


----------



## GreedyMuffin

Hi!

Can anyone with a 1080 OCed to around 2100mhz test vanilla Timespy? (I guess there's where most 1080s hit).

I got this result with a 1500/1989 mhz 980Ti/4500 5960X.

Cheers and thanks!









Also, I get around 41¤C when folding on my 980Ti, will the temp be lower on the 1080 under water? It got a lower TDP, but is a smaller die - harder to cool? Dunno how big of a difference that makes.

Also, which waterblock is the h


----------



## ssgwright

i score around 7800 with mine


----------



## MrTOOSHORT

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Hi!
> 
> Can anyone with a 1080 OCed to around 2100mhz test vanilla Timespy? (I guess there's where most 1080s hit).
> 
> I got this result with a 1500/1989 mhz 980Ti/4500 5960X.
> 
> Cheers and thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I get around 41¤C when folding on my 980Ti, will the temp be lower on the 1080 under water? It got a lower TDP, but is a smaller die - harder to cool? Dunno how big of a difference that makes.
> 
> Also, which waterblock is the h


On max clocks on both my Titan-X and former 1080, 1586Mhz and 2152Mhz respectively, The gpu score in Timespy differed by 1500 points.

6800 to 8300.


----------



## pantsoftime

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Hi!
> 
> Can anyone with a 1080 OCed to around 2100mhz test vanilla Timespy? (I guess there's where most 1080s hit).


Came in at 7849 earlier today for the graphics score. GPU around 2100 and CPU is 4930K @ 4.7


----------



## xer0h0ur

Quote:


> Originally Posted by *nexxusty*
> 
> I own a GTX 1080.....


My mistake then, didn't see your name on the list.


----------



## fat4l

Quote:


> Originally Posted by *ssgwright*
> 
> here's a pic... sorry it's all blurry I just took it real quick and threw the block back on:


maybe you should put more paste there


----------



## fat4l

Quote:


> Originally Posted by *fewness*
> 
> I'm too lazy to pull cards out and take pictures now, but it's the capacitors, right now I have 10ohm resistors soldered on top of them. I didn't have CLU.
> 
> I will wait till someone actually show a proven performance improvement before doing anything, if at all, on the cards again.
> 
> On the "visual" side, I'm now suspecting it's the Ctrl-F oc-curve's fault. It's the curve method that gave me "visual" improvement on mhz and volts, but lower scores....


Which type of resistors did you use ? Thx


----------



## flexus

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yeah. Gotta test it before I purchase a EK block for it.
> 
> I got 45 days before I need to return it (Norway, Komplett.no FTW!!).
> 
> I can run the fan on 100% just to test, it will do around the same on water with that fanspeed on I hope.


I`m still waiting for my MSI sea hawk EK X from Komplett. It has not been in stock yet and the "expecting" date is moved a couple of times.
So and I`m staring to be impatient as I have dissembled my loop, hehe.


----------



## GreedyMuffin

Yeah. Seems like that is the case with most 1080s. That's why I purchased a DEMO for a little less. (500 nok - 60USD approx.)

Will pick up the card within an hour, gotta eat breakfast first. It's waiting for me.


----------



## flexus

Nice








Yes, guess it is the vendor as it has not been in stock in Norway yet.








I thought that I would drop the DIY route this time with this card, as I would have ended around the same price buying a WB, backplate and a FE card.


----------



## GreedyMuffin

For me it's a little bit cheaper, plus I can run it on air if I upgrade. (Slap it into a folding machine etc).


----------



## jedimasterben

Quote:


> Originally Posted by *ssgwright*
> 
> here's a pic... sorry it's all blurry I just took it real quick and threw the block back on:


When you get super blurry pics like that, the easy way to fix it is to get more light to the camera sensor, so turn on every light around you or move to a better lit room, pics will turn out much better








Quote:


> Originally Posted by *fewness*
> 
> I'm too lazy to pull cards out and take pictures now, but it's the capacitors, right now I have 10ohm resistors soldered on top of them. I didn't have CLU.
> 
> I will wait till someone actually show a proven performance improvement before doing anything, if at all, on the cards again.
> 
> On the "visual" side, I'm now suspecting it's the Ctrl-F oc-curve's fault. It's the curve method that gave me "visual" improvement on mhz and volts, but lower scores....


So you only added 10ohm resistors onto the capacitors according to the xDevs article and did not short resistors RS1, RS2, and RS3? If that is the case, then I don't think it will work properly. The resistors are the primary part of the mod and adding the resistors on top of the capacitors is to take it 'the extra mile' sort of thing.


----------



## fat4l

Quote:


> Originally Posted by *jedimasterben*
> 
> When you get super blurry pics like that, the easy way to fix it is to get more light to the camera sensor, so turn on every light around you or move to a better lit room, pics will turn out much better
> 
> 
> 
> 
> 
> 
> 
> 
> So you only added 10ohm resistors onto the capacitors according to the xDevs article and did not short resistors RS1, RS2, and RS3? If that is the case, then I don't think it will work properly. The resistors are the primary part of the mod and adding the resistors on top of the capacitors is to take it 'the extra mile' sort of thing.


Tin(the guy who made the guide), at kingpincooling, said that:
_"No need to do anything with shunts RS1,RS2,RS3.

Only solder low-value resistors to three caps C278, C271, C263 like here:"_

http://forum.kingpincooling.com/showthread.php?t=3879&page=5


----------



## jedimasterben

Quote:


> Originally Posted by *fat4l*
> 
> Tin(the guy who made the guide), at kingpincooling, said that:
> _"No need to do anything with shunts RS1,RS2,RS3.
> 
> Only solder low-value resistors to three caps C278, C271, C263 like here:"_
> 
> http://forum.kingpincooling.com/showthread.php?t=3879&page=5


Very interesting. Thanks


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> Tin(the guy who made the guide), at kingpincooling, said that:
> _"No need to do anything with shunts RS1,RS2,RS3.
> 
> Only solder low-value resistors to three caps C278, C271, C263 like here:"_
> 
> http://forum.kingpincooling.com/showthread.php?t=3879&page=5


Does he know what the phase angle is after he adds the resistors? Because the total impedance and current will have a phase angle from 0-90 deg when you do that.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Does he know what the phase angle is after he adds the resistors? Because the total impedance and current will have a phase angle from 0-90 deg when you do that.


And what does that mean and what does it cause?


----------



## nexxusty

Quote:


> Originally Posted by *uberwootage*
> 
> Does he know what the phase angle is after he adds the resistors? Because the total impedance and current will have a phase angle from 0-90 deg when you do that.


Ummm. TiN?

He's an engineer at EVGA.

As in... you don't need to question his work. Ever.


----------



## yenclas

I have this pcb (Palit 1080 Gamerock):

https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/images/front.jpg

Can I "solder" with Collaboratory Liquid Pro resistors to unlock power ? Which ones ?

Thank you very much


----------



## fireyfire

Quote:


> Originally Posted by *fat4l*
> 
> And what does that mean and what does it cause?


Adding resistors to a reactive circuit (AC or pulsed DC signal with Capacitors and or inductors) Will change the phase angle Eg. in purely inductive circuits Voltage leads current by 90 Degrees, in capacitive circuits Current leads voltage by 90 Degrees. Adding a resistors to capacitors can significantly change the impedance of the circuit, as impedance is a calculated value depending on current flow, voltage and signal frequency.

Long story short, if you add resistors you will shift the phase angle by 90 Degrees, Voltage will lead current.

Sorry if there is missing information. I just woke up and my eyes are blurry


----------



## GreedyMuffin

What is an OK card?

Like 2100 on stock voltage, is that OK, or is it bad?

What about memory?


----------



## boredgunner

Quote:


> Originally Posted by *GreedyMuffin*
> 
> What is an OK card?
> 
> Like 2100 on stock voltage, is that OK, or is it bad?
> 
> What about memory?


2100 on stock voltage is really good. +500 memory (~11000 MHz) seems the standard memory overclock most go for and achieve.


----------



## nexxusty

Quote:


> Originally Posted by *GreedyMuffin*
> 
> What is an OK card?
> 
> Like 2100 on stock voltage, is that OK, or is it bad?
> 
> What about memory?


2100mhz is a great card. Anything 2.1ghz and over is the best you're going to get.

RAM would be about +500 or so.
Quote:


> Originally Posted by *boredgunner*
> 
> 2100 on stock voltage is really good. +500 memory (~11000 MHz) seems the standard memory overclock most go for and achieve.


Haha, bored, you beat me by a few seconds!


----------



## fat4l

Quote:


> Originally Posted by *nexxusty*
> 
> 2100mhz is a great card. Anything 2.1ghz and over is the best you're going to get.
> 
> RAM would be about +500 or so.
> Haha, bored, you beat me by a few seconds!


Theres a guy in another thread saying of a card hits 2130-2150 its below average lol....


----------



## nexxusty

Quote:


> Originally Posted by *fat4l*
> 
> Theres a guy in another thread saying of a card hits 2130-2150 its below average lol....


He's a fool.

Lol.

*edit*

K, I gotta know. Who is it?


----------



## KillerBee33

When you guys say HIT 2100 do you mean Hit it once or?
I can hit 2215 in firestrike but it crashes when Physics test logs in


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> When you guys say HIT 2100 do you mean Hit it once or?
> I can hit 2215 in firestrike but it crashes when Physics test logs in


For me... "hit" just means a certain mhz you can obtain.

Not necessarily stable. But doesn't lock up immediately.

Essentially what you said in your first sentence. For instance I can't hit anything over 2100mhz. Card throttles or hits the power limit/voltage limit.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> For me... "hit" just means a certain mhz you can obtain.
> 
> Not necessarily stable. But doesn't lock up immediately.
> 
> Essentially what you said in your first sentence. For instance I can't hit anything over 2100mhz. Card throttles or hits the power limit/voltage limit.


Mine starts up @ 2215 , throttles down to 2188 then 2154 i think but i figured out a way to have highest 1.93V as soon as the card starts boosting .


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Mine starts up @ 2215 , throttles down to 2188 then 2154 i think but i figured out a way to have highest 1.93V as soon as the card starts boosting .


I still haven't screwed with the curve yet. I really need to figure that out. That's good card right there it seems though.


----------



## fat4l

Quote:


> Originally Posted by *KillerBee33*
> 
> Mine starts up @ 2215 , throttles down to 2188 then 2154 i think but i figured out a way to have highest 1.93V as soon as the card starts boosting .


and you have it under air @stock FE cooler ?


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> and you have it under air @stock FE cooler ?


Uhummm, still thinking if i'm keeping it so , no AIO yet.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> I still haven't screwed with the curve yet. I really need to figure that out. That's good card right there it seems though.


If you able to to try i can guide you


----------



## fat4l

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhummm, still thinking if i'm keeping it so , no AIO yet.


Thats a ncie card there!


----------



## Jpmboy

Quote:


> Originally Posted by *uberwootage*
> 
> Thing about the hardware tdp mod is that its not needed.
> This is a founder edition running a modded bios. Game is maxed. 8X MSAA, CMMA,


post the bios - the OP will add it to Post#1. No worries, you have no liability if someone downloads it.
Quote:


> Originally Posted by *uberwootage*
> 
> Tested a modded FE bios. Based off 86.04.11.00.0C


So... you hex modded the bios? post it up!
Quote:


> Originally Posted by *ssgwright*
> 
> tested the liquid metal on that one resistor... no change here... it didn't do anything lol


you need a "lump" of CLU on the resistor - a painted coating will not do anything

Quote:


> Originally Posted by *ssgwright*
> 
> here's a pic... sorry it's all blurry I just took it real quick and threw the block back on:


Needs a lot more CLU.
Quote:


> Originally Posted by *looniam*
> 
> oh sorry, i was expecting it to look more "bead like"
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> but still, maybe have a beer.


A shot or two of single malt does it for me.









Quote:


> Originally Posted by *uberwootage*
> 
> Does he know what the phase angle is after he adds the resistors? Because the total impedance and current will have a phase angle from 0-90 deg when you do that.


So I met with one of the guys working with Tin and Kingpin on the card mods (HWBOT LN2 Party this weekend). A straight resistor short can trip the Bios/Driver fault and lock the card in P8 state. A driver flush and refill can help.. until the next high current/high TDP fault-trip. As I understand it, the only way to do this so that the mod will behave properly and not bork the bios/driver interface/handshake is 3x10ohm resistors.
Basically, the best thing to do (unless your 1080 is purchased with casino cash, since a hard mod will void the warranty) is wait for the new releases with a filled out power section ( note the unfilled PCB locations for additional VRMs and chokes). But if you must squeeze the 1080... work among the suicide guys is the 10ohm resistors are the only real solution.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> If you able to to try i can guide you


That'd be awesome bro.

I should have my system up in a few hours. Just dealing with a concave 5930k IHS first.

Appreciated.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> That'd be awesome bro.
> 
> I should have my system up in a few hours. Just dealing with a concave 5930k IHS first.
> 
> Appreciated.


NP , but i notice with CURVE you get lower score on Higher Clocks , even on stable highest Voltage , let me know when


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> NP , but i notice with CURVE you get lower score on Higher Clocks , even on stable highest Voltage , let me know when


Hmph. OK, cool. I'll be testing for this as well then.

Talk soon.


----------



## MrTOOSHORT

The curve never worked so great with Timespy, but it worked pretty good for Heaven.


----------



## fat4l

Quote:


> Originally Posted by *Jpmboy*
> 
> post the bios - the OP will add it to Post#1. No worries, you have no liability if someone downloads it.
> So... you hex modded the bios? post it up!
> you need a "lump" of CLU on the resistor - a painted coating will not do anything
> Needs a lot more CLU.
> A shot or two of single malt does it for me.
> 
> 
> 
> 
> 
> 
> 
> 
> So I met with one of the guys working with Tin and Kingpin on the card mods (HWBOT LN2 Party this weekend). A straight resistor short can trip the Bios/Driver fault and lock the card in P8 state. A driver flush and refill can help.. until the next high current/high TDP fault-trip. As I understand it, the only way to do this so that the mod will behave properly and not bork the bios/driver interface/handshake is 3x10ohm resistors.
> Basically, the best thing to do (unless your 1080 is purchased with casino cash, since a hard mod will void the warranty) is wait for the new releases with a filled out power section ( note the unfilled PCB locations for additional VRMs and chokes). But if you must squeeze the 1080... work among the suicide guys is the 10ohm resistors are the only real solution.


Any idea whats the max safe 24/7 voltage for 1080 under water? 40C temps...


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> Any idea whats the max safe 24/7 voltage for 1080 under water? 40C temps...


Is there a BIOS with Max Voltage i didnt know about ?








Max so far is 1.093 and thats nothing , lol my friends 980G1 has 1.32 24/7


----------



## fat4l

Quote:


> Originally Posted by *KillerBee33*
> 
> Is there a BIOS with Max Voltage i didnt know about ?
> 
> 
> 
> 
> 
> 
> 
> 
> Max so far is 1.093 and thats nothing , lol my friends 980G1 has 1.32 24/7


I'm talking about v-mod


----------



## nexxusty

Quote:


> Originally Posted by *fat4l*
> 
> I'm talking about v-mod


I wouldn't touch a VMOD for 24/7 without a TEC or a water chiller.

Full cover waterblock as well. Even then.... because I don't truly KNOW I'd still be worried about the long term.

These cards have enough voltage for now. 1.1v is the max they'll go with software and that seems to get 2200mhz with proper cooling. On most cards once the power limit is removed.


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> I'm talking about v-mod


Ehh not touching that , gonna wait for BIOS Tools and if not just sell it


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Im waiting on the next one. Right now this only hits 1.093v what means i can only get to just over 2.2ghz. Need more volts to really push it.


Cmon maaan...we needthis bios ! upload pls


----------



## KillerBee33

Some1 posted this...

Not sure if i should congratulate him on Photoshop Skillz or leave 1080 , hard mod it and stop chasing Pascals


----------



## kx11

yeah i might pick this up , no TitanX though


----------



## Jpmboy

Quote:


> Originally Posted by *fat4l*
> 
> Any idea whats the *max safe 24/7 voltage for 1080* under water? 40C temps...


Quote:


> Originally Posted by *fat4l*
> 
> I'm talking about v-mod


yeah, this is not known... anyone doing trim pots or adding an new powersection is not running ambient water. Sorry.
Then again... this is Overclock.net, not Safevoltage.net.


----------



## Derpinheimer

Quote:


> Originally Posted by *boredgunner*
> 
> 2100 on stock voltage is really good. +500 memory (~11000 MHz) seems the standard memory overclock most go for and achieve.


Just got EVGA FTW, it does 2100 on stock voltage.. but only 2113 with +100 :/ (at least, it insta crashes in BF4 with maxed and +200% resolution scale)


----------



## Noufel

hi people








i got my Palit GameRock PE 1080, is there any chance that it will be bottlenecked by my 4790k on 1080p resolution, i have a 144hz 1080p monitor







i know it's not a gpu for this kind of resplutions but i'm one of those that prefere higher fps than higher resolution


----------



## uberwootage

Quote:


> Originally Posted by *Noufel*
> 
> hi people
> 
> 
> 
> 
> 
> 
> 
> 
> i got my Palit GameRock PE 1080, is there any chance that it will be bottlenecked by my 4790k on 1080p resolution, i have a 144hz 1080p monitor
> 
> 
> 
> 
> 
> 
> 
> i know it's not a gpu for this kind of resplutions but i'm one of those that prefere higher fps than higher resolution


No your fine. I ran a 1080 on a i5 4690k for a week till I finished my skylake. 4.5ghz because it was a turd 4.6 needed 1.4v


----------



## RavenousSix

So got my ASUS GTX 1080 Strix yesterday and I got lucky on silicon lottery. Can hit 2100Mhz GPU stable.

23,042 Firestrike Graphics Score, not bad:

http://www.3dmark.com/3dm/13610562

EDIT:

Made a mistake when installing the card, fan noise fixed, card is fine, I edited the post to reflect that.


----------



## GreedyMuffin

Thanks for ya help!

I need a WB ASAP.

65¤C at 100% fan speed.









It will downclock to 2088 after some time due temp. So instead I was testing mem.

500+, 550+, 600+, 650+, 700+.. by now I was thinking *woah*







. Tested another run in Valley at 750+ and stopped, will test 800+ tomorrow.
No sign of artifact etc yet.

Is it dangerous to play with such a high mem speed? I don't want to ruin something.


----------



## dentnu

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Thanks for ya help!
> 
> I need a WB ASAP.
> 
> 65¤C at 100% fan speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It will downclock to 2088 after some time due temp. So instead I was testing mem.
> 
> 500+, 550+, 600+, 650+, 700+.. by now I was thinking *woah*
> 
> 
> 
> 
> 
> 
> 
> . Tested another run in Valley at 750+ and stopped, will test 800+ tomorrow.
> No sign of artifact etc yet.
> 
> Is it dangerous to play with such a high mem speed? I don't want to ruin something.


I also could push my ram really high but after +525 I start seeing less fps due to the memory ECC kicking in. You need to test and make sure you are not losing frames or getting lower benchmark scores on anything over +500. Yes I know you think its stable since you don't see any artifact nor driver cashing but trust me ECC is kicking in and you are for sure getting lower FPS and benchmarks scores.


----------



## GreedyMuffin

I've been thinking it could be that as well.

But as far as I remember, I didn't get a lower score. But I will double check tomorrow!


----------



## ssgwright

sad... no posts in 6 hrs....


----------



## chronicfx

Quote:


> Originally Posted by *ssgwright*
> 
> sad... no posts in 6 hrs....


Drop and give me 50...posts!


----------



## ssgwright

Quote:


> Originally Posted by *chronicfx*
> 
> Drop and give me 50...posts!


lol if we had a bios editor this place would be hopping!


----------



## kx11

so HOF cards can run on 1.3 voltage

but you have to use XtremeTuner+ for that


----------



## carlhil2

Is the Seahawk with the EK block a good gpu? I see that my local MC has one left in stock. it's between this one and the Classified. I decided to only get the new Titan X if nVIDIA let them be sold elsewhere also.I want at least 2100......


----------



## MrTOOSHORT

Quote:


> Originally Posted by *carlhil2*
> 
> Is the Seahawk with the EK block a good gpu? I see that my local MC has one left in stock. it's between this one and the Classified. I decided to only get the new Titan X if nVIDIA let them be sold elsewhere also.I want at least 2100......


It's pretty good, look:



The stock bios is good, but I did better with the MSI FE bios from techpowerup data base.


----------



## carlhil2

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It's pretty good, look:
> 
> 
> 
> The stock bios is good, but I did better with the MSI FE bios from techpowerup data base.


Lol, what's up Kid? I forgot that you were pushing one. If the Classifieds don't hit MC by the weekend, I will snatch one of the Seahawks up. should hold me til something else greater drops, that you don't have to go through nVIDIA to cop....thanks for the info. if it's good enough for you, it's good enough for me, you're that dude....


----------



## 337drew

Quote:


> Originally Posted by *Aph0ticShield*
> 
> 130% is the limit on FTW. Use the Slave bios.


EVGA Precision doesn't allow me a 130% power limit with the slave bios on my FTW. How did you go about this? I see no difference between bioses from a Precision perspective. Is there something else I should be using?

thx


----------



## schoolofmonkey

You know what's funny.
I came from a eVGA GTX980ti Hybrid to a Asus GTX1080 Strix, I overclocked the Stix to 2050Mhz temps hit 70c, my brain went "oh no that's too hot". After owning a Hybrid I forgot 70c is actually really good for a air cooled card.

I'm really impressed with the noticeable "out of the box" performance difference to the GTX980ti Hybrid


----------



## SauronTheGreat

can someone tell me how important is AA in 4K because in the witcher 3 i turn off both AA and AA for hairworks and i barely see any difference ? but with my 1080 G1 in factory OC mode there is rather a 5 or 7 fps gain with these settings off ... are there any more settings which make no difference in 4k in witcher 3 , which should be lowered or turned off for better frame rates ?


----------



## KickAssCop

Ordered 2 STRIX. Good choice?


----------



## kx11

Quote:


> Originally Posted by *KickAssCop*
> 
> Ordered 2 STRIX. Good choice?


amazing choice


----------



## pez

Quote:


> Originally Posted by *SauronTheGreat*
> 
> can someone tell me how important is AA in 4K because in the witcher 3 i turn off both AA and AA for hairworks and i barely see any difference ? but with my 1080 G1 in factory OC mode there is rather a 5 or 7 fps gain with these settings off ... are there any more settings which make no difference in 4k in witcher 3 , which should be lowered or turned off for better frame rates ?


With most games in 4K, I either turn AA completely off, to 2x, or just FXAA on. Those are the only ones I like to compromise on at least







.


----------



## schoolofmonkey

Quote:


> Originally Posted by *KickAssCop*
> 
> Ordered 2 STRIX. Good choice?


Quote:


> Originally Posted by *kx11*
> 
> amazing choice


Second this one, heck of a lot better than the Matrix cards they had at one point.


----------



## KickAssCop

Do I need HB bridge for 1440p? I think not. Hoping to grab a ribbon from somewhere.


----------



## SauronTheGreat

Quote:


> Originally Posted by *pez*
> 
> With most games in 4K, I either turn AA completely off, to 2x, or just FXAA on. Those are the only ones I like to compromise on at least
> 
> 
> 
> 
> 
> 
> 
> .


but still how much difference do you see while running AA on and off in 4k ? although when i used to play in 1080p the difference was a lot while playing with AA on and off


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> so HOF cards can run on 1.3 voltage
> 
> but you have to use XtremeTuner+ for that


Hmmm and any results ?
I'm only worried about if increasing voltage brings any +mhz.
We can vmod the card but...if I do so I wanna be sure I get some neat Mhz out of it.
Der8auer modded 1060 t0 1.25v and could run it @2200Mhz while with stock he could do 2025Mhz.

Quote:


> Originally Posted by *uberwootage*
> 
> Testing it for a guy since only a few people are testing bios out. i'll ask if i can give it out i was not told i could upload it so i'll ask.


Maybe @uberwootage could spread some love and give us the "modded" tdp bios ?


----------



## KickAssCop

Quote:


> Originally Posted by *KillerBee33*
> 
> Some1 posted this...
> 
> Not sure if i should congratulate him on Photoshop Skillz or leave 1080 , hard mod it and stop chasing Pascals


Any info on price?


----------



## KillerBee33

Quote:


> Originally Posted by *KickAssCop*
> 
> Any info on price?


Heh, looks fake to me , just felt like posting it







i might be wrong though


----------



## KickAssCop

Looks pretty authentic to be honest. And I just ordered the 1080s.


----------



## pez

Quote:


> Originally Posted by *SauronTheGreat*
> 
> but still how much difference do you see while running AA on and off in 4k ? although when i used to play in 1080p the difference was a lot while playing with AA on and off


I tested specifically for it in Crysis 3 and Fallout 4 and could not tell a difference. I did see a small difference in Fallout 4, but I had to look for it way too hard for it to be worth the impact it had on performance. GTA V has 4 or 5 (maybe more) types of AA and I have 1 or 2 of them off or at the lowest (2x or FXAA). Those I have yet to notice either.

I stopped noticing the difference AA made back when i started to use 1440p, so I'm pretty pleased with how little I need it on 4K. I notice texture quality decreases much more than AA disabling/decreasing.


----------



## skline00

KickAssCop: The GTX 1080 is a VERY fast card. I'm running it at 3440 x 1440 and I get very high frame rates on all games.


----------



## toncij

What is the most silent 1080 and 1070 so far? MSI? Gainward? Zotac?


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Some1 posted this...
> 
> Not sure if i should congratulate him on Photoshop Skillz or leave 1080 , hard mod it and stop chasing Pascals


It's fake. It'd be GDDR5X not regular GDDR5 in such a card.







And 1080Ti will probably never happen.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> It's fake. It'd be GDDR5X not regular GDDR5 in such a card.
> 
> 
> 
> 
> 
> 
> 
> And 1080Ti will probably never happen.


Yeap, first thing that caught my eye ,


----------



## Joshwaa

Quote:


> Originally Posted by *toncij*
> 
> What is the most silent 1080 and 1070 so far? MSI? Gainward? Zotac?


The EVGA FTW is silent if you leave the stock fan profile. No whine from mine either.


----------



## ViTosS

Hey, can someone tell me what's wrong with my build for running The Witcher 3 with the GPU usage variation like this:




If you guys want to compare, there is a video in the description showing the same pass in the game but with an i7 5960X, and the GPU usage is solid 98~99%.

Any help is welcome, thank you!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *toncij*
> 
> What is the most silent 1080 and 1070 so far? MSI? Gainward? Zotac?


MSI Sea Hawk EK X, both 1080 and 1070.


----------



## pez

Quote:


> Originally Posted by *ViTosS*
> 
> Hey, can someone tell me what's wrong with my build for running The Witcher 3 with the GPU usage variation like this:
> 
> 
> 
> 
> If you guys want to compare, there is a video in the description showing the same pass in the game but with an i7 5960X, and the GPU usage is solid 98~99%.
> 
> Any help is welcome, thank you!


I don't have the game, but assuming based on usage, it looks like Witcher 3 may scale better with more cores? You're getting more usage on less cores from what I saw and then the video you referenced is getting less usage, but using more cores (assuming they really are running the game and just the OSD-related stuff). It doesn't look like clock speed does much from that other video. I know I'm kinda saying yes to the question in the video, but someone with more knowledge and experience with the game can correct me.


----------



## emett

Quote:


> Originally Posted by *KickAssCop*
> 
> Do I need HB bridge for 1440p? I think not. Hoping to grab a ribbon from somewhere.


Only with pci-e 3 x8 not with x16 sli. No gains.


----------



## moustang

Quote:


> Originally Posted by *ViTosS*
> 
> Hey, can someone tell me what's wrong with my build for running The Witcher 3 with the GPU usage variation like this:
> 
> 
> 
> 
> If you guys want to compare, there is a video in the description showing the same pass in the game but with an i7 5960X, and the GPU usage is solid 98~99%.
> 
> Any help is welcome, thank you!


It's probably a RAM bottleneck. If that's the rig in your signature than you've got two slow 4GB RAM sticks, and it's probably running into a bus speed bottleneck when CPU usage is high.


----------



## Sheyster

Quote:


> Originally Posted by *Joshwaa*
> 
> The EVGA FTW is silent if you leave the stock fan profile. No whine from mine either.


Mine is silent and stable at 2114 MHz, using the stock curve. For those who don't know, the secondary EVGA FTW BIOS supports +130% power target. Base TDP is 215w. I get no throttling whatsoever in BF4 with the secondary BIOS at 2114 Mhz. Using the primary BIOS (+120%) I got some throttling.


----------



## Noufel

Quote:


> Originally Posted by *toncij*
> 
> What is the most silent 1080 and 1070 so far? MSI? Gainward? Zotac?


The palit gamerock very silent good temps and amazing clocks


----------



## ViTosS

Quote:


> Originally Posted by *moustang*
> 
> It's probably a RAM bottleneck. If that's the rig in your signature than you've got two slow 4GB RAM sticks, and it's probably running into a bus speed bottleneck when CPU usage is high.


Oh my bad I forgot to update my sig, I have 4x4GB Corsair Vengeance Pro 1600Mhz now.


----------



## kx11

Quote:


> Originally Posted by *fat4l*
> 
> Hmmm and any results ?
> I'm only worried about if increasing voltage brings any +mhz.
> We can vmod the card but...if I do so I wanna be sure I get some neat Mhz out of it.
> Der8auer modded 1060 t0 1.25v and could run it @2200Mhz while with stock he could do 2025Mhz.


well yeah i can hit 2209mhz for almost 3 to 10 seconds during a benchmark but it goes down later to 2179mhz


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> well yeah i can hit 2209mhz for almost 3 to 10 seconds during a benchmark but it goes down later to 2179mhz


what voltage is that ?


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> well yeah i can hit 2209mhz for almost 3 to 10 seconds during a benchmark but it goes down later to 2179mhz



Starts up 2227 but still throttles down to 2202


----------



## KillerBee33

Double Post


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Starts up 2224 and throttles down to 2202 , ran this for about 6 minutes no crash.


stock bios ?!


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> stock bios ?!


Uhumm , higest i got was 2272 for about 4 minutes


----------



## kx11

nice , i need to kick this card a bit more , funny how OC the memory past 505+ can decrease the performance in benchmarks always

are you on air cooling ?!


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> nice , i need to kick this card a bit more , funny how OC the memory past 505+ can decrease the performance in benchmarks always
> 
> are you on air cooling ?!


Yeap,,anything higher than 500 is killing the performance, got up to 650 and stopped. Stock Ref. cooler


----------



## kx11

Quote:


> Originally Posted by *fat4l*
> 
> what voltage is that ?


well


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Yeap,,anything higher than 500 is killing the performance, got up to 650 and stopped. Stock Ref. cooler


it seems that MSI AB have voltage support for your GPU but it doesn't for mine since it can not apply more than 1.09v , that's why i can't go a lot higher with it and i'm too lazy and non-knowledgeable to edit MSI AB to add my GPU voltage support


----------



## ikjadoon

Quote:


> Originally Posted by *Noufel*
> 
> hi people
> 
> 
> 
> 
> 
> 
> 
> 
> i got my Palit GameRock PE 1080, is there any chance that it will be bottlenecked by my 4790k on 1080p resolution, i have a 144hz 1080p monitor
> 
> 
> 
> 
> 
> 
> 
> i know it's not a gpu for this kind of resplutions but i'm one of those that prefere higher fps than higher resolution


The 8 threads will help. An i5, though, will get bottlenecked severely if 144FPS is your aim.


----------



## kx11

delete


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> it seems that MSI AB have voltage support for your GPU but it doesn't for mine , that's why i can't go a lot higher with it and i'm too lazy and non-knowledgeable to edit MSI AB to add my GPU voltage support


There's an option in Options
Try changing from Standart MSI to "i forgot what it's called" last out of 4 options, restart AB.


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> There's an option in Options
> Try changing from Standart MSI to "i forgot what it's called" last out of 4 options, restart AB.


i went through all of them , couldn't go higher than 100+


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> well


Ohoh. Its nice to have you here sir!

What is the scaling from stock 1.05v to 1.3v?
How much extra mhz did you get?


----------



## fat4l

Quote:


> Originally Posted by *KillerBee33*
> 
> 
> Starts up 2227 but still throttles down to 2202


Did you do tdp mod? How was the card acting before?


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> Did you do tdp mod? How was the card acting before?


No mods at all,untouched stock.


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> i went through all of them , couldn't go higher than 100+


----------



## kx11

Quote:


> Originally Posted by *fat4l*
> 
> Ohoh. Its nice to have you here sir!
> 
> What is the scaling from stock 1.05v to 1.3v?
> How much extra mhz did you get?


actually it goes down to 0.8000 all the way to 1.3000

i could push 2209mhz for 10 seconds during a benchmark then it'll go down to 2180mhz and stay there until the bench is over , couldn't do that without 1.3000v applied otherwise i'd see green dots all over the screen


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*


Hum...ok try this : revert to stock and only raise Power and Voltage to MAX then Press Ctrl+F , when curve opens press Ctl and start raising only the last pin from the right to 2300 or close to that. Apply in main AB window and run again.

EDIT : Keep holding CTRL when raising curve


----------



## toncij

Quote:


> Originally Posted by *Sheyster*
> 
> Mine is silent and stable at 2114 MHz, using the stock curve. For those who don't know, the secondary EVGA FTW BIOS supports +130% power target. Base TDP is 215w. I get no throttling whatsoever in BF4 with the secondary BIOS at 2114 Mhz. Using the primary BIOS (+120%) I got some throttling.


What BIOS, please?


----------



## Jpmboy

Not sure anyone would be interested... nvidia sure has crippled *double precision FLOPS* since the OG TX:

6950X with 1 GTX 1080FE or 2 TitanX SLI.


vs an ANCIENT Tesla C-class:


----------



## toncij

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> MSI Sea Hawk EK X, both 1080 and 1070.


Well, if I choose water, I'd go custom.


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Hum...ok try this : revert to stock and only raise Power and Voltage to MAX then Press Ctrl+F , when curve opens press Ctl and start raising only the last pin from the right to 2300 or close to that. Apply in main AB window and run again.
> 
> EDIT : Keep holding CTRL when raising curve


that worked and i got a steady 2202mhz on a 1.09v , it didn't last long before crashing in Heaven benchmark


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> that worked and i got a steady 2202mhz on a 1.09v , it didn't last long before crashing in Heaven benchmark


Try Memory + 500 , not more than that.
Edit.: Having steady 1.93V does not seem to have a huge effect but does help with higher than 2100 Mhz , i run 2150 in firestrike , will try more later today but keep in mind on AIR at around 80 degrees








http://www.3dmark.com/3dm/13621380


----------



## kx11

kinda weird how HWinfo64 is reporting the maximum voltage applied was 1.094 while XtremeTuner+ is reporting 1.300v


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> kinda weird how HWinfo64 is reporting the maximum voltage applied was 1.094 while XtremeTuner+ is reporting 1.300v


Heh, i have no good reason to say it but i dont trust XTREME TUNER








I usually go by GPUZ and keep it up to date , also i tried GALAX OC Tool few times and it gave me a headache .
Generally try not to use more than one Soft.


----------



## GreedyMuffin

Buying an WB block for my GPU. The FE cooler is not good enough.

1. Is a backplate necessary in order to have the card not sag? I absolutely hate sagging, I feel so bad for the card's PCB.

2. Which WB do you like the most?

Thinking of going with EK because I like them the best.

Cheers!


----------



## fireyfire

Hi Guys, I just want to yell all of you about my experience with memory overclocking the Zotac GTX 1080 AMP! Edition card that I have. Warning! I have placed heatsinks on all of the RAM modules.
Link to heatsinks used:
*https://www.amazon.com/gp/product/B00637X42A/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1*
They were secured with a generic thermal tape.

I used an artificial benchmark (I chose it because it was short and it utilized almost all of the VRAM) It is the MSI Kombustor Lakes of Titan test.

This card is a "Hybrid card" I used an NZXT G10 Bracket and an NZXT Kraken x61 Cooler, The Memory/VRM fan is silent, so I left it at full speed.

Here are the OC results:
Core is always at 2190 MHz

Memory was stable up to +825 MHz

+0 Memory Clock | Score: 41902 | 698 FPS

+100 Memory Clock | Score: 41544| 692 FPS

+200 Memory Clock | Score: 42325 | 705 FPS

+500 Memory Clock | Score: 42679 | 711 FPS

+600 Memory Clock | Score: 42864 | 714 FPS

+650 Memory Clock | Score: 42450 | 707 FPS

+700 Memory Clock | Score: 43164 | 719 FPS

+750 Memory Clock | Score: 42566 | 709 FPS

+800 Memory Clock | Score: 43139 | 718 FPS

+825 Memory Clock | Score: 43463 | 724 FPS (Stable in CS:GO, GTA V and Garrys mod Vsync off)

+850 Memory Clock | Unstable


----------



## xer0h0ur

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Buying an WB block for my GPU. The FE cooler is not good enough.
> 
> 1. Is a backplate necessary in order to have the card not sag? I absolutely hate sagging, I feel so bad for the card's PCB.
> 
> 2. Which WB do you like the most?
> 
> Thinking of going with EK because I like them the best.
> 
> Cheers!


Not at all, you should be able to use any of the currently available waterblocks without needing the backplate. In reality on these Pascal cards so far, the backplate is purely there for aesthetics. The EK backplate wasn't available when I ordered my nickel plated block so I just reused the FE backplate on it for the sake of not having an exposed PCB.


----------



## juniordnz

Look what mail santa broght to me yesteday










Spoiler: Warning: Spoiler!


----------



## GreedyMuffin

Quote:


> Originally Posted by *xer0h0ur*
> 
> Not at all, you should be able to use any of the currently available waterblocks without needing the backplate. In reality on these Pascal cards so far, the backplate is purely there for aesthetics. The EK backplate wasn't available when I ordered my nickel plated block so I just reused the FE backplate on it for the sake of not having an exposed PCB.


Thanks! That will save me almost 60 bucks.

How did you mount the FE backplate If i may ask?









I think I'll order the same block as you got.

Do you have any pics of your rig/loop?


----------



## kx11

just recorded a video playing Witcher3 with 2165 to 2139mhz clocks @ 4k

playing @ 2202mhz is possible but it will produce textures pop-ups and weird red dots every 3 seconds


----------



## juniordnz

Quote:


> Originally Posted by *xer0h0ur*
> 
> Not at all, you should be able to use any of the currently available waterblocks without needing the backplate. In reality on these Pascal cards so far, the backplate is purely there for aesthetics. The EK backplate wasn't available when I ordered my nickel plated block so I just reused the FE backplate on it for the sake of not having an exposed PCB.


Just saw a review of the Gaming X 1080 and the part where the VRMs are situated gets VERY hot. Like 82ºC. Wouldn't a backplate with good quality thermal pads help keeping it cool?


----------



## GreedyMuffin

Quote:


> Originally Posted by *juniordnz*
> 
> Just saw a review of the Gaming X 1080 and the part where the VRMs are situated gets VERY hot. Like 82ºC. Wouldn't a backplate with good quality thermal pads help keeping it cool?


Probably not to that degree since it's watercooled and not aircooled.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Probably not to that degree since it's watercooled and not aircooled.


Yeah, I forgot that..sorry. Waterblock will cover the VRMs also...

Anyway, for aircooled cards it's still a valid point, isn't it?


----------



## xer0h0ur

Quote:


> Originally Posted by *juniordnz*
> 
> Just saw a review of the Gaming X 1080 and the part where the VRMs are situated gets VERY hot. Like 82ºC. Wouldn't a backplate with good quality thermal pads help keeping it cool?


You know I probably should have specified that what I said was only with regard to the FE cards using the reference design. There just isn't anything significant there on the backside of the PCB that needs cooling.
Quote:


> Originally Posted by *GreedyMuffin*
> 
> Thanks! That will save me almost 60 bucks.
> 
> How did you mount the FE backplate If i may ask?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think I'll order the same block as you got.
> 
> Do you have any pics of your rig/loop?


My water cooling loop is nothing fancy, its in a case that doesn't have a window so I didn't have to pretty things up per say. Click on the 1080 club in my sig and that should lead to 3 pictures I had put up before in this thread.

As for how to use the FE backplate with an EK block, just re-use the hexagonal screws that you're going to remove when removing the FE air cooler. By the way, if you're going to use needle nose pliers to remove those hexagonal screw, take it freakin slow and steady. Its beyond easy to break off a component, scratch the PCB or sever a trace on the PCB if you're spinning pliers willy nilly while unscrewing them. So after looking at my photo again it appears I had re-used 9 of those hexagonal screws. Either way you can see my finished product in that photo to see what I mean.


----------



## uberwootage

Quote:


> Originally Posted by *fireyfire*
> 
> Hi Guys, I just want to yell all of you about my experience with memory overclocking the Zotac GTX 1080 AMP! Edition card that I have. Warning! I have placed heatsinks on all of the RAM modules.
> Link to heatsinks used:
> *https://www.amazon.com/gp/product/B00637X42A/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1*
> They were secured with a generic thermal tape.
> 
> I used an artificial benchmark (I chose it because it was short and it utilized almost all of the VRAM) It is the MSI Kombustor Lakes of Titan test.
> 
> This card is a "Hybrid card" I used an NZXT G10 Bracket and an NZXT Kraken x61 Cooler, The Memory/VRM fan is silent, so I left it at full speed.
> 
> Here are the OC results:
> Core is always at 2190 MHz
> 
> Memory was stable up to +825 MHz
> 
> +0 Memory Clock | Score: 41902 | 698 FPS
> 
> +100 Memory Clock | Score: 41544| 692 FPS
> 
> +200 Memory Clock | Score: 42325 | 705 FPS
> 
> +500 Memory Clock | Score: 42679 | 711 FPS
> 
> +600 Memory Clock | Score: 42864 | 714 FPS
> 
> +650 Memory Clock | Score: 42450 | 707 FPS
> 
> +700 Memory Clock | Score: 43164 | 719 FPS
> 
> +750 Memory Clock | Score: 42566 | 709 FPS
> 
> +800 Memory Clock | Score: 43139 | 718 FPS
> 
> +825 Memory Clock | Score: 43463 | 724 FPS (Stable in CS:GO, GTA V and Garrys mod Vsync off)
> 
> +850 Memory Clock | Unstable


Great work. Any other games you have tested?


----------



## Barterlos

hi guys, my 1080 FE overclocks like a turd, only 2025mhz at 1.093v stable :/ tested in unigine valley and heaven benchmark, i didnt win silicon lottery i guess


----------



## carlhil2

Anyone attempt GTAV @4k with the 1080 getting acceptable frames @highest settings minus AA?


----------



## Barterlos

Quote:


> Originally Posted by *carlhil2*
> 
> Anyone attempt GTAV @4k with the 1080 getting acceptable frames @highest settings minus AA?


yes, GTA V runs very very good at 4k, with very high settings, high grass, fxaa, 60fps, gtx 1080 is having hard time only in heavy grassy areas but still can maintain 60fps, other than that is downclocking in city cuz very low on gpu even at 4k







crazy, prolly at 1080p this gpu is consuming less power than ps4


----------



## carlhil2

Quote:


> Originally Posted by *Barterlos*
> 
> yes, GTA V runs very very good at 4k, with very high settings, high grass, fxaa, 60fps, gtx 1080 is having hard time only in heavy grassy areas but still can maintain 60fps, other than that is downclocking in city cuz very low on gpu even at 4k
> 
> 
> 
> 
> 
> 
> 
> crazy, prolly at 1080p this gpu is consuming less power than ps4


+1, Thanks for the feedback...


----------



## GreedyMuffin

Quote:


> Originally Posted by *xer0h0ur*
> 
> You know I probably should have specified that what I said was only with regard to the FE cards using the reference design. There just isn't anything significant there on the backside of the PCB that needs cooling.
> My water cooling loop is nothing fancy, its in a case that doesn't have a window so I didn't have to pretty things up per say. Click on the 1080 club in my sig and that should lead to 3 pictures I had put up before in this thread.
> 
> As for how to use the FE backplate with an EK block, just re-use the hexagonal screws that you're going to remove when removing the FE air cooler. By the way, if you're going to use needle nose pliers to remove those hexagonal screw, take it freakin slow and steady. Its beyond easy to break off a component, scratch the PCB or sever a trace on the PCB if you're spinning pliers willy nilly while unscrewing them. So after looking at my photo again it appears I had re-used 9 of those hexagonal screws. Either way you can see my finished product in that photo to see what I mean.


I borrowed a 4mm socket from my father. So I got that covered. I'm not using nose pliers, not on a (in Norway) 885 USD GPU.

I saw the pictures, looks very neat. I will def. do the same since I already have the backplate. The scews you connected, was that something you used instead of the EK scews, or dosen't the block require all of the holes/scews? (If you know what I mean







).

Looks very good by the way. Thanks for the info. Rep!


----------



## xer0h0ur

My memory is a bit fuzzy these days but I am nearly certain that I did not leave any of the waterblock's locations for screws unused. So it should be 9 hex screws, the 4 screws by the GPU die and that one screw next to the PCI bracket.


----------



## GreedyMuffin

Quote:


> Originally Posted by *xer0h0ur*
> 
> My memory is a bit fuzzy these days but I am nearly certain that I did not leave any of the waterblock's locations for screws unused. So it should be 9 hex screws, the 4 screws by the GPU die and that one screw next to the PCI bracket.


Thanks!

If this is correct, It will be a joy installing.

Win - win. You don't need to purchase another backplate, and you can re-use your old FE backplate..

Will receive it within Friday hopefully. Can post a few pictures if others are interested.

Once again, thank you!


----------



## xer0h0ur

Photos are as always welcomed. I like seeing what other people do on their rigs. Sometimes it serves as inspiration and other times it sparks ideas.


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> kinda weird how HWinfo64 is reporting the maximum voltage applied was 1.094 while XtremeTuner+ is reporting 1.300v


Try to use msi afterburner hardware monitor fot that.... Maybe its rly pushing only 1.09v max.

We need to see some results with 1.2v on water...2.3k is maybe possible


----------



## juniordnz

Can we expect a huge improvement in overclocking capability when we get a BIOS tweaking software like we did with maxwell?

I ask because I see that, as it is now, most cards performance are paired, be it a an EVGA with 6 power phases, a G1 with 8, a MSI with 10 or a FTW with 12...


----------



## xer0h0ur

Until we can reliably and consistently push the power limit and the gpu voltage then this really is a waiting game.


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> Try to use msi afterburner hardware monitor fot that.... Maybe its rly pushing only 1.09v max.
> 
> We need to see some results with 1.2v on water...2.3k is maybe possible


Just ordered EVGA AIO , hopefully BIOS tools will be out by the time it's installed


----------



## Barterlos

Quote:


> Originally Posted by *xer0h0ur*
> 
> Photos are as always welcomed. I like seeing what other people do on their rigs. Sometimes it serves as inspiration and other times it sparks ideas.


here you go







my beast!!







lol little bit of dust, need to clean that asap







btw that freezer cooling is amazing







4790k 4.6ghz at 1.208v and only 64c max during battlefield4 mp 64players


----------



## moustang

Quote:


> Originally Posted by *juniordnz*
> 
> Just saw a review of the Gaming X 1080 and the part where the VRMs are situated gets VERY hot. Like 82ºC. Wouldn't a backplate with good quality thermal pads help keeping it cool?


Do you have a link to that review?

I ask because of all of the 3rd party, air cooled AIBs the Gaming X is the one with the largest and thickest heatsink. If it's getting that hot on the Gaming X then almost every other card is significantly hotter.


----------



## juniordnz

Quote:


> Originally Posted by *moustang*
> 
> Do you have a link to that review?
> 
> I ask because of all of the 3rd party, air cooled AIBs the Gaming X is the one with the largest and thickest heatsink. If it's getting that hot on the Gaming X then almost every other card is significantly hotter.


Sure, here it is.

See how the temperature is very high right where the VRM is located on the PCB. That' may be because MSI was more focused in making some fashion backplate than one that could also work as a cooling mecanism. The backplate is full of holes where most of the heat is. If it had thermal pads on those locations and a solid plate above instead, it would work much better in cooling those VRMs and everything hot in that region (GPU and MEM).


----------



## fireyfire

I am going to try more extreme me memory OCing on different games, I just need some game suggestions.


----------



## fireyfire

What kind of RAD/Pump combo comes with one of those? Is it a corsair H55? I still get into the 50s and 60s with a 140mm AIO on mine...


----------



## cstarkey42

Just got my EVGA gtx 1080 ftw today. I can't get Precision XOC to keep a setting of 147 for the clock (it resets to 125) but at +125/+176 it hit 2138 in BF4 at only 58 degrees on air, according to HWINFO64. It's ASIC is 98.6 according to GPU-Z. This card is amazing!


----------



## cstarkey42

Sorry, I'm sure I'm one of the few on here who doesn't know but how do I switch bios on the FTW?

Thanks.

nm, google is my friend. Is it still the switch on the card?


----------



## xer0h0ur

Nothing can correctly read the ASIC value on Pascal GPUs right now. Don't know that anything will. Either way if your card can hit 2100 or more then you got yourself a good one.


----------



## juniordnz

Quote:


> Originally Posted by *cstarkey42*
> 
> Just got my EVGA gtx 1080 ftw today. I can't get Precision XOC to keep a setting of 147 for the clock (it resets to 125) but at +125/+176 it hit 2138 in BF4 at only 58 degrees on air, according to HWINFO64. It's ASIC is 98.6 according to GPU-Z. This card is amazing!


Wait, how did you get an ASIC read if GPU-Z doest not support ASIC for Pascal cards?

Or is it just mine that reads "ASIC not supported on this card"?


----------



## fireyfire

Earlier I posted about Memory OC with heatsinks on the memory, here are the results with different benchmarks
All of these are run at 3840X2160 Resoloution (4K)
Core clock speed: 2177 MHz
GTA V Max settings 0x MSAA:

+200 Memory 75.6 FPS
+600 Memory 78.2 FPS
+700 Memory 77.8 FPS
+800 Memory 79.3 FPS

CS:GO Max settings 8x MSAA No motion blur or FXAA:

+200 Memory 188.65
+600 Memory 186.53
+700 Memory 181.67
+800 Memory 173.58

(does not scale well with Memory OC)

Minecraft (Just for fun) Minimum settings, in a void world staring into nothing:

+200 Memory 6072 FPS
+600 Memory 6190 FPS
+700 Memory 6274 FPS
+800 Memory 6743 FPS

(actually scaled well with Memory OC!!!!!!)

-fire


----------



## Jpmboy

Quote:


> Originally Posted by *fireyfire*
> 
> Hi Guys, I just want to yell all of you about my experience with memory overclocking the Zotac GTX 1080 AMP! Edition card that I have. Warning! I have placed heatsinks on all of the RAM modules.
> Link to heatsinks used:
> *https://www.amazon.com/gp/product/B00637X42A/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1*
> They were secured with a generic thermal tape.
> 
> I used an artificial benchmark (I chose it because it was short *and it utilized almost all of the VRAM)* It is the MSI Kombustor Lakes of Titan test.
> 
> This card is a "Hybrid card" I used an NZXT G10 Bracket and an NZXT Kraken x61 Cooler, The Memory/VRM fan is silent, so I left it at full speed.
> 
> Here are the OC results:
> Core is always at 2190 MHz
> 
> Memory was stable up to +825 MHz
> 
> +0 Memory Clock | Score: 41902 | 698 FPS
> 
> +100 Memory Clock | Score: 41544| 692 FPS
> 
> +200 Memory Clock | Score: 42325 | 705 FPS
> 
> +500 Memory Clock | Score: 42679 | 711 FPS
> 
> +600 Memory Clock | Score: 42864 | 714 FPS
> 
> +650 Memory Clock | Score: 42450 | 707 FPS
> 
> +700 Memory Clock | Score: 43164 | 719 FPS
> 
> +750 Memory Clock | Score: 42566 | 709 FPS
> 
> +800 Memory Clock | Score: 43139 | 718 FPS
> 
> +825 Memory Clock | Score: 43463 | 724 FPS (Stable in CS:GO, GTA V and Garrys mod Vsync off)
> 
> +850 Memory Clock | Unstable


*Kombuster Lakes of titan at 1440P uses only 545 mb of vram when I run it.*


----------



## SAFX

Just bagged an EVGA 1080 SC from newegg, $649, thank you auto-notify, God's greatest creation









So long 295x2, it was fun


----------



## cstarkey42

Quote:


> Originally Posted by *juniordnz*
> 
> Wait, how did you get an ASIC read if GPU-Z doest not support ASIC for Pascal cards?
> 
> Or is it just mine that reads "ASIC not supported on this card"?


I didn't realize it wasn't supported with Pascal. It did give me a value but it sounds like I shouldn't read too much into it. Still, I'm happy with the card I got so far.


----------



## juniordnz

Quote:


> Originally Posted by *cstarkey42*
> 
> I didn't realize it wasn't supported with Pascal. It did give me a value but it sounds like I shouldn't read too much into it. Still, I'm happy with the card I got so far.


You got a hell of a card...I was inlove with the FTW but the difference in price got me the MSI ARMOR. Still a great choice IMO. But the FTW is just pure awesomeness


----------



## KillerBee33

369 Developer Driver if anyone wants to try
https://developer.nvidia.com/opengl-driver


----------



## CallsignVega

Man the cooler on this Gigabyte XTREME 1080 that I just got in rocks. Makes the FTW cooler look amateurish.


----------



## looniam

Quote:


> Originally Posted by *KillerBee33*
> 
> 369 Developer Driver if anyone wants to try
> https://developer.nvidia.com/opengl-driver


*word on the street* is it doesn't have the DPC latency fix.


----------



## carlhil2

Quote:


> Originally Posted by *fireyfire*
> 
> Earlier I posted about Memory OC with heatsinks on the memory, here are the results with different benchmarks
> All of these are run at 3840X2160 Resoloution (4K)
> Core clock speed: 2177 MHz
> GTA V Max settings 0x MSAA:
> 
> +200 Memory 75.6 FPS
> +600 Memory 78.2 FPS
> +700 Memory 77.8 FPS
> +800 Memory 79.3 FPS
> 
> CS:GO Max settings 8x MSAA No motion blur or FXAA:
> 
> +200 Memory 188.65
> +600 Memory 186.53
> +700 Memory 181.67
> +800 Memory 173.58
> 
> (does not scale well with Memory OC)
> 
> Minecraft (Just for fun) Minimum settings, in a void world staring into nothing:
> 
> +200 Memory 6072 FPS
> +600 Memory 6190 FPS
> +700 Memory 6274 FPS
> +800 Memory 6743 FPS
> 
> (actually scaled well with Memory OC!!!!!!)
> 
> -fire


Nice GTA V results, thanks for sharing...


----------



## nexxusty

Quote:


> Originally Posted by *Barterlos*
> 
> here you go
> 
> 
> 
> 
> 
> 
> 
> my beast!!
> 
> 
> 
> 
> 
> 
> 
> lol little bit of dust, need to clean that asap
> 
> 
> 
> 
> 
> 
> 
> btw that freezer cooling is amazing
> 
> 
> 
> 
> 
> 
> 
> 4790k 4.6ghz at 1.208v and only 64c max during battlefield4 mp 64players


Sounds like a 5ghz water-cooled chip.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Man the cooler on this Gigabyte XTREME 1080 that I just got in rocks. Makes the FTW cooler look amateurish.


Can't beat the looks of the FTW though.


----------



## Sheyster

Quote:


> Originally Posted by *CallsignVega*
> 
> Man the cooler on this Gigabyte XTREME 1080 that I just got in rocks. Makes the FTW cooler look amateurish.


What power target limit does that card's BIOS have?


----------



## x7007

Did anyone saw this settings in Nvidia Inspector ?

Maximum Frame Allowed
No info on this.
OpenGL only. Min value - 1, Max value - 9.

Maximum GPU Power
No info on this.
OpenGL only.

Memory Allocation Policy
This defines how workstation feature resource allocation is performed.
• As Needed (default): Resources for workstation features are allocated as needed resulting in the minimum amount of resource consumption. Feature activation or deactivation often causes mode-sets.
• Moderate pre-allocation: Resources for the first workstation feature activated are statically allocated at system boot and persist thereafter. This will use more GPU and system memory, but will prevent mode-sets when activating or deactivating a single feature. Invocation of additional workstation features will still cause mode-sets.
• Aggressive pre-allocation: Resources for all workstation features are statically allocated at system boot and persist thereafter. This will use the most GPU and system memory, but will prevent mode-sets when activating or deactivating all workstation features.

Nvidia Quality Upscaling
No info on this.
Unfriendly name for this option is 0x10444444, but driver dll's doesn't contain it. This mean, that this option may be available only on debug drivers or removed at all.

Enable Overlay
Enable overlay allows the use of OpenGL overlay planes in programs

Frame Rate Limiter 2 Control
Additional options for Frame Rate Limiter v2

Frame Rate Monitor
Some settings for nvidia gps(gpu performance scale). Seems, notebook only.

Frame Rate Monitor Control
Additional options for Frame Rate Limiter

PowerThrottle
No info on this.
But, seems this setting
0x00000001 SET_POWER_THROTTLE_FOR_PCIe_COMPLIANCE_ON
limit power consumption from PCI-E slot to 75 watt.

SILK Smoothness
Silk reduces stutters in games caused by variable CPU or GPU workloads by smoothing out animation and presentation cadence using animation prediction and post render smoothing buffer.
• Off - Silk is disabled.
• Low - Moderate smoothing is enabled and most microstutter is eliminated.
• Medium - Many stutters and hitches are removed in typical games.
• High - More smoothing is applied and may result in observable input lag.
• Ultra - Maximum smoothing is applied and most stutters and hitches in games are eliminated. Lag may be unacceptable in some games.

Note: Selecting High or Ultra settings for silk can increase noticeable lag when playing, and may not be appropriate for first person shooters or competitive gaming.

Vsync - Behavior Flags
Flags for altering how the driver interprets VSYNC
0x00000001 - IGNORE_FLIPINTERVAL_MULTIPLE - Ignore flip interval when it is greater than 1. Usually, it is used on CPL half refresh rates.

What I'm interested in are

Should we enable those settings or at least the Frame Rate

Frame Rate Monitor
Some settings for nvidia gps(gpu performance scale). Seems, notebook only.

Frame Rate Monitor Control
Additional options for Frame Rate Limiter

Frame Rate Limiter 2 Control
Additional options for Frame Rate Limiter v2

SILK Smoothness
Silk reduces stutters in games caused by variable CPU or GPU workloads by smoothing out animation and presentation cadence using animation prediction and post render smoothing buffer.
• Off - Silk is disabled.
• Low - Moderate smoothing is enabled and most microstutter is eliminated.
• Medium - Many stutters and hitches are removed in typical games.
• High - More smoothing is applied and may result in observable input lag.
• Ultra - Maximum smoothing is applied and most stutters and hitches in games are eliminated. Lag may be unacceptable in some games.

Thanks to Guzz in guru3d forums http://forums.guru3d.com/showpost.php?p=5312177&postcount=3


----------



## Chatassys

I know we all have been waiting for a bios editor for a while now, but there is little information around it.
I am not sure how long it took last time for the bios editor to launch, but I dont think it took that ong.
Do we know if anyone, maybe the creator for the past bios editors, is actually working on the new version?
I would LOVE to support/donate for the cause, and I am sure a lot of people in here can spare a few bucks to support this development as well.


----------



## nexxusty

Quote:


> Originally Posted by *x7007*
> 
> Did anyone saw this settings in Nvidia Inspector ?
> 
> Maximum Frame Allowed
> No info on this.
> OpenGL only. Min value - 1, Max value - 9.
> 
> Maximum GPU Power
> No info on this.
> OpenGL only.
> 
> Memory Allocation Policy
> This defines how workstation feature resource allocation is performed.
> • As Needed (default): Resources for workstation features are allocated as needed resulting in the minimum amount of resource consumption. Feature activation or deactivation often causes mode-sets.
> • Moderate pre-allocation: Resources for the first workstation feature activated are statically allocated at system boot and persist thereafter. This will use more GPU and system memory, but will prevent mode-sets when activating or deactivating a single feature. Invocation of additional workstation features will still cause mode-sets.
> • Aggressive pre-allocation: Resources for all workstation features are statically allocated at system boot and persist thereafter. This will use the most GPU and system memory, but will prevent mode-sets when activating or deactivating all workstation features.
> 
> Nvidia Quality Upscaling
> No info on this.
> Unfriendly name for this option is 0x10444444, but driver dll's doesn't contain it. This mean, that this option may be available only on debug drivers or removed at all.
> 
> Enable Overlay
> Enable overlay allows the use of OpenGL overlay planes in programs
> 
> Frame Rate Limiter 2 Control
> Additional options for Frame Rate Limiter v2
> 
> Frame Rate Monitor
> Some settings for nvidia gps(gpu performance scale). Seems, notebook only.
> 
> Frame Rate Monitor Control
> Additional options for Frame Rate Limiter
> 
> PowerThrottle
> No info on this.
> But, seems this setting
> 0x00000001 SET_POWER_THROTTLE_FOR_PCIe_COMPLIANCE_ON
> limit power consumption from PCI-E slot to 75 watt.
> 
> SILK Smoothness
> Silk reduces stutters in games caused by variable CPU or GPU workloads by smoothing out animation and presentation cadence using animation prediction and post render smoothing buffer.
> • Off - Silk is disabled.
> • Low - Moderate smoothing is enabled and most microstutter is eliminated.
> • Medium - Many stutters and hitches are removed in typical games.
> • High - More smoothing is applied and may result in observable input lag.
> • Ultra - Maximum smoothing is applied and most stutters and hitches in games are eliminated. Lag may be unacceptable in some games.
> 
> Note: Selecting High or Ultra settings for silk can increase noticeable lag when playing, and may not be appropriate for first person shooters or competitive gaming.
> 
> Vsync - Behavior Flags
> Flags for altering how the driver interprets VSYNC
> 0x00000001 - IGNORE_FLIPINTERVAL_MULTIPLE - Ignore flip interval when it is greater than 1. Usually, it is used on CPL half refresh rates.
> 
> What I'm interested in are
> 
> Should we enable those settings or at least the Frame Rate
> 
> Frame Rate Monitor
> Some settings for nvidia gps(gpu performance scale). Seems, notebook only.
> 
> Frame Rate Monitor Control
> Additional options for Frame Rate Limiter
> 
> Frame Rate Limiter 2 Control
> Additional options for Frame Rate Limiter v2
> 
> SILK Smoothness
> Silk reduces stutters in games caused by variable CPU or GPU workloads by smoothing out animation and presentation cadence using animation prediction and post render smoothing buffer.
> • Off - Silk is disabled.
> • Low - Moderate smoothing is enabled and most microstutter is eliminated.
> • Medium - Many stutters and hitches are removed in typical games.
> • High - More smoothing is applied and may result in observable input lag.
> • Ultra - Maximum smoothing is applied and most stutters and hitches in games are eliminated. Lag may be unacceptable in some games.
> 
> Thanks to Guzz in guru3d forums http://forums.guru3d.com/showpost.php?p=5312177&postcount=3


Silk Smoothness eh... lol. Sounds like triple buffering to me.

Cool find though.


----------



## Josse

Hey guys. First time poster from Norway here.

It seems like I really have lost the silicone lottery on a non-OC Strix :-(

I am only able to increase the base clock by 160, up to 1767 MHz (yielding a boost clock of 1894 MHz) before I am getting driver crashes in Firestrike, I can push Heaven a few MHz higher before I am starting to see crashes there as well.

This is in a well ventilated case on air, and GPU temps is around 72C when the driver crashes.
Increasing the voltage does not seem to matter, and I am seeing the driver crashes whether I am overclocking the GPU memory or not.

This is on an [email protected] GHz.

Is there any tricks I can try in order to increase stability at higher frequencies?

Links to: Firestrike, GPU-Z


----------



## nexxusty

Quote:


> Originally Posted by *Josse*
> 
> Hey guys. First time poster from Norway here.
> 
> It seems like I really have lost the silicone lottery on a non-OC Strix :-(
> 
> I am only able to increase the base clock by 160, up to 1767 MHz (yielding a boost clock of 1894 MHz) before I am getting driver crashes in Firestrike, I can push Heaven a few MHz higher before I am starting to see crashes there as well.
> 
> This is in a well ventilated case on air, and GPU temps is around 72C when the driver crashes.
> Increasing the voltage does not seem to matter, and I am seeing the driver crashes whether I am overclocking the GPU memory or not.
> 
> This is on an [email protected] GHz.
> 
> Is there any tricks I can try in order to increase stability at higher frequencies?
> 
> Links to: Firestrike, GPU-Z


Haven't even mentioned increasing fan speed. That will help.

A lot. Keep the GPU under 70c. Ideally... 60c.


----------



## Whitechap3l

Quote:


> Originally Posted by *Josse*
> 
> Hey guys. First time poster from Norway here.
> 
> It seems like I really have lost the silicone lottery on a non-OC Strix :-(
> 
> I am only able to increase the base clock by 160, up to 1767 MHz (yielding a boost clock of 1894 MHz) before I am getting driver crashes in Firestrike, I can push Heaven a few MHz higher before I am starting to see crashes there as well.
> 
> This is in a well ventilated case on air, and GPU temps is around 72C when the driver crashes.
> Increasing the voltage does not seem to matter, and I am seeing the driver crashes whether I am overclocking the GPU memory or not.
> 
> This is on an [email protected] GHz.
> 
> Is there any tricks I can try in order to increase stability at higher frequencies?
> 
> Links to: Firestrike, GPU-Z


Well mine is coming today, I hope my non OC is better ;P
But I dont understand why u have such problems... The non OC strix is the same card as the OC one just with lower clocks ? So I guess you should be able to get at least around 2050 out of the card with air cooling.
Mine is going under water next week ..


----------



## x7007

So we have all the Sync options.

What I want to know is what is the best way to use it and what to do if the game doesn't support it, won't work with it or worse crash instant or after a while.

Some people say Maximum pre-Render frames - 1 which is a must to have the lowest input lag with vsync, that we can always agree. When should I change it otherwise ? any games specifics or issues ?

Vsync should be On in NVCP ? if it does should I Frame Limit using Nvidia Inspector ? or RivaTuner/MsiAfterBurner , if to limit what number to choose if I have 60 Hz TV screen, and I usually play in 3D too so I know it must be accounted for cause 3D causes input lag by itself, I usually use Tridef and sometimes Nvidia 3DTV and using 3DTV causing Vsync Force ON always.

When can I use Vsync Fast ? in some specific games FPS or any game ? Any game I tried it caused Stuttering every couple sec. example DarkSiders 2, BladeStorm Nightmare, The Technomancer, The Evil Within. Complete Edition, but it really seem to work fine for Skyrim for some reason.

When using Vsync Adaptive doesn't seem to work or works badly on some games, for example I'm trying to have no Tearing + no Stuttering in BladeStorm Nightmare. using Vsync On causing lag input, Vsync Adaptive doesn't seem to work well, Vsync Fast causing Stuttering.
I try to Frame Limit to 58.6 and it didn't work well, or I just don't remember, but I just can't find the right settings.


----------



## nexxusty

Quote:


> Originally Posted by *x7007*
> 
> So we have all the Sync options.
> 
> What I want to know is what is the best way to use it and what to do if the game doesn't support it, won't work with it or worse crash instant or after a while.
> 
> Some people say Maximum pre-Render frames - 1 which is a must to have the lowest input lag with vsync, that we can always agree. When should I change it otherwise ? any games specifics or issues ?
> 
> Vsync should be On in NVCP ? if it does should I Frame Limit using Nvidia Inspector ? or RivaTuner/MsiAfterBurner , if to limit what number to choose if I have 60 Hz TV screen, and I usually play in 3D too so I know it must be accounted for cause 3D causes input lag by itself, I usually use Tridef and sometimes Nvidia 3DTV and using 3DTV causing Vsync Force ON always.
> 
> When can I use Vsync Fast ? in some specific games FPS or any game ? Any game I tried it caused Stuttering every couple sec. example DarkSiders 2, BladeStorm Nightmare, The Technomancer, The Evil Within. Complete Edition, but it really seem to work fine for Skyrim for some reason.
> 
> When using Vsync Adaptive doesn't seem to work or works badly on some games, for example I'm trying to have no Tearing + no Stuttering in BladeStorm Nightmare. using Vsync On causing lag input, Vsync Adaptive doesn't seem to work well, Vsync Fast causing Stuttering.
> I try to Frame Limit to 58.6 and it didn't work well, or I just don't remember, but I just can't find the right settings.


Gsync.


----------



## pez

Welp, all this talk of G-sync and ultrawide being the one thing besides g-sync that I haven't tried at this point, I pull the trigger on a X34 Predator. PrimeNow has it in my area, so I'll get to take it home from work today and test it out. Here's to hoping g-sync and ultrawide are as great as I'm hearing they are







.


----------



## schoolofmonkey

Quote:


> Originally Posted by *Josse*
> 
> Hey guys. First time poster from Norway here.
> 
> It seems like I really have lost the silicone lottery on a non-OC Strix :-(
> 
> I am only able to increase the base clock by 160, up to 1767 MHz (yielding a boost clock of 1894 MHz) before I am getting driver crashes in Firestrike, I can push Heaven a few MHz higher before I am starting to see crashes there as well.


I've got the same card, mine boosts to 1911Mhz out of the box no overclock, funny thing is even if I try to overclock it still only boosts to 1938Mhz no matter how many extra Mhz or power I add, so I didn't bother..lol.
Guess it's a BIOS set limit because the temps don't even reach 70c with an overclock.

Honestly going any higher for [email protected] gaming is pointless anyway, this card kills everything at that res..


----------



## toncij

Which of those multi-mode cards has a hw or other kind of switch that doesn't need Windows and special tool? Asus needs GPU Tweak 2 so it's out of the question... EVGA FTW can switch BIOS-es so to make OC one primary?


----------



## outofmyheadyo

If you had to choose gigabyte g1 gaming 1080 vs gainward phoenix 1080 wich one would you go for? Phoenix is 60eur cheaper and quieter I belive, could I hope drastic oc improvements over the phoenix, personally I dont think so.
Not sure if its gonna be air or water.


----------



## Barterlos

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I've got the same card, mine boosts to 1911Mhz out of the box no overclock, funny thing is even if I try to overclock it still only boosts to 1938Mhz no matter how many extra Mhz or power I add, so I didn't bother..lol.
> Guess it's a BIOS set limit because the temps don't even reach 70c with an overclock.
> 
> Honestly going any higher for [email protected] gaming is pointless anyway, this card kills everything at that res..


holly smoke, i thought that my FE is bad, cuz i can only hit 2025mhz stable, little bit more and drivers crashing, guys did you realize taht boost bins are somehow fixed ? like i set in MSI afterburner for 1.093v 2030mhz and its hitting 2038mhz


----------



## Barterlos

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If you had to choose gigabyte g1 gaming 1080 vs gainward phoenix 1080 wich one would you go for? Phoenix is 50eur cheaper and quieter I belive, could I hope drastic oc improvements over the phoenix, personally I dont think so.
> Not sure if its gonna be air or water.


i heard that G1 gaming its pretty loud, i'd say go for Phoenix, but overclocking depends on silicon lottery no matter which brand u choose


----------



## pez

If you don't plan on going SLI with the G1, then it's perfectly fine. it will never exceed 75C and 60% fan should you have good airflow. When I was running a single G1, it never exceeded 54% fan speed in any gaming situation and the noise was never greater than any of my case fans on a very moderate RPM (~1000RPM).


----------



## ChevChelios

my G1 is fine overall, but my next card (likely 1180 Volta) will be either an MSI with Twin Frozr or a Gainward Phoenix with that double fan cooler of theirs .. or maybe a Zotac AMP also with double fans (I am going to go with bigger double fans over smaller triple fans)

I feel like those might be slight upgrades to Gigabyte G1

Im definitely passing on Palit and Asus, and EVGA custom isnt really sold around here a lot


----------



## outofmyheadyo

Ill go with the gainward quiet operation is the most important for me, 1000rpm casefans piss me off already


----------



## KickAssCop

Haven't used the 1080 G1 but I had the 970 G1 cards and find it really sad that Gigabyte has not evolved their cooler design for G1 line. All other brands have and moved to larger fans including the Asus Strix. I was faced with a choice between Gigabyte, Asus and Evga and opted for the Strix. In your case I would go with Gainward.


----------



## GreedyMuffin

I always go for the card that has wb block support from release. Ref. Design.


----------



## CallsignVega

Quote:


> Originally Posted by *boredgunner*
> 
> Can't beat the looks of the FTW though.


Looks a distant third priority for me. As far as cooling and noise goes, the XTREME clobbers the EVGA coolers.

Just to give an example with same clocks:

FTW = 65C and quite audible.
XTREME = 55C and 1/3rd the noise.

Quote:


> Originally Posted by *Sheyster*
> 
> What power target limit does that card's BIOS have?


150.


----------



## Barterlos

i read somewhere that evga this time around didnt make good job with cooling the VRMs, they have hottest vrm section than other brands but i dont know if is that 100% truth


----------



## Benjiw

Quote:


> Originally Posted by *CallsignVega*
> 
> Looks a distant third priority for me. As far as cooling and noise goes, the XTREME clobbers the EVGA coolers.
> 
> Just to give an example with same clocks:
> 
> FTW = 65C and quite audible.
> XTREME = 55C and 1/3rd the noise.
> 150.


Stick a waterblock on it and enjoy lower temps and silence...


----------



## GreedyMuffin

Quote:


> Originally Posted by *Benjiw*
> 
> Stick a waterblock on it and enjoy lower temps and silence...


^This!


----------



## schoolofmonkey

Quote:


> Originally Posted by *Barterlos*
> 
> holly smoke, i thought that my FE is bad, cuz i can only hit 2025mhz stable, little bit more and drivers crashing, guys did you realize taht boost bins are somehow fixed ? like i set in MSI afterburner for 1.093v 2030mhz and its hitting 2038mhz


I just got it to 2000Mhz, temps hit 67c during the Time Spy run, made a 330 point difference








Didn't touch the VRAM though.


----------



## KillerBee33

@ CallsignVega

http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1
or
https://www.amazon.com/EVGA-Hybrid-GeForce-Cooling-400-HY-0996-B1/dp/B00ZQ4PFX2/ref=sr_1_1?ie=UTF8&qid=1469618796&sr=8-1&keywords=evga+hybrid+kit


----------



## juniordnz

Do "L" 8pin PCI cables exist?

I ask because, don't know if anyone else noticed and is having same problem as I do, these cards have much wider PCBs than Maxwell generation. At least my MSI does.

I have a compact case (Corsair 300R) and I can't get the side panel with my 2 140mm fans in place because of the widht of this card. If I remove both fans then I can get it in place but still making a lot of pressure on de 8+6pin connector.

Any thoughts?


----------



## Barterlos

Quote:


> Originally Posted by *juniordnz*
> 
> Do "L" 8pin PCI cables exist?
> 
> I ask because, don't know if anyone else noticed and is having same problem as I do, these cards have much wider PCBs than Maxwell generation. At least my MSI does.
> 
> I have a compact case (Corsair 300R) and I can't get the side panel with my 2 140mm fans in place because of the widht of this card. If I remove both fans then I can get it in place but still making a lot of pressure on de 8+6pin connector.
> 
> Any thoughts?


i never saw "L" 8pin cable for a GPU


----------



## Whitechap3l

Quote:


> Originally Posted by *Barterlos*
> 
> i never saw "L" 8pin cable for a GPU


Never saw one either


----------



## MrTOOSHORT

Quote:


> Originally Posted by *juniordnz*
> 
> Do "L" 8pin PCI cables exist?
> 
> I ask because, don't know if anyone else noticed and is having same problem as I do, these cards have much wider PCBs than Maxwell generation. At least my MSI does.
> 
> I have a compact case (Corsair 300R) and I can't get the side panel with my 2 140mm fans in place because of the widht of this card. If I remove both fans then I can get it in place but still making a lot of pressure on de 8+6pin connector.
> 
> Any thoughts?
> 
> 
> Spoiler: Warning: Spoiler!


Heard something earlier from EVGA, but it might be for their cards only. Not sure:

*https://www.techpowerup.com/forums/threads/evga-unveils-unique-l-shaped-vga-power-adapter.223105/*


----------



## Whitechap3l

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Heard something earlier from EVGA, but it might be for their cards only. Not sure:
> 
> *https://www.techpowerup.com/forums/threads/evga-unveils-unique-l-shaped-vga-power-adapter.223105/*[/quote
> Quote:
> 
> 
> 
> Originally Posted by *MrTOOSHORT*
> 
> Heard something earlier from EVGA, but it might be for their cards only. Not sure:
> 
> *https://www.techpowerup.com/forums/threads/evga-unveils-unique-l-shaped-vga-power-adapter.223105/*
> 
> 
> 
> Brilliant idea btw from EVGA. Should been out for ages >.>
> I think u can use it on other cards too but first it may look stupid when you are using a other card and have the EVGA logo on it and second there are only 2 8pins and they didnt look like you can split them into maybe 1 8pin and 1 6pin..
Click to expand...


----------



## juniordnz

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Heard something earlier from EVGA, but it might be for their cards only. Not sure:
> 
> *https://www.techpowerup.com/forums/threads/evga-unveils-unique-l-shaped-vga-power-adapter.223105/*


That could work. If we could figure out witch pins to snap and make it a 8+6. But I'm afraid it will cost me a kidney to get one of those trendy, shiny, adapters from EVGA. Was looking forward to a normal L connector







impossible no one ever thoght about doing it


----------



## moustang

Quote:


> Originally Posted by *juniordnz*
> 
> Sure, here it is.
> 
> *See how the temperature is very high right where the VRM is located on the PCB. That' may be because MSI was more focused in making some fashion backplate than one that could also work as a cooling mecanism. The backplate is full of holes where most of the heat is. If it had thermal pads on those locations and a solid plate above instead, it would work much better in cooling those VRMs and everything hot in that region* (GPU and MEM).


I see a glaring flaw here. The heat at their M2 measuring point is NOT where the VRM is located. That's the GPU and VRAM. Their M4 measuring point is where the VRM is located, on the left side of their thermal pic where there are no holes in the backplate.

Here's the MSI GTX 1080 Gaming X PCB:










The VRM is on the right side of the SFC chokes. You'll notice that above and to the right of them are the power connectors.

Now let's turn the card over....










Now you see the power connectors on the bottom-right, which means the VRM are located under the dragon logo, where the backplate is solid.

This means their M4 measuring point is actually the VRM, which came in at just 59C.

Their M1 measuring point is directly under the GPU, and their M2 measuring point is an empty space between the GPU and VRAM. You can tell this by looking at the screw pattern in the backplate in their thermal pics. You can clearly see the 4 screw holes around the slots cut into the backplate. Again looking at the PCB from the front you can see that those 4 holes are surrounding the GPU and where the heatsink mounts to the card:










Their M2 measuring point is really the gap between the heatsink on the GPU and heat spreader on the VRAM. There is no VRM there.

Their M2 measuring point is just below and slightly to the right of the top-right screw hole for the heatsink in this pic. It's on the opposite side of the chokes from the VRM.


----------



## Barterlos

but M1 and M2 are preety close to eachother, and temp diffrence is big, very strange, maybe this heat comes from VRM area, and u know heat is spreading like a butter


----------



## boredgunner

Quote:


> Originally Posted by *pez*
> 
> Welp, all this talk of G-sync and ultrawide being the one thing besides g-sync that I haven't tried at this point, I pull the trigger on a X34 Predator. PrimeNow has it in my area, so I'll get to take it home from work today and test it out. Here's to hoping g-sync and ultrawide are as great as I'm hearing they are
> 
> 
> 
> 
> 
> 
> 
> .


No ULMB though. G-SYNC is an excellent crutch but ULMB (and the like) are the goal for LCD.


----------



## moustang

Quote:


> Originally Posted by *Barterlos*
> 
> but M1 and M2 are preety close to eachother, and temp diffrence is big, very strange, maybe this heat comes from VRM area


It's not even remotely close to the VRM and the VRM is covered by an entirely separate heatsink. It's coming from a gap in the cooling between the GPU heatsink and the heatspreader on the VRAM.

If you look at the picture of the backplate that I posted you'll see the screws at the feet, back of the jaw, and top of the horn of the dragon logo. Those are the screws holding the VRM heatsink on. If you look at the picture of the bare PCB that I posted you can see those three screws are right next to the VRM.

.


----------



## juniordnz

I found it pretty odd too. I saw the review, looked at the naked PCB and it got me thinking that the VRM wouldn´t be there. M2 is most likely right below the 3 VRAM modules right next to the GPU.

Anyway, 80 degrees is a bit hot and I bet that if the backplate was solid over there with a nice thermal pad underneath it those temps would be much lower.

That´s why I'm getting a solid backplate from EK and will make it touch the PCB with good quality thermal pads almost everywhere in the back of the card. I have 145mm x 145mm of area to cover with the Arctic Thermal Pad.


----------



## Barterlos

Quote:


> Originally Posted by *moustang*
> 
> It's not even remotely close to the VRM and the VRM is covered by an entirely separate heatsink. It's coming from a gap in the cooling between the GPU heatsink and the heatspreader on the VRAM. It's simply detecting some radiant heat coming off the GPU.
> 
> If you look at the picture of the backplate that I posted you'll see the screws at the feet, back of the jaw, and top of the horn of the dragon logo. Those are the screws holding the VRM heatsink on. If you look at the picture of the bare PCB that I posted you can see those three screws are right next to the VRM.


that 80c is produced by something diffrent than GPU, maybe by a g5x modules


----------



## Jpmboy

amazing what die shrink and more on-die controls do... very dense power section and lot's of "open space" around the core and ram ICs. That is a pretty PCB.


----------



## fat4l

Guys, how much volts/amps can an FE vrms handle ?


----------



## KillerBee33

Did more than few runs to make sure this is right








1.93V http://www.3dmark.com/3dm/13621380
Stock Voltage 1.65 http://www.3dmark.com/3dm/13638712
looks to me like adding Voltage makes less sense .


----------



## KillerBee33

Double Post


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Did more than few runs to make sure this is right
> 
> 
> 
> 
> 
> 
> 
> 
> 1.93V http://www.3dmark.com/3dm/13621380
> Stock Voltage 1.65 http://www.3dmark.com/3dm/13638712
> looks to me like adding Voltage makes less sense .


Would you be so kind and post a pic of your sensor tabs set at maximum value on GPU-Z?


----------



## yungtiger

I read through the posts and currently I'm debating between the Gigabyte G1 and Asus Strix. They're the only two I can find so my choices are unfortunately limited to these two. In regards to strictly performance out of the box I've heard these two are essentially the same through various online reviews. However I keep on reading about possible coil whine/noise issues for the G1 while the Strix has possible BIOS locking and temp issues? I've never owned either brand of these cards and was wondering if there's an opinion on which one is more preferred by the community or has less QC issues/better RMA customer service. (My last few cards have been EVGA or MSI.)


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Would you be so kind and post a pic of your sensor tabs set at maximum value on GPU-Z?


When home i'll get the Stock Vol. GPUZ but tweaked 1.93 is here

this is a 6 or 8 minute run i think and stable 1.93


----------



## juniordnz

Quote:


> Originally Posted by *yungtiger*
> 
> I read through the posts and currently I'm debating between the Gigabyte G1 and Asus Strix. They're the only two I can find so my choices are unfortunately limited to these two. In regards to strictly performance out of the box I've heard these two are essentially the same through various online reviews. However I keep on reading about possible coil whine/noise issues for the G1 while the Strix has possible BIOS locking and temp issues? I've never owned either brand of these cards and was wondering if there's an opinion on which one is more preferred by the community or has less QC issues/better RMA customer service. (My last few cards have been EVGA or MSI.)


I'd go for the STRIX. More power phases and *possibly* better overclocking and stability.
Quote:


> Originally Posted by *KillerBee33*
> 
> When home i'll get the Stock Vol. GPUZ but tweaked 1.93 is here


geez, what a relief...when you said 1.93V I imagined the card under some crazy LN2 container in a lab in siberia. That's 1.093V


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> geez, what a relief...when you said 1.93V I imagined the card under some crazy LN2 container in a lab in siberia. That's 1.093V


HEHE sorry my mistake








and yes stock on this card is 1.0650


----------



## GreedyMuffin

Yesterday my 1080 went up to 2100/1050MV, today it won't go over 1000mv?

What the heck?


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> HEHE sorry my mistake
> 
> 
> 
> 
> 
> 
> 
> 
> and yes stock on this card is 1.0650


How did you get it to 1.093V? Mine won't go higher than 1.062V. I imagined all this cards came with the same TDP max. Anyway to keep it at a stable voltage under load? Maybe high performance setting?

Great results though...


----------



## fireyfire

Quote:


> Originally Posted by *KillerBee33*
> 
> 369 Developer Driver if anyone wants to try
> https://developer.nvidia.com/opengl-driver


I am running this driver right now and can confirm that there is no DPC fix :/ Sadly. I am having trouble playing back some of the 4k videos I have recorded with shadowplay (Stuttering and windows media center crashing)


----------



## AlienPrime173

mine just came in the other day


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> How did you get it to 1.093V? Mine won't go higher than 1.062V. I imagined all this cards came with the same TDP max. Anyway to keep it at a stable voltage under load? Maybe high performance setting?
> 
> Great results though...


In Afterburner , Reset.
1-Max Voltage and Power
2-Memory +500
3-Apply
4-Ctrl+F
5-In CURVE SCREEN keep holding Ctrl and raise ONLY last Pin on the Right to around 2250
6-Apply and Check Voltage








Curve should look like this


----------



## nexxusty

Quote:


> Originally Posted by *pez*
> 
> Welp, all this talk of G-sync and ultrawide being the one thing besides g-sync that I haven't tried at this point, I pull the trigger on a X34 Predator. PrimeNow has it in my area, so I'll get to take it home from work today and test it out. Here's to hoping g-sync and ultrawide are as great as I'm hearing they are
> 
> 
> 
> 
> 
> 
> 
> .


Oh pezzy my boy you're going to love it.

I cannot live without GSYNC. Not possible now. Lol.


----------



## nexxusty

Quote:


> Originally Posted by *AlienPrime173*
> 
> mine just came in the other day


Why.... WHY did you pay $530 for an EVGA Classified?

Want one for a good price? I have one barely used.... that board is not worth more than $250 on a good day. Paying $500+?

Naw dude.. naw.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> In Afterburner , Reset.
> 1-Max Voltage and Power
> 2-Memory +500
> 3-Apply
> 4-Ctrl+F
> 5-In CURVE SCREEN keep pressing Ctrl and raise ONLY last Pin on the Right to around 2250
> 6-Apply and Check Voltage
> 
> 
> 
> 
> 
> 
> 
> 
> Curve should look like this


Thanks a lot, buddy. Will try as soom as I get home. So 1093mv seems to be the current power limit in these cards, right? Hopefully bios modding Will Get us farther. Let's pray'n'wait


----------



## CallsignVega

Gigabyte XTREME I have only clocks to 2050. What a shame since it's such a good cooler. I may return it to Newegg because that is technically under the stock boost of 2063. Unless someone would still want the card for cost I'll post in the for sale section.


----------



## raidflex

Any waterblocks out yet for 1080 FTW?


----------



## nexxusty

Quote:


> Originally Posted by *raidflex*
> 
> Any waterblocks out yet for 1080 FTW?


Nope.

Should have bought an FE...


----------



## -terabyte-

Quote:


> Originally Posted by *raidflex*
> 
> Any waterblocks out yet for 1080 FTW?


I remember reading they were delayed to the end of August, but I'm not 100% sure. Take it with a grain of salt


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Thanks a lot, buddy. Will try as soom as I get home. So 1093mv seems to be the current power limit in these cards, right? Hopefully bios modding Will Get us farther. Let's pray'n'wait


That was my whole point today when i posted two Firestrike runs ,
Stock Voltage 1.065 Scores better than 1.093


----------



## raidflex

Quote:


> Originally Posted by *-terabyte-*
> 
> I remember reading they were delayed to the end of August, but I'm not 100% sure. Take it with a grain of salt


The FE didn't make any sense when I can get a better version for cheaper. Hopefully they will be out by the end of August, I figured two months after release they would have been out already.

I am just happy to get off of SLI and back to a single card.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *juniordnz*
> 
> Thanks a lot, buddy. Will try as soom as I get home. So 1093mv seems to be the current power limit in these cards, right? Hopefully bios modding Will Get us farther. Let's pray'n'wait


Quote:


> Originally Posted by *KillerBee33*
> 
> In Afterburner , Reset.
> 1-Max Voltage and Power
> 2-Memory +500
> 3-Apply
> 4-Ctrl+F
> 5-In CURVE SCREEN keep pressing Ctrl and raise ONLY last Pin on the Right to around 2250
> 6-Apply and Check Voltage
> 
> 
> 
> 
> 
> 
> 
> 
> Curve should look like this


This work OK on g1 1080 please?


----------



## nexxusty

Quote:


> Originally Posted by *raidflex*
> 
> The FE didn't make any sense when I can get a better version for cheaper. Hopefully they will be out by the end of August, I figured two months after release they would have been out already.
> 
> I am just happy to get off of SLI and back to a single card.


There aren't better versions than the FE. They clock highest on wottur.

Paradigms of the past don't apply with the 1080 series. AIB's are not the way to go this time around.


----------



## ChaosAD

Just sold my msi 390x and found a good deal on a inno3d gtx 1080 ichill x3. What do you think on this card? Is this a good choice to go for or shall i choose something else?


----------



## raidflex

Quote:


> Originally Posted by *nexxusty*
> 
> There aren't better versions than the FE. They clock highest on wottur.
> 
> Paradigms of the past don't apply with the 1080 series. AIB's are not the way to go this time around.


That would imply that NVIDIA is binning the FE's for better clocks, I find this hard to believe.


----------



## KillerBee33

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> This work OK on g1 1080 please?


Not sure about GIGABYTE but it worked on GALAX HOF


----------



## nexxusty

Quote:


> Originally Posted by *raidflex*
> 
> That would imply that NVIDIA is binning the FE's for better clocks, I find this hard to believe.


Well I'm only going on what other forum members say. That is however, how a group would arrive at a concensus no?

However, from what I've read the overwhelming majority say FE is best on wottur. Followed by the Galaxy HOF.

I also wouldn't find NVIDIA binning FE's hard to believe at all. It's their flagship product.


----------



## raidflex

Quote:


> Originally Posted by *nexxusty*
> 
> Well I'm only going on what other forum members say. That is however, how a group would arrive at a concensus no?
> 
> However, from what I've read the overwhelming majority say FE is best on wottur. Followed by the Galaxy HOF.
> 
> I also wouldn't find NVIDIA binning FE's hard to believe at all. It's their flagship product.


I could see binning the Titan, but the FE seems a stretch but I guess its possible. At least with the FTW I know it is binned. Hopefully the second 8-pin will help with OC, but that is still to be seen so far.


----------



## nexxusty

Quote:


> Originally Posted by *raidflex*
> 
> I could see binning the Titan, but the FE seems a stretch but I guess its possible. At least with the FTW I know it is binned. Hopefully the second 8-pin will help with OC, but that is still to be seen so far.


Agreed there. Still a ton of speculation at this point.

Theoretically the extra power pins should allow for a better OC, combined with more phases than the FE.

We shall all see soon enough.


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> Just bagged an EVGA 1080 SC from newegg, $649, thank you auto-notify, God's greatest creation
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So long 295x2, it was fun


LOL, so long multi-gpu headaches and hello single GPU microstutter. Hopefully you don't have the same problems with your 1080 I am.


----------



## ralphi59

Hi all
I make a curve voltage with only 2 points
1 at 0.881 (+225)
1 at 0.893 (+250)
With this i have 1898 mhz and less with the step of boost 3.0
Power limit rarely go higher 80%
Temp 69° Fan 52% in valley benchmark
I use EVGA Precision Xoc 6.0.3
Good Night from France


----------



## toncij

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, so long multi-gpu headaches and hello single GPU microstutter. Hopefully you don't have the same problems with your 1080 I am.


Single-GPU stutter is highly unlikely. Are you sure the card is a problem?


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> Single-GPU stutter is highly unlikely. Are you sure the card is a problem?


Lol, pardon me?

I think you need to re-evaluate your statement. I can name hundreds of games that will stutter no matter what you do. Single gpu or Multi GPU.


----------



## boredgunner

Quote:


> Originally Posted by *nexxusty*
> 
> I can name hundreds of games that will stutter no matter what you do. Single gpu or Multi GPU.


Every other Unreal Engine 3 game.


----------



## nexxusty

Quote:


> Originally Posted by *boredgunner*
> 
> Every other Unreal Engine 3 game.


Bored... you rock. I had that exact piece of ammo locked and loaded if needed.

That's at least 1000 games right there. Lol.

They all have issues. Except UT3 itself.


----------



## CallsignVega

Quote:


> Originally Posted by *nexxusty*
> 
> There aren't better versions than the FE. They clock highest on wottur.
> 
> Paradigms of the past don't apply with the 1080 series. AIB's are not the way to go this time around.


I've had three AIB cards now and none of them clocked as high as my FE. Although that FE cooler sucks so much you are pretty much required to at least put a hybrid kit on it. The point is fairly moot for me though as I'll be getting two Titan-XP's.


----------



## raidflex

Quote:


> Originally Posted by *nexxusty*
> 
> Bored... you rock. I had that exact piece of ammo locked and loaded.
> 
> That's at least 1000 games right there. Lol.
> 
> They all have issues. Except UT3 itself.


Still SLI throws in much more complication. SLI in general has not really improved much over the years and I still find many games that are compatible with SLI are poorly optimized for it. BF4 seemed to be the only game, at least that I have played recently that scaled very well and ran very smooth with SLI.


----------



## toncij

Quote:


> Originally Posted by *nexxusty*
> 
> Lol, pardon me?
> 
> I think you need to re-evaluate your statement. I can name hundreds of games that will stutter no matter what you do. Single gpu or Multi GPU.


Really? Are you sure? I usually run FCAT and other tests on many games but never experience this. What kind of stutter? Can you reproduce it?


----------



## nexxusty

Quote:


> Originally Posted by *raidflex*
> 
> Still SLI throws in much more complication. SLI in general has not really improved much over the years and I still find many games that are compatible with SLI are poorly optimized for it. BF4 seemed to be the only game, at least that I have played recently that scaled very well and ran very smooth with SLI.


SLi is a joke now. Benchmarks and epeen only.

There are people who can't wait for single card 4k who try to justify it. They're crazy. 2K is where it's at.

You're greedy if you "think" you need more for gaming. IMO.


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> Really? Are you sure? I usually run FCAT and other tests on many games but never experience this. What kind of stutter? Can you reproduce it?


Any UT3 engine game. Especially with PhysX.

I'm not saying it's a hardware issue. This is and always will be a driver/crap coding issue.

Streaming textures to be exact. We have 2400mbs SSD'S 16gb of RAM and 8gn video cards as standard now. Hoping texture streaming will be gone for good.

Doesn't seem like it though. Even 2016 games still stream.


----------



## raidflex

Quote:


> Originally Posted by *nexxusty*
> 
> SLi is a joke now. Benchmarks and epeen only.
> 
> There are people who can't wait for single card 4k who try to justify it. They're crazy. 2K is where it's at.
> 
> You're greedy if you "think" you need more for gaming. IMO.


Yeah I wouldn't go to 4k, even if I could now. I wouldn't replace my 34in 3440x1440 ultra wide, I like the good balance between resolution and performance. Volta maybe the first generation that is actually suitable for 4k with a single card.


----------



## nexxusty

Quote:


> Originally Posted by *raidflex*
> 
> Yeah I wouldn't go to 4k, even if I could now. I wouldn't replace my 34in 3440x1440 ultra wide, I like the good balance between resolution and performance. Volta maybe the first generation that is actually suitable for 4k with a single card.


My thoughts almost exactly.

Specifically your thoughts on Volta. I've said before a few times, Volta (Hopefully) = 4k.

I'll need more than 60fps though. As long as it can hold 90ish in most games at 4k, I'll bite. GSYNC and 144hz are a part of me now.

I will not leave them behind. Heh. Easily the biggest GSYNC fan boy you've ever seen <--.


----------



## toncij

You're probably a 144Hz fan, rather than a G-Sync one.


----------



## xer0h0ur

Quote:


> Originally Posted by *toncij*
> 
> Single-GPU stutter is highly unlikely. Are you sure the card is a problem?


Oh trust me. 20-30 millisecond frametime spikes in CSGO aren't a figment of my imagination. I don't even notice frametime spikes or microstutter in anything else I have tried. Dying Light is fine, Tombraider is fine, Thief is fine. Its just CSGO microstuttering for me every 20 to 40 seconds. Problem is that is the only game I have any real passion for. So here I sit, patiently awaiting driver improvements and a legitimate DPC latency fix instead of this hotfix that only kinda sorta solved the issue.

I ditched a 295X2 and 290X because I believed the FCAT results on the 1080's reviews.


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> You're probably a 144Hz fan, rather than a G-Sync one.


144hz first for sure.

You have that right. Lol!
Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh trust me. 20-30 millisecond frametime spikes in CSGO aren't a figment of my imagination. I don't even notice frametime spikes or microstutter in anything else I have tried. Dying Light is fine, Tombraider is fine, Thief is fine. Its just CSGO microstuttering for me every 20 to 40 seconds. Problem is that is the only game I have any real passion for. So here I sit, patiently awaiting driver improvements and a legitimate DPC latency fix instead of this hotfix that only kinda sorta solved the issue.
> 
> I ditched a 295X2 and 290X because I believed the FCAT results on the 1080's reviews.


The man speaks truth. Every game he mentioned that works well, does.

All good engines with no discernable frame time issues. Especially Dying Light and Tomb Raider (ROTR too). Always butter smooth, never stutter.

If a game stutters, I don't play it. I've beaten Tomb Raider and almost finished Dying Light. If you knew me personally you know just by that, those games have great engines.


----------



## xer0h0ur

Man who isn't. I bought a 1440p 144Hz Freesync monitor and I had no intention of ever even using Freesync on it. I got it because it was a 1440p 144Hz monitor that I could tweak to give me solid FPS performance.


----------



## raidflex

All I want is an OLED 4k Gsync 100Hz monitor at 30in+ and I will be happy.


----------



## nexxusty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Man who isn't. I bought a 1440p 144Hz Freesync monitor and I had no intention of ever even using Freesync on it. I got it because it was a 1440p 144Hz monitor that I could tweak to give me solid FPS performance.


If I had to choose, I would always choose 144hz over GSYNC.

Those 4k 60Hz GSYNC monitors are JUNK. Anything 60hz/fps is junk.....

Quote:


> Originally Posted by *raidflex*
> 
> All I want is an OLED 4k Gsync 100Hz monitor at 30in+ and I will be happy.


Hehe, holding out for OLED are we? Such a slow technology. They're having a hard time making larger screens that are reliable.

AFAIK anyway.


----------



## raidflex

Quote:


> Originally Posted by *nexxusty*
> 
> If I had to choose, I would always choose 144hz over GSYNC.
> 
> Those 4k 60Hz GSYNC monitors are JUNK. Anything 60hz/fps is junk.....
> Hehe, holding out for OLED are we? Such a slow technology. They're having a hard time making larger screens that are reliable.
> 
> AFAIK anyway.


While I agree 144Hz is nice, FPS can vary and that is were Gsync comes in. But I would still want a monitor with at least 100Hz. I have been waiting for OLED for 10 years, it sucks that its been so slow with development and MFG. Nothing beats the contrast of an OLED panel though.


----------



## ikjadoon

Quote:


> Originally Posted by *nexxusty*
> 
> If I had to choose, I would always choose 144hz over GSYNC.
> 
> Those 4k 60Hz GSYNC monitors are JUNK. Anything 60hz/fps is junk.....
> Hehe, holding out for OLED are we? Such a slow technology. They're having a hard time making larger screens that are reliable.
> 
> AFAIK anyway.


Well, OLED is maybe kind of coming this year. The Dell UP3017Q is the 30" 4K OLED, possibly with 120Hz (I've heard conflicting info). But, it's been delayed until Q3/Q4 this year.







And, well, haha, it's also $5,000.


----------



## KillerBee33

You can set any Screen Size Virtually


----------



## toncij

Quote:


> Originally Posted by *xer0h0ur*
> 
> Man who isn't. I bought a 1440p 144Hz Freesync monitor and I had no intention of ever even using Freesync on it. I got it because it was a 1440p 144Hz monitor that I could tweak to give me solid FPS performance.


I put game details at such level that I get 120 or 144 as a minimum. That way I have the best experience and don't bother about anySync.








Quote:


> Originally Posted by *raidflex*
> 
> All I want is an OLED 4k Gsync 100Hz monitor at 30in+ and I will be happy.


Quote:


> Originally Posted by *ikjadoon*
> 
> Well, OLED is maybe kind of coming this year. The Dell UP3017Q is the 30" 4K OLED, possibly with 120Hz (I've heard conflicting info). But, it's been delayed until Q3/Q4 this year.
> 
> 
> 
> 
> 
> 
> 
> And, well, haha, it's also $5,000.


Unfortunatelly, that Dell is unusable. DP1.2 - or USB-C and we don't have a USB-C GPU.









Oh and that stutter, are you sure it's not related to your drive? I'm using SSDs for years now, always the fastest so I may have missed the stutter during streaming part...


----------



## ikjadoon

Quote:


> Originally Posted by *toncij*
> 
> Unfortunatelly, that Dell is unusable. DP1.2 - or USB-C and we don't have a USB-C GPU.


Uh, what? lol, just get a DisplayPort to type-C adapter, dude.







If you can buy a $5,000 monitor, I'm sure a $30 cable won't hurt.

http://www.monoprice.com/product?p_id=12908

Actually, it's confirmed 4K @ 120Hz..


----------



## fat4l

Ok guys....I'm getting ready....Slowly









Cooling:
EK 1080 Fullcover Waterblock - Plexi Nickel
EK 1080 Backplate - Nickel
EK Plexi Terminal

Paste:
Thermal Grizzly Conductonaut - Liquid Metal 73W/mK
Thermal Grizzly Kryonaut - 12,5 W/mk

Pads:
Alphacool Eisschicht thermal pad - 17W/mK 0,5mm - Sarcon XR-m
Alphacool Eisschicht thermal pad - 17W/mK 1.0mm - Sarcon XR-m

I'm hoping to get <40C on load and very low VRM temps as well!


----------



## raidflex

Quote:


> Originally Posted by *fat4l*
> 
> Ok guys....I'm getting ready....Slowly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cooling:
> EK 1080 Fullcover Waterblock - Plexi Nickel
> EK 1080 Backplate - Nickel
> EK Plexi Terminal
> 
> Paste:
> Thermal Grizzly Conductonaut - Liquid Metal 73W/mK
> Thermal Grizzly Kryonaut - 12,5 W/mk
> 
> Pads:
> Alphacool Eisschicht thermal pad - 17W/mK 0,5mm - Sarcon XR-m
> Alphacool Eisschicht thermal pad - 17W/mK 1.0mm - Sarcon XR-m
> 
> I'm hoping to get <40C on load and very low VRM temps as well!


Curious on what temps you end up with. I am hoping get get below 40C also.


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> I put game details at such level that I get 120 or 144 as a minimum. That way I have the best experience and don't bother about anySync.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unfortunatelly, that Dell is unusable. DP1.2 - or USB-C and we don't have a USB-C GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh and that stutter, are you sure it's not related to your drive? I'm using SSDs for years now, always the fastest so I may have missed the stutter during streaming part...


Me?

I have a Samsung SM951 512gb.... also owned many, many SSD's. Mechanical HDD's have no place in a modern gaming system.








Quote:


> Originally Posted by *fat4l*
> 
> Ok guys....I'm getting ready....Slowly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cooling:
> EK 1080 Fullcover Waterblock - Plexi Nickel
> EK 1080 Backplate - Nickel
> EK Plexi Terminal
> 
> Paste:
> Thermal Grizzly Conductonaut - Liquid Metal 73W/mK
> Thermal Grizzly Kryonaut - 12,5 W/mk
> 
> Pads:
> Alphacool Eisschicht thermal pad - 17W/mK 0,5mm - Sarcon XR-m
> Alphacool Eisschicht thermal pad - 17W/mK 1.0mm - Sarcon XR-m
> 
> I'm hoping to get <40C on load and very low VRM temps as well!


Oh, nice bro!

Let's see what you get. Interested.


----------



## CallsignVega

Quote:


> Originally Posted by *ikjadoon*
> 
> Uh, what? lol, just get a DisplayPort to type-C adapter, dude.
> 
> 
> 
> 
> 
> 
> 
> If you can buy a $5,000 monitor, I'm sure a $30 cable won't hurt.
> 
> http://www.monoprice.com/product?p_id=12908
> 
> Actually, it's confirmed 4K @ 120Hz..


That will only work at 4K 30Hz. Can't even do 4K at 60 Hz let alone 120 Hz.


----------



## sherlock

Quote:


> Originally Posted by *CallsignVega*
> 
> That will only work at 4K 30Hz. Can't even do 4K at 60 Hz let alone 120 Hz.


The review that said it only work at 4K 30Hz is using a 12" Macbook that only can only do 4K 30Hz because Core M iGP limitations.

or just get this which is confirmed USB 3.1 C to DP 1.2 and half the price

https://www.amazon.com/Adapter-VicTsing-DisplayPort-ChromeBook-AiO-Black/dp/B0191PRS72/ref=sr_1_2?ie=UTF8&qid=1469648538&sr=8-2&keywords=usb+c+to+display+port+1.2


----------



## Rhadamanthis

Guys for member that have a founders edition, stock thermal pad are thickness 1mm for the memory and vrm?


----------



## ikjadoon

Quote:


> Originally Posted by *CallsignVega*
> 
> That will only work at 4K 30Hz. Can't even do 4K at 60 Hz let alone 120 Hz.


Err, you're talking about the _literal_ cable I linked?
Quote:


> Originally Posted by *VESA*
> Like USB, DisplayPort uses a packetized data structure and differential AC-Coupled signal "lanes" that carry high speed data with an embedded clock. *This allows the same electrical circuits and cables to carry either SuperSpeed USB data, at up to 10 Gbps per lane, or DisplayPort, at up to 8.1 Gbps per lane, as defined in the new DisplayPort 1.3 Standard*. *Early implementations of DisplayPort Alt Mode USB Type-C devices will likely use existing DisplayPort 1.2a capabilities that support up to 5.4 Gbps per lane. Using 5.4 Gbps across all four high-speed lanes will support up to 4K (4096 x 2160) display resolutions at a 60Hz frame rate with up to 30-bit color.*


However, it's crazy to think that appropriate cabling / GPUs setup will not be available. Every thread about this OLED monitor seems to think, "Oh, wow. Dell screwed the pooch on that $5,000 monitor." Give them a _little_ credit.

....or, have you discovered something that Dell hasn't even imagined yet?


----------



## Jpmboy

^^ 4K @ 120Hz does not yet have a gpu transcoder that can push that signal.. thru a cable that does not yet exist AFAIK.

Quote:


> Originally Posted by *fat4l*
> 
> Ok guys....I'm getting ready....Slowly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cooling:
> EK 1080 Fullcover Waterblock - Plexi Nickel
> EK 1080 Backplate - Nickel
> EK Plexi Terminal
> 
> Paste:
> Thermal Grizzly Conductonaut - Liquid Metal 73W/mK
> Thermal Grizzly Kryonaut - 12,5 W/mk
> Pads:
> Alphacool Eisschicht thermal pad - 17W/mK 0,5mm - Sarcon XR-m
> Alphacool Eisschicht thermal pad - 17W/mK 1.0mm - Sarcon XR-m
> I'm hoping to get <40C on load and very low VRM temps as well!


top line (ex-) Fuji pads (careful, they are more like a putty than a pad). Where are you putting the Liquid metal? JUST an FYI - the block mount tolerances make using a Liq metal a real challenge since they only really work well when the mount surfaces are perfect. The kryonaut on the GPU is more "imperfection" tolerant. I'd try that first and if the temps are not to your liking, go metal.
My experience using an EK uniblock on the 1080 with kryonaut (or gelid ex, PK-3 etc) is that the GPU temp will not be a problem, and there is no clock benefit between 30C and 45C. Hook up a water chiller and cool it to 5-10C and it will run higher clocks.. for us "ambient" players.


----------



## toncij

Quote:


> Originally Posted by *ikjadoon*
> 
> Uh, what? lol, just get a DisplayPort to type-C adapter, dude.
> 
> 
> 
> 
> 
> 
> 
> If you can buy a $5,000 monitor, I'm sure a $30 cable won't hurt.
> 
> http://www.monoprice.com/product?p_id=12908
> 
> Actually, it's confirmed 4K @ 120Hz..


Sorry to inform you, but no cable will make your DP1.2 run [email protected] No cable.


----------



## ikjadoon

Quote:


> Originally Posted by *toncij*
> 
> Sorry to inform you, but no cable will make your DP1.2 run [email protected] No cable.


Yes, maybe *that* exact cable will not work. Read the quote I pulled from VESA; that cable (if it is limited to 5Gbps) is the *early implementation*.

Do you all seriously think Dell is launching a $5,000 monitor that won't be compatible with the GTX 1080? Am I seriously hearing that? Let me know. I think I fell into crazy land, lol.


----------



## toncij

Quote:


> Originally Posted by *ikjadoon*
> 
> Yes, maybe *that* exact cable will not work. Read the quote I pulled from VESA; that cable (if it is limited to 5Gbps) is the *early implementation*.
> 
> Do you all seriously think Dell is launching a $5,000 monitor that won't be compatible with the GTX 1080? Am I seriously hearing that? Let me know. I think I fell into crazy land, lol.


Let's hope you're right and we'll plug it in.


----------



## nexxusty

Quote:


> Originally Posted by *ikjadoon*
> 
> Yes, maybe *that* exact cable will not work. Read the quote I pulled from VESA; that cable (if it is limited to 5Gbps) is the *early implementation*.
> 
> Do you all seriously think Dell is launching a $5,000 monitor that won't be compatible with the GTX 1080? Am I seriously hearing that? Let me know. I think I fell into crazy land, lol.


DP 1.3 has been out for awhile with 1.4 on the horizon.

AFAIK the GTX 1080 has DP 1.2a ports certified ports, they're apparently DP 1.3/1.4 "Ready" whatever the hell that means.

So atm...Yes. You are hearing that.


----------



## ikjadoon

Quote:


> Originally Posted by *toncij*
> 
> Let's hope you're right and we'll plug it in.


LOL. Yeah, right.









You all should all call Dell and inform them of their huge engineering mistake. They completely forgot about cables and DP 1.3. Man, Dell should consult OCN members before they make monitors. We can teach them a thing or two!

Quote:


> Originally Posted by *nexxusty*
> 
> DP 1.3 has been out for awhile with 1.4 on the horizon.
> 
> AFAIK the GTX 1080 has DP 1.2a ports certified ports, they're apparently DP 1.3/1.4 "Ready" whatever the hell that means.
> 
> So atm...Yes. You are hearing that.


Why is "at the moment" relevant for an unreleased product?









You're right; it's only certified for DP1.2a. I think DP 1.3/1.4 are only "Ready"....because I haven't seen a single DP1.3/1.4 monitor hit stores yet. Have you?


----------



## nexxusty

Quote:


> Originally Posted by *ikjadoon*
> 
> LOL. Yeah, right.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You all should all call Dell and inform them of their huge engineering mistake. They completely forgot about cables and DP 1.3. Man, Dell should consult OCN members before they make monitors. We can teach them a thing or two!
> 
> Why is "at the moment" relevant for an unreleased product?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're right; it's only certified for DP1.2a. I think DP 1.3/1.4 are only "Ready"....because I haven't seen a single DP1.3/1.4 monitor hit stores yet. Have you?


Nope. "Ready" could very well mean exactly this.

I wouldn't discount it.

It's relevant, at least to me because I can't touch it.


----------



## ikjadoon

Quote:


> Originally Posted by *nexxusty*
> 
> Nope. "Ready" could very well mean exactly this.
> 
> I wouldn't discount it.


Sure, sure.







Let's all wait for it to release. We don't know anything until it gets in our hands. I completely agree.

The only reason I started this discussion is because of this comment:
Quote:


> Originally Posted by *toncij*
> 
> Unfortunatelly, that Dell is unusable. DP1.2 - or USB-C and we don't have a USB-C GPU.


Unusable? At worst, you'll have to use two of the "DP 1.2a" ports from a GTX 1080. Nowhere _near_ "unusable".

If that's the bar....everything could be unusable. Nobody buy a graphic card now: we don't know if will support any games released after today! Nobody buy a car; maybe the gasoline tomorrow will suddenly be incompatible. And definitely don't buy anything that uses a wall outlet: for all we know, just tomorrow, new electrical standards are released and we're all screwed.

All I'm saying....let's give Dell slightly more credit than "unusable"....before they even release it.


----------



## toncij

That display has only a DP1.2. Not sure if it can work that way
Quote:


> Originally Posted by *ikjadoon*


We're not sure if Dell monitor's USB-C can communicate with DP 1.3/1.4 ports on 1080. That's all. Until we see, we won't know. If we ever do. That monitor is long past announcement date.


----------



## dubldwn

Well after ripping on FE for weeks I got one. Couldn't wait anymore but it sounds like you fellas are saying they have some of the higher clocks.

Got a 4mm socket (with a whole set) from Harbor Freight for $5.

I had to use a weird jeweler screwdriver on those tiny screws.

EK block installed. I did breath on the acrylic but luckily it didn't crack. Benching tonight.

First all stock run 1911MHz/40C.


----------



## ikjadoon

Quote:


> Originally Posted by *toncij*
> 
> That display has only a DP1.2. Not sure if it can work that way
> We're not sure if Dell monitor's USB-C can communicate with DP 1.3/1.4 ports on 1080. That's all. Until we see, we won't know. If we ever do. That monitor is long past announcement date.


Did you read the link I sent? It has DP 1.2, but also a type-C port. It would be crazy..._right?_....crazy if Dell released a monitor that didn't have an input that worked. If it's 4K @ 120Hz...and it has a type-C port....I think it's _very_ safe to assume it has DP 1.3 support. I didn't know we were all so worried about these things. Maybe OCN users are far more anxious than I expected.

And it's not just this thread....every thread about this monitor has people up in arms, nearly with pitchforks, that "OMG. It only has DP 1.2. How could Dell have been so stupid?"

It'd be like releasing the 5K monitors with just a VGA port. That's not...nobody does that. No company is that asinine. Right? Have I been missing this huge trend of monitors that don't have appropriate inputs? Are you all getting screwed over by Dell's monitor inputs? I just...I don't know. Maybe I've been unaware.

---

Until we see, sure. Just like we won't know if Battlefield 1 will work on Pascal GPUs, right? It's in Alpha and who knows...Maybe EA/DICE will go crazy forget to add support for Pascal GPUs. We won't ever know until the release date. Everybody, if you wanted to play BF1, don't buy a computer yet. You need to wait until the game is released and we can confirm it will be compatible with DDR4, Pascal, and, heck, even Windows 10.

/s

I'm just having a laugh. I get it. Type-C is new, people are worried, we've never had to worry about this thing before. Sure. Sure. I just think it's hilarious that it's Dell you guys are worried about, on their $5,000 OLED monitor. Is this really where we'll see type-C trip up? I expect it'll happen....just not on this monitor, of all type-C devices that will come into existence.

Agreed. It's way late now. But, actually, most of the OLED devices announced at CES were "delayed".







I think just Samsung's was on time, but they're lucky enough to make their own panels, haha.

I've heard rumblings of Q3 or Q4 this year, but who knows? Dell can write whatever they want....


----------



## karelbastos

PPL

What you think ?

I'm using 2 GTX 1080 ZOTAC FE

OC two AT

MAX VOLT
MAS POWER LIMIT
+165 CORE
+500 VRAM
100% FAN

I'm getting

2050 Mhz, 2025 Mhz almost of the time because the temps goes up to 62 - 67 C

It's a good deal for FE version ?

Today i brought two EVGA HYBRID COOLER to mount on my two 1080

https://www.amazon.com/gp/product/B00ZQ4PFX2/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

I saw other users using this cooler on GTX 1080, and it fits fine.

Maybe with WATER COOLER i can reach 2100 - 2150 mhz ?

Or its impossible on a FE 1080 ?

Thanks...


----------



## xer0h0ur

I have already seen one of my friends rage hard over his 4K Dell monitor's problems. I wouldn't touch a Dell monitor with a 10 foot pole. Especially not something new that Dell is first to market with.


----------



## GreedyMuffin

Hoping for 2150 on my FE. If it can do 2100/2125 on air, maybe 25 extra will be possible.


----------



## toncij

Quote:


> Originally Posted by *ikjadoon*
> 
> ...


You need to Netflix & chill dude.


----------



## xer0h0ur

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Hoping for 2150 on my FE. If it can do 2100/2125 on air, maybe 25 extra will be possible.


Just make sure you're chasing performance instead of MHz because a few people have observed and reported that despite being able to push higher clocks with stability, their Firestrike scores went down.


----------



## CallsignVega

Quote:


> Originally Posted by *sherlock*
> 
> The review that said it only work at 4K 30Hz is using a 12" Macbook that only can only do 4K 30Hz because Core M iGP limitations.
> 
> or just get this which is confirmed USB 3.1 C to DP 1.2 and half the price
> 
> https://www.amazon.com/Adapter-VicTsing-DisplayPort-ChromeBook-AiO-Black/dp/B0191PRS72/ref=sr_1_2?ie=UTF8&qid=1469648538&sr=8-2&keywords=usb+c+to+display+port+1.2


That will NOT work with the new Dell at 120 Hz 4K. The whole reason Dell went with Thunderbolt 3 is that there are no DP 1.3/1.4 TCon chips out there yet. That's why we've seen zero DP 1.3/1.4 monitors.


----------



## ikjadoon

Quote:


> Originally Posted by *toncij*
> 
> You need to Netflix & chill dude.


Bahaha, true. I just wanted some chuckles.

I try to make my own comedy show,







Netflix's stand-up specials ain't got nothing on OCN,









---

These things are not dropping in price anytime soon, I see.







Maybe Black Friday sales on the GTX 1070.


----------



## toncij

Quote:


> Originally Posted by *CallsignVega*
> 
> we've seen zero DP 1.3/1.4 monitors.


Actually, Asus showed a [email protected] just days ago...


----------



## juniordnz

Is firestrike still a good tool to asset overclock stability? Getting a lot of driver crashes on it...


----------



## Snabeltorsk

Quote:


> Originally Posted by *juniordnz*
> 
> Is firestrike still a good tool to asset overclock stability? Getting a lot of driver crashes on it...


Run this to check for stability.

http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/360_30

http://gdl.square-enix.com/ffxiv/inst/ffxiv-heavensward-bench.zip


----------



## karelbastos

My case

If i go to RAM 501+ i have no gains, only down in performance.

+500 for VRAM is the best for me

And +165 fo core, with stock fan FE and this latest drivers.

More than +165 drivers crash


----------



## FattysGoneWild

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have already seen one of my friends rage hard over his 4K Dell monitor's problems. I wouldn't touch a Dell monitor with a 10 foot pole. Especially not something new that Dell is first to market with.


Well let me tell you the other 2 big players Asus and Acer are far from angels themselves. They also don't offer advancement replacement along with shipping to and from for free. Dell offers those perks free of charge and is included in the 3 year warranty.


----------



## CallsignVega

Quote:


> Originally Posted by *toncij*
> 
> Actually, Asus showed a [email protected] just days ago...


Showing a prototype monitor means nothing. Asus takes 1-2 year to release monitors after prototypes.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Showing a prototype monitor means nothing. Asus takes 1-2 year to release monitors after prototypes.


Didn't the MG279Q and PG279Q both come out under a year after it was first shown?


----------



## juniordnz

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Run this to check for stability.
> 
> http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/360_30
> 
> http://gdl.square-enix.com/ffxiv/inst/ffxiv-heavensward-bench.zip


I can handle FFXIV with maximum preset on a loop and still crash miserably on firestrike


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> I can handle FFXIV with maximum preset on a loop and still crash miserably on firestrike


Try all the same settings but Stock Voltage


----------



## axiumone

Quote:


> Originally Posted by *boredgunner*
> 
> Didn't the MG279Q and PG279Q both come out under a year after it was first shown?


I think the PG278 was in development hell for a very long time. I'm cautiously optimistic that we may start seeing [email protected] at the end of the year and certainly in 2017.


----------



## Baasha

Ran into a weird issue with the 1080s today.

Tried to play BF4. I had to take a call so I was AFK and the server booted me for being AFK for too long. No biggie I thought. But, now the desktop, everything on it, windows in Chrome etc., all moved at a snail's pace - almost like a slideshow. I would try to bring up the task manager but switching windows took a LONG time.

I couldn't even restart the computer - pressing the Windows icon brought the thing up and I pressed Restart but Origin failed to close and everything just moved really slowly.

I had to restart manually and everything's normal. It definitely has something to do with the power states or something with the GPUs - I guess changing the window from the game to the desktop did something where the cards were at 0% usage and everything moved extremely slowly.

Any idea what caused this or how to avoid this in the future?


----------



## axiumone

Quote:


> Originally Posted by *Baasha*
> 
> Ran into a weird issue with the 1080s today.
> 
> Tried to play BF4. I had to take a call so I was AFK and the server booted me for being AFK for too long. No biggie I thought. But, now the desktop, everything on it, windows in Chrome etc., all moved at a snail's pace - almost like a slideshow. I would try to bring up the task manager but switching windows took a LONG time.
> 
> I couldn't even restart the computer - pressing the Windows icon brought the thing up and I pressed Restart but Origin failed to close and everything just moved really slowly.
> 
> I had to restart manually and everything's normal. It definitely has something to do with the power states or something with the GPUs - I guess changing the window from the game to the desktop did something where the cards were at 0% usage and everything moved extremely slowly.
> 
> Any idea what caused this or how to avoid this in the future?


You're in sli right? Single display, multiple or surround?


----------



## uberwootage

Quote:


> Originally Posted by *Baasha*
> 
> Ran into a weird issue with the 1080s today.
> 
> Tried to play BF4. I had to take a call so I was AFK and the server booted me for being AFK for too long. No biggie I thought. But, now the desktop, everything on it, windows in Chrome etc., all moved at a snail's pace - almost like a slideshow. I would try to bring up the task manager but switching windows took a LONG time.
> 
> I couldn't even restart the computer - pressing the Windows icon brought the thing up and I pressed Restart but Origin failed to close and everything just moved really slowly.
> 
> I had to restart manually and everything's normal. It definitely has something to do with the power states or something with the GPUs - I guess changing the window from the game to the desktop did something where the cards were at 0% usage and everything moved extremely slowly.
> 
> Any idea what caused this or how to avoid this in the future?


On the driver to hotfix the dcp issue i am seeing ta bad lag on start up. After a min. it goes away but its very slow for about 2 min. Then its fine.


----------



## fat4l

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Run this to check for stability.
> 
> http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/360_30
> 
> http://gdl.square-enix.com/ffxiv/inst/ffxiv-heavensward-bench.zip


Can anyone else confirm this?


----------



## ssgwright

Quote:


> Originally Posted by *Baasha*
> 
> Ran into a weird issue with the 1080s today.
> 
> Tried to play BF4. I had to take a call so I was AFK and the server booted me for being AFK for too long. No biggie I thought. But, now the desktop, everything on it, windows in Chrome etc., all moved at a snail's pace - almost like a slideshow. I would try to bring up the task manager but switching windows took a LONG time.
> 
> I couldn't even restart the computer - pressing the Windows icon brought the thing up and I pressed Restart but Origin failed to close and everything just moved really slowly.
> 
> I had to restart manually and everything's normal. It definitely has something to do with the power states or something with the GPUs - I guess changing the window from the game to the desktop did something where the cards were at 0% usage and everything moved extremely slowly.
> 
> Any idea what caused this or how to avoid this in the future?


noticed this as well, when i game I'll stop and go do something come back and I didn't notice anything in windows or on the net but when I run a 3d program the card won't boost normally... half at best. Restart and everythings fine again.


----------



## JaredC01

Quote:


> Originally Posted by *fat4l*
> 
> Can anyone else confirm this?


Tried every setting I could on it, and could never get the program to pull the card out of utilization throttling.


----------



## juniordnz

Quote:


> Originally Posted by *fat4l*
> 
> Can anyone else confirm this?


just too light to use as a stability test. could go pass it 30 times and fail miserably with 4 seconds of heaven or firestrike.

Firestrike Ultra Stresstest is killing me right now


----------



## KickAssCop

So how many of you who purchased the 1080s are going to get a Titan X?


----------



## xer0h0ur

Quote:


> Originally Posted by *KickAssCop*
> 
> So how many of you who purchased the 1080s are going to get a Titan X?


HBM2 Vega / Volta or bust. I am not giving these stepping stone GPUs any more attention.


----------



## Kriant

I have a bit of a dilemma:

Will it be possible to fit an HB bridge between a FE 1080 and a FTW 1080? (on ASUS RVE with an 80mm bridge).


----------



## KickAssCop

Can't see why not.


----------



## fat4l

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ 4K @ 120Hz does not yet have a gpu transcoder that can push that signal.. thru a cable that does not yet exist AFAIK.
> top line (ex-) Fuji pads (careful, they are more like a putty than a pad). Where are you putting the Liquid metal? JUST an FYI - the block mount tolerances make using a Liq metal a real challenge since they only really work well when the mount surfaces are perfect. The kryonaut on the GPU is more "imperfection" tolerant. I'd try that first and if the temps are not to your liking, go metal.
> My experience using an EK uniblock on the 1080 with kryonaut (or gelid ex, PK-3 etc) is that the GPU temp will not be a problem, and there is no clock benefit between 30C and 45C. Hook up a water chiller and cool it to 5-10C and it will run higher clocks.. for us "ambient" players.


Hi mate








I used fujipoly pads on my asus ares 3 card which was 295x2 with custom pcb and waterblock. Improved my vrm temps by up yo 35C under heavy oc and volts.
Its rly good to use it and I rly recommend it!
I also used CLU on both of the cores and it dropped temps by ~10-15C. I ended up with about 40-50C on ares 3 wich has an EK waterblock.
I will use it with 1080 as well







I wil ltest how the normal paste is spreading first ofc ..


----------



## Rhadamanthis

Quote:


> Originally Posted by *Rhadamanthis*
> 
> Guys for member that have a founders edition, stock thermal pad are thickness 1mm for the memory and vrm?


anyone write me size?


----------



## pez

Quote:


> Originally Posted by *boredgunner*
> 
> No ULMB though. G-SYNC is an excellent crutch but ULMB (and the like) are the goal for LCD.


I'll have to look into what exactly that does....another thing I'm sure I'll be told I'm a pleb for not having in a few months







.
Quote:


> Originally Posted by *nexxusty*
> 
> Oh pezzy my boy you're going to love it.
> 
> I cannot live without GSYNC. Not possible now. Lol.


I highly enjoyed it yesterday. I do miss the density of 4K in a 27 inch screen, but I do have to admit even Win10's scaling is a bit atrocious. Chrome as well. Games however were not an issue, and SLI did it justice.

CS:GO was rather lovely at 100hz and ridiculously smooth. The same went for GTA V (playing a survival) and Overwatch as well. Overwatch was the one game that felt weird with G-sync. I had to max it out and do 200% scaling to get it below 100 FPS, and then it I felt like I was getting mouse acceleration or some sort of input lag. No other game felt this way. I thought it was because ti was going from about 90-110 constantly, but GTA V did this as well in some parts and didn't have this feeling. I'll have to play with it a bit more tonight.

However, I did decide to return the 4K panel. I'm still inside of my return window. It's a great panel, actually, and I'll miss it. The IPS panel on it is definitely a bit better than the X34, but G-sync and ultrawide are enough to convince me I can live without it







. I look forward to ultrawide 4K







.

G-sync really is something special. It's a hard thing to describe to someone who knows nothing about monitors (or at least familiar with variable refresh rates and stuff). I maxed GTA V out completely and during the stunt races yesterday I did see drops to 50FPS, and they were noticeable, but no screen tearing, and nothing like what you see when dropping below 60FPS on a 'normal' monitor. It's hard to explain the way you experience it, but it's something you'd almost have to look for to notice it. I'm debating whether it's worth turning down settings to maintain 60FPS or if I'm happy enough with how g-sync is doing that I can live with it.
Quote:


> Originally Posted by *nexxusty*
> 
> SLi is a joke now. Benchmarks and epeen only.
> 
> There are people who can't wait for single card 4k who try to justify it. They're crazy. 2K is where it's at.
> 
> You're greedy if you "think" you need more for gaming. IMO.


I dunno. I like my SLI setup. It's running very well with no issues so far







. Minus the fact that Doom still doesn't support and gets some nasty frame drops with newer patches...I find that most of the games I actually need the extra power in supports SLI







.


----------



## nexxusty

Quote:


> Originally Posted by *pez*
> 
> I'll have to look into what exactly that does....another thing I'm sure I'll be told I'm a pleb for not having in a few months
> 
> 
> 
> 
> 
> 
> 
> .
> I highly enjoyed it yesterday. I do miss the density of 4K in a 27 inch screen, but I do have to admit even Win10's scaling is a bit atrocious. Chrome as well. Games however were not an issue, and SLI did it justice.
> 
> CS:GO was rather lovely at 100hz and ridiculously smooth. The same went for GTA V (playing a survival) and Overwatch as well. Overwatch was the one game that felt weird with G-sync. I had to max it out and do 200% scaling to get it below 100 FPS, and then it I felt like I was getting mouse acceleration or some sort of input lag. No other game felt this way. I thought it was because ti was going from about 90-110 constantly, but GTA V did this as well in some parts and didn't have this feeling. I'll have to play with it a bit more tonight.
> 
> However, I did decide to return the 4K panel. I'm still inside of my return window. It's a great panel, actually, and I'll miss it. The IPS panel on it is definitely a bit better than the X34, but G-sync and ultrawide are enough to convince me I can live without it
> 
> 
> 
> 
> 
> 
> 
> . I look forward to ultrawide 4K
> 
> 
> 
> 
> 
> 
> 
> .
> 
> G-sync really is something special. It's a hard thing to describe to someone who knows nothing about monitors (or at least familiar with variable refresh rates and stuff). I maxed GTA V out completely and during the stunt races yesterday I did see drops to 50FPS, and they were noticeable, but no screen tearing, and nothing like what you see when dropping below 60FPS on a 'normal' monitor. It's hard to explain the way you experience it, but it's something you'd almost have to look for to notice it. I'm debating whether it's worth turning down settings to maintain 60FPS or if I'm happy enough with how g-sync is doing that I can live with it.
> I dunno. I like my SLI setup. It's running very well with no issues so far
> 
> 
> 
> 
> 
> 
> 
> . Minus the fact that Doom still doesn't support and gets some nasty frame drops with newer patches...I find that most of the games I actually need the extra power in supports SLI
> 
> 
> 
> 
> 
> 
> 
> .


You sound just like me, Hehe.

I've been waiting for GSYNC ever since GLQuake on Voodoo cards. Never used vsync, ever, tearing got worse and worse though.

Glad you're happy with your decision though, not easy buying these $800+ panels and being happy with them. So easy to find a fault at that price.

I was lucky to get the last Dell 27" Gsync 1440p monitor on sale recently. It's a TN, however I don't care. After calibration it's the best monitor I've ever used by a WIIIDE margin. I drooled when I first loaded up DooM.

Still do. 1080p is the new 720p.


----------



## Whitechap3l

Quote:


> Originally Posted by *nexxusty*
> 
> You sound just like me, Hehe.
> 
> I've been waiting for GSYNC ever since GLQuake on Voodoo cards. Never used vsync, ever, tearing got worse and worse though.
> 
> Glad you're happy with your decision though, not easy buying these $800+ panels and being happy with them. So easy to find a fault at that price.
> 
> I was lucky to get the last Dell 27" Gsync 1440p monitor on sale recently. It's a TN, however I don't care. After calibration it's the best monitor I've ever used by a WIIIDE margin. I drooled when I first loaded up DooM.
> 
> Still do. 1080p is the new 720p.


Dell S2716DG ?


----------



## nexxusty

Quote:


> Originally Posted by *Whitechap3l*
> 
> Dell S2716DG ?


Yep. You got it.


----------



## pez

Quote:


> Originally Posted by *nexxusty*
> 
> You sound just like me, Hehe.
> 
> I've been waiting for GSYNC ever since GLQuake on Voodoo cards. Never used vsync, ever, tearing got worse and worse though.
> 
> Glad you're happy with your decision though, not easy buying these $800+ panels and being happy with them. So easy to find a fault at that price.
> 
> I was lucky to get the last Dell 27" Gsync 1440p monitor on sale recently. It's a TN, however I don't care. After calibration it's the best monitor I've ever used by a WIIIDE margin. I drooled when I first loaded up DooM.
> 
> Still do. 1080p is the new 720p.


I actually considered that panel quite a bit, but my Dell IPS spoiled me so much that I couldn't do TN again







. I knew I'd eventually have to pay the price to meet what I wanted in a 'new' panel. And that was by minimum; IPS, g-sync, >=27", and >=1440p. The only 'issues' I have with it so far is that I don't have more variation in the ambient LEDs and that the OSD is less-than-ideal.


----------



## nexxusty

Quote:


> Originally Posted by *pez*
> 
> I actually considered that panel quite a bit, but my Dell IPS spoiled me so much that I couldn't do TN again
> 
> 
> 
> 
> 
> 
> 
> . I knew I'd eventually have to pay the price to meet what I wanted in a 'new' panel. And that was by minimum; IPS, g-sync, >=27", and >=1440p. The only 'issues' I have with it so far is that I don't have more variation in the ambient LEDs and that the OSD is less-than-ideal.


It's a good TN, but it's a TN nonetheless.

I like IPS colour reproduction, obviously. However panel uniformity really messes with my OCD side. IPS glow drives me nuts.

Not playing the panel lottery until I get a decent IPS without glow, so for me.... Slightly less accurate colours and viewing angle are sacrificed for panel uniformity.

Those two things can be dealt with in my head. A non uniform panel can't. So.... I choose TN for now.

OLED can't come quick enough.


----------



## pez

Yeah, an OLED, 4K Ultrawide with 200hz, g-sync and a curve...will be affordable one day







.


----------



## nexxusty

Quote:


> Originally Posted by *pez*
> 
> Yeah, an OLED, 4K Ultrawide with 200hz, g-sync and a curve...will be affordable one day
> 
> 
> 
> 
> 
> 
> 
> .


We'll both be all over those when they are.

As you said.... one day... Hehe.


----------



## toncij

Quote:


> Originally Posted by *CallsignVega*
> 
> Showing a prototype monitor means nothing. Asus takes 1-2 year to release monitors after prototypes.


Well, true, it may be a custom tcon for development... but still...


----------



## Whitechap3l

I have an Asus PB278Q ( 2560*1440, IPL Panel, 60Hz )
I have birthday end of August and I hope to get some cash in for a new one. But gosh when you want a 2k, IPL, 144Hz and maybe G-sync you spend well about 700 - 800 Euro and the Panel is so inconsistent as nexxusty said earlier...
Maybe just the Dell with TN Tanel...

Hard times >.>


----------



## nexxusty

Quote:


> Originally Posted by *Whitechap3l*
> 
> I have an Asus PB278Q ( 2560*1440, IPL Panel, 60Hz )
> I have birthday end of August and I hope to get some cash in for a new one. But gosh when you want a 2k, IPL, 144Hz and maybe G-sync you spend well about 700 - 800 Euro and the Panel is so inconsistent as nexxusty said earlier...
> Maybe just the Dell with TN Tanel...
> 
> Hard times >.>


Right?

Not much one can do it seems. There's always drawbacks choosing a specific Panel Technology.... never best of both worlds.

OLED is close though. Best panel uniformity, no clouding. The only issues I've seen with OLED is colour temperature uniformity. My mom's S6 has the colour uniformity issue. Noticed it immediately when I went over to set it up for her. My Note 5 doesn't exhibit this issue. I got lucky.

I noticed it immediately on my mom's phone. She doesn't even see it. Lol. Ignorance really is bliss. I have wished many times I was a blind idiot....


----------



## Whitechap3l

Quote:


> Originally Posted by *nexxusty*
> 
> Right?
> 
> Not much one can do it seems. There's always drawbacks choosing a specific Panel Technology.... never best of both worlds.
> 
> OLED is close though. Best panel uniformity, no clouding. The only issues I've seen with OLED is colour temperature uniformity.
> 
> My mom's S6 has it. Noticed it immediately when I went over to set it up for her. My Note 5 doesn't exhibit this issue. I got lucky.
> 
> I noticed it immediately on my mom's phone. She doesn't even see it. Lol.


Yeah you are right sure..
But OLed with same specs ( 2k, 144Hz, maybe Gsync ) I guess u have to put some money on the table


----------



## nexxusty

Quote:


> Originally Posted by *Whitechap3l*
> 
> Yeah you are right sure..
> But OLed with same specs ( 2k, 144Hz, maybe Gsync ) I guess u have to put some money on the table


Oh no I only mean from what I've seen from phone OLED screens.

I'm sure the issues are exacerbated with even bigger panels. I suspect we'll all be using OLED monitors in 2 years time.


----------



## Whitechap3l

Quote:


> Originally Posted by *nexxusty*
> 
> Oh no I only mean from what I've seen from phone OLED screens.
> 
> I'm sure the issues are exacerbated with even bigger panels. I suspect we'll all be using OLED monitors in 2 years time.


https://www.rockpapershotgun.com/2016/01/14/ces-2016-oled-dell-razer-oculus-rift/

Maybe in one year ;D


----------



## Snabeltorsk

Quote:


> Originally Posted by *JaredC01*
> 
> Tried every setting I could on it, and could never get the program to pull the card out of utilization throttling.


Quote:


> Originally Posted by *juniordnz*
> 
> just too light to use as a stability test. could go pass it 30 times and fail miserably with 4 seconds of heaven or firestrike.
> 
> Firestrike Ultra Stresstest is killing me right now


You have to do a manually/custom preset for 4K - 3840x2160
Try that and comeback


----------



## PasK1234Xw

Its finally here boys and girls and only $30 more than FE

EVGA GTX 1080 FTW HYBRID

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6288-KR


----------



## tin0

So I have been fiddling with a reference GTX 1080 and different BIOSses. At the moment I found the best BIOS with highest stock boost clocks to use is the MSI GeForce GTX 1080 SEA HAWK X. This one boosts up to 1936MHz on my card without changing anything in afterburner, so completely stock.

One thing all reference based BIOS versions (from all brands I could find) have in common is that the powerlimit is the same (either slider goes up to 120%, or goes up to 105% but same limit in total wattage), and this power limit is holding back any reference card at the moment.

So from my experience, adding any voltage to reference cards is actually quite useless, you will only hit the Power Limit earlier and the card will downclock. Best setting I found so far is simply set maximum powerlimit and find the highest boost clock for your card, without adding any voltage.

*We need a higher power limit BIOS!*







(I know there is a 'hard' mod, but hoping to free ourselfs from powerlimit using BIOS instead of these more drastic measures)

I've attached the MSI SEA HAWK X BIOS for anyone wanting to try, since I found this BIOS is hard to find online at the moment. Of course this is at your own risk, but I can confirm this one works fine with any reference 1080.

MSI_1080_SEAHAWK_X.191.zip 149k .zip file


----------



## Whitechap3l

Quote:


> Originally Posted by *tin0*
> 
> So I have been fiddling with a reference GTX 1080 and different BIOSses. At the moment I found the best BIOS to use is the MSI GeForce GTX 1080 SEA HAWK X. This one boosts up to 1936MHz on my card without changing anything in afterburner, so completely stock.
> 
> One thing all reference based BIOS versions (from all brands I could find) have in common is that the powerlimit is the same (either slider goes up to 120%, or goes up to 105% but same limit in total wattage), and this power limit is holding back any reference card at the moment.
> 
> So from my experience, adding any voltage to reference cards is actually quite useless, you will only hit the Power Limit earlier and the card will downclock. Best setting I found so far is simply set maximum powerlimit and find the highest boost clock for your card, without adding any voltage.
> 
> *We need a higher power limit!*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've attached the MSI SEA HAWK X BIOS for anyone wanting to try, since I found this BIOS is hard to find online at the moment. Of course this is at your own risk, but I can confirm this one works fine with any reference 1080.
> 
> MSI_1080_SEAHAWK_X.191.zip 149k .zip file


So "best" bios.. I mean if u can only hit a certain clock number isnt it indifferent what Bios you use?
The Sea Hawk Bios will probably give me roughly the same clock as an FE Bios or Strix or whatever. or ?


----------



## tin0

Sorry wasn't really clear, I mean 'best BIOS in terms of highest stock boost clock on reference cards'


----------



## Whitechap3l

Ah okay








No i was just wondering


----------



## ROKUGAN

Hi, been reading this thread, just wanted to post my experience with ZOTAC 1080 cards as I had both non-ref versions and was able to compare them, and for some reason there are very few reviews out there on the Zotac cards:


__
https://www.reddit.com/r/4pb1wz/zotac_gtx_1080_amp_max_temperatures/d5sw0jo

Very pleased with OC capabilities on both versions and truly impressed with the Extreme. Getting 2077 Mhz out of the box without touching a thing made me laugh.
Probably sheer luck on the chip lottery, but my experience with the Zotac PCB has been great so far, specially after reading people with Gigabytes and Strixxs not getting past 2K Mhz. And those 2x power connectors give me some peace of mind (despite only one is alledgedly needed).


----------



## CapKrunch

my local shop finally got some Gigabyte GeForce GTX 1080 G1 Gaming Graphics Card in stock right now and I'm tempting to buy it. Should I go for it or hold to my gtx 970 until next generation?


----------



## outofmyheadyo

Why are you asking us, if you are happy with gtx 970 keep it, if not buy a 1080.


----------



## Whitechap3l

Quote:


> Originally Posted by *CapKrunch*
> 
> my local shop finally got some Gigabyte GeForce GTX 1080 G1 Gaming Graphics Card in stock right now and I'm tempting to buy it. Should I go for it or hold to my gtx 970 until next generation?


And then? you wait for the next generation ?
Srsly this questions are kind of stupid in my eyes... when you have money and want to buy the newest ( best ) card out there go for it.
when you are waiting its basically a never ending process


----------



## outofmyheadyo

But unless you have a 1440p 144hz monitor or 4k I dont see much point on the 1080, 60hz 1440p and even 1080p 144hz the 1070 is much more reasonable choice.


----------



## juniordnz

Just found out that 2077mhz is my MSI 1080 Armor maximum overclock. Be it with stock voltage or maximum power limit and voltage add with afterburner, doesn't make any difference except heat.

At that clock I noticed, at least with firestrike ultra stress test, that it goes as high as 2088 for a second, than briefly to 2077, again briefly to 2062, then a litlle longer to 2050 and settles at 2037. All with the same voltage.

Already tried setting the last 3 voltages in boost curve to my max OC (as I did succesfully with Maxwell) but the same behaviour shows.

Getting tired already and I'm almost letting OC go for now and wait for BIOS Editor. Software OC doesn't allow for full control and optimization ending up being just an endless trying game when trying to get stable max overclock.

(With maxwell I could hit 1600mhz stable without any perfcap, any fluctuations at all. Hopefully won't take long for a Pascal BIOS Tweaking)


----------



## juniordnz

Double, sry


----------



## CapKrunch

Yeah I'm gaming on 1440p 144hz monitor. That's why I'm thinking about grab that gtx 1080. I'll just shut up and get me one anyway.

Sent from my LG-D850 using Tapatalk


----------



## Barterlos

Quote:


> Originally Posted by *juniordnz*
> 
> Just found out that 2077mhz is my MSI 1080 Armor maximum overclock. Be it with stock voltage or maximum power limit and voltage add with afterburner, doesn't make any difference except heat.
> 
> At that clock I noticed, at least with firestrike ultra stress test, that it goes as high as 2088 for a second, than briefly to 2077, again briefly to 2062, then a litlle longer to 2050 and settles at 2037. All with the same voltage.
> 
> Already tried setting the last 3 voltages in boost curve to my max OC (as I did succesfully with Maxwell) but the same behaviour shows.
> 
> Getting tired already and I'm almost letting OC go for now and wait for BIOS Editor. Software OC doesn't allow for full control and optimization ending up being just an endless trying game when trying to get stable max overclock.
> 
> (With maxwell I could hit 1600mhz stable without any perfcap, any fluctuations at all. Hopefully won't take long for a Pascal BIOS Tweaking)


ye, same for me, its so hard and so time consuming to get maximum oc performance from 10xx series, that gpu boost 3.0 complicate oc potetnial, no direct control over voltage screw everything


----------



## GreedyMuffin

Have you tried to keep the fan running at 100% to see if the speed still drops?


----------



## Jpmboy

Quote:


> Originally Posted by *fat4l*
> 
> Can anyone else confirm this?


confirm that it is a reasonable benchmark/stress test? yes, it is "medium hard" at best.
Quote:


> Originally Posted by *KickAssCop*
> 
> So how many of you who purchased the 1080s are going to get a Titan X?


me. I think Vega is also, Mr.T... it's the Titan thing. Each one has been the best card of it's generation (OG Titan, TitanX) but also the most freakin expensive. I'd give the 295x2 a nod also. Mine has been running under a Koolance WB since launch.
Quote:


> Originally Posted by *fat4l*
> 
> Hi mate
> 
> 
> 
> 
> 
> 
> 
> 
> I used fujipoly pads on my asus ares 3 card which was 295x2 with custom pcb and waterblock. Improved my vrm temps by up yo 35C under heavy oc and volts.
> Its rly good to use it and I rly recommend it!
> I also used CLU on both of the cores and it dropped temps by ~10-15C. I ended up with about 40-50C on ares 3 wich has an EK waterblock.
> I will use it with 1080 as well
> 
> 
> 
> 
> 
> 
> 
> I wil ltest how the normal paste is spreading first ofc ..


yeah man, I only use Fuji 17s. Best pads. CLU or CLP, let us know what core temps you see. With Gelid or Grizzly, I see max core temps of 38-40C.


----------



## AlienPrime173

Quote:


> Originally Posted by *nexxusty*
> 
> Why.... WHY did you pay $530 for an EVGA Classified?
> 
> Want one for a good price? I have one barely used.... that board is not worth more than $250 on a good day. Paying $500+?
> 
> Naw dude.. naw.


whattt!!?? you do!? Thats $250 USD thought which translates to over $500 CAD after the "FU Canada Tax"

If not, ill happily buy that X99 classy from you for $250 CAD









Can always use another one for idk what haha


----------



## Joshwaa

Anyone with a 1080 have the Acer X34 or Z35 Predator? I am by no means a monitor aficionado so would like some input before purchasing. Heck I am still using an Asus VG248QE.


----------



## AlienPrime173

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone with a 1080 have the Acer X34 or Z35 Predator? I am by no means a monitor aficionado so would like some input before purchasing. Heck I am still using an Asus VG248QE.


I had one of the Z35's... You really get what you pay for... its better to just save up and get a good monitor. On my 980ti it would occasuionally flicker, I asked Acer about it. They said they have a flicker tollerance of up to 0.010% So for every two hours you use the monitor it may flicker 1-2 times and that's deemed "OK" by Acer... i was pretty upset about it. I thought Acer monitors would be different but its the same junk they always have been







I ended up RMA and got stuck with 15% restocking :/

You can find a million people with the same issue lol

http://community.acer.com/t5/Predator-Monitors/z35-screen-flickering-than-gos-black/td-p/420872

community.acer.com/t5/Predator-Monitors/Predator-X34-flickering-and-instability/td-p/392636

Google: https://www.google.ca/search?q=acer+z35+flicker&ie=utf-8&oe=utf-8&gws_rd=cr&ei=DAyaV_GhCKK0jwSX_rKIAw


----------



## Joshwaa

Thanks very much for the input. Are there any super wide curved monitors that are good?


----------



## pez

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone with a 1080 have the Acer X34 or Z35 Predator? I am by no means a monitor aficionado so would like some input before purchasing. Heck I am still using an Asus VG248QE.


I just got one in yesterday. What exactly were you looking for? If you want, I can disable my second GPU and do some testing for you so long as I have the game







.

I also have to disagree on a single 1080 for 4K unless you have something like G-sync. Way too many compromises have to be made for a single card to push that res ATM. I'd be seriously upset to spend $650 on a GPU and another 300-400 on a monitor to make compromises. Standard 1440p with or without G-sync seems to be the sweet spot.


----------



## AlienPrime173

Quote:


> Originally Posted by *Joshwaa*
> 
> Thanks very much for the input. Are there any super wide curved monitors that are good?


http://www.ncix.com/detail/samsung-s29e790c-29in-ultrawide-curved-c8-108669.htm

I personally ended up getting a standard 16:9 2160p u28E590D because it was cheap. But there are lots of options


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Have you tried to keep the fan running at 100% to see if the speed still drops?


Yes, I run always with 100% fan speed when over 40celsius. Noise doesn't bother me when gaming (closed headphone). Still getting those drops in clocks no matter what. Voltage stays constant though


----------



## Joshwaa

Quote:


> Originally Posted by *pez*
> 
> I just got one in yesterday. What exactly were you looking for? If you want, I can disable my second GPU and do some testing for you so long as I have the game
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I also have to disagree on a single 1080 for 4K unless you have something like G-sync. Way too many compromises have to be made for a single card to push that res ATM. I'd be seriously upset to spend $650 on a GPU and another 300-400 on a monitor to make compromises. Standard 1440p with or without G-sync seems to be the sweet spot.


I was more just wondering if it was good for gaming and if it had any issues. Also if a single 1080 would push it ok.


----------



## Joshwaa

Quote:


> Originally Posted by *AlienPrime173*
> 
> http://www.ncix.com/detail/samsung-s29e790c-29in-ultrawide-curved-c8-108669.htm
> 
> I personally ended up getting a standard 16:9 2160p u28E590D because it was cheap. But there are lots of options


I was actually looking at this one also. Samsung LS34E790CNS/ZA


----------



## Kielon

Quote:


> Originally Posted by *Joshwaa*
> 
> I was more just wondering if it was good for gaming and if it had any issues. Also if a single 1080 would push it ok.


Take a look at this 1080/ultra-wide review: http://techgage.com/article/nvidia-geforce-gtx-1080-review-a-look-at-4k-ultra-wide-gaming/ I can confirm that single 1080 is more than capable to drive X34 at or close to 100Hz G-sync limit.


----------



## Sazexa

Hi there everyone. I just recently picked up two 1080's and am having a bit of an issue, and was wondering if anyone else has had a similar experience. I'm running the latest NVidia driver as of yesterday (7/27/2016). I think it's 389.89? I don't remember exactly. I'm running a Intel i7-6950X on stock clock, 4x8 DDR4 (XMP isn't even enabled at the moment), and two GTX 1080's FE at stock clock. I tested each card individually and together on another machine and they ran fine.

The issue is in certain games, I have a massive coloration issue. But it seems to be ONLY in full screen and ONLY in SLI. There is also screen flicker. The screen gets a very over-saturated look, and has a hue shift in colors.

It will do this (though, seemingly different colors shifts in different aspect ratios) no matter what resolution, just as long as SLI is enabled and games are in full screen. Some games are Doom, and Battlefield 3. I haven't test many others yet.

If a game is in full screen, and I press alt+tab to open up a window in the foreground, the game then shifts to normal colors. But as soon as I click back into the game and it comes up full screen, the colors get all messed up again. Disabling SLI fixes the issue, as well as running in a window.

And not all games experience this. Also, I noticed when changing the API in Doom from Open GL 4.5 to Vulkan, it ran flawlessly with SLI and full screen.


----------



## fireyfire

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh trust me. 20-30 millisecond frametime spikes in CSGO aren't a figment of my imagination. I don't even notice frametime spikes or microstutter in anything else I have tried. Dying Light is fine, Tombraider is fine, Thief is fine. Its just CSGO microstuttering for me every 20 to 40 seconds. Problem is that is the only game I have any real passion for. So here I sit, patiently awaiting driver improvements and a legitimate DPC latency fix instead of this hotfix that only kinda sorta solved the issue.
> 
> I ditched a 295X2 and 290X because I believed the FCAT results on the 1080's reviews.


I have the same exact issue with CS:GO and I am using the beta 369.0 Drivers.


----------



## xer0h0ur

Quote:


> Originally Posted by *fireyfire*
> 
> I have the same exact issue with CS:GO and I am using the beta 369.0 Drivers.


Take a look at this madness



That is the sort of frametime spiking I am getting in CSGO with the hotfix driver.

BTW, word on the street is that the 369.0 driver does not have the DPC latency hotfix in it.


----------



## nexxusty

Quote:


> Originally Posted by *AlienPrime173*
> 
> whattt!!?? you do!? Thats $250 USD thought which translates to over $500 CAD after the "FU Canada Tax"
> 
> If not, ill happily buy that X99 classy from you for $250 CAD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can always use another one for idk what haha


250 usd is 329 Canadian. I'd let it go for 300 Canadian.

227 US. If you're down, PM me.


----------



## outofmyheadyo

Does backplate actually do anything on the 1080s other than lookin pretty? Tempted to order one since the nickel one looks mighty fine.


----------



## xer0h0ur

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Does backplate actually do anything on the 1080s other than lookin pretty? Tempted to order one since the nickel one looks mighty fine.


Perhaps on the non reference designs if they put anything on the backside that needs cooling but on the FE cards its just for aesthetics.


----------



## AlienPrime173

Quote:


> Originally Posted by *nexxusty*
> 
> 250 usd is 329 Canadian. I'd let it go for 300 Canadian.
> 
> 227 US. If you're down, PM me.


why arent you using it? i dont have any spare 2011-3 chips laying around to use it. thats the only issue haha


----------



## nexxusty

Quote:


> Originally Posted by *AlienPrime173*
> 
> why arent you using it? i dont have any spare 2011-3 chips laying around to use it. thats the only issue haha


I had enough of it. Wanted a board with a 32gb/s m.2 slot so I didn't have to use my PCI-E to m.2 adapter.

Other than that, no reason. Nice stable board otherwise.


----------



## RMXO

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone with a 1080 have the Acer X34 or Z35 Predator? I am by no means a monitor aficionado so would like some input before purchasing. Heck I am still using an Asus VG248QE.


I have the Acer x34 Predator and it an awesome monitor, no regrets but you'll need SLI to get the best out of it.


----------



## uberwootage

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Does backplate actually do anything on the 1080s other than lookin pretty? Tempted to order one since the nickel one looks mighty fine.


Ref. There are 2 chips that the back plate cools. Its under the big part not the smaller one on the back.


----------



## deejaykristoff

dont understand so much obsession, yesterday we have cards on silicon lottery achived 1500/1600 mhz, today cards achieve 2000/2100mhz or a litte more and it not enough?! it was proved that 50 or 100mhz more have no impact in fps. this generation is different that previous because the clock are already high by default. in my opinion you wasting your time. no gain difference to have 8 or 16 pin power, no gain to have more phases... i think have a good cooling solution to have stable clocks is more important.


----------



## nexxusty

Quote:


> Originally Posted by *deejaykristoff*
> 
> dont understand so much obsession, yesterday we have cards on silicon lottery achived 1500/1600 mhz, today cards achieve 2000/2100mhz or a litte more and it not enough?! it was proved that 50 or 100mhz more have no impact in fps. this generation is different that previous because the clock are already high by default. in my opinion you wasting your time. no gain difference to have 8 or 16 pin power, no gain to have more phases...


It's just fun and games....


----------



## GreedyMuffin

Quote:


> Originally Posted by *deejaykristoff*
> 
> dont understand so much obsession, yesterday we have cards on silicon lottery achived 1500/1600 mhz, today cards achieve 2000/2100mhz or a litte more and it not enough?! it was proved that 50 or 100mhz more have no impact in fps. this generation is different that previous because the clock are already high by default. in my opinion you wasting your time. no gain difference to have 8 or 16 pin power, no gain to have more phases... i think have a good cooling solution to have stable clocks is more important.


Ever heard of bragging rights? Duh.


----------



## xer0h0ur

Quote:


> Originally Posted by *deejaykristoff*
> 
> dont understand so much obsession, yesterday we have cards on silicon lottery achived 1500/1600 mhz, today cards achieve 2000/2100mhz or a litte more and it not enough?! it was proved that 50 or 100mhz more have no impact in fps. this generation is different that previous because the clock are already high by default. in my opinion you wasting your time. no gain difference to have 8 or 16 pin power, no gain to have more phases... i think have a good cooling solution to have stable clocks is more important.


You're on the wrong forum to be preaching modesty.


----------



## boredgunner

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're on the wrong forum to be preaching modesty.


Yup, it's in the name.

Also overclock percentage means a lot to us. The GTX Titan X was able to almost reach a 50% overclock on average, same for AIB GTX 980 Ti's compared to reference model, and the GTX 780 Ti could often break that 50% barrier. We want more of this.

Also overclocking my GTX 1080 from the stock ~1940 MHz boost and 10000 MHz VRAM to 2063 MHz typical boost and 11016 MHz RAM made a detectable difference in some games.

Although I never actually complained about GTX 1080 overclocking until this post. Intel CPUs however...


----------



## kx11

can anyone confirm the source of this ??


----------



## nexxusty

Quote:


> Originally Posted by *kx11*
> 
> can anyone confirm the source of this ??


Jesus Christ....

*edit*

Seems like a Galaxy HOF.


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> can anyone confirm the source of this ??


I think thats :

http://videocardz.com/60923/galax-overclocks-gtx-1080-to-2-2-ghz-on-air-2-5-ghz-with-ln2


----------



## kx11

Quote:


> Originally Posted by *nexxusty*
> 
> Jesus Christ....
> 
> *edit*
> 
> Seems like a Galaxy HOF.


it seems like a reference card using the XOC voltage tool to allow more OC headroom


----------



## GreedyMuffin

Quote:


> Originally Posted by *kx11*
> 
> it seems like a reference card using the XOC voltage tool to allow more OC headroom


Is that possible to use on my FE? Wut?!


----------



## kx11

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Is that possible to use on my FE? Wut?!


if you got LN2 hooked on yours then maybe you can


----------



## toncij

Quote:


> Originally Posted by *boredgunner*
> 
> Yup, it's in the name.
> 
> Also overclock percentage means a lot to us. The GTX Titan X was able to almost reach a 50% overclock on average, same for AIB GTX 980 Ti's compared to reference model, and the GTX 780 Ti could often break that 50% barrier. We want more of this.
> 
> Also overclocking my GTX 1080 from the stock ~1940 MHz boost and 10000 MHz VRAM to 2063 MHz typical boost and 11016 MHz RAM made a detectable difference in some games.
> 
> Although I never actually complained about GTX 1080 overclocking until this post. Intel CPUs however...


I wonder where will the new Titan X stop with the click. I seriously doubt it will catch 2GHz...


----------



## nexxusty

Quote:


> Originally Posted by *kx11*
> 
> it seems like a reference card using the XOC voltage tool to allow more OC headroom


Subvendor ID in GPU-Z is "Galaxy". We know FE's cant use the HOF BIOS.....

So again, that's an HOF. This would most likely be why the HOF BIOS doesn't POST on FE's, different power delivery circuitry.


----------



## GreedyMuffin

Quote:


> Originally Posted by *nexxusty*
> 
> Subvendor ID in GPU-Z is "Galaxy". We know FE's cant use the HOF BIOS.....
> 
> So again, that's an HOF. This would most likely be why the HOF BIOS doesn't POST on FE's, different power delivery circuitry.


That's what I thought, Never mind me, I just got confused when I read that post.


----------



## boredgunner

Quote:


> Originally Posted by *toncij*
> 
> I wonder where will the new Titan X stop with the click. I seriously doubt it will catch 2GHz...


I think with water cooling it'll reach the same speeds as GTX 1080s, so 2-2.1 GHz on average.


----------



## pez

Quote:


> Originally Posted by *Joshwaa*
> 
> I was more just wondering if it was good for gaming and if it had any issues. Also if a single 1080 would push it ok.


Quote:


> Originally Posted by *Kielon*
> 
> Take a look at this 1080/ultra-wide review: http://techgage.com/article/nvidia-geforce-gtx-1080-review-a-look-at-4k-ultra-wide-gaming/ I can confirm that single 1080 is more than capable to drive X34 at or close to 100Hz G-sync limit.


Quote:


> Originally Posted by *RMXO*
> 
> I have the Acer x34 Predator and it an awesome monitor, no regrets but you'll need SLI to get the best out of it.


Yeah, I agree for longevity that SLI'd 1080s just gives a little extra breathing room, but I feel a single 1080 could do the X34 justice. G-sync however would be a must. You will get frame drops below 60 if maxing all settings, so you will either have to make some compromises and cut some AA and shader details, or if you can handle the drops (which G-sync handles *very* well), you'll be set.

I however, would not recommend a panel of that resolution that does not have G-sync. A high refresh rate panel without the benefit of G-sync would just make for an extremely unpleasant experience otherwise.


----------



## nexxusty

Quote:


> Originally Posted by *GreedyMuffin*
> 
> That's what I thought, Never mind me, I just got confused when I read that post.


Heh I almost shat my pants because I thought it was for FE.


----------



## yungtiger

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone with a 1080 have the Acer X34 or Z35 Predator? I am by no means a monitor aficionado so would like some input before purchasing. Heck I am still using an Asus VG248QE.


I have the X34 but I'm waiting on my 1080 to arrive. The Gigabyte 1080 I ordered last week got pushed to next week (yeah a fun 2 week wait after I called to ask where my package was), so I switched to order the STRIX and it'll be here on Monday. What are you asking?


----------



## outofmyheadyo

So how is 1080 treating you 2560x1440 guys? Joining the club tomorrow =)


----------



## toncij

Quote:


> Originally Posted by *pez*
> 
> Yeah, I agree for longevity that SLI'd 1080s just gives a little extra breathing room, but I feel a single 1080 could do the X34 justice. G-sync however would be a must. You will get frame drops below 60 if maxing all settings, so you will either have to make some compromises and cut some AA and shader details, or if you can handle the drops (which G-sync handles *very* well), you'll be set.
> 
> I however, would not recommend a panel of that resolution that does not have G-sync. A high refresh rate panel without the benefit of G-sync would just make for an extremely unpleasant experience otherwise.


Well, in many games you can't catch 100Hz easily with a single 1080. I wouldn't count on it - unless you're into reducing image details.


----------



## xer0h0ur

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So how is 1080 treating you 2560x1440 guys? Joining the club tomorrow =)


Excluding CSGO, pretty damn well.


----------



## nexxusty

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So how is 1080 treating you 2560x1440 guys? Joining the club tomorrow =)


Very well.

Perfect combination.


----------



## juniordnz

Boy,that firestrike ultra stress test really puts a new meaning to the word "stable".

Have anyone gone trough all 20 loops with no crashes and more than 97% final result? It's very humbling...


----------



## GreedyMuffin

Quote:


> Originally Posted by *juniordnz*
> 
> Boy,that firestrike ultra stress test really puts a new meaning to the word "stable".
> 
> Have anyone gone trough all 20 loops with no crashes and more than 97% final result? It's very humbling...


I fold 24/7 for two weeks.


----------



## TWiST2k

Did you get it from the brief stock on amazon? Mine will be here tomorrow as well 1080 FTW!

Started doing my homework today and was pretty bummed to find out there is no bios mod tools yet for Pascal, I had my 970 FTW and 980 Ti Classified all tricked out in the bios and now I am going to have to run software for OC, boooo!!!


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> Did you get it from the brief stock on amazon? Mine will be here tomorrow as well 1080 FTW!
> 
> Started doing my homework today and was pretty bummed to find out there is no bios mod tools yet for Pascal, I had my 970 FTW and 980 Ti Classified all tricked out in the bios and now I am going to have to run software for OC, boooo!!!


I know that feeling...everytime I see afterburner running on my rig I die a little


----------



## immortalkings

Quote:


> Originally Posted by *pez*
> 
> Yeah, I agree for longevity that SLI'd 1080s just gives a little extra breathing room, but I feel a single 1080 could do the X34 justice. G-sync however would be a must. You will get frame drops below 60 if maxing all settings, so you will either have to make some compromises and cut some AA and shader details, or if you can handle the drops (which G-sync handles *very* well), you'll be set.
> 
> I however, would not recommend a panel of that resolution that does not have G-sync. A high refresh rate panel without the benefit of G-sync would just make for an extremely unpleasant experience otherwise.


do you guys turn OFF the vsync on Nvidia Control Panel? it ends up turning ON after restart when using Gsync. i'm getting Stuttering in some games when getting a below 60FPS when its turned ON.. is there a way to turn it OFF permanently?


----------



## jorpe

Anyone else having problems getting the gpu boost clock to work? In some games it never even boots above 1721 which is the base for my card the way I have it clocked.


----------



## juniordnz

Quote:


> Originally Posted by *jorpe*
> 
> Anyone else having problems getting the gpu boost clock to work? In some games it never even boots above 1721 which is the base for my card the way I have it clocked.


Have you tried setting max performance in power mode for those specific games where the clock doesnt go up?


----------



## ChaosBlades

Can one of you with the 400+ memory overclock that you say is "stable' run 3DMark Time Spy Stress Test and tell me if you can pass the test... I bet you can't.

Because I thought mine was stable too until I ran these 3DMark Stress Tests and I could not pass them. I have not nailed down the actual overclock I am getting just yet on memory but it is going to be nowhere near the number I though it was stable at.


----------



## boredgunner

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So how is 1080 treating you 2560x1440 guys? Joining the club tomorrow =)


GTX 1080 has an easy time with 2560 x 1440. Expect 80+ FPS on the highest settings in most games. Haven't tried modded GTA V yet though, but I don't expect much issue.


----------



## jorpe

Quote:


> Originally Posted by *juniordnz*
> 
> Have you tried setting max performance in power mode for those specific games where the clock doesnt go up?


Where do I specify that? Ive got a 1080 classy and am using PrecisionX OC software.


----------



## boredgunner

Quote:


> Originally Posted by *jorpe*
> 
> Where do I specify that? Ive got a 1080 classy and am using PrecisionX OC software.


NVIDIA Control Panel or NVIDIA Inspector (same thing basically, Inspector has some extra features largely irrelevant for modern games).


----------



## juniordnz

Quote:


> Originally Posted by *jorpe*
> 
> Where do I specify that? Ive got a 1080 classy and am using PrecisionX OC software.


nVIDIA control panel, manage 3d settings, program settings, power management mode > prefer maximum performance
Quote:


> Originally Posted by *ChaosBlades*
> 
> Can one of you with the 400+ memory overclock that you say is "stable' run 3DMark Time Spy Stress Test and tell me if you can pass the test... I bet you can't.
> 
> Because I thought mine was stable too until I ran these 3DMark Stress Tests and I could not pass them. I have not nailed down the actual overclock I am getting just yet on memory but it is going to be nowhere near the number I though it was stable at.


I used Firestrike Ultra Stress test to access stability, it was very hard on my OC too...but I'm now seeing snowflakes when playing R6S. Guess I'm not stable after all.

This 1080 Armor is proving to be a bad overclocker


----------



## TWiST2k

Quote:


> Originally Posted by *jorpe*
> 
> Where do I specify that? Ive got a 1080 classy and am using PrecisionX OC software.


Man I have a 980 Ti Classy and I was pretty happy with it, but the 749 tag on the 1080 Classy turned me off to it, my 1080 FTW will be here tomorrow. What kind of speeds are you getting with the Classy?


----------



## FoamyV

Any idea when the STRIX OC will be back in stock? Do stores get shipments monthly? Should i be on the lookout as we hit August?


----------



## Whitechap3l

Quote:


> Originally Posted by *FoamyV*
> 
> Any idea when the STRIX OC will be back in stock? Do stores get shipments monthly? Should i be on the lookout as we hit August?


Just get the "normal" one...
Non OC, OC and Advanced edition is just marketing...
You can get the non OC version easily to the the same clocks as the OC version and you save some money as well


----------



## ChevChelios

Quote:


> Anyone else having problems getting the gpu boost clock to work? In some games it never even boots above 1721 which is the base for my card the way I have it clocked.


yes in games like HotS and parts of WoW I am at ~1700 Mhz instead of my usual out-of-the-box 1900+ .. changing from Optimal Power to Max Perf doesnt seem to make a difference, I presume thats because those are old graphics-lite games that cant stress the GPU enough to push 1900 Mhz

in Overwatch and any other semi graphics intensive game Im always at 1900+ though

@*G1 1080 owners*
http://www.gigabyte.com/products/product-page.aspx?pid=5915#bios
has anyone tried this F2_beta BIOS ? any OC difference ? How does it modify the fan ?


----------



## Reckit

Quote:


> Originally Posted by *Whitechap3l*
> 
> Just get the "normal" one...
> Non OC, OC and Advanced edition is just marketing...
> You can get the non OC version easily to the the same clocks as the OC version and you save some money as well


I wouldn't be too sure, I have a non oc strix and I can't clock it past 1900, the boost only hits 1985. Not the 2100 people are getting with the oc version. Temps are not the issue, it doesn't go above 65 C when bench marking.

Guess I lost out a little.


----------



## shalafi

Quote:


> Originally Posted by *boredgunner*
> 
> GTX 1080 has an easy time with 2560 x 1440. Expect 80+ FPS on the highest settings in most games. Haven't tried modded GTA V yet though, but I don't expect much issue.


Try ARK for laughs.


----------



## tin0

Quote:


> Originally Posted by *Reckit*
> 
> I wouldn't be too sure, I have a non oc strix and I can't clock it past 1900, the boost on hits 1985. Not the 2100 people are getting with the oc version. Temps are not the issue, it doesn't go above 65 C when bench marking.
> 
> Guess I lost out a little.


Try flashing the OC BIOS to yours and see where it goes


----------



## outofmyheadyo

Can I flash my gainward phoenix 1080 with gainward phoenix GLH bios? Should be the same card with diff bios.


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> I wouldn't be too sure, I have a non oc strix and I can't clock it past 1900, the boost on hits 1985. Not the 2100 people are getting with the oc version. Temps are not the issue, it doesn't go above 65 C when bench marking.
> 
> Guess I lost out a little.


I guess bad lottery... Read so much comments that it is the basically the same card. Got mine yesterday and hit 2050 and I had only few minutes to test. next week it will be under water and I see what I get


----------



## Whitechap3l

Quote:


> Originally Posted by *tin0*
> 
> Try flashing the OC BIOS to yours and see where it goes


Yeah thats the next option







I am pretty sure all of those 3 cards ( OC, advanced, non oc ) are the same, with minor bios changes


----------



## outofmyheadyo

But the question is do manufacturers bin their cards or not?


----------



## grimboso

Quote:


> Originally Posted by *Joshwaa*
> 
> Anyone with a 1080 have the Acer X34 or Z35 Predator? I am by no means a monitor aficionado so would like some input before purchasing. Heck I am still using an Asus VG248QE.


I got a X34 to go with my 1080 FTW.

I can honestly say it's the best monitor I have ever had the pleasure of playing on. I've had ROG Swifts, Dell Ultrasharps and some korea monitor. the X34 is really good. I am using the icc-profile and calibration from tftcentral which works like a charm.

All of the games that I play, I can play in Ultra (without AA) and still keep a steady 100 fps. According to the reviews I've seen that uses Ultra-wide monitors almost all games will run 70-80 fps or more, and most being 100+. Can't say that I need to run SLI at the moment, but might get a 2nd 1080-card after christmas or on black friday.


----------



## juniordnz

Quote:


> Originally Posted by *outofmyheadyo*
> 
> But the question is do manufacturers bin their cards or not?


I believe they do. Otherwise MSI wouldn't have 4 different cards with exactly the same PCB and harware (Armor, Gaming, Gaming X and Gaming Z). Gaming Z coming out of the box with the best clocks.


----------



## GreedyMuffin

The MSI Z didn't achieve very good results IMHO. *Only* 2075 with max voltage.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73116-msi-gtx-1080-gaming-z-8gb-review-18.html

It's not bad, just not very good.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> The MSI Z didn't achieve very good results IMHO. *Only* 2075 with max voltage.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73116-msi-gtx-1080-gaming-z-8gb-review-18.html
> 
> It's not bad, just not very good.


Exactly what I get with my ARMOR. Same hardware, same everything. Just different cooling, backplate, shiny leds...

Not stable though, opens at 2075 and stabilizes at 2050, 2025 depending on the game. Lots of vRel and vOP and ocasionally pwr perfcaps.

I'm pretty disapointed if you ask me. But maybe I just got too many expectations from premium cards reviews I read before buying a basic one.


----------



## PasK1234Xw

Quote:


> Originally Posted by *immortalkings*
> 
> do you guys turn OFF the vsync on Nvidia Control Panel? it ends up turning ON after restart when using Gsync. i'm getting Stuttering in some games when getting a below 60FPS when its turned ON.. is there a way to turn it OFF permanently?


Just turn off per game profile. Gsync with vsyn on is garbage.


----------



## ChevChelios

Quote:


> Originally Posted by *immortalkings*
> 
> do you guys turn OFF the vsync on Nvidia Control Panel? *it ends up turning ON after restart when using Gsync*. i'm getting Stuttering in some games when getting a below 60FPS when its turned ON.. is there a way to turn it OFF permanently?


I think thats a bug in current drivers .. it turns itself back on

however it only affects things when you go above your Gsync range (either above 60 or 100 or 144) .. falling below 60 means you're still in Gsync range and Vsync isnt affecting anything even if its on

but maybe thats just because sub 60 isnt smooth by itself, even with Gsync


----------



## pez

Quote:


> Originally Posted by *toncij*
> 
> Well, in many games you can't catch 100Hz easily with a single 1080. I wouldn't count on it - unless you're into reducing image details.


That's what I said







.
Quote:


> Originally Posted by *immortalkings*
> 
> do you guys turn OFF the vsync on Nvidia Control Panel? it ends up turning ON after restart when using Gsync. i'm getting Stuttering in some games when getting a below 60FPS when its turned ON.. is there a way to turn it OFF permanently?


I actually did notice that V-sync found its' way back in after a reboot, so I'm not sure. There's still some things I need to iron out apparently. Fallout 4 is currently 'tilting' me into a black hole with its optimization. Game dips below 60 frames but is using 70% max on both GPUs. I'm even noticing stutters in Crysis 3 that I wasn't before.
Quote:


> Originally Posted by *grimboso*
> 
> I got a X34 to go with my 1080 FTW.
> 
> I can honestly say it's the best monitor I have ever had the pleasure of playing on. I've had ROG Swifts, Dell Ultrasharps and some korea monitor. the X34 is really good. I am using the icc-profile and calibration from tftcentral which works like a charm.
> 
> All of the games that I play, I can play in Ultra (without AA) and still keep a steady 100 fps. According to the reviews I've seen that uses Ultra-wide monitors almost all games will run 70-80 fps or more, and most being 100+. Can't say that I need to run SLI at the moment, but might get a 2nd 1080-card after christmas or on black friday.


What's ICC-Profile? I saw the TFTCentral calibration and think I'm going to try that as well.

Edit:
Google was my friend. I'll give those two things a try later. This new monitor had really shown me how out of the loop I am with this stuff.


----------



## S-Line

My local Microcenter has the EVGA GTX 1080 CLASSIFIED GAMING ACX 3.0 in stock right now. I'm thinking about pulling the trigger. It would look sick in my right which has a black and red theme. Decisions, decisions.


----------



## Reckit

Quote:


> Originally Posted by *tin0*
> 
> Try flashing the OC BIOS to yours and see where it goes


Already Done doesn't make the slightest bit of difference


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> Already Done doesn't make the slightest bit of difference


Then I guess its bad luck


----------



## Barterlos

im here to post about my experience with overclocking, based on witcher 3 game, overclocking 1080 past 1.9ghz is not giving any real benefit, like 0,5fps at most, but going past 1.9ghz will make gpu more power hungry for no reason, like 20-30wats more with no substanial performance improvment, i dont know if my conclusion is correct, but i think gp104 works best at 1.9ghz with 190/200w pt


----------



## boredgunner

Quote:


> Originally Posted by *shalafi*
> 
> Try ARK for laughs.


lol does that game still run as badly as it did a year ago? If so then yeah, it'll be a good laugh and a glaring exception to what I said. Last time I checked nothing can run that game smoothly.


----------



## Joshwaa

Is ARK a good game. I am looking for a new game to play. Currently I play: World of Tanks, 7Days to Die, Starcraft, and World of Warships.


----------



## Sazexa

Quote:


> Originally Posted by *Sazexa*
> 
> Hi there everyone. I just recently picked up two 1080's and am having a bit of an issue, and was wondering if anyone else has had a similar experience. I'm running the latest NVidia driver as of yesterday (7/27/2016). I think it's 389.89? I don't remember exactly. I'm running a Intel i7-6950X on stock clock, 4x8 DDR4 (XMP isn't even enabled at the moment), and two GTX 1080's FE at stock clock. I tested each card individually and together on another machine and they ran fine.
> 
> The issue is in certain games, I have a massive coloration issue. But it seems to be ONLY in full screen and ONLY in SLI. There is also screen flicker. The screen gets a very over-saturated look, and has a hue shift in colors.
> 
> It will do this (though, seemingly different colors shifts in different aspect ratios) no matter what resolution, just as long as SLI is enabled and games are in full screen. Some games are Doom, and Battlefield 3. I haven't test many others yet.
> 
> If a game is in full screen, and I press alt+tab to open up a window in the foreground, the game then shifts to normal colors. But as soon as I click back into the game and it comes up full screen, the colors get all messed up again. Disabling SLI fixes the issue, as well as running in a window.
> 
> And not all games experience this. Also, I noticed when changing the API in Doom from Open GL 4.5 to Vulkan, it ran flawlessly with SLI and full screen.


I hate to be "that guy," and quite my own post, but there's been a lot of talk and no one responded. Just trying to ask again, if anyone has had similar issues.


----------



## KillerBee33

Quote:


> Originally Posted by *Joshwaa*
> 
> Is ARK a good game. I am looking for a new game to play. Currently I play: World of Tanks, 7Days to Die, Starcraft, and World of Warships.


This should be EPIC








http://www.no-mans-sky.com/


----------



## shalafi

Quote:


> Originally Posted by *boredgunner*
> 
> lol does that game still run as badly as it did a year ago? If so then yeah, it'll be a good laugh and a glaring exception to what I said. Last time I checked nothing can run that game smoothly.


980Ti Hybrid @1500Mhz, 1440p with High settings, ALL of the postprocessing options disabled (ambient occlusion, etc.). runs in the 30-40fps range and sometimes dips below 30. And with the ambient occlusion and distance field shadowing off, it looks like crap.


----------



## boredgunner

Quote:


> Originally Posted by *shalafi*
> 
> 980Ti Hybrid @1500Mhz, 1440p with High settings, ALL of the postprocessing options disabled (ambient occlusion, etc.). runs in the 30-40fps range and sometimes dips below 30. And with the ambient occlusion and distance field shadowing off, it looks like crap.


Yeah I think the game is FUBAR to be honest.


----------



## ChevChelios

ARK runs memetically bad


----------



## Jpmboy

Quote:


> Originally Posted by *outofmyheadyo*
> 
> But the question is do manufacturers bin their cards or not?


they bin the gpu core, not the card, and put the better cores in the higher base clock SKUs and/or their "flagship" custom PCB design.


----------



## Al plants Corn

Is the Zotac Amp! Edition worth it for $640? Would it be as quiet as the Gaming X? Much overclock potential?


----------



## fat4l

Quote:


> Originally Posted by *Jpmboy*
> 
> they bin the gpu core, not the card, and put the better cores in the higher base clock SKUs and/or their "flagship" custom PCB design.


I woudlnt rly believe this. I say asus matrix duds and many others...
What they do is bin for the speed they want to sell it at. If the core can do it then they put it on the card. Beyond that they dont guarantee anything..

So if a core can do 2000 and they want to sell at 2000Mhz, then its all good.
You get the 2 cards, both can do stable 2000MHz. One can do 2050 max, but one 2150.... This is what the difference is


----------



## fat4l

Quote:


> Originally Posted by *immortalkings*
> 
> do you guys turn OFF the vsync on Nvidia Control Panel? it ends up turning ON after restart when using Gsync. i'm getting Stuttering in some games when getting a below 60FPS when its turned ON.. is there a way to turn it OFF permanently?


What about to use V-sync - "FAST" ? not "on".

That should be buch better but we need more info about this option as it's something new..


----------



## outofmyheadyo

If someone happens to have Gainward GTX 1080 GLH or GS bios could you upload it here so I could flash it to my regular phoenix ?


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If someone happens to have Gainward GTX 1080 GLH or GS bios could you upload it here so I could flash it to my regular phoenix ?


Are the bios you need on here?

https://www.techpowerup.com/


----------



## jase78

Quote:


> Originally Posted by *Reckit*
> 
> I wouldn't be too sure, I have a non oc strix and I can't clock it past 1900, the boost only hits 1985. Not the 2100 people are getting with the oc version. Temps are not the issue, it doesn't go above 65 C when bench marking.
> 
> Guess I lost out a little.


Im beginning to wonder if asus quickly checked how far their finished strix cards could go and the better ones that got higger clocks were named "oc" and vise versa. Cause ive noticed a lot of non oc owners complaining of not being able to even hit 2000.
My strix oc easily hits 2088.


----------



## juniordnz

I believe that's a double shot strategy from companies:

1- They can make cards that suits all needs (budget versus pure performance)
2- They can make use of "bad" and "good" chips in different segments

See how MSI have like, 4 different tiers within the same model?


----------



## nexxusty

Quote:


> Originally Posted by *Joshwaa*
> 
> Is ARK a good game. I am looking for a new game to play. Currently I play: World of Tanks, 7Days to Die, Starcraft, and World of Warships.


Quote:


> Originally Posted by *juniordnz*
> 
> I know that feeling...everytime I see afterburner running on my rig I die a little


No. It's the worst coded game ever.

Pathetic even.


----------



## ikjadoon

Quote:


> Originally Posted by *jase78*
> 
> Im beginning to wonder if asus quickly checked how far their finished strix cards could go and the better ones that got higger clocks were named "oc" and vise versa. Cause ive noticed a lot of non oc owners complaining of not being able to even hit 2000.
> My strix oc easily hits 2088.


I mentioned this in another thread--I don't know why we're beating around the bush. There are only a few possible scenarios:

NVIDIA knowingly gave crappy GPU dies to AIB partners (they say "no")
NVIDIA knows more about overclocking than AIB partners (possibly)
14nm yields are so low/weak that without aggressive binning, AIB partners won't have enough differentiated cards.
The idea that we can buy an average card and it can overclock just as well as "highly binned" card....that reality is only true if there is a glut of cards. There used to be so many "strong GPU dies" that EVGA no choice but to re-direct some "as good as Classified dies" to "SC" because more people buy "SC" cards.

That's not true, I think, with the GTX 1080. And, I have sneaking suspicion about #2.


----------



## outofmyheadyo

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> Are the bios you need on here?
> 
> https://www.techpowerup.com/


Nope they dont have the GLH bios only GS, I was hoping maybe someone here would have the GLH.


----------



## Jpmboy

Quote:


> Originally Posted by *fat4l*
> 
> I woudlnt rly believe this. I say asus matrix duds and many others...
> What they do is bin for the speed they want to sell it at. If the core can do it then they put it on the card. Beyond that they dont guarantee anything..
> 
> So if a core can do 2000 and they want to sell at 2000Mhz, then its all good.
> You get the 2 cards, both can do stable 2000MHz. One can do 2050 max, but one 2150.... This is what the difference is


just tellin' ya how it is done with SKUs that have different base clocks... predicts nothing regarding max frequency. Lol , like ASIC.


----------



## uberwootage

Quote:


> Originally Posted by *ikjadoon*
> 
> I mentioned this in another thread--I don't know why we're beating around the bush. There are only a few possible scenarios:
> 
> NVIDIA knowingly gave crappy GPU dies to AIB partners (they say "no")
> NVIDIA knows more about overclocking than AIB partners (possibly)
> 14nm yields are so low/weak that without aggressive binning, AIB partners won't have enough differentiated cards.
> The idea that we can buy an average card and it can overclock just as well as "highly binned" card....that reality is only true if there is a glut of cards. There used to be so many "strong GPU dies" that EVGA no choice but to re-direct some "as good as Classified dies" to "SC" because more people buy "SC" cards.
> 
> That's not true, I think, with the GTX 1080. And, I have sneaking suspicion about #2.


They said "highest quality components" they won't call it binning but they're binning and keeping the GPUs for then self's. That's why the bios voltage is set so low. People will lose there mind if you could put 1.2v to a fe and get to 2.3-2.35ghz and all the AIB cards are stuck at the lower clocks. They won't call it binning but they are.


----------



## jase78

Quote:


> Originally Posted by *uberwootage*
> 
> They said "highest quality components" they won't call it binning but they're binning and keeping the GPUs for then self's. That's why the bios voltage is set so low. People will lose there mind if you could put 1.2v to a fe and get to 2.3-2.35ghz and all the AIB cards are stuck at the lower clocks. They won't call it binning but they are.


If tjis really is true. Man thats some seriouslly shady bussiness! How woukd that benefit them in tje long run though. Why would they do that?


----------



## ikjadoon

Quote:


> Originally Posted by *uberwootage*
> 
> They said "highest quality components" they won't call it binning but they're binning and keeping the GPUs for then self's. That's why the bios voltage is set so low. People will lose there mind if you could put 1.2v to a fe and get to 2.3-2.35ghz and all the AIB cards are stuck at the lower clocks. They won't call it binning but they are.


Wait, has that been shown? FE cards are allowed higher voltages than AIB cards? Maybe I missed that.

Man, this thread would've been great if the OP could set up a table for overclocks on the first post.


----------



## uberwootage

Quote:


> Originally Posted by *ikjadoon*
> 
> Wait, has that been shown? FE cards are allowed higher voltages than AIB cards? Maybe I missed that.
> 
> Man, this thread would've been great if the OP could set up a table for overclocks on the first post.


So I said if fe could get more volts. Even at 1.065v they outclock aib's at 1.093v

Right now I got a 2.2ghz stable fe. Fe's are coning in at 2.1-2.2ghz most falling in at the 2,160 range. Just gotta put a better cooker on then. Water cooled fe vs water cooled aib the fe will win unless its a galaxy hof that can hit 1.3v stock on the core

Its been confirmed that Fe's clock higher.even jaystwocents in his zotac amp extreme review said this.
Quote:


> Originally Posted by *jase78*
> 
> If tjis really is true. Man thats some seriouslly shady bussiness! How woukd that benefit them in tje long run though. Why would they do that?


Not shady at all. Evga, Asus, Zotac did not chip into the development cost. So they can get what they are offered. Whats shady is EVGA marking up the same GPU sound on there SC as the FTW dupeing people into thinking they will get higher clocks. Nvidia made them they can keep what they want and sell what they want. You can't force them to use the bottom of the barrel GPUs on there cards so the FTW or extreme on the side of your card means something. I'm sure nvidia offered them higher end GPUs but they said no due to a cost per unit increase. The strix oc is trailing the Fe's but still Able to get close or hit 2.1 what is higher then most aib cards so maybe they payed the prem and got the better GPUs for that card. Still not on the level of the Fe's for the most part but better then most.

I payed a premium for a fe card. Out the door close to $740. That premium over the aib cards got me a GPU that's stable at 2.2ghz. And its out now and more then a few people have said it. You want a 1080 that over clocks well? Get a fe and put a water block on it. If they are limiting the voltage to keep the clocks low that's not the main case with the fe. The card runs hot. You put 1.1v to it with a stock cooler you will throttle hard. Nothing shady about it. Everyone knew they would be binning there GPUs for the Fe's. Nvidia is not going to put out a gtx 1080 for the life of the series and have it clock like trash. The point of the fe was they wanted to make a card that would hold its performance for years....I know they all do that. But they picked there highest binned GPUs for that to keep the volts down on the card for its life so it last. They said they used premium components so it will last. People just bought into the omg 100 power phases and 12 8 pin plugs. When in reality at 2.2ghz my card is only using around 180w. My bios is modded and the tdp set to 300 but there is no way I will ever come close to that. A lot of people bought into the marketing hype thinking they would get higher over clocks and they got nothing on those aib cards.

So far it looks like you want a aib card that can out clock a fe. You will need a galaxy hof as that is the best gtx 1080 out and will end up being the best.


----------



## dubldwn

Tested my EVGA FE with EK block last night.

With + 13 increments got to 2100.

At 2113 I started to see a small amount of bouncing off power limit pushing back to 2100.

At 2126 this increased quite a bit. Before this I was vRel limited.

Using OG precision x; not into precision x oc.


----------



## ikjadoon

Quote:


> Originally Posted by *uberwootage*
> 
> So I said if fe could get more volts. Even at 1.065v they outclock aib's at 1.093v
> 
> Right now I got a 2.2ghz stable fe. Fe's are coning in at 2.1-2.2ghz most falling in at the 2,160 range. Just gotta put a better cooker on then. Water cooled fe vs water cooled aib the fe will win unless its a galaxy hof that can hit 1.3v stock on the core
> 
> Its been confirmed that Fe's clock higher.even jaystwocents in his zotac amp extreme review said this.


Ohhh, yes! You're right. Sorry; I don't know what I was reading. Seems like Galax is the only company willing to give up NVIDIA's warranty plan, then.

...hmm...so, is that it, then? Are we calling it? NVIDIA is just binning the GP104 dies and lying about it?

What about that decidedly crazy, but possibly idea: NVIDIA just knows how to make better PCBs and better overclocking cards?


----------



## GreedyMuffin

Quote:


> Originally Posted by *uberwootage*
> 
> So I said if fe could get more volts. Even at 1.065v they outclock aib's at 1.093v
> 
> Right now I got a 2.2ghz stable fe. Fe's are coning in at 2.1-2.2ghz most falling in at the 2,160 range. Just gotta put a better cooker on then. Water cooled fe vs water cooled aib the fe will win unless its a galaxy hof that can hit 1.3v stock on the core
> 
> Its been confirmed that Fe's clock higher.even jaystwocents in his zotac amp extreme review said this.


If I can manage 2100 on air with FE on stock volts. I believe I can hit 2200 with voltage inc. on water. I'm so looking forward to get my waterblock!!


----------



## arrow0309

Hi, I've just ordered an Asus Strix 1080, the "non oc" version, pretty high price, ~ £712 at PCNation uk, these new 1080 gpu are very hard to find.
What can you tell me about it, it overclocks easily to reach the "OC" version?


----------



## dubldwn

Quote:


> Originally Posted by *GreedyMuffin*
> 
> If I can manage 2100 on air with FE on stock volts. I'm so looking forward to get my waterblock!!


Be sure to post what you get. What are you limited by right now?


----------



## GreedyMuffin

Quote:


> Originally Posted by *dubldwn*
> 
> Be sure to post what you get. What are you limited by right now?


Temp it seems like. Still going up to 73¤C or so with fans running at 100%. The FE cooler is not good, and I suspect that I got a bad paste done on it.


----------



## dubldwn

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Temp it seems like. Still going up to 73¤C or so with fans running at 100%. The FE cooler is not good, and I suspect that I got a bad paste done on it.


Have you looked at what %TDP you're at while gaming? If you're at 100% load at 2100?


----------



## GreedyMuffin

Quote:


> Originally Posted by *dubldwn*
> 
> Have you looked at what %TDP you're at while gaming? If you're at 100% load at 2100?


No gaming. It's in the I7 4770 rig. (My folding/media PC). I've been folding and stresstesting it. Will look after TDP when I get it in my rig.


----------



## mouacyk

Quote:


> Originally Posted by *fat4l*
> 
> What about to use V-sync - "FAST" ? not "on".
> 
> That should be buch better but we need more info about this option as it's something new..


http://www.overclock.net/t/1601321/fast-sync-howto#post_25330158


----------



## frankth3frizz

Strix OC came yesterday. Today my Dell g-sync monitor game. So excited









Anyone know if there's a convenient profile switching on and off of nvidia surround. Like a work and game mode?


----------



## S-Line

Can I join the club?! Picked it up a few hours ago from Microcenter.


----------



## Cool Mike

Picked up my Classified at Microcenter near Atlanta yesterday. Drove 1.5 hours. Good drive though. I am very fortunate, they had two in stock and I immediately jumped on it.

I do a lot of benchmarking and the classified is my best 1080 yet! Looks like I have at least at top 10% GPU. I have benched 3 other name brand 1080 cards and could never achieve over 2,025-2040.

Very Happy with my stable overclock. Beautiful card









GPU Boost speed: 2,138
Sustained Boost Speed (15 minutes running Valley at 4K): 2,114
GPU temps never exceed 68C (74F room temp)
Memory speed (Effective): 11,000
My GPUZ boost clock is at 1985
*Settings as follows:*
Power Target maxed at 122%
Temp target at 92C
GPU voltage maxed out at 100%
GPU positive offset at 125
Memory positive offset at 498
Fan Profile: Aggressive (Auto)


----------



## dubldwn

Wait so Classy has a max power target of 122%? Is that 122% of 180 (~220W) or 122% of something else?


----------



## juniordnz

Never understood the meaning of leaving holes on the backplate instead of making it solid and using thermal pads to cool those chips on the back.

Wouldn't it be better this way? Or air does the job better? I always imagined that if we could use a solid backplate as a heatsink that would keep the carde cooler.


----------



## mouacyk

Quote:


> Originally Posted by *juniordnz*
> 
> Never understood the meaning of leaving holes on the backplate instead of making it solid and using thermal pads to cool those chips on the back.
> 
> Wouldn't it be better this way? Or air does the job better? I always imagined that if we could use a solid backplate as a heatsink that would keep the carde cooler.


Me neither. I've always liked MSI's full backplates. It protects the entirety of the back circuitry.


----------



## ROKUGAN

Quote:


> Originally Posted by *Al plants Corn*
> 
> Is the Zotac Amp! Edition worth it for $640? Would it be as quiet as the Gaming X? Much overclock potential?


See my post a few pages back:

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/3930#post_25387243

The AMP starts @ 2100 then gets 2050-2063 Mhz fully stable with +150, but temps go over 80C with [email protected]% (replacing thermal paste reduced like 3-5C).

The AMP Extreme starts over 2150 (I've seen it like 2186) then gets fully stable at 2114-2126 Mhz with +60, temps under 70C without maxing fans.

Mem will go up to 11200 on both cases, I keep it at 11000. They basically perform the same but with 50Mhz and 15C difference (@4K Ultra settings and room temp over 30C, not like in those open benches in 20C rooms from the reviews, lol).


----------



## pantsoftime

So what ever happened to that guy with the magical custom BIOS that removed power limit with no other downsides?

I've been playing a lot with memory overclocking and I'm wondering if someone who is running +500 memory could try +490 and see if your scores go up? I'm seeing a noticeable improvement in Firestrike Extreme and Heavensward bench with +490 (vs +500). There seem to be highs and lows for the memory and sometimes it takes 5-10MHz increments/decrements to really dial in the best performance. I saw some posts about this earlier in the thread but I'm not sure anyone has really figured out an optimal algorithm yet.


----------



## ROKUGAN

Quote:


> Originally Posted by *Cool Mike*
> 
> Picked up my Classified at Microcenter near Atlanta yesterday. Drove 1.5 hours. Good drive though. I am very fortunate, they had two in stock and I immediately jumped on it.
> 
> I do a lot of benchmarking and the classified is my best 1080 yet! Looks like I have at least at top 10% GPU. I have benched 3 other name brand 1080 cards and could never achieve over 2,025-2040.
> 
> Very Happy with my stable overclock. Beautiful card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU Boost speed: 2,138
> Sustained Boost Speed (15 minutes running Valley at 4K): 2,114
> GPU temps never exceed 68C (74F room temp)
> Memory speed (Effective): 11,000
> My GPUZ boost clock is at 1985
> *Settings as follows:*
> Power Target maxed at 122%
> Temp target at 92C
> GPU voltage maxed out at 100%
> GPU positive offset at 125
> Memory positive offset at 498
> Fan Profile: Aggressive (Auto)


Sexy looking card, me likes those red accents!









Congrats, you actually getting pretty much the same results as in my Zotac behemoth card (Amp Extreme)


----------



## uberwootage

Quote:


> Originally Posted by *Cool Mike*
> 
> Picked up my Classified at Microcenter near Atlanta yesterday. Drove 1.5 hours. Good drive though. I am very fortunate, they had two in stock and I immediately jumped on it.
> 
> I do a lot of benchmarking and the classified is my best 1080 yet! Looks like I have at least at top 10% GPU. I have benched 3 other name brand 1080 cards and could never achieve over 2,025-2040.
> 
> Very Happy with my stable overclock. Beautiful card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU Boost speed: 2,138
> Sustained Boost Speed (15 minutes running Valley at 4K): 2,114
> GPU temps never exceed 68C (74F room temp)
> Memory speed (Effective): 11,000
> My GPUZ boost clock is at 1985
> *Settings as follows:*
> Power Target maxed at 122%
> Temp target at 92C
> GPU voltage maxed out at 100%
> GPU positive offset at 125
> Memory positive offset at 498
> Fan Profile: Aggressive (Auto)


If you can toss up the bios. Whats the max voltage on the bios? 1.093?


----------



## juniordnz

Anyone experiencing micro white dots (snowflakes) after overclocking when playing rainbow six siege?

My overclocked seemed stable after numerous stress tests on firestrike ultra with no artifacts, but on R6S i keep getting those annoying snowflakes


----------



## JaredC01

A HUGE thank you to KPCT for his guide over at xDevs ( https://xdevs.com/guide/pascal_oc/ )!

Did the 10 Ohm SMT resistor mod, worked like a charm on my G1 Gaming card! Max power usage in Heaven is only hitting 73 as seen in the screenshots. So far seems stable at 2139 MHz, gonna try a few more bench's and see if I can bump it up any further.

For record, on the G1 Gaming, the capacitors that need the resistors are on the lower right corner, if looking at the GPU die side, with the PCI bridge facing you.

No throttling from power usage...


Two of the three resistors added in place...


----------



## juniordnz

1080FTW owners, how is it treating you? Does it overclock well? Seen on some other places that it is not overclocking too well...

Got an oportunity to return my Armor and get a FTW for a few bucks more...


----------



## uberwootage

Quote:


> Originally Posted by *JaredC01*
> 
> A HUGE thank you to KPCT for his guide over at xDevs ( https://xdevs.com/guide/pascal_oc/ )!
> 
> Did the 10 Ohm SMT resistor mod, worked like a charm on my G1 Gaming card! Max power usage in Heaven is only hitting 73 as seen in the screenshots. So far seems stable at 2139 MHz, gonna try a few more bench's and see if I can bump it up any further.
> 
> For record, on the G1 Gaming, the capacitors that need the resistors are on the lower right corner, if looking at the GPU die side, with the PCI bridge facing you.
> 
> No throttling from power usage...
> 
> 
> Two of the three resistors added in place...


Wish you luck on that ic. Hope it can handle that. I already explained why that mod will kill your card its just a matter of time.

FE modded right with no resistor botch job.



If you want a card that overclocks go with a FE and water cool it. AIB's are way way way to hit or miss. Looks like this classy on here gets great clocks. I seen another one that cant break 2ghz. Same with the Strix oc i had that i sold to a friend. Card would not go over 2,010mhz at all. Nothing could get it any higher. Im just going to start a database for 1080 max clocks. Do benchmark stable clocks. game stable clocks and max oc. We need to get some numbers in one spot. I cant include my FE because it was hardware modded last night so it will skew the results.


----------



## JaredC01

Quote:


> Originally Posted by *uberwootage*
> 
> Wish you luck on that ic. Hope it can handle that. I already explained why that mod will kill your card its just a matter of time.
> 
> FE modded right with no resistor botch job.
> 
> 
> 
> If you want a card that overclocks go with a FE and water cool it. AIB's are way way way to hit or miss. Looks like this classy on here gets great clocks. I seen another one that cant break 2ghz. Same with the Strix oc i had that i sold to a friend. Card would not go over 2,010mhz at all. Nothing could get it any higher. Im just going to start a database for 1080 max clocks. Do benchmark stable clocks. game stable clocks and max oc. We need to get some numbers in one spot. I cant include my FE because it was hardware modded last night so it will skew the results.


My 1080 G1 is under water, max temp of 45*C. Where did you mention anything about the mod? I must have missed it...


----------



## Snabeltorsk

Quote:


> Originally Posted by *Cool Mike*
> 
> Picked up my Classified at Microcenter near Atlanta yesterday. Drove 1.5 hours. Good drive though. I am very fortunate, they had two in stock and I immediately jumped on it.
> 
> I do a lot of benchmarking and the classified is my best 1080 yet! Looks like I have at least at top 10% GPU. I have benched 3 other name brand 1080 cards and could never achieve over 2,025-2040.
> 
> Very Happy with my stable overclock. Beautiful card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU Boost speed: 2,138
> Sustained Boost Speed (15 minutes running Valley at 4K): 2,114
> GPU temps never exceed 68C (74F room temp)
> Memory speed (Effective): 11,000
> My GPUZ boost clock is at 1985
> *Settings as follows:*
> Power Target maxed at 122%
> Temp target at 92C
> GPU voltage maxed out at 100%
> GPU positive offset at 125
> Memory positive offset at 498
> Fan Profile: Aggressive (Auto)


If u can upload the bios here it would be great.


----------



## Jpmboy

Quote:


> Originally Posted by *uberwootage*
> 
> Wish you luck on that ic. Hope it can handle that. I already explained why that mod will kill your card its just a matter of time.
> 
> FE modded right with no resistor botch job.
> 
> 
> 
> If you want a card that overclocks go with a FE and water cool it. AIB's are way way way to hit or miss. Looks like this classy on here gets great clocks. I seen another one that cant break 2ghz. Same with the Strix oc i had that i sold to a friend. Card would not go over 2,010mhz at all. Nothing could get it any higher. Im just going to start a database for 1080 max clocks. Do benchmark stable clocks. game stable clocks and max oc. We need to get some numbers in one spot. I cant include my FE because it was hardware modded last night so it will skew the results.


Run that clock thru Time spy and post the result HERE


----------



## Joshwaa

I have the FTW and like it. I have not played with the OC that much but it will do 2147Mhz. Cooler is pretty good. Once I get a waterblock for it I will really know how I like it.


----------



## uberwootage

Quote:


> Originally Posted by *juniordnz*
> 
> 1080FTW owners, how is it treating you? Does it overclock well? Seen on some other places that it is not overclocking too well...
> 
> Got an oportunity to return my Armor and get a FTW for a few bucks more...


Quote:


> Originally Posted by *Jpmboy*
> 
> Run that clock thru Time spy and post the result HERE


I'll run it on Monday. I left it at work. Doing some reverse engineering on a circuit in it. time spy is stable at 2.2ghz. Right now im just working on some filtering to try and clean up the power a bit more. Should have it finished Monday or Tuesday and back to some testing. Just building it up so when i get my new Titan X i can pretty much get full price on this 1080 and get some money back.

I will add there is a way Nvidia can tell if you have used more then 1.25v on the card. I will not go into details on this as people can circumvent this and it looks like it is placed for warranty issues. Go over 1.25v there is no way to reverse it after it trips other then replacing a part and they will be able to tell. Unless you are doing a hardware volt mod you do not need to worry about this as you are not able to provide that voltage without one. And if you do a volt mod your warranty is gone anyways so it dont really matter.


----------



## JaredC01

Quote:


> Originally Posted by *uberwootage*
> 
> Quote:
> 
> 
> 
> Originally Posted by *juniordnz*
> 
> 1080FTW owners, how is it treating you? Does it overclock well? Seen on some other places that it is not overclocking too well...
> 
> Got an oportunity to return my Armor and get a FTW for a few bucks more...
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> Run that clock thru Time spy and post the result HERE
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I'll run it on Monday. I left it at work. Doing some reverse engineering on a circuit in it. time spy is stable at 2.2ghz. Right now im just working on some filtering to try and clean up the power a bit more. Should have it finished Monday or Tuesday and back to some testing. Just building it up so when i get my new Titan X i can pretty much get full price on this 1080 and get some money back.
> 
> I will add there is a way Nvidia can tell if you have used more then 1.25v on the card. I will not go into details on this as people can circumvent this and it looks like it is placed for warranty issues. Go over 1.25v there is no way to reverse it after it trips other then replacing a part and they will be able to tell. Unless you are doing a hardware volt mod you do not need to worry about this as you are not able to provide that voltage without one. And if you do a volt mod your warranty is gone anyways so it dont really matter.
Click to expand...

Still never said how the resistor mod can damage the card, and after looking in your post history I see nothing mentioned about the resistors...


----------



## kx11

so there's a new Strix 1080 called ( advanced ROG ) ?!!


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> My bios is modded and the tdp set to 300 but there is no way I will ever come close to that..


Can you please share this modded bios? Thanks


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> 1080FTW owners, how is it treating you? Does it overclock well? Seen on some other places that it is not overclocking too well...
> 
> Got an oportunity to return my Armor and get a FTW for a few bucks more...


I literally just installed my 1080 FTW in my rig right now, should I use afterburner or precision-x? I heard precision-x isn't the best piece of software out there lol.


----------



## Cool Mike

*Here is the Bios for the 1080 Classified.* Used GPUZ to extract. Has *not* been tested. Use at your own risk. The obvious: Change the extension to .rom

When voltage is set to 100% GPUZ and Precision XOC reads 1.075V

EVGAClassified1080.csv 251k .csv file


----------



## TWiST2k

Thanks for the upload man! I am upgraded to my 1080 FTW from a 980 Ti Classified, but that 749 price point, I just couldn't do it this time. Let us know how it performs and clocks, I am curious to see!

I just tried using precision-X and it is such a dismal let down haha.


----------



## kx11

MSI Afterburner is still the best OC tool out there


----------



## TWiST2k

Thanks man, I did end up installing afterburner and ditching Inprecision-X but my 1080 FTW is freezing and crashing when I am trying to run the Ashes of the SIngularity Benchmark in DX12, not sure why. I really hope there isn't anything wrong with my card, I am going to be super bummed.

Been doing some overclocking testing with my 1080 FTW and I am hitting all sorts of limits, VRel, Thrm, Pwr. I am seeing spikes of 127% or so TDP and I have the core voltage and power limits set to max so +100 core voltage and 130% power limit. I have the core clock at +52, it starts off strong at 2075 and then over time and temp it drops down to 2050, 1093 mV 68c, with the fans on full, this is all according to Afterburner.

I am really hoping we get some bios editing tools cause that would be amazing.

i am going to continue to mess with it and see what I can get, when i first installed it, it was running at 2113 with precision-X, but it kept locking up.

Oh it just dropped while I was writing this to 2037, 1081mV still 67c-68c,


----------



## ucode

Quote:


> Originally Posted by *uberwootage*
> 
> They said "highest quality components" they won't call it binning but they're binning and keeping the GPUs for then self's. That's why the bios voltage is set so low. People will lose there mind if you could put 1.2v to a fe and get to 2.3-2.35ghz and all the AIB cards are stuck at the lower clocks. They won't call it binning but they are.


FWIW my FE card at 1.2V will only do about 2.2GHz at that, not sure it's worth it TBH.

@TWiST2k would you post a P1080 for that please, cheers.


----------



## outofmyheadyo

My gainward GTX 1080 phoenix with phoenix GLH bios only seems to boost to 1976mhz over an extended session, fan stays at 35% @ 68c and 1.0620v
Really hoping we can have a bios editor soon.


----------



## outofmyheadyo

This card is HUGE


----------



## TWiST2k

Quote:


> Originally Posted by *ucode*
> 
> FWIW my FE card at 1.2V will only do about 2.2GHz at that, not sure it's worth it TBH.
> 
> @TWiST2k would you post a P1080 for that please, cheers.


I am clearly out of the loop man, what is a P1080 haha


----------



## ucode

Never mind. For some reason I had it in my head you were stressing with Furmark.


----------



## GreedyMuffin

My 1080 is now under water.

I didn't even bother to leak test. Used the exact same tubing as before. Only difference is the fittings which I had to remove from the 980Ti on to my 1080.

Drained 50% of the water and filled it. Booted as always! :-D

Used the FE backplate as well. Looking forward to see temps!


----------



## TWiST2k

Quote:


> Originally Posted by *ucode*
> 
> Never mind. For some reason I had it in my head you were stressing with Furmark.


I was, but I started reading this thread from the beginning and there was a lot of anti furmark, so I removed that part of my reply lol. I can mess with it and send ya whatever you're looking for.

I am using 3DMark now and have been settling at 2062 actual stable at +65 core and +546 memory with the fans at 100% and core and power maxed. I have hit 2113 and chilled there for a bit at lower temp, I believe if I had this thing on water I could hit 2113 no issue. Is there any way I could use a AIO cooler of some sort with this FTW card?


----------



## arrow0309

Has anyone flashed the 1080 Strix Oc bios over the regular, "non Oc" 1080 Strix version?


----------



## GreedyMuffin

I feel the GPU is getting hot. 44-45¤C when OCed while running Valley. I don't like this. :/

Before I needed an offset of 233 to reach 2100. Now I only need 196 offset :-D

TDP while running Valley is 95-100%. Perhaps the stock cooler at 100% fan speed draw some power?


----------



## metal409

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I feel the GPU is getting hot. 44-45¤C when OCed while running Valley. I don't like this. :/
> 
> Before I needed an offset of 233 to reach 2100. Now I only need 196 offset :-D
> 
> TDP while running Valley is 95-100%. Perhaps the stock cooler at 100% fan speed draw some power?


Which water block did you go with? Your temps don't seem terrible, but warmer than what I see on my card with an EK block. Could be ambient temp?


----------



## GreedyMuffin

Quote:


> Originally Posted by *metal409*
> 
> Which water block did you go with? Your temps don't seem terrible, but warmer than what I see on my card with an EK block. Could be ambient temp?


Went with the EK one. Might have a little to much thermal paste as I tend to overdo it on GPUs. Ambient is not hot at all. 25-26'C perhaps?

XTX360 (Push-pull) + XE240 (push) with Eloops running at 700RPM. D5 pump along with my Evo sup. CPU block for my 5960X OCed.


----------



## GreedyMuffin

Goddamit it's actually hotter than my 980Ti was. 1-2¤C..

I don't want to remove the block and reapply as I don't have enough coolant. Wellwell. It's not that important. It's only a few degrees difference.


----------



## SauronTheGreat

can someone tell me what is anti aliasing called in GTAV Online ? because i cannot find it in graphics or advanced graphics


----------



## ucode

@TWiST2k no need to go out of your way. I think I got a pretty good idea of where P1080 (Preset 1920x1080 furmark bench) sits with the GTX-1080 when running with normal TDP limit. Somewhere around 120fps. I removed power limiting on my FE card and managed 133fps with >300W draw. It was a software approach that had some issues with another clock but seems cross flashing strix VBIOS posted earlier (hwbot link) can do the same while also allowing for a little more voltage, up to 1.2V on this card, however I lose some fan speed from 4k to 3.6k. Still need to look into it a bit more when time permits.


----------



## libremaster

Joining the club with a MSI 1080 Gaming (Not X)

Will start to see how far it may go. Not having lots of expectations for what I been reading here


----------



## boredgunner

So I finally got around to using 3DMark Time Spy stress test and as many of you would expect, I failed but just barely (96.6%, minimum passing grade is 97%). Somewhat surprising since this overclock performed better in every benchmark and game, including minimum frame rate, although I was alerted to instability last night after a display driver crash during *Cryostasis: Sleep of Reason*.

So there it is folks, Cryostasis is still the best game for testing GPU overclocks 7 years later. The tech demo benchmark showed no signs of instability either, only playing the game did (and it crashed in under half an hour of playing).

I'm guessing it's my VRAM. Lowered it to +450 (10908 MHz).


----------



## GreedyMuffin

49¤C when gaming.. that sucks.. alot..


----------



## ShaZam508

Just received my MSI GTX 1080 Gaming X in the mail today! It's been awhile since I've been on these boards but it feels great to be back! I was forced to sell the build in my sig due to financial hardship back in 2012 and I've been gaming on an AlienWare Alpha since last year on a 1080P display. Can't wait to receive the rest of my parts so I can finally play anything I want @ max settings!


----------



## GreedyMuffin

This is embarrassing..

My fans on the XE240 rad is not on at all. Must have disconnected them when I changed the power cable from PSU to GPU. (from 1 cable with 8 2 pins to 1 8 pin.


----------



## uberwootage

Quote:


> Originally Posted by *Cool Mike*
> 
> *Here is the Bios for the 1080 Classified.* Used GPUZ to extract. Has *not* been tested. Use at your own risk. The obvious: Change the extension to .rom
> 
> When voltage is set to 100% GPUZ and Precision XOC reads 1.075V
> 
> EVGAClassified1080.csv 251k .csv file


Ok cool i'll test it out later today.

1.075v is pretty disappointing. Thats puts it into the middle of the cards. So it looks like Galaxy and Gigabyte are the only ones who are pushing the volts to the cards. But i'll test it and see how it runs.


----------



## Madness11

Hey boys







ill join in this club







Card very nice condition . I use agressive profile (85% fans) and i reach max 63C. Overclock start 2076, after few min goes down and 2063 . and i m Fine


----------



## Gabkicks

My evga GTX 1080 FTW came in this week







It gets to 2012mhz core out of box, but then downclocks to 1987mhz maybe because of heat. i havent tried much overclocking yet


----------



## pantsoftime

Quote:


> Originally Posted by *Madness11*
> 
> 
> Hey boys
> 
> 
> 
> 
> 
> 
> 
> ill join in this club
> 
> 
> 
> 
> 
> 
> 
> Card very nice condition . I use agressive profile (85% fans) and i reach max 63C. Overclock start 2076, after few min goes down and 2063 . and i m Fine


Welcome aboard!


----------



## fat4l

Quote:


> Originally Posted by *GreedyMuffin*
> 
> This is embarrassing..
> 
> My fans on the XE240 rad is not on at all. Must have disconnected them when I changed the power cable from PSU to GPU. (from 1 cable with 8 2 pins to 1 8 pin.


haha .sooo...what are the temps now ?
When I used normal paste on my ares III,, I didnt like the temps so I aplied liquid metal. Temps went down by 15C









@uberwootage...I see you dont want to share the TDP modded bios. What a shame


----------



## SAFX

Looking for benchmarks, *SC* vs *FTW*, could not find anything on youtube, what's FTW performance advantage over SC (stock)? 5%, 10% ??

*UPDATE*
Found this, doesn't benchmark SC, but close enough for my needs:
http://www.babeltechreviews.com/evga-gtx-1080-review/5/


----------



## Jpmboy

Quote:


> Originally Posted by *boredgunner*
> 
> So I finally got around to using 3DMark Time Spy stress test and as many of you would expect, I failed but just barely (96.6%, minimum passing grade is 97%). Somewhat surprising since this overclock performed better in every benchmark and game, including minimum frame rate, although I was alerted to instability last night after a display driver crash during *Cryostasis: Sleep of Reason*.
> So there it is folks, Cryostasis is still the best game for testing GPU overclocks 7 years later. The tech demo benchmark showed no signs of instability either, only playing the game did (and it crashed in under half an hour of playing).
> I'm guessing it's my VRAM. Lowered it to +450 (10908 MHz).


If the clocks are stable in the games you use - they are stable. Time spy stable, may not predict stability in anything else but timespy. It does use the DX12 API more than most games will call on DX12.
Quote:


> Originally Posted by *GreedyMuffin*
> 
> This is embarrassing..
> 
> My fans on the XE240 rad is not on at all. Must have disconnected them when I changed the power cable from PSU to GPU. (from 1 cable with 8 2 pins to 1 8 pin.


lol - do you have a water temp probe? the delta between core and coolant is what you need to know regarding the block mount efficiency.








Quote:


> Originally Posted by *fat4l*
> 
> @uberwootage...I see you dont want to share the TDP modded bios. What a shame


No benches posted, no bios posted... tho I don;t have a dog in this fight (sold my 1080), this is really looking like:


----------



## zGunBLADEz

Quote:


> Originally Posted by *uberwootage*
> 
> I'll run it on Monday. I left it at work. Doing some reverse engineering on a circuit in it. time spy is stable at 2.2ghz. Right now im just working on some filtering to try and clean up the power a bit more. Should have it finished Monday or Tuesday and back to some testing. Just building it up so when i get my new Titan X i can pretty much get full price on this 1080 and get some money back.
> 
> I will add there is a way Nvidia can tell if you have used more then 1.25v on the card. I will not go into details on this as people can circumvent this and it looks like it is placed for warranty issues. Go over 1.25v there is no way to reverse it after it trips other then replacing a part and they will be able to tell. Unless you are doing a hardware volt mod you do not need to worry about this as you are not able to provide that voltage without one. And if you do a volt mod your warranty is gone anyways so it dont really matter.


^^ dont feed the troll boys XD

If the pros out there haven't manage to do nothing this guy here who only posts in this topic and nowhere else is going to be a great help XD

Let him reverse engineering XD


----------



## GreedyMuffin

Need to get a temp probe. Probably a bad install as I've used too much thermal paste, but I thought it didn't matter as I could just squeeze it out the sides.









Running 2126 at 1.043V instead of 2100/2088 at 1.050V. Might do 2150.

Why did you sell your 1080?

I sorta know you, so I guess it's the new Titan XP you'll be getting?









Temp is 40-41'C instead. Will test gaming later.


----------



## sanick

Can any GTX 1080 STRIX owners/enthusiasts confirm that this setup would work... ?

I want to send video to my UHD TV via one HDMI port and have HDMI two going out to my 7.1 receiver (which i can't pass through as it's not UHD compliant). I'd love to be able to do this without needing to go through a splitter or anything that made degrade the image or add lag.

It looks like the Strix cards are the only 1080's with two HDMI ports?


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> haha .sooo...what are the temps now ?
> When I used normal paste on my ares III,, I didnt like the temps so I aplied liquid metal. Temps went down by 15C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @uberwootage...I see you dont want to share the TDP modded bios. What a shame


Yeah people keep asking and i said i did not make the bios. Im not uploading it without permission from the person who made it. They said no so its not being passed out. He made it not me its not mine to pass around.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Yeah people keep asking and i said i did not make the bios. Im not uploading it without permission from the person who made it. They said no so its not being passed out. He made it not me its not mine to pass around.


So tell us who made it so we can get in touch with the person or ask him if you can make it public ....


----------



## Jpmboy

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Need to get a temp probe. Probably a bad install as I've used too much thermal paste, but I thought it didn't matter as I could just squeeze it out the sides.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running 2126 at 1.043V instead of 2100/2088 at 1.050V. Might do 2150.
> 
> Why did you sell your 1080?
> 
> I sorta know you, so I guess it's the new Titan XP you'll be getting?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Temp is 40-41'C instead. Will test gaming later.


Hey bud, I sold the 1080 in advance of the Pascal TitanX. 40% more cuda cores than the 1080 but lower base frequency. As we had for the OG Titan and Maxwell TitanX (TXM), Pascal TitanX (TXP) or the 1080 is going to need a bios editor for it to be as much fun.


----------



## GreedyMuffin

Awesome! Just as I thought.

The Titan XP is going to cost 1800-2000 in Norway. A 1080 is alreadt 900USD.

Will be awesome to see results!


----------



## ssgwright

Quote:


> Originally Posted by *Jpmboy*
> 
> No benches posted, no bios posted... tho I don;t have a dog in this fight (sold my 1080), this is really looking like:


lol thinking the same thing


----------



## uberwootage

Quote:


> Originally Posted by *fat4l*
> 
> So tell us who made it so we can get in touch with the person or ask him if you can make it public ....


Absolutely not. Not after the people who cant seem to grasp the concept of waiting on here. They spam my inbox with "give me it" after i said im not going to until he tells me i can. So i can only guess that he will get 10x the amount of spam i have gotten on it and im not going to do that to him and make him mad. If he wants to release it he will if not someone else will end up figuring it out and doing so.

Also i could caress less with the people "Prove it post it up" People have seen the screen shots of the TDP. I said when i get my card back its at my work i been doing some stuff with it and need the equipment at my work i'll post some. Hatters gonna hate that tdp at 60% at 2.2ghz.


----------



## toncij

Quote:


> Originally Posted by *SAFX*
> 
> Looking for benchmarks, *SC* vs *FTW*, could not find anything on youtube, what's FTW performance advantage over SC (stock)? 5%, 10% ??
> 
> *UPDATE*
> Found this, doesn't benchmark SC, but close enough for my needs:
> http://www.babeltechreviews.com/evga-gtx-1080-review/5/


Judging by most tests and then FireStrike Ultra, TitanX @ 1,5GHz being at 5050 points is very close to most 1080s at 5400ish. 1080 is barely 7% faster, which is a bit "meh".

Not sure why Nvidia decided not to sell the new TitanX but in only few select markets where they have their web shop.


----------



## metal409

Quote:


> Originally Posted by *uberwootage*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> So tell us who made it so we can get in touch with the person or ask him if you can make it public ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Absolutely not. Not after the people who cant seem to grasp the concept of waiting on here. They spam my inbox with "give me it" after i said im not going to until he tells me i can. So i can only guess that he will get 10x the amount of spam i have gotten on it and im not going to do that to him and make him mad. If he wants to release it he will if not someone else will end up figuring it out and doing so.
> 
> Also i could caress less with the people "Prove it post it up" People have seen the screen shots of the TDP. I said when i get my card back its at my work i been doing some stuff with it and need the equipment at my work i'll post some. Hatters gonna hate that tdp at 60% at 2.2ghz.
Click to expand...

As one of the people who sent you a single PM inquiring about the bios and asking to have my name passed along to the creator, I thought I was being reasonably polite. I guess not.
Quote:


> Originally Posted by *GreedyMuffin*
> 
> Need to get a temp probe. Probably a bad install as I've used too much thermal paste, but I thought it didn't matter as I could just squeeze it out the sides.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running 2126 at 1.043V instead of 2100/2088 at 1.050V. Might do 2150.
> 
> Why did you sell your 1080?
> 
> I sorta know you, so I guess it's the new Titan XP you'll be getting?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Temp is 40-41'C instead. Will test gaming later.


Temps seem much better. I normally see between 35-37 while gaming, so not far off from you. My 980Ti also ran a couple degrees cooler than this 1080, which was also on an EK block.


----------



## fat4l

Quote:


> Originally Posted by *uberwootage*
> 
> Absolutely not. Not after the people who cant seem to grasp the concept of waiting on here. They spam my inbox with "give me it" after i said im not going to until he tells me i can. So i can only guess that he will get 10x the amount of spam i have gotten on it and im not going to do that to him and make him mad. If he wants to release it he will if not someone else will end up figuring it out and doing so.
> 
> Also i could caress less with the people "Prove it post it up" People have seen the screen shots of the TDP. I said when i get my card back its at my work i been doing some stuff with it and need the equipment at my work i'll post some. Hatters gonna hate that tdp at 60% at 2.2ghz.


Haters ? ugh... I think you are in the wrong thread then...








Quote:


> Originally Posted by *metal409*
> 
> As one of the people who sent you a single PM inquiring about the bios and asking to have my name passed along to the creator, I thought I was being reasonably polite. I guess not.


<= Pretty much this.

You @uberwootage are acting like its area 51 and noone can know about anything. I thought this forum here is about spreading love and helping each other but guess not...
And if its really that top secret, then there's a more polite way of saying so, rather than "haters gonna hate".
Yes you are the winner sir, cuz you have the modded bios, you are waaaay above us and can laugh at all of us, haters.. L0L


----------



## Jpmboy

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Awesome! Just as I thought.
> 
> The Titan XP is going to cost 1800-2000 in Norway. A 1080 is alreadt 900USD.
> 
> Will be awesome to see results!


yeah - gonna try to grab two on Tuesday.


----------



## GreedyMuffin

Quote:


> Originally Posted by *metal409*
> 
> As one of the people who sent you a single PM inquiring about the bios and asking to have my name passed along to the creator, I thought I was being reasonably polite. I guess not.
> Temps seem much better. I normally see between 35-37 while gaming, so not far off from you. My 980Ti also ran a couple degrees cooler than this 1080, which was also on an EK block.


35-37¤C?!?!

Do you have som sort of chiller or something?

Yeah, perhaps the smaller die of the 1080 is making it harder to cool? I'm just thinking loud.

My temps is a bit high, 44¤C max after folding on 2152 for some hours. WIll play som R6S to check for further stability and temps.


----------



## SAFX

Quote:


> Originally Posted by *toncij*
> 
> Judging by most tests and then FireStrike Ultra, TitanX @ 1,5GHz being at 5050 points is very close to most 1080s at 5400ish. 1080 is barely 7% faster, which is a bit "meh".
> 
> Not sure why Nvidia decided not to sell the new TitanX but in only few select markets where they have their web shop.


I see your point, thx for the insight


----------



## kx11

finally got them white DDR4 sticks by GALAX


----------



## fat4l

Quote:


> Originally Posted by *kx11*
> 
> finally got them white DDR4 sticks by GALAX


wow they looookkkkkk nice !!!!!


----------



## metal409

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *metal409*
> 
> As one of the people who sent you a single PM inquiring about the bios and asking to have my name passed along to the creator, I thought I was being reasonably polite. I guess not.
> Temps seem much better. I normally see between 35-37 while gaming, so not far off from you. My 980Ti also ran a couple degrees cooler than this 1080, which was also on an EK block.
> 
> 
> 
> 35-37¤C?!?!
> 
> Do you have som sort of chiller or something?
> 
> Yeah, perhaps the smaller die of the 1080 is making it harder to cool? I'm just thinking loud.
> 
> My temps is a bit high, 44¤C max after folding on 2152 for some hours. WIll play som R6S to check for further stability and temps.
Click to expand...

No, no chiller. Just run dual 360 radiators with gentle typhoon ap-45 fans. But my home office is in the basement and the ambient temp is roughly 21-22C.


----------



## MrTOOSHORT

kx11, sweet set up!









didn't know Galax had custom ram like that.

thanks for posting.


----------



## xer0h0ur

Quote:


> Originally Posted by *GreedyMuffin*
> 
> 35-37¤C?!?!
> 
> Do you have som sort of chiller or something?
> 
> Yeah, perhaps the smaller die of the 1080 is making it harder to cool? I'm just thinking loud.
> 
> My temps is a bit high, 44¤C max after folding on 2152 for some hours. WIll play som R6S to check for further stability and temps.


Man you guys are spoiled on load temps. I see low to mid 40's and grin. Coming from a guy who had fire breathing dragons before this 1080. More commonly known as the 295X2 and 290X.


----------



## xer0h0ur

I don't get the hostility towards uberwootage, his hands are tied. He can talk about it but can't distribute the BIOS. *shrug* Being a whiny child about that doesn't change anything. Although if he can't distribute it he probably would have been better off never having mentioned anything about it to begin with.


----------



## toncij

Quote:


> Originally Posted by *kx11*
> 
> finally got them white DDR4 sticks by GALAX


That's some hot ram design there!

Quote:


> Originally Posted by *metal409*
> 
> No, no chiller. Just run dual 360 radiators with gentle typhoon ap-45 fans. But my home office is in the basement and the ambient temp is roughly 21-22C.


Dual 360?







Worth it?

Also, what tubing u're using?

I'm thinking of moving from H115i on both 1080 and 5960X to custom loop with EK 360 kit. Not sure if i'll get anything, but while TitanX was at about 65°C on 1,5GHz loaded (+112V) my 5960X on 4.5 jumps to 80°C... which I find disturbingly hot for an AIO.


----------



## kx11

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> kx11, sweet set up!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> didn't know Galax had custom ram like that.
> 
> thanks for posting.


i forgot to mention it's the 4000mhz model , i'll knock this up as far as i can


----------



## toncij

Quote:


> Originally Posted by *kx11*
> 
> i forgot to mention it's the 4000mhz model , i'll knock this up as far as i can


Link to a model?


----------



## kx11

Quote:


> Originally Posted by *toncij*
> 
> Link to a model?


from a news site
http://www.thinkcomputers.org/galax-shows-off-4000-mhz-hall-of-fame-ddr4-memory/

if you want to buy them , they're not cheap
http://www.buytome.com/goods/item/detail/id/534924602129


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Man you guys are spoiled on load temps. I see low to mid 40's and grin. Coming from a guy who had fire breathing dragons before this 1080. More commonly known as the 295X2 and 290X.


I can relate, .... still got the dragon in my rig, at least until next Tuesday when FedEx arrives


----------



## zGunBLADEz

This card runs pretty cool on water

Heaven 4k everything max


I have a ek supremacy universal vga on 2 alpha cool rads 240 with 1200rpms fans on it. Ambient it's 75F


----------



## arrow0309

Quote:


> Originally Posted by *kx11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *toncij*
> 
> Link to a model?
> 
> 
> 
> from a news site
> http://www.thinkcomputers.org/galax-shows-off-4000-mhz-hall-of-fame-ddr4-memory/
> 
> if you want to buy them , they're not cheap
> http://www.buytome.com/goods/item/detail/id/534924602129
Click to expand...

Nice rig, congrats!

Maybe you should buy these wb for your 1080 HoF's (sooner or later)









http://www.buytome.com/goods/item/detail/id/535405699030


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *kx11*
> 
> finally got them white DDR4 sticks by GALAX


sweeeeeeeeet


----------



## kx11

Quote:


> Originally Posted by *arrow0309*
> 
> Nice rig, congrats!
> 
> Maybe you should buy these wb for your 1080 HoF's (sooner or later)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.buytome.com/goods/item/detail/id/535405699030


maybe later , they do need them in benchmarks


----------



## GreedyMuffin

I did let my room cool down. 37'C when folding.









Weird thing is that under 40'C the card is only at 103x-104x mv instead of the full 1050mv.


----------



## Antsu

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I did let my room cool down. 37'C when folding.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Weird thing is that under 40'C the card is only at 103x-104x mv instead of the full 1050mv.


Are you sure the 1050mV stepping has a step +13Mhz higher than lower steps? if yes, try bumping up the core a notch.


----------



## wangle0485

Count me in!

Something strange happened today though- the bios somehow reverted to reference clocks, 1607/2500, and would only go back to Super Jetstream clocks by flashing the bios, has anyone else had something like this happen?


----------



## KillerBee33

Does anyone know what nvflash to use to flash my FE bios back?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KillerBee33*
> 
> Does anyone know what nvflash to use to flash my FE bios back?


Go here:
*
http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x*
Quote:


> New in version 5.287: Pascal Support. They changed a lot in the code and I had to find a new bypass point. You will see the following errors while flashing (that have been highlighted). You can ignore them and continue as normal.


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Go here:
> *
> http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x*


What comands do i use? Tried same as usual
cd c:/nvflash
nvflash gp104.rom DID NOT WORK


----------



## wangle0485

Nvflash -6 your.rom


----------



## MrTOOSHORT

Put the nvflash(with your rom inside) folder in the (C): directory.

then open cmd, type "cd c:\nvflash" type enter.

then "nvlash -6 xxxx.rom" press enter.

Type "y" to continue when prompted to flash card.

after done, type "exit", then reboot.


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Put the nvflash(with your rom inside) folder in the (C): directory.
> 
> then open cmd, type "cd c:\nvflash" type enter.
> 
> then "nvlash -6 xxxx.rom" press enter.
> 
> Type "y" to continue when prompted to flash card.
> 
> after done, type "exit", then reboot.



No go


----------



## ssgwright

Quote:


> Originally Posted by *KillerBee33*
> 
> 
> No go


first type "nvflash --protectoff"


----------



## KillerBee33

Quote:


> Originally Posted by *ssgwright*
> 
> first type "nvflash --protectoff"


cd c:/nvflash
nvflash --protectoff gp104.rom ?


----------



## fat4l

Guys can you list all and the best 120mm aios compatible with 1080?
My friend is looking for one and would like to get something thicker than evga aio.
Btw is evga aio 980ti compatible with 1080?
Yhx


----------



## MrTOOSHORT

Just nvflash --protectoff

then follow what I wrote earlier.


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Just nvflash --protectoff
> 
> then follow what I wrote earlier.


You got a full command? this is confusing...where and when is the "nvflash --protectoff"


----------



## MrTOOSHORT

cd: c\nvflash, press enter

nvlfash --protectoff, press enter

nvflash -6 xxx.rom, press enter

enter y when prompted.

type exit and reboot.


----------



## toncij

Quote:


> Originally Posted by *uberwootage*
> 
> They said "highest quality components" they won't call it binning but they're binning and keeping the GPUs for then self's. That's why the bios voltage is set so low. People will lose there mind if you could put 1.2v to a fe and get to 2.3-2.35ghz and all the AIB cards are stuck at the lower clocks. They won't call it binning but they are.


But it's so hard. Tried 4 FEs and some AIBs. None could touch 2.2 and none was worth switching from [email protected]

That's a bit disappointing.

I'm getting FTWs next week. Will try how these do.


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> cd: c\nvflash, press enter
> 
> nvlfash --protectoff, press enter
> 
> nvflash -6 xxx.rom, press enter
> 
> enter y when prompted.
> 
> type exit and reboot.


Same thing


----------



## MrTOOSHORT

Copy these two files in this nvflash folder to the nvlfash folder in (C):

nvflash.zip 1167k .zip file


now try it again.


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Copy these two files in this nvflash folder to the nvlfash folder in (C):
> 
> nvflash.zip 1167k .zip file
> 
> 
> now try it again.


Flashed but no changes at all, i used this thing to begin with and now my fan on FE is "0" and stock clock 1709

GTX_1080_GAMING_X_8G_602-V336-03S_vbios_v1.1.zip 3119k .zip file


----------



## ssgwright

ya Tooshort got you... you're nvflash needs to be updated


----------



## MrTOOSHORT

I used the nvlfash files from the OP of this thread to flash an MSI Seahawk with about 5 different bios' without issue, I'm on Windows 10.
*
http://forum.hwbot.org/showthread.php?t=159025*

It's in the 1080strixXOC.rar file.


----------



## KillerBee33

Quote:


> Originally Posted by *ssgwright*
> 
> ya Tooshort got you... you're nvflash needs to be updated


First i flashed FE with this

GTX_1080_GAMING_X_8G_602-V336-03S_vbios_v1.1.zip 3119k .zip file

Subvendor changed from NVIDIA to MSI and no matter what i do it just wont flash back

Heh right image is what FE came like


----------



## TWiST2k

Does anyone know if the Corsair HG10 or something similar can be modded to fit on the Evga 1080 FTW cards? I am not quite prepared to go custom loop, just another expensive hobby I am not ready to commit to, but I do own a couple corsair coolers and could put them to good use.


----------



## Cool Mike

Look forward to hearing back from you testing the Classified Bios


----------



## Jpmboy

Quote:


> Originally Posted by *kx11*
> 
> finally got them white DDR4 sticks by GALAX
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *kx11*
> 
> from a news site
> http://www.thinkcomputers.org/galax-shows-off-4000-mhz-hall-of-fame-ddr4-memory/
> 
> if you want to buy them , they're not cheap
> http://www.buytome.com/goods/item/detail/id/534924602129


Beautiful! So how are those 2 z170 kits working on the x99? What speed and timings are you running? Can you post up some info *HERE*


----------



## ucode

Quote:


> Originally Posted by *toncij*
> 
> But it's so hard. Tried 4 FEs and some AIBs. None could touch 2.2 and none was worth switching from [email protected]
> 
> That's a bit disappointing.
> 
> I'm getting FTWs next week. Will try how these do.


FWIW 2.2 looks doable on my FE but would need to improve stock cooling and an extra 40% in dynamic power for 10% increase in clocks seems so... except for benches.


----------



## fat4l

Quote:


> Originally Posted by *ucode*
> 
> FWIW 2.2 looks doable on my FE but would need to improve stock cooling and an extra 40% in dynamic power for 10% increase in clocks seems so... except for benches.


1.19v ? hmmm how ?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Antsu*
> 
> Are you sure the 1050mV stepping has a step +13Mhz higher than lower steps? if yes, try bumping up the core a notch.


Correct. Now when I woke up the temp was on 42¨C instead and the voltage was 1043mv instead of 103xmv. I can't keep it under 40¤'C when gaming, at least not with my 5960X running along it and my fans at an low RPM. But as long as the core dosen't clock down I'm happy.

2151 mhz has folded for the last 6-7 hours (1-2 hours gaming before that, and 3-4 hours of folding before that again). So I'm pretty confident that my card is stable on stock voltage with those speeds.

I'm very, very happy with 2151 on stock volts!


----------



## ucode

@fat4l As mentioned in post #4080 by cross flashing VBIOS linked by overclocker Dancop and also linked earlier on in this forum. But don't worry you don't have to go digging back too far as MrTOOSHORT re-posted the link just a few posts back #4153


----------



## toncij

Quote:


> Originally Posted by *ucode*
> 
> FWIW 2.2 looks doable on my FE but would need to improve stock cooling and an extra 40% in dynamic power for 10% increase in clocks seems so... except for benches.


This looks much higher than 2.2


----------



## toncij

So is it a general knowledge now that FEs are better chips than AIBs?


----------



## ucode

Quote:


> Originally Posted by *toncij*
> 
> This looks much higher than 2.2


Unlike 2.2 that's no where near stable with load, just for show


----------



## TWiST2k

Quote:


> Originally Posted by *ucode*
> 
> @TWiST2k no need to go out of your way. I think I got a pretty good idea of where P1080 (Preset 1920x1080 furmark bench) sits with the GTX-1080 when running with normal TDP limit. Somewhere around 120fps. I removed power limiting on my FE card and managed 133fps with >300W draw. It was a software approach that had some issues with another clock but seems cross flashing strix VBIOS posted earlier (hwbot link) can do the same while also allowing for a little more voltage, up to 1.2V on this card, however I lose some fan speed from 4k to 3.6k. Still need to look into it a bit more when time permits.


Sorry for the delay, here you go bro!


----------



## NBAasDOGG

Hello gentelman (and woman),

By this I want to share my result for the past 5 weeks for those still doubting which 1080 to buy. I have purchased a total of 3 GTX1080s (AMP!, FTW and Xtreme Gaming) and I have tested all of them in detail. So time to share the most important results. Lets start with GPU temperatures.

Temperatures: with 100% fan speed and full 3dmark FireStrike Extreme run (room temp 27 degCel)

-Zotac GTX1080 AMP!: Hits 79-81 Celsius
-Gigabyte GTX1080 Xtreme Gaming: Hits 55 Celsius
-EVGA GTX1080 FTW: Hits 64 Celsius

Noise 100% fan speed:
- The Zotac at 100% fan speed is clearly on of the most quietest card I have ever heard. Its almost like the FTW and Xtreme gaming at 40%. The sad part is that low fan speeds at 100% always resulted in high temps. EVEN HIGER THAN FE!!
- I think we can all agree that the Xtreme Gaming heatsink and fans are the most advanced in the market. Can you believe that this GPU is very quite and does not even hit 50 Celsius during gaming (100% fan speed)? If you want quite and high cooling solution. Xtreme Gaming is the best choice. This thing just gives water-cooling performance using heatsink and fans. Very impressive, isn't it?








- FTW is good, but not the best cooling in the market. Hits around 60 in games and the fans are noisiest of all three. This card always requires a custom fan curve since the auto fan will not get higher than ~40-50%, resulting in high temps!

Overclocking: All of these cards can maintain solid 2000Mhz out of the box if fan speed is set to 100%. However, the Zotac does dip to around 1950Mhz when temps reaches 75 Celsius. The Xtreme Gaming is one of the best and maintains solid 2038Mhz out of the box for 3 hours+ gaming.
I concluded a few important things during overclocking and I think that most of you do/did not realized that yet. Max overclocking using volt/clock curve gives less performance over slider. In other words, 2100Mhz with the curve is less performance gain over 2100Mhz with slider! I see that many people in this forum are showing their awesome high clocks (2150Mhz+) using the curve, but did you even compare that with your max stable slider OC? Go figure








I give you an example of the curve/slider comparison. With 100% voltage and 130% power, I overclocked the 1080 FTW to solid 2177Mhz by increasing the slider to +100 and finishing it off with the curve to hit 2177Mhz (starts from 2200 and dips to solid 2177). Using Heaven 4.0, I compared the results to 2114Mhz by only using the slider (+110 on slider).

- +100/curve to 2177Mhz : 118.1 fps
- +110 slider to 2114Mhz : 120.7 fps

In fact, these results are also reproducible using Firestrike or Valley benchmarks. So I finally decided to NOT use the curve since the slider will usually result higher performance gains. The max overclocks of the three cards are shown below. Max overclock means; max stable clock which settles for a long period of time. (So please stop sharing your max peak if you cannot maintain that







)

-Zotac GTX1080 AMP!: 2038mhz core / 11200mhz mem
-Gigabyte GTX1080 Xtreme Gaming: 2063Mhz core / 11400Mhz mem
-EVGA GTX1080 FTW: 2126Mhz core / 10900Mhz mem

For some reason, the Zotac was one of the worse overclockers. This card is huge, but it failed instantly when overclock to 2050Mhz. I'm not sure, but I think the temperature of the Zotac is a drawback since this thing hits 78 Celsius during gaming with 100% fan speed. At the other hand, I think that silicon lottery plays an important for overclocking since the Xtreme gaming also sucked at overclocking. I expected much more than 2063Mhz for the gigabyte since the temps were usually below 50 Celsius during gaming. Fortunately, the FTW was an great overclocker hitting a solid 2126Mhz and settled there forever when temps were below 65 Celsius. Also, the FTW performed the best in Firestrike, Heaven and Valley when overclocked.
Again, all overclocks were based on using the sliders with 100% voltage and max power (120% Zotac, 150% Gigabyte and 130% EVGA)

Some pics to share















*(Personal opinion: Please stop sharing your max peak OC or using curve since scores are always lower than lower clocked slider OC)*


----------



## pantsoftime

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hello gentelman (and woman),
> *(Personal opinion: Please stop sharing your max peak OC or using curve since scores are always lower than lower clocked slider OC)*


From what I've seen, it's not the curve that causes this, it's the voltage/power limits. If you push too hard it will hit the limits earlier, resulting in power or vrel throttling. This isn't always apparent in monitoring applications clock speed charts either since it's happening multiple times per second. For example, I have seen on my watercooled FE that keeping the voltage slider lower will frequently result in better scores with both custom curve and slider. In studying the behavior, the cards boost to higher voltage bins earlier and end up hitting power limit sooner. Striking a balance can be very difficult because the power limits on these cards are just plain too low.

When you were doing your testing were you watching for throttling activity by looking at the power limit or voltage limit graphs in something like afterburner? This is an important factor in maximizing your scores.


----------



## fat4l

Quote:


> Originally Posted by *ucode*
> 
> @fat4l As mentioned in post #4080 by cross flashing VBIOS linked by overclocker Dancop and also linked earlier on in this forum. But don't worry you don't have to go digging back too far as MrTOOSHORT re-posted the link just a few posts back #4153


oooo hi. Well ppl are already saying that this bios lowers your performance. The mhz and voltage numbers look high but the real performance is going down. Try to compare


----------



## Bdonedge

My 1080 has been giving me nothing but problems lately

Yesterday i noticed my gpu was running at 99% in a game with a core clock of 1015mhz - I thought to myself "***? That's not right at all" And noticed the GPU power % was only at 44%. (my temps were fine - below 64)

How can a GPU run at 100% while only taking in 44% of power? That makes absolutely zero sense to me.


----------



## pantsoftime

Quote:


> Originally Posted by *Bdonedge*
> 
> My 1080 has been giving me nothing but problems lately
> 
> Yesterday i noticed my gpu was running at 99% in a game with a core clock of 1015mhz - I thought to myself "***? That's not right at all" And noticed the GPU power % was only at 44%. (my temps were fine - below 64)
> 
> How can a GPU run at 100% while only taking in 44% of power? That makes absolutely zero sense to me.


That's happened to me a few times as well. There's something that can happen that causes the drivers to not trigger the boost routine properly. Sometimes running in fullscreen mode can help, or rebooting. I haven't figured out a great way to fix it myself either, but it definitely comes and goes depending on some sort of conditions that I haven't quite put a finger on yet.


----------



## ssgwright

same here... before I game I always check first to ensure it's boosting properly. If not I reboot and it's fine again.


----------



## Bdonedge

Quote:


> Originally Posted by *pantsoftime*
> 
> That's happened to me a few times as well. There's something that can happen that causes the drivers to not trigger the boost routine properly. Sometimes running in fullscreen mode can help, or rebooting. I haven't figured out a great way to fix it myself either, but it definitely comes and goes depending on some sort of conditions that I haven't quite put a finger on yet.


That's interesting to hear. Out of curiosity what Version of the 1080 do you have?

I have the Gigabyte G1.

I rebooted and it was fine - but I'll be honest at first I thought the thing was crapping out on me.


----------



## Bdonedge

Quote:


> Originally Posted by *ssgwright*
> 
> same here... before I game I always check first to ensure it's boosting properly. If not I reboot and it's fine again.


That's insane - I haven't heard anyone else mention it before. Glad I mentioned it. What kind of 1080 do you have?
It causes inconsistancy in my 3dMark Stress tests for effeciency % I think


----------



## outofmyheadyo

If afterburner voltage adjustment doesn't work any other ways to pump up the volts? My card seems to run at 1.0500v max (gainward phoenix)


----------



## pantsoftime

Quote:


> Originally Posted by *Bdonedge*
> 
> That's interesting to hear. Out of curiosity what Version of the 1080 do you have?
> 
> I have the Gigabyte G1.
> 
> I rebooted and it was fine - but I'll be honest at first I thought the thing was crapping out on me.


I have an NVIDIA branded FE. I think it's most likely a driver issue. I figured I'd give it a couple more driver revs before I make too much noise about it. Now that I'm thinking about it I actually haven't seen it happen in the past week or so. It used to happen to me when I was using Precision XOC for overclocking. Now I'm using afterburner. I wonder if that's a factor... do you use XOC?


----------



## ssgwright

I have a Zotac FE... I think it may be an overclock issue (possibly) I know that with my 980ti and TitanX if the driver crashes and recovered sometimes the card wouldn't boot properly afterward. So it's possible that if you push the card too much maybe the driver recovers after a crash and then doesn't boost again (properly) until a restart? I don't know I'm just throwing a theory out there.


----------



## juniordnz

Temp throttle is huge in those cards! Even if GPUz is not showing thermal perfcap the card downclocks itself as temp goes up. Seems that mid 50s is the temp to aim. Watercooling never made so much sense as in these pascal chips.

I wonder if this throttle can be bypassed via BIOS modding though, or if this should be read as a sign that theses chips are more sensitive to heat and we should be more careful with it?


----------



## Bdonedge

Quote:


> Originally Posted by *pantsoftime*
> 
> I have an NVIDIA branded FE. I think it's most likely a driver issue. I figured I'd give it a couple more driver revs before I make too much noise about it. Now that I'm thinking about it I actually haven't seen it happen in the past week or so. It used to happen to me when I was using Precision XOC for overclocking. Now I'm using afterburner. I wonder if that's a factor... do you use XOC?


I'm actually not overclocking since I've upgraded to the 1080, don't even have a overclocking software installed. Gogabyte recommends their utility but it gave me some Bsod I don't trust it.


----------



## Bdonedge

Quote:


> Originally Posted by *ssgwright*
> 
> I have a Zotac FE... I think it may be an overclock issue (possibly) I know that with my 980ti and TitanX if the driver crashes and recovered sometimes the card wouldn't boot properly afterward. So it's possible that if you push the card too much maybe the driver recovers after a crash and then doesn't boost again (properly) until a restart? I don't know I'm just throwing a theory out there.


It's very possible. The drivers have had a bunch of issues that took a while to surface so maybe it's "mini" crashing or something. I don't even know what that means but what you're saying makes sense


----------



## Cool Mike

A correction from an earlier post. After spending some quality time with my EVGA Classified 1080, max core voltage is 1.095. I initially said it was 1.075.

After spending some time on my overclock: Core 1139 MHz and 11080 (Eff.) on the memory. 1114 Sustained for 1 hour running Black OPs III at 4K maxed settings. This is at 4K, so slightly better results can be had at 1440 or 1080. Temps were in the 65-68C range. Looks like I got a top 10% over clocker. Glad I waited for the Classified.


----------



## duckweedpb7

Been testing the FTW today. Seems my best results with Firestrike are with with the core at +85 (2101Mhz) and the memory at 11000Mhz. Played BF4 for a few hours and it hovered between 2088 and 2101 Mhz depending on temperatures. Overall not bad.


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> A correction from an earlier post. After spending some quality time with my EVGA Classified 1080, max core voltage is 1.095. I initially said it was 1.075.
> 
> After spending some time on my overclock: Core 1139 MHz and 11080 (Eff.) on the memory. 1114 Sustained for 1 hour running Black OPs III at 4K maxed settings. This is at 4K, so slightly better results can be had at 1440 or 1080. Temps were in the 65-68C range. Looks like I got a top 10% over clocker. Glad I waited for the Classified.


the Classified has an EVBOT port... that's how you can increase voltage on the card.


----------



## Kriant

Question: will there ever be flexible HB bridges?
I've paired up a 1080FE with a hydro cooler set (from EVGA) and a 1080 FTW and well, after getting the HB bridge, due to the hight differnce there's no way I can connect the cards with that bridge. I am using a 144hz 1440p monitor, so I'm not sure whether I'd see great benefits from that HB bridge, but right now, all I can use is flexible bridges.


----------



## kx11

i can confirm 2 1080s can run GTAV @ 4K/85fps with maxed settings , fxaa off + msaax2


----------



## Cool Mike

Wished I could utilize the evbot port. I understand the box is difficult to find and if you do find one, expensive?


----------



## Cool Mike

Duck you have a nice FTW overclocker. I owned one for a few days and the best sustained speed I could achieve was around 2040 on the core.


----------



## TK421

Quote:


> Originally Posted by *Jpmboy*
> 
> the Classified has an EVBOT port... that's how you can increase voltage on the card.


but with pascal, increase voltage not so useful if not running lower than ambient cooling right?

On the other note, I got a zotac 1080 AMP edition repasted with Grizzly Kryonaut, seems to hit 78c on around 220w power draw, normal or not? The card is set at +60/+500 with power/temp limit maxed, but voltage offset left at 0%....

Just wondering why this card is running really hot


----------



## shadow85

guys, what should I do, I need a bit more power for 4K gaming on max details. Currently I have 2x EVGA GTX 980 Ti Hybrids, should I

a) Sell them both and grab 2x GTX 1080

or

b) Keep them both and buy a 3rd GTX 980 Ti Hybrid. (I haven't had any major issues with 2way SLi yet)

If I do option a) I need to spend about $1000-1200 AUD after selling both of the Hybrids to get 2x GTX 1080,
if I do option b) I only need to spend another ~$750 AUD to get a new Hybrid.


----------



## toncij

With two 980Tis you're looking at 25% speed increase at best if you move to 1080s. Since 980Tis are worth nothing atm, you won't be able to cover a single 1080 with the sale. I don't think it's worth it to pay $900 to get 25% more performance.


----------



## ChevChelios

Quote:


> Originally Posted by *shadow85*
> 
> guys, what should I do, I need a bit more power for 4K gaming on max details. Currently I have 2x EVGA GTX 980 Ti Hybrids, should I
> 
> a) Sell them both and grab 2x GTX 1080
> 
> or
> 
> b) Keep them both and buy a 3rd GTX 980 Ti Hybrid. (I haven't had any major issues with 2way SLi yet)
> 
> If I do option a) I need to spend about $1000-1200 AUD after selling both of the Hybrids to get 2x GTX 1080,
> if I do option b) I only need to spend another ~$750 AUD to get a new Hybrid.


*never touch 3-way SLI*

there is no cheap way to get more power for you at this point

2x1080 will be faster than 2x 980Ti, but not by too much and it will cost to upgrade

2x new Titan X will be much faster, but cost $2400+ (not counting selling the 980Tis)

Id say wait for 1080Ti @ sub-$1000 or later - Volta GV104/GV102


----------



## KickAssCop

What is the size of SLi bridge that comes with Gigabyte Xtreme Gaming 1080?
40, 60 or something else.


----------



## karlahoin

I'll be joining tomorrow with a couple of 1080s. Sort of... It has been a really long wait after selling my 980 TI (for a very good amount) two months ago. Never expected the 1080s to be so difficult to get. Having no GPU/IGP, it was not an easy wait.

I had 7 orders at the same time at a point and still managed to reach 2 months. Amazon UK/DE alone had 5 and two months after, had shipped nothing. When I found an AMP Extreme in stock at Overclockers UK, I immediately went for it, and then cancelled all the other orders. Ironically, at about the same time, one of the stores shipped one of the orders, so tomorrow kinda by accident I'll be receiving:

-1 x Palit GameRock 1080 (standard, not the Premium), with G-Panel (from Mindfactory in Germany)
-1 x Zotac AMP Extreme 1080 (from Overclockers UK)

I really liked my Zotac 980 TI so I'm going to keep the Extreme unless someone thinks that is a tremendously bad idea. The Palit seems very interesting, I like the large 3-slot cooler and they did well in reviews, fairly cool and fast, double BIOS, etc. The G-Panel hardware is also interesting, but I can definitely live without it.

The Palit is about 30 Eur cheaper, but I suspect it will also retain less value when sold used, plus it has 2 year warranty, while Zotac has 5 years (at least on paper).

It would be nice to test both cards, and there is a 2 week window for that, but I prefer to just ship one of them back sealed/unopened so that someone else might enjoy gaming with untouched gear.

So... open the Zotac? The Palit? Flip coin?


----------



## gerbil80

Quote:


> Originally Posted by *karlahoin*
> 
> I'll be joining tomorrow with a couple of 1080s. Sort of... It has been a really long wait after selling my 980 TI (for a very good amount) two months ago. Never expected the 1080s to be so difficult to get. Having no GPU/IGP, it was not an easy wait.
> 
> I had 7 orders at the same time at a point and still managed to reach 2 months. Amazon UK/DE alone had 5 and two months after, had shipped nothing. When I found an AMP Extreme in stock at Overclockers UK, I immediately went for it, and then cancelled all the other orders. Ironically, at about the same time, one of the stores shipped one of the orders, so tomorrow I'll be receiving:
> 
> -1 x Palit GameRock 1080 (standard, not the Premium), with G-Panel (from Mindfactory in Germany)
> -1 x Zotac AMP Extreme 1080 (from Overclockers UK)
> 
> I really liked my Zotac 980 TI so I'm going to keep the Extreme unless someone thinks that is a tremendously bad idea. The Palit seems very interesting, I like the large 3-slot cooler and they did well in reviews, fairly cool and fast, double BIOS, etc. The G-Panel hardware is also interesting, but I can definitely live without it. The Palit is about 30 Eur cheaper, but I suspect it would also retain less value while used, plus it has 2 year warranty, while Zotac has 5 years (at least on paper).
> 
> It would be nice to test both cards, and there is a 2 week window for that, but I prefer to just ship one of them back sealed/unopened so that someone else might enjoy gaming with untouched gear.
> 
> So... open the Zotac? The Palit? Flip coin?


I believe that the Zotac Amp Extreme is the more desirable if staying on air. Plus your getting 5 years warranty as opposed to 2 years which is a nice plus.


----------



## Antsu

Quote:


> Originally Posted by *gerbil80*
> 
> I believe that the Zotac Amp Extreme is the more desirable if staying on air. Plus your getting 5 years warranty as opposed to 2 years which is a nice plus.


Wasn't there talk about Zotac reaching almost 80C at 100%? Or was it another model?


----------



## NBAasDOGG

Quote:


> Originally Posted by *KickAssCop*
> 
> What is the size of SLi bridge that comes with Gigabyte Xtreme Gaming 1080?
> 40, 60 or something else.


60, one slot between the cards


----------



## NBAasDOGG

Quote:


> Originally Posted by *Antsu*
> 
> Wasn't there talk about Zotac reaching almost 80C at 100%? Or was it another model?


I posted that indeed. It's the Zotac AMP! hitting 80C at 100% fan. Zotac AMP Extreme is fine.


----------



## karlahoin

Quote:


> Originally Posted by *Antsu*
> 
> Wasn't there talk about Zotac reaching almost 80C at 100%? Or was it another model?


What is the Zotac 1080 AMP (two fans, 300mm). The 1080 AMP Extreme uses a larger cooler with three fans (325mm) and so far I haven't seen any reports of overheating. Seems to have temperatures similar to other cards.


----------



## toncij

With a 980Ti you're in for an expensive move up the ladder...
Im still rethinking if moving from Titan X to 1080 gave me anything at all...


----------



## karlahoin

Quote:


> Originally Posted by *toncij*
> 
> With a 980Ti you're in for an expensive move up the ladder...
> Im still rethinking if moving from Titan X to 1080 gave me anything at all...


Yeah and not even considering just SLI upgrades.

I don't think I would have sold my Zotac 980TI if I knew how things would turn out. It was pretty fast at 1.5 GHz. But over two months ago the 1080 seemed interesting and I expected them to overclock better. With the tiny overclocks, the distance to the 1080 will be minimal, definitely not worth 250-300 Eur of upgrade. Would have been a mere 100 or so if the 1080s were at MSRP. Didn't expect them to OC so low and sell so high.









I did have some issues with noise and heat however, so at least that should improve.


----------



## toncij

Quote:


> Originally Posted by *karlahoin*
> 
> Yeah and not even considering just SLI upgrades.
> 
> I don't think I would have sold my Zotac 980TI if I knew how things would turn out. It was pretty fast at 1.5 GHz. But over two months ago the 1080 seemed interesting and I expected them to overclock better. With the tiny overclocks, the distance to the 1080 will be minimal, definitely not worth 250-300 Eur of upgrade. Would have been a mere 100 or so if the 1080s were at MSRP. Didn't expect them to OC so low and sell so high.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did have some issues with noise and heat however, so at least that should improve.


I haven't sold my TitanXes, both are still happy running at 1465 very cool under AIO (h115i) each (under 65°C).
Distance to 1080 FTWs is mere 6%. Not sure if under water I could gain anything with FTWs since those chips have an upper limit before sub-zero.

In thinking of just selling FTWs instead since I actually can use 12GB.

I can't buy a new Titan X because Nvidia doesn't have a shop for all European countries, only a few.


----------



## juniordnz

Currently thinking about returning my MSI 1080 Armor for a EVGA 1080 FTW, what do you guys think about it?

I've been thinking, if we're probably not going to have the same flexibility maxwell bios tweaker gave us in terms of overclocking, what's the point of investing in cards with "higher overclock capability" if we're all stuck in the same 2000-2100mhz with aircooling?

Bear in mind that it would no be a "20 buck upgrade" since these cards here in Brazil costs like 1200usd.

Thanks in advance


----------



## karelbastos

Quote:


> Originally Posted by *juniordnz*
> 
> Currently thinking about returning my MSI 1080 Armor for a EVGA 1080 FTW, what do you guys think about it?
> 
> I've been thinking, if we're probably not going to have the same flexibility maxwell bios tweaker gave us in terms of overclocking, what's the point of investing in cards with "higher overclock capability" if we're all stuck in the same 2000-2100mhz with aircooling?
> 
> Bear in mind that it would no be a "20 buck upgrade" since these cards here in Brazil costs like 1200usd.
> 
> Thanks in advance


Hi Junior

I'm from brasil too. Sao Paulo.

I have two ZOTAC FE 1080.. SLI

And i was able to reach stable 2025 mhz on FAN 100%, max temp 67 - 70 C

I brought two EVGA hybrid 980TI kit, just to watercooler the two GPU and see if i can reach any more stable overclock.

How much BRL R$ you paid on your vga ?


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> Wished I could utilize the evbot port. I understand the box is difficult to find and if you do find one, expensive?


yeah, expensive only because they are rare. But they can be found for sale at times.
Quote:


> Originally Posted by *TK421*
> 
> but with pascal, increase voltage not so useful if not running lower than ambient cooling right?
> On the other note, I got a zotac 1080 AMP edition repasted with Grizzly Kryonaut, seems to hit 78c on around 220w power draw, normal or not? The card is set at +60/+500 with power/temp limit maxed, but voltage offset left at 0%....
> Just wondering why this card is running really hot


Can't really say that at this point. The EVBOT (with the firmware update) will provide access to more than just NVVDD (core voltage).


----------



## juniordnz

Quote:


> Originally Posted by *karelbastos*
> 
> Hi Junior
> 
> I'm from brasil too. Sao Paulo.
> 
> I have two ZOTAC FE 1080.. SLI
> 
> And i was able to reach stable 2025 mhz on FAN 100%, max temp 67 - 70 C
> 
> I brought two EVGA hybrid 980TI kit, just to watercooler the two GPU and see if i can reach any more stable overclock.
> 
> How much BRL R$ you paid on your vga ?


I paid 3759brl (12x313 credit card). It was very cheap considering prices here, wasn't it? I couldn't find any other 1080 with that price (12x sem juros).

Where did you bought those EVGA Hybrid kits? Can you PM me that info? Was currently thinking about hooking up a H110i on mine to keep temps low. (Temp throttle is huge in these cards)


----------



## tin0

Still got a Corsair H80 laying around so I just ordered the Corsair Hydro HG10 bracket and will mod it to fit my reference 1080. Let's see what lower temps do with this baby


----------



## karelbastos

Quote:


> Originally Posted by *juniordnz*
> 
> I paid 3759brl (12x313 credit card). It was very cheap considering prices here, wasn't it? I couldn't find any other 1080 with that price (12x sem juros).
> 
> Where did you bought those EVGA Hybrid kits? Can you PM me that info? Was currently thinking about hooking up a H110i on mine to keep temps low. (Temp throttle is huge in these cards)


I brought two kits on AMAZON

https://www.amazon.com/gp/product/B00ZQ4PFX2/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

I paid last week 66 USD each, ( NOT PRICE SHOW 99 USD EACH )

These evga KITS works on 1080 FE too.

Now i will wait for it arrive and see the temps going down on my SLI...

I PAID 3500 R$ BRL on my ZOTAC 1080 FE here on brasil.

I brought this EVGA hybrid kit because i saw other users using on this post

http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look


----------



## TK421

is 78c normal for GTX1080 using 230w of power?

I am getting 78c on my GTX1080 AMP on 120% power and 92c thermal target, no added voltage. This is on max fans and using Kryonaut thermal paste.


----------



## raidflex

Received my 1080 FTW, much smoother then my 780 TI SLI setup. Frametimes are very consistent and my FPS is no worse and generally better. It seems to boost to right under 2GHz on the core automatically. I have not tried overclocking it yet, going to wait for the EK water block to come out first.


----------



## juniordnz

Quote:


> Originally Posted by *tin0*
> 
> Still got a Corsair H80 laying around so I just ordered the Corsair Hydro HG10 bracket and will mod it to fit my reference 1080. Let's see what lower temps do with this baby


Very interested in those results, please reply back when it's done. Good luck with the mod!








Quote:


> Originally Posted by *karelbastos*
> 
> I brought two kits on AMAZON
> https://www.amazon.com/gp/product/B00ZQ4PFX2/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER
> I paid last week 66 USD each, ( NOT PRICE SHOW 99 USD EACH )
> These evga KITS works on 1080 FE too.
> Now i will wait for it arrive and see the temps going down on my SLI...
> I PAID 3500 R$ BRL on my ZOTAC 1080 FE here on brasil.
> I brought this EVGA hybrid kit because i saw other users using on this post
> http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look


Shipping directly to Brazil? How much BRL in final price?
Quote:


> Originally Posted by *TK421*
> 
> is 78c normal for GTX1080 using 230w of power?
> 
> I am getting 78c on my GTX1080 AMP on 120% power and 92c thermal target, no added voltage. This is on max fans and using Kryonaut thermal paste.


Unfortunatly, yes. Expecally if you live in a warm place. I get those temps while benchmarking in 28ºC room temp with 100% fan speed.


----------



## TK421

Quote:


> Originally Posted by *juniordnz*
> 
> Very interested in those results, please reply back when it's done. Good luck with the mod!
> 
> 
> 
> 
> 
> 
> 
> 
> Shipping directly to Brazil? How much BRL in final price?
> Unfortunatly, yes. Expecally if you live in a warm place. I get those temps while benchmarking in 28ºC room temp with 100% fan speed.


goddamn









My AIO cooler legs aren't long enough to reach the screws without applying too much pressure on the core


----------



## karelbastos

Quote:


> Originally Posted by *juniordnz*
> 
> Very interested in those results, please reply back when it's done. Good luck with the mod!
> 
> 
> 
> 
> 
> 
> 
> 
> Shipping directly to Brazil? How much BRL in final price?
> Unfortunatly, yes. Expecally if you live in a warm place. I get those temps while benchmarking in 28ºC room temp with 100% fan speed.


Not direct.

Package receive on NY and after ship to me by DHL. Maybe i receive the package until the weekend.

But AMAZON ships direct to brasil too.










Now i'm thinking on how i will fit three closed loop water cooler on my Case ^^

1 - CPU interl water cooler BXTS13X
2 - GPU EVGA HYBRID
3 - GPU EVGA HYBRID

lol, maybe i will left my case open ^^


----------



## aylan1196

hi for any one thinking of evga hybrid here are pics I'll post of my build :
Case phanteks evolve tempered glass
CPU cooler h360x2 prestige
Two evga 1080 fe with hybrid cooler
The main problems I faced is fitting all those in this small case







but I managed to do it now for the pics :

Max temps 45
Max overclock from 1980 on stock cooler to 2102 stable


----------



## KillerBee33

Quote:


> Originally Posted by *aylan1196*
> 
> hi for any one thinking of evga hybrid here are pics I'll post of my build :
> Case phanteks evolve tempered glass
> CPU cooler h360x2 prestige
> Two evga 1080 fe with hybrid cooler
> The main problems I faced is fitting all those in this small case
> 
> 
> 
> 
> 
> 
> 
> but I managed to do it now for the pics :
> 
> Max temps 45
> Max overclock from 1980 on stock cooler to 2102 stable


Hybrid KITS come with LEDs?


----------



## aylan1196

Yes they come with LED


----------



## KillerBee33

Quote:


> Originally Posted by *aylan1196*
> 
> Yes they come with LED


There's nothing about that on EVGA , mine is coming tomorow.


----------



## aylan1196

You will have to fit the adapter from the evga shroud to the gpu in the white slot the other adapters are for the pump and the fan but I plugged the fan to my mobo and changed from evga fan to noctua 120 2000 rpm


----------



## KillerBee33

Quote:


> Originally Posted by *aylan1196*
> 
> You will have to fit the adapter from the evga shroud to the gpu in the white slot the other adapters are for the pump and the fan but I plugged the fan to my mobo and changed from evga fan to noctua 120 2000 rpm


Had a single Thermaltake riim 120 on 980 with Evga Hybrid kit, worked fine @1200RPM


----------



## KickAssCop

Anyone purchased the Gigabyte Xtreme Waterforce Gaming 1080 cards? I just ordered two after cancelling my STRIX. Any reviews, can't find using simple google search. Thanks.


----------



## TK421

be the first one to rip the cooler off the card to show us the insides and pcb


----------



## arrow0309

Quote:


> Originally Posted by *KickAssCop*
> 
> Anyone purchased the Gigabyte Xtreme Waterforce Gaming 1080 cards? I just ordered two after cancelling my STRIX. Any reviews, can't find using simple google search. Thanks.


My I ask you why did you cancel the Strix?


----------



## jase78

In pretty satisfied with my strix 1080. Other than problems with there ****ty gputweak 2 causing constant errors in windows. It generally starts out around 1125 or so and ends up at 1088 after gaming for a while. Temp wise mid 60's.


----------



## escalibur

Those temps....


----------



## Maligx

I installed kuhler 620 on 1080 ftw and with stock heatsink/fans of acx 3.0 I couldn't do +100 stable so the best I could do was around 2050 with temps around 74c testing overwatch and 64c idle temps. After installing my 620 idle temps are around 37c and I was able to hit +100 so far at 2100+ with load temps of around 50c. The bracket I used was from some guy named dwood way back when I used the same kuhler on my 680.

I do have a question though, was I correct in keeping the backplate and frontplate on the board? I also have a fan attached to the bracket blowing air onto the board. Should I be good to go? Do I need to worry about vram overheating? I'm not even sure how to check the temps of the vram. So yeah so far its worth replacing the stock fan/hs with AIO solution.


----------



## juniordnz

Got a new personal record at firestrike. Never got 25k graphics before!

Stock/OC:
http://www.3dmark.com/compare/fs/9502784/fs/9598917#

Wanna know how?


Spoiler: Warning: Spoiler!







LOL

The thermal throttle is STRONG in this one...


----------



## karelbastos

Brazilian way










^^


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Got a new personal record at firestrike. Never got 25k graphics before!
> 
> Stock/OC:
> http://www.3dmark.com/compare/fs/9502784/fs/9598917#
> 
> Wanna know how?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> LOL
> 
> The thermal throttle is STRONG in this one...


Flashed my FE with MSIs GamingX OC BIOS and can't get 25000 anymore , with FE Stock BIOS recorded @ 25278
http://www.3dmark.com/3dm/13638712


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Flashed my FE with MSIs GamingX OC BIOS and can't get 25000 anymore , with FE Stock BIOS recorded @ 25278
> http://www.3dmark.com/3dm/13638712


That's nice! I'll try FE BIOS when I have the time. Just got back to my stock Armor BIOS. It seems it performs better than GamingX, maybes it's just my impression, idk.

Nice OC btw!


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> That's nice! I'll try FE BIOS when I have the time. Just got back to my stock Armor BIOS. It seems it performs better than GamingX, maybes it's just my impression, idk.
> 
> Nice OC btw!


Gonna try one more thing and RMA that FE to MSI , after flash Subvendor changed to MSI and wont flash to anything else at all , havent seen such thing with bios








Talked to MSI support , email & phone those dudes don't seem very bright


----------



## criminal

Just picked one of these up: https://www.amazon.com/gp/product/B01IIGVL3W/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

Cheapest I have ever seen for a a full cover water block, so wanted to share.


----------



## jase78

Quote:


> Originally Posted by *juniordnz*
> 
> Got a new personal record at firestrike. Never got 25k graphics before!
> 
> Stock/OC:
> http://www.3dmark.com/compare/fs/9502784/fs/9598917#
> 
> Wanna know how?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> LOL
> 
> The thermal throttle is STRONG in this one...


Like your methods. Lol. I can get to 24997 or some crap. Will try your method tonight.


----------



## arrow0309

Quote:


> Originally Posted by *jase78*
> 
> In pretty satisfied with my strix 1080. Other than problems with there ****ty gputweak 2 causing constant errors in windows. It generally starts out around 1125 or so and ends up at 1088 after gaming for a while. Temp wise mid 60's.


I'm afraid I didn't understand what kind of problems did you have.
Nice temps however, I really don't get how high (78) they were getting on the Russian youtube review








Are you using the default fans speed for those temps or a modified fan curve?
Also wanted to know, is there any seal on the back of the Strix?


----------



## Jpmboy

Quote:


> Originally Posted by *criminal*
> 
> Just picked one of these up: https://www.amazon.com/gp/product/B01IIGVL3W/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> Cheapest I have ever seen for a a full cover water block, so wanted to share.


Good find!!


----------



## kx11

Quote:


> Originally Posted by *KickAssCop*
> 
> What is the size of SLi bridge that comes with Gigabyte Xtreme Gaming 1080?
> 40, 60 or something else.


3 slots so maybe 60


----------



## Jpmboy

Quote:


> Originally Posted by *kx11*
> 
> 3 slots so maybe 60


each slot is 40.6mm.
https://www.ekwb.com/shop/ek-fc-terminal-triple-serial-plexi


----------



## Goroshi

Hey guys new here,

Just bought a 1080 FE today as there wasn't much other choice in my local store and also everything pointing towards the FE being "binned".
Been playing around with it a bit and it seems i have managed to get it stable 2038, playing GTA V for a couple of hours with temps at around 72C average with an aggressive fan curve.
Is this an ok OC for the FE or is it on the lower end? Will a hybrid kit do me any good at all in increasing this?

Thanks


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Goroshi*
> 
> Hey guys new here,
> 
> Just bought a 1080 FE today as there wasn't much other choice in my local store and also everything pointing towards the FE being "binned".
> Been playing around with it a bit and it seems i have managed to get it stable 2038, playing GTA V for a couple of hours with temps at around 72C average with an aggressive fan curve.
> Is this an ok OC for the FE or is it on the lower end? Will a hybrid kit do me any good at all in increasing this?
> 
> Thanks


Get over 2000Mhz boost 24/7, it's good.









The FE isn't binned, my opinion.


----------



## GreedyMuffin

It's not binned.

Everything over 2000 is OK, but not good imho.

Over 2100 is good, imo.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> It's not binned.
> 
> Everything over 2000 is OK, but not good imho.
> 
> Over 2100 is good, imo.


Have you tried overclocking your FE on air before putting it under water? I'm curious about how much the max oc changes with the lower temps.

Imo, <2000 bad, 2000-2100 average, >2100 sylicon lottery winners


----------



## TK421

Quote:


> Originally Posted by *Jpmboy*
> 
> each slot is 40.6mm.
> https://www.ekwb.com/shop/ek-fc-terminal-triple-serial-plexi


Any flexible HB sli bridge?


----------



## Sazexa

The build is nearing completion! Here's the two NVidia 1080 FE's. Waiting to order some waterblocks.


----------



## karelbastos

Today i discovered that one of my GTX 1080 ( have 2 SLI ) is not allowing my other gtx 1080 to reach more clock

oO

Example

One GTX 1080 TESTED alone i can reach 2115 Mhz ( STABLE ) on +215 core clock with tem 49 - 55 C

And 2088 Mhz ( STABLE ) on +210 core clock with 62 - 70 C

My Second 1080 if u use same OC settings it goes up to 2115 on begin but driver crash after 42 C

I can't go futher than +165 on core, to get stable performance.

So, if i use two on sli, i need to oc the two on +165 or one separated for each other. ( this is the way correct ? )

This is a chip lottery question ?

Thanks


----------



## Jpmboy

Quote:


> Originally Posted by *TK421*
> 
> Any flexible HB sli bridge?


not that I know of. The HB bridge is really only relevant at 4K60.
why do you trhik you need it? http://www.overclock.net/t/1603647/tpu-nvidia-geforce-gtx-1080-sli-is-the-sli-hb-bridge-essential/0_20
Quote:


> Originally Posted by *karelbastos*
> 
> Today i discovered that one of my GTX 1080 ( have 2 SLI ) is not allowing my other gtx 1080 to reach more clock
> 
> oO
> 
> Example
> 
> One GTX 1080 TESTED alone i can reach 2215 Mhz on +210 core clock with tem 49 - 55 C
> 
> And 2088 Mhz on +210 core clock with 62 - 70 C
> 
> My Second 1080 i can't go futher than +165 on core and drivers goes down ^^
> 
> So, if i use two on sly, i need to oc the two on +165 or one separated for each other. ( this is the way correct ? )
> 
> This is a chip lottery question ?
> 
> Thanks


In SLI it is always the case that the less capable card rules the max stable clocks. Running them at different clocks (asynchronous) may work in some games, and not in others. Only thing to do it make sure that the higher performing card is in slot 1 (closest to the CPU) then find settings that both are happy with.


----------



## karelbastos

Quote:


> Originally Posted by *Jpmboy*
> 
> not that I know of. The HB bridge is really only relevant at 4K60.
> In SLI it is always the case that the less capable card rules the max stable clocks. Running them at different clocks (asynchronous) may work in some games, and not in others. Only thing to do it make sure that the higher performing card is in slot 1 (closest to the CPU) then fiond settings that both are happy with.


Yes...

I tried now and don't work...

If if configure separated OC on AFTERBURNER, on HEAVEN BENCH i still getting the lower clock from the first GPU.

I will try now to change GPU slots 1 ---> 2 to see if anything change.


----------



## Jpmboy

Quote:


> Originally Posted by *karelbastos*
> 
> Yes...
> 
> I tried now and don't work...
> 
> If if configure separated OC on AFTERBURNER, on HEAVEN BENCH i still getting the lower clock from the first GPU.
> 
> I will try now to change GPU slots 1 ---> 2 to see if anything change.


wait before moving your cards around - what? you are getting a lower FPS with two cards than with one?
sorry misread your post.

1) are they identical cards (same SKU)
2) what SLI bridge (use two ribbon or one with both sli links engaged.
3) set both cards to the clocks you know the weaker one is okay with. chech stability, not frequency., On pascal clock speed doe snot always mean better performance (there can be very large amount of error correction before the driver will crash).
4) find the best clocks for productivity (eg, FPS in what ever you choose to use as a benchmark.) I prefer a DX12 test.


----------



## karelbastos

Quote:


> Originally Posted by *Jpmboy*
> 
> wait before moving your cards around - what? you are getting a lower FPS with two cards than with one?
> sorry misread your post.
> 
> 1) are they identical cards (same SKU)
> 2) what SLI bridge (use two ribbon or one with both sli links engaged.
> 3) set both cards to the clocks you know the weaker one is okay with. chech stability, not frequency., On pascal clock speed doe snot always mean better performance (there can be very large amount of error correction before the driver will crash).
> 4) find the best clocks for productivity (eg, FPS in what ever you choose to use as a benchmark.) I prefer a DX12 test.


Moved the GPU that can run stable at +215 core clock to the first position.

Still the same on SLI, i can only reach +165 core clock on SLI, because the second vga do not accept more than +165 on core clock after 42 C

I run SLI using x16 and x8 PCI EXPRESS, And just one FLEX SLI BRIDGE.

Two ZOTAC FE 1080

I have done som e testes allready with sli and the best for me was +165 on two

2000 - 2025 stable at 65 C



Maybe after i install my two EVGA hybrid on this week, i will reach more great results.

Or the other way is to sell my GPU that do not reach more clock and try lucky on one more ZOTAC 1080. The first i have lucky, the second not


----------



## MrDerrikk

Quote:


> Originally Posted by *Sazexa*
> 
> The build is nearing completion! Here's the two NVidia 1080 FE's. Waiting to order some waterblocks.


I have a not-so-secret love affair with the looks of this case, good to build in? Might have to use it for my next eventual build...


----------



## Sazexa

Quote:


> Originally Posted by *MrDerrikk*
> 
> I have a not-so-secret love affair with the looks of this case, good to build in? Might have to use it for my next eventual build...


I like the case very much. It's great to build in if you're just doing air cooling, with a crap top of space for whatever you'd like. If you are doing water cooling, limit yourself to a 240mm and 360mm radiator. I did two 360mm radiators, and it required a bit of cheating to mount and was an extremely tight fit as you can see from the top rad and front rad. Also, don't use an E-ATX motherboard lol. They claim the case can support E-ATX up to 264mm wide, I'm using 272mm wide E-ATX, and have to leave the right-side motherboard screws a little loose, as well as insulate the board with electrical tape, AND remove the side gromets from the case.

I've asked Phanteks to make an E-ATX version of the case. Realistically all they need to do is add another two inches of front-to-back depth, and another two inches in height. This would also give it another PCI slot, or maybe even two, fix E-ATX compatibility, and allow proper fitment of two 360mm radiators.

But, other than giving myself headaches, this is one of the best cases I've built in, and my favorite ATX case ever.


----------



## jase78

Quote:


> Originally Posted by *arrow0309*
> 
> I'm afraid I didn't understand what kind of problems did you have.
> Nice temps however, I really don't get how high (78) they were getting on the Russian youtube review
> 
> 
> 
> 
> 
> 
> 
> 
> Are you using the default fans speed for those temps or a modified fan curve?
> Also wanted to know, is there any seal on the back of the Strix?


gpu tweak 2 is having a lot of problems running along side some of their own software ai suite lmao. i like to use ai suite for the fan control program. hopefully asus will eventually get it sorted out though.

i immediately changed to a custom fan curve. i've def. not experienced 89c or whatever it was in that russian video.

yes there is a little sticker on one of the heat sink screws. as far as warranty goes and removing it im not sure.


----------



## juniordnz

It's insane how much temperature matter with these cards. After doing my "mcgiver it´s-not-wrong-if-it-works fan setup" I could keep much more stable clocks when playing R6S and benching. Memory clocks were sustained at +425, a clock that before would give me snowflakes that make my game look like christmas.

And that's all only with 60-65°C. I can only imagine how much better it would perform on a waterblock or even an AIO.

Can't wait to hook it up with a H110i and make run at fifties.


----------



## TK421

Quote:


> Originally Posted by *juniordnz*
> 
> It's insane how much temperature matter with these cards. After doing my "mcgiver it´s-not-wrong-if-it-works fan setup" I could keep much more stable clocks when playing R6S and benching. Memory clocks were sustained at +425, a clock that before would give me snowflakes that make my game look like christmas.
> 
> And that's all only with 60-65°C. I can only imagine how much better it would perform on a waterblock or even an AIO.
> 
> Can't wait to hook it up with a H110i and make run at fifties.


temperature matters a lot on these cards, probably because of the smaller fabrication = fragile gpu cores


----------



## Joshwaa

Are these chips really that hot on stock volts. My 780TI with EK WB running at 1300Mhz and 1.3V never got above 41C. Really wish EK would hurry up the FTW WB.


----------



## Derpinheimer

Quote:


> Originally Posted by *juniordnz*
> 
> It's insane how much temperature matter with these cards. After doing my "mcgiver it´s-not-wrong-if-it-works fan setup" I could keep much more stable clocks when playing R6S and benching. Memory clocks were sustained at +425, a clock that before would give me snowflakes that make my game look like christmas.
> 
> And that's all only with 60-65°C. I can only imagine how much better it would perform on a waterblock or even an AIO.
> 
> Can't wait to hook it up with a H110i and make run at fifties.


I have yet to see a single artifact on this card - it either runs or the program crashes (core clock) or system freezes (memory)
Quote:


> Originally Posted by *Joshwaa*
> 
> Are these chips really that hot on stock volts. My 780TI with EK WB running at 1300Mhz and 1.3V never got above 41C. Really wish EK would hurry up the FTW WB.


They run very cool. If there is any concern about temps it must be the temperature gradient..
Quote:


> Originally Posted by *GreedyMuffin*
> 
> It's not binned.
> 
> Everything over 2000 is OK, but not good imho.
> 
> Over 2100 is good, imo.


I think that seems a bit pessimistic. Bad is more like <2050 and average 2050-2125 and good >2125, if we are talking about peak clocks and not the stable value.

But it doesnt really matter anyway. I was playin around with Metro: LL

Boost Clock/Memory Offset: FPS
1898/+0: 76
1898/+575: 80
2012/+0: 77
2012/+575: 82
2114/+0: 78
2114/+575: 83

I'm guessing the same thing is seen in synthetics? Memory scales very well and core clock not so much.


----------



## axiumone

Quote:


> Originally Posted by *Joshwaa*
> 
> Are these chips really that hot on stock volts. My 780TI with EK WB running at 1300Mhz and 1.3V never got above 41C. Really wish EK would hurry up the FTW WB.


I think you'll be waiting for a long time. As I recall ek has no plans to release blocks for anything but the reference PCB on the evga side. Evga decided to use another manufacturer for their hydrocooper cards and the they broke off the relationship.


----------



## schoolofmonkey

I've noticed on my Strix that temps are playing a major part on what boosts you get.
If you set a custom curve to keep the card around 62c you'll have consistent maximum boost, anything over and the clocks start dropping.


----------



## ucode

@TWiST2k Thanks bro









Here's a run I did with cross flashed strix VBIOS on my Galax 1080 FE.



Settings were 1.000V 2012MHz GPU / 1375MHz Mem. GPU quickly dropped to 1999MHz at start then 1987, 1974, 1962 with 1974MHz being predominant during the run. Having stock air cooling and reduced fan curve with the cross flashed VBIOS naturally doesn't help. Power draw started at ~266W and even though frequency went down it appears to steadily increase with temperature finishing off at ~280W. Not sure why average frame is reported lower than min! Did two runs both very close. 8030 and 8031.


----------



## Kold

Can anyone tell me if this card uses the reference PCB and works with the Founders Edition EK 1080 block? Might buy it, if so..

EDIT: A link might help.. lol http://www.newegg.com/Product/Product.aspx?Item=N82E16814125880


----------



## TK421

Quote:


> Originally Posted by *Kold*
> 
> Can anyone tell me if this card uses the reference PCB and works with the Founders Edition EK 1080 block? Might buy it, if so..
> 
> EDIT: A link might help.. lol http://www.newegg.com/Product/Product.aspx?Item=N82E16814125880


looks an awful lot like the custom pcb G1 version: http://www.techspot.com/review/1190-gigabyte-geforce-gtx-1080-g1-gaming/


----------



## SweWiking

Ive been away for awhile so ive missed alot of the thread, but i just wanted to know: is there a bios that seems better then others that would be good to flash ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *SweWiking*
> 
> Ive been away for awhile so ive missed alot of the thread, but i just wanted to know: is there a bios that seems better then others that would be good to flash ?


It's the same situation. When I had the Sea Hawk EK X, I used an FE MSI 1080 bios and that worked better for me than the stock one. Got better bench scores with the FE bios at the same clocks as the Sea Hawk bios.


----------



## Reckit

Quote:


> Originally Posted by *jase78*
> 
> gpu tweak 2 is having a lot of problems running along side some of their own software ai suite lmao. i like to use ai suite for the fan control program. hopefully asus will eventually get it sorted out though.
> 
> i immediately changed to a custom fan curve. i've def. not experienced 89c or whatever it was in that russian video.
> 
> yes there is a little sticker on one of the heat sink screws. as far as warranty goes and removing it im not sure.


Are you suggesting to use a different tool like MSI Afterburner?. I have a 1080 strix non oc and I can't get above 1937 on firestrike bench. Temps do not play a part (doesn't go above 60c) but bench quits if I clock over that frequency.

Seems like everyone is getting higher clocks than me


----------



## KickAssCop

You lost the lottery!


----------



## GreedyMuffin

Peak dosen't matter imho. Max stable clock.

I can maintain 2126 stable pn stock voltage. With 2150 the clock will drop due tdp, so I can't run that.

We need a fricking bios for these cards.


----------



## TWiST2k

I tried to rip my bios on my 1080 FTW with GPU-Z and I couldnt get it to go, is there a trick to ripping these?


----------



## pez

I've missed quite a few replies here in the last few days. Anyone running 1080 SLI on 1440p or 21:9 1440p and seeing under-utilization on their GPUs?

GTA V was using 50-60% with spikes to 70% usage on each card at 21:9. The issue here for me is that it wasn't doing this to push 60+ FPS as I was seeing dips below. It seems the only workaround I've found so far is do either scale the resolution higher or nearly max the settings out to require it to use more VRAM. The only correlation I am seeing so far is that if I'm using less VRAM, the GPU decides it doesn't need to run with full utilization. Maxing the game out completely causes 90-100% on each GPU.

Fallout 4 has the same issue, but I have not found a workaround yet. I might try to resolution scale it to 'fix' this. Alternatively, I did not have this issue with 4K. This still falls in line with my theory that 1440p 21:9 is not pushing the cards to their limits VRAM-wise and acting as a bottleneck.


----------



## juniordnz

Quote:


> Originally Posted by *TK421*
> 
> temperature matters a lot on these cards, probably because of the smaller fabrication = fragile gpu cores


Yeah, thought about that too...and if we're right, watercooling never made more sense than with these pascal series.
Quote:


> Originally Posted by *Derpinheimer*
> 
> I have yet to see a single artifact on this card - it either runs or the program crashes (core clock) or system freezes (memory)


Me too, on synthetics. I don't even get driver crashes with the memory. It just freezes and fail the test. With core clock the same if I'm not running insane high clocks. Maybe the freeze-crash has something to do with temps. With lower temps I was much more stable and got less freezes. Maybe, just maybe, anything that doesn't crash your driver is possible. You just rely on temps.

But on rainbow six I got some minor snowflakes and periodic black screen flashes (like half a second flash). That's memory, right?

With 2088mhz peak and +425mem I got a lot less snowflakes running at 60°C than 75+++. Maybe when I watercool it I can get those values without artifacts, maybe even ramp up a little.

Anyone could tell me if there's any hope of keeping the card below 50 with a good AIO in a hot country? room temp 28-30.


----------



## PasK1234Xw

Quote:


> Originally Posted by *Reckit*
> 
> Are you suggesting to use a different tool like MSI Afterburner?. I have a 1080 strix non oc and I can't get above 1937 on firestrike bench. Temps do not play a part (doesn't go above 60c) but bench quits if I clock over that frequency.
> 
> Seems like everyone is getting higher clocks than me


use DDU and reinstall driver if you haven't already.
Ive had times when testing unstable OC crash would result in unstable driver. Resulted in driver crashes even with stable clocks. reinstalling driver fixed it..

1937 is really low increase voltage to +100 to ensure most voltage just to be safe and set core to +100 and increase +10 up to +150 once you find stable clock start to lower voltage.


----------



## Bogga

Quote:


> Originally Posted by *pez*
> 
> I've missed quite a few replies here in the last few days. Anyone running 1080 SLI on 1440p or 21:9 1440p and seeing under-utilization on their GPUs?
> 
> GTA V was using 50-60% with spikes to 70% usage on each card at 21:9. The issue here for me is that it wasn't doing this to push 60+ FPS as I was seeing dips below. It seems the only workaround I've found so far is do either scale the resolution higher or nearly max the settings out to require it to use more VRAM. The only correlation I am seeing so far is that if I'm using less VRAM, the GPU decides it doesn't need to run with full utilization. Maxing the game out completely causes 90-100% on each GPU.
> 
> Fallout 4 has the same issue, but I have not found a workaround yet. I might try to resolution scale it to 'fix' this. Alternatively, I did not have this issue with 4K. This still falls in line with my theory that 1440p 21:9 is not pushing the cards to their limits VRAM-wise and acting as a bottleneck.


Haven't really tried different settings since the fps is quite alright (3440x1440 and 1080 SLI). With current settings I'm just below 5GB VRAM usage... might give it a go later on


----------



## tin0

Quote:


> Originally Posted by *juniordnz*
> 
> Yeah, thought about that too...and if we're right, watercooling never made more sense than with these pascal series.
> Me too, on synthetics. I don't even get driver crashes with the memory. It just freezes and fail the test. With core clock the same if I'm not running insane high clocks. Maybe the freeze-crash has something to do with temps. With lower temps I was much more stable and got less freezes. Maybe, just maybe, anything that doesn't crash your driver is possible. You just rely on temps.
> 
> But on rainbow six I got some minor snowflakes and periodic black screen flashes (like half a second flash). That's memory, right?
> 
> With 2088mhz peak and +425mem I got a lot less snowflakes running at 60°C than 75+++. Maybe when I watercool it I can get those values without artifacts, maybe even ramp up a little.
> 
> Anyone could tell me if there's any hope of keeping the card below 50 with a good AIO in a hot country? room temp 28-30.


Are you touching the voltage slider or just max. the power limit? The Corsair HG10 is expected to arrive tomorrow or Thursday, so hope I can get around to mod it this weekend.


----------



## Reckit

Thanks for the reply. I've tried everything you suggested. Nothing works, I think I am just unlucky As soon as my card sees a peak above 2000 it stops the bench


----------



## juniordnz

Quote:


> Originally Posted by *tin0*
> 
> Are you touching the voltage slider or just max. the power limit? The Corsair HG10 is expected to arrive tomorrow or Thursday, so hope I can get around to mod it this weekend.


Max voltage and TDP limit. Tried to play with voltage bar but didn't get any results. The best overclock comes with everything maxed.

Please do post some temp reads before and after watercooling it. Good luck with the mod!


----------



## pez

Quote:


> Originally Posted by *Bogga*
> 
> Haven't really tried different settings since the fps is quite alright (3440x1440 and 1080 SLI). With current settings I'm just below 5GB VRAM usage... might give it a go later on


Quote:


> Originally Posted by *juniordnz*
> 
> Max voltage and TDP limit. Tried to play with voltage bar but didn't get any results. The best overclock comes with everything maxed.
> 
> Please do post some temp reads before and after watercooling it. Good luck with the mod!


I'd greatly appreciate it







.

If you can, get it to below 4GB of VRAM and see what your usage is on both GPUs, I'd be eternally grateful







.


----------



## TK421

ZOTAC AMP 1080 (non-extreme) - recorded 240w max pull from hwinfo


----------



## KillerBee33

Anyone read TitanXP reviews yet ? Seems like keeping 1080 is a better choice.


----------



## Derpinheimer

Quote:


> Originally Posted by *juniordnz*
> 
> It's insane how much temperature matter with these cards. After doing my "mcgiver it´s-not-wrong-if-it-works fan setup" I could keep much more stable clocks when playing R6S and benching. Memory clocks were sustained at +425, a clock that before would give me snowflakes that make my game look like christmas.
> 
> And that's all only with 60-65°C. I can only imagine how much better it would perform on a waterblock or even an AIO.
> 
> Can't wait to hook it up with a H110i and make run at fifties.


Quote:


> Originally Posted by *GreedyMuffin*
> 
> Peak dosen't matter imho. Max stable clock.
> 
> I can maintain 2126 stable pn stock voltage. With 2150 the clock will drop due tdp, so I can't run that.
> 
> We need a fricking bios for these cards.


What are you running to hit the tdp limit? Max I've got was like 118% with a 130% power limit setting


----------



## juniordnz

Quote:


> Originally Posted by *Derpinheimer*
> 
> What are you running to hit the tdp limit? Max I've got was like 118% with a 130% power limit setting


I've noticed temperature somehow have influence over power perfcap.

Try Firestrike Ultra Stress test. I got pwr perfcap with temps over 70.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Anyone read TitanXP reviews yet ? Seems like keeping 1080 is a better choice.


I should have 2 tomorrow so will post up results not long after quick air cooled before putting them under water.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> I should have 2 tomorrow so will post up results not long after quick air cooled before putting them under water.


I want to see Single VS SLI







, EVGA Hybrid kit coming today , want to see what Titan brings before i take my 1080 aprt and dremel it a bit








Ehh Nvidia sent me Titans page and BUY NOW button is still not clickable


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> I want to see Single VS SLI
> 
> 
> 
> 
> 
> 
> 
> , EVGA Hybrid kit coming today , want to see what Titan brings before i take my 1080 aprt and dremel it a bit


http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html

decadent is an appropriate term IMO.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html


Saw that , by that review it's not worth letting 1080 go.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Saw that , by that review it's not worth letting 1080 go.


a lot will depend on the resolution used and whether or not we can break into the bios. (I'm typing this on a rig running 2x Maxwell TitanX's running 1.274V since they launched.).








For gaming at anything less that 4K, a single 1080 should be plenty.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> a lot will depend on the resolution used and whether or not we can break into the bios. (I'm typing this on a rig running 2x Maxwell TitanX's running 1.274V since they launched.).
> 
> 
> 
> 
> 
> 
> 
> 
> For gaming at anything less that 4K, a single 1080 should be plenty.


Heh , i see they already testing it @ 5K . 1.274 aint nothing fo Maxwell , had my co workers 980 G1 @ 1.312V


----------



## GreedyMuffin

I actually don't care about the new Titan X. It's beast, but uses more power which was the main reason I went away from the 980Ti (which is in another room, folding) and why I lowered my CPUs OC.


----------



## KillerBee33

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I actually don't care about the new Titan X. It's beast, but uses more power which was the main reason I went away from the 980Ti (which is in another room, folding) and why I lowered my CPUs OC.


Humm i did notice about 300 points drop of my CPU after 1080 was installed , not sure if that is related .


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Heh , i see they already testing it @ 5K . 1.274 aint nothing fo Maxwell , had my co workers 980 G1 @ 1.312V


yeah - and I had my 2 980Ti Kingpins at stupidly higher NVVDD that that.. didnlt help much tho. Only maxwell that really stretched it's legs with higher voltage was the TX in my experience. Above 1.274 (bios, actual measured with a DMM was 1.265 due to vdroop) FYI- entering 1.312V in the bios mod only produced 1.265V under load - agaihn, measured with a DMM off the power caps.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Humm i did notice about 300 points drop of my CPU after 1080 was installed , not sure if that is related .


if you are on a single card, change the PCIE slot to Gen 2 - your CPU score will return to normal.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - and I had my 2 980Ti Kingpins at stupidly higher NVVDD that that.. didnlt help much tho. Only maxwell that really stretched it's legs with higher voltage was the TX in my experience. Above 1.274 (bios, actual measured with a DMM was 1.265 due to vdroop) FYI- entering 1.312V in the bios mod only produced 1.265V under load - agaihn, measured with a DMM off the power caps.


Well my 980 was @ 1.275 running 1582Mhz but no matter how hard we tried with that ******ed G1 it will not get stable anywhere after 1506Mhz, thats why i have my doubts about ASIC theory , mine was 65.2 and G1 ws 78.6


----------



## GreedyMuffin

My old 980Ti did 1524 on stock voltage, and 1575 with voltage increasement. Never tried v-bios on that card though.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> if you are on a single card, change the PCIE slot to Gen 2 - your CPU score will return to normal.


Got time to explain? You mean in MB bios settings or an actual slot?


----------



## fat4l

Quote:


> Originally Posted by *Jpmboy*
> 
> if you are on a single card, change the PCIE slot to Gen 2 - your CPU score will return to normal.


Interesting. Explain pls








Any practical use other than benching ?


----------



## juniordnz

My old 970G1 did 1605mhz with 1,232V


----------



## Snabeltorsk

Did anyone tried the Classiefied bios on an asus/gigabyte/palit or whatever to se if it makes a difference ?


----------



## TK421

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Did anyone tried the Classiefied bios on an asus/gigabyte/palit or whatever to se if it makes a difference ?


I'm still wondering how cross flashing even work without brick the card in the first place. Moreso with cards that have a different voltage controller and power phases...


----------



## juniordnz

Just ordered some heatsinks and a custom adapter. On the weekend my 1080 will be rolling on a H80i. Lets see if one 120mm thick rad is enough to keep it around 50°C.

Wish me luck


----------



## Spieler4

Got a MSI GTX 1080 sea hawk yesterday

Temps around 50-55 C and No dpc issues when gaming Phewww


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Got time to explain? You mean in MB bios settings or an actual slot?


yes - set the PCIE x16 gen3 slot to Gen 2 in bios.
It's just empirical (and a technical answer would be a guess at this point) - setting to Gen2 (single card) has no negative effect on gfx scores but improves physics scores. The LN2 guys ferreted this out.
Quote:


> Originally Posted by *fat4l*
> 
> Interesting. Explain pls
> 
> 
> 
> 
> 
> 
> 
> 
> Any practical use other than benching ?


probably not. I responded to his point about loosing 300pts after putting the 1080 in.


----------



## juniordnz

Quote:


> Originally Posted by *Spieler4*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got a MSI GTX 1080 sea hawk yesterday
> 
> Temps around 50-55 C and No dpc issues when gaming Phewww


Guess one 120mm rad won't be enough here...

What's your room temp? Was hoping to keep it under 50 with a 280mm rad.


----------



## Spieler4

Quote:


> Originally Posted by *juniordnz*
> 
> Guess one 120mm rad won't be enough here...
> 
> What's your room temp? Was hoping to keep it under 50 with a 280mm rad.


my room temp is around 21 C
FAN is set to 800 rpm. 1000-1200 rpm should put temps around 50 c I guess. But I like it quiet


----------



## fat4l

Guys, I'm about to start putting waterblock on my 1080.
I'm gonna do the TDP mod and mod one of the resistors.

Will modding just one be nuff to remove the tdp limits in order to prevent throttling ?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> yes - set the PCIE x16 gen3 slot to Gen 2 in bios.
> It's just empirical (and a technical answer would be a guess at this point) - setting to Gen2 (single card) has no negative effect on gfx scores but improves physics scores. The LN2 guys ferreted this out.
> probably not. I responded to his point about loosing 300pts after putting the 1080 in.


Thank you. Will try out tonight and post results . My 6700k got up to 15100 @ 4.7 , just once and now it stays at around 14800 score


----------



## KillerBee33

It's here but im gonna wait for Titan review before i open this


----------



## TK421

I stopped using hybrid kit, pump too noisy for my taste.

Now running the Zotac 1080 AMP (non exreme) with dual 2150 gentle typhoons


----------



## KillerBee33

Quote:


> Originally Posted by *TK421*
> 
> I stopped using hybrid kit, pump too noisy for my taste.
> 
> Now running the Zotac 1080 AMP (non exreme) with dual 2150 gentle typhoons


this thing runs quiet for a long time , this is the idea for the 1080 now


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> this thing runs quiet for a long time , this is the idea for the 1080 now


That's a nice, clean job!









How about temp results before and after? Max overclock before and after?

When you have some time I'd love to see that.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> That's a nice, clean job!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How about temp results before and after? Max overclock before and after?
> 
> When you have some time I'd love to see that.


Absolute max i got out of it, regulary 45-50 gaming unless AC is off







max i've seen was 60


----------



## TK421

Quote:


> Originally Posted by *KillerBee33*
> 
> this thing runs quiet for a long time , this is the idea for the 1080 now


I can still hear the pump buzzing since all my fans are 350~rpm on idle


----------



## KillerBee33

Quote:


> Originally Posted by *TK421*
> 
> I can still hear the pump buzzing since all my fans are 350~rpm on idle


Pump defected? Only time i heard what Pump noise is on my Kraken X31 on startup but it went away with a new driver .


----------



## juniordnz

Just ordered a new 1080FTW. My MSI armor was getting too hot and, besides that, EVGA superb customer care and three years warranty got me.


----------



## boredgunner

Quote:


> Originally Posted by *juniordnz*
> 
> Just ordered a new 1080FTW. My MSI armor was getting too hot and, besides that, EVGA superb customer care and three years warranty got me.


Yeah next time I should exercise patience before buying a card with a heatsink as small as the MSI Armor. I have one too and depending on the game it will either sit in the mid 60s, low 70s, or low 80s. Not a problem since even in low 80s it mostly runs at 2050-2063 MHz, not to mention it's inaudible in my PC even at 100%, but for the money the FTW is better (and the FTW is the best looking card ever in my opinion).


----------



## metal409

Quote:


> Originally Posted by *toncij*
> 
> Dual 360?
> 
> 
> 
> 
> 
> 
> 
> Worth it?
> 
> Also, what tubing u're using?
> 
> I'm thinking of moving from H115i on both 1080 and 5960X to custom loop with EK 360 kit. Not sure if i'll get anything, but while TitanX was at about 65°C on 1,5GHz loaded (+112V) my 5960X on 4.5 jumps to 80°C... which I find disturbingly hot for an AIO.


Sorry, just saw this reply. Dual 360 may be overkill, but I figured I would max out what my case could run for the fun of it. Plus, it helped out when I ran crossfire 290x.







I am running Primochill Advanced LRT 1/2 ID/3/4 OD tubing, had a bunch left over.


----------



## ikjadoon

Quote:


> Originally Posted by *pez*
> 
> I've missed quite a few replies here in the last few days. Anyone running 1080 SLI on 1440p or 21:9 1440p and seeing under-utilization on their GPUs?
> 
> GTA V was using 50-60% with spikes to 70% usage on each card at 21:9. The issue here for me is that it wasn't doing this to push 60+ FPS as I was seeing dips below. It seems the only workaround I've found so far is do either scale the resolution higher or nearly max the settings out to require it to use more VRAM. The only correlation I am seeing so far is that if I'm using less VRAM, the GPU decides it doesn't need to run with full utilization. Maxing the game out completely causes 90-100% on each GPU.
> 
> Fallout 4 has the same issue, but I have not found a workaround yet. I might try to resolution scale it to 'fix' this. Alternatively, I did not have this issue with 4K. This still falls in line with my theory that 1440p 21:9 is not pushing the cards to their limits VRAM-wise and acting as a bottleneck.


Out of curiosity, what's the CPU utilization when you're at 50-60% GPU util?
Quote:


> Originally Posted by *TK421*
> 
> I can still hear the pump buzzing since all my fans are 350~rpm on idle


I think only the new Gen5 coolers from Asetek are truly quiet.


----------



## juniordnz

Quote:


> Originally Posted by *boredgunner*
> 
> Yeah next time I should exercise patience before buying a card with a heatsink as small as the MSI Armor. I have one too and depending on the game it will either sit in the mid 60s, low 70s, or low 80s. Not a problem since even in low 80s it mostly runs at 2050-2063 MHz, not to mention it's inaudible in my PC even at 100%, but for the money the FTW is better (and the FTW is the best looking card ever in my opinion).


Yeah, imagine here where room temps are most of the time 28-30°C. I was getting 78 in some situations with fans @100% speed.

Also, mine had 6 months warranty only, no backplate and was going to end up buying a custom one from EK. So, in the end, I spent more with EVGA, got a better cooler, a nice backplate, more powerphases, a whole lot better lonking card and 3 years warranty (which is extremely importante if I want to sell it 2 years from now. much easier to sell something that still has a full year covered. here in brazil evga sells a lot easier second hand than any other brand).

As the name says, EVGA FTW.


----------



## boredgunner

Quote:


> Originally Posted by *juniordnz*
> 
> Yeah, imagine here where room temps are most of the time 28-30°C. I was getting 78 in some situations with fans @100% speed.
> 
> Also, mine had 6 months warranty only, no backplate and was going to end up buying a custom one from EK. So, in the end, I spent more with EVGA, got a better cooler, a nice backplate, more powerphases, a whole lot better lonking card and 3 years warranty (which is extremely importante if I want to sell it 2 years from now. much easier to sell something that still has a full year covered. here in brazil evga sells a lot easier second hand than any other brand).
> 
> As the name says, EVGA FTW.


My room actually does get that hot this time of year. My fan is at 100% whenever it's above 70c, since I can't hear it at that speed anyway. But when I buy an HBM 2 GP100 (my next GPU), I'll wait it out and get an EVGA KINGPIN probably (or whatever is best).


----------



## juniordnz

Quote:


> Originally Posted by *boredgunner*
> 
> My room actually does get that hot this time of year. My fan is at 100% whenever it's above 70c, since I can't hear it at that speed anyway. But when I buy an HBM 2 GP100 (my next GPU), I'll wait it out and get an EVGA KINGPIN probably (or whatever is best).


Me, I just got it because it was the cheaper of them all. But it turned out to be a disappointment. Bad cooling, bad overclocking, no backplate (pcb sagging), no warranty...when you buy hardware that expensive, warranty and customer service should be a concern.

Lucky me I bought from a local dealer that is very friendly and agreed to trade my card even after 15 days of use.


----------



## LolCakeLazors

The coil whine on my EVGA 1080 FTW is louder than the fans themselves. Anyone else get bad coil whine? Got my RMA confirmation yesterday and I'm going to ship it back to EVGA tomorrow.


----------



## juniordnz

Quote:


> Originally Posted by *LolCakeLazors*
> 
> The coil whine on my EVGA 1080 FTW is louder than the fans themselves. Anyone else get bad coil whine? Got my RMA confirmation yesterday and I'm going to ship it back to EVGA tomorrow.


you're the first one I see complaining about that with the FTW. Bad luck I gues. Have you tried benching for a long period of time? May sound crazy, but I've seen people saying it sometimes helps to get rid of the coil whine.


----------



## LolCakeLazors

Ran Unigine Heaven overnight, didn't help. Tried my old XFX 650W on it by only powering the GPU with it and the rest of the computer was hooked up to my 750W G2. Still had coil whine. Swapped out PCI-E cables on my 750W and that didn't work either. The PSU isn't even old, I RMAed it last year.






Some of the noise in the video is the case fans but the coil whine is there. It's really apparent when I exit Heaven.


----------



## looniam

^ have a +rep for already doing the troubleshooting i was going to suggest; not that i am an expert . . .

but i play one on TV


----------



## zlpw0ker

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Ran Unigine Heaven overnight, didn't help. Tried my old XFX 650W on it by only powering the GPU with it and the rest of the computer was hooked up to my 750W G2. Still had coil whine. Swapped out PCI-E cables on my 750W and that didn't work either. The PSU isn't even old, I RMAed it last year.
> 
> 
> 
> 
> 
> 
> Some of the noise in the video is the case fans but the coil whine is there. It's really apparent when I exit Heaven.


I hear the the coil whine,but its very minimal in my opinion. when you move the camera away from the open case I cant hear it. I would probably just ignore that noise unless you have a noise suppresion headphones like me. I more or less dont hear any fans when I have the headphones on,but I do however hear the minimal coilwhine which can drive me insane sometimes xD.

Im actually troubleshooting my h100i atm,because I could have sworn that I heard coil whine from the pump,but I bought a NB 14-2 to replace to replace the fans I have in front of my case.


----------



## LolCakeLazors

Ah well I have open back headphones and once you hear it, it's pretty hard not to notice. Near idle, it has this distinct growling noise which is louder than all my fans. When it's on load, the growl combines with a screeching noise. I wouldn't really care about the noise if I wasn't going to water cool in the future. Coming from a 290X that had coil whine as well, I just wanted some change in my life


----------



## zlpw0ker

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Ah well I have open back headphones and once you hear it, it's pretty hard not to notice. Near idle, it has this distinct growling noise which is louder than all my fans. When it's on load, the growl combines with a screeching noise. I wouldn't really care about the noise if I wasn't going to water cool in the future. Coming from a 290X that had coil whine as well, I just wanted some change in my life


ye,once you notice it wont EVER go away in your head. I know the feeling. I had the Fury X which had coilwhine to and I had to send it back,I got my money back btw. Second is my h100i which I think its coilwhine in the pump,but im not sure. since I heard under load before,im gonna try a few benchmarks and games.

ill try and hear it again,but ye if you're gonna do waterloop I think the noise could be even more clearer.


----------



## turtletrax

Ordered nickel/plexi heatkiller blocks and backplates for my Asus FE cards. Then I can see what they can really do. If they are as fast as they are gonna look, I will be pretty happy


----------



## wsarahan

Hi guys how are you

SO i tought i was stable with my 1080 SLI till test in some heavy games like GTAV and Witcher 3, after that i saw that i was not stable at all and to stabilize the SLI OC was a pain in the ass

Finally i got stable, not what i wanted but it`s what i got

2076 to start, after some degrees 2063 and after 65C 2050..... and go on

Memos 5500 Stable....

Is it a still good OC for the SLI? If i pump +5 core or +5 vcore, anything else more i get a crash in games

Here is the Afterburner picture, if you guys have some advice i`ll be thankfull, but let me know if the oc is ok to a SLI rig, the cards are EVGA SC ACX 3.0



Thanks guys


----------



## KillerBee33

http://www.nvidia.com/download/driverResults.aspx/105685/en-us
369.05 WHQL
I tried but it says no compatible Hardware...


----------



## TWiST2k

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.nvidia.com/download/driverResults.aspx/105685/en-us
> 369.05 WHQL
> I tried but it says no compatible Hardware...


Did you see under compatible hardware, its for the Titan.


----------



## xer0h0ur

Quote:


> Originally Posted by *turtletrax*
> 
> Ordered nickel/plexi heatkiller blocks and backplates for my Asus FE cards. Then I can see what they can really do. If they are as fast as they are gonna look, I will be pretty happy


The only real benefits of water cooling a Pascal card right now is to maintain the overclocks under load so it doesn't step down the voltage and clock speed. Well that and relative system silence, assuming you're using an open loop instead of hybrid coolers.


----------



## turtletrax

Yup, was the plan! I don't consider an overclock you can't maintain an overclock at all.


----------



## ikjadoon

Quote:


> Originally Posted by *xer0h0ur*
> 
> The only real benefits of water cooling a Pascal card right now is to maintain the overclocks under load so it doesn't step down the voltage and clock speed. Well that and relative system silence, assuming you're using an open loop instead of hybrid coolers.


Though, I've seen some very promising Gen5 Asetek AIOs; compared to a Noctua NH-U14S (single Noctua A15 fan), these Gen5 120mm rads with dual fans are quieter and cooler than the single A15 on the NH-U14S. But, I think only Arctic Cooling and Cryorig have released Gen5.







Corsair has only updated the H80i to Gen5.

Not saying a single A15 is the quietest setup you could get, but it should set a relatively high (low?) bar for quiet.


----------



## KickAssCop

Quote:


> Originally Posted by *juniordnz*
> 
> you're the first one I see complaining about that with the FTW. Bad luck I gues. Have you tried benching for a long period of time? May sound crazy, but I've seen people saying it sometimes helps to get rid of the coil whine.


Tons of report of coil whine on FTW cards on evga forums. Also FTW runs hotter than most AIB cards. Best one out in my opinion is the Strix (after you tighten some screws lol).
Bad coil whine reports are for FTW and Gigabyte G1 cards mostly. See some egg/Amazon reviews.


----------



## KickAssCop

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you
> 
> SO i tought i was stable with my 1080 SLI till test in some heavy games like GTAV and Witcher 3, after that i saw that i was not stable at all and to stabilize the SLI OC was a pain in the ass
> 
> Finally i got stable, not what i wanted but it`s what i got
> 
> 2076 to start, after some degrees 2063 and after 65C 2050..... and go on
> 
> Memos 5500 Stable....
> 
> Is it a still good OC for the SLI? If i pump +5 core or +5 vcore, anything else more i get a crash in games
> 
> Here is the Afterburner picture, if you guys have some advice i`ll be thankfull, but let me know if the oc is ok to a SLI rig, the cards are EVGA SC ACX 3.0
> 
> 
> 
> Thanks guys


If you can maintain about 2025-2050 MHz stable in SLi over hours of gaming, that is quite an achievement with most air cooled cards.


----------



## Kold

Quote:


> Originally Posted by *KickAssCop*
> 
> Tons of report of coil whine on FTW cards on evga forums. Also FTW runs hotter than most AIB cards. Best one out in my opinion is the Strix (after you tighten some screws lol).
> Bad coil whine reports are for FTW and Gigabyte G1 cards mostly. See some egg/Amazon reviews.


It's all luck of the draw. My FTW was boosting to 2009 out of the box. It stayed under 67C in a 28C room with the fan speed set to 60%. Oh, and fan speed under 65% was nearly inaudible. During load screens for Heaven and Valley (notoriously know to cause coil whine due to high FPS) it was so minimal, I couldn't hear it unless I put my ear directly to the card.

Basically, don't go spreading misinformation just because you had a single bad experience or even just an opinion. All vendors will have a percentage of cards that have excessive coil whine, higher than normal temps..possibly due to thermal paste application and other random issues.

http://www.newegg.com/Product/Product.aspx?Item=n82e16814487245

^ 140+ reviews.. 97% of those reviews are 5 star rated.


----------



## Randomocity

Quote:


> Originally Posted by *Kold*
> 
> It's all luck of the draw. My FTW was boosting to 2009 out of the box. It stayed under 67C in a 28C room with the fan speed set to 60%. Oh, and fan speed under 65% was nearly inaudible. During load screens for Heaven and Valley (notoriously know to cause coil whine due to high FPS) it was so minimal, I couldn't hear it unless I put my ear directly to the card.
> 
> Basically, don't go spreading misinformation just because you had a single bad experience or even just an opinion. All vendors will have a percentage of cards that have excessive coil whine, higher than normal temps..possibly due to thermal paste application and other random issues.
> 
> http://www.newegg.com/Product/Product.aspx?Item=n82e16814487245
> 
> ^ 140+ reviews.. 97% of those reviews are 5 star rated.


I couldn't agree more. Just got my FTW in today, 2012 Mhz out of the box and 2100 stable over an hour. Can't wait to put it under water when ek finally releases their FTW block. I'm quite sad that it was pushed back a month, but who am I to really complain?


----------



## pez

Quote:


> Originally Posted by *ikjadoon*
> 
> Out of curiosity, what's the CPU utilization when you're at 50-60% GPU util?
> I think only the new Gen5 coolers from Asetek are truly quiet.


I have to check my first post about it for exact numbers, but IIRC, 20-40% normally with spikes/peaks at 60%.
Quote:


> Originally Posted by *TWiST2k*
> 
> Did you see under compatible hardware, its for the Titan.


Regardless, the release notes for the driver includes fixes for the GTX 1070/1080. It only specifies that it adds support for Titan X:
Quote:


> What's New in Version 369.05 WHQL
> Game Ready Drivers provide the best possible gaming experience for all major new
> releases, including Virtual Reality games. Prior to a new title launching, our driver team
> works up until the last minute to ensure every performance tweak and bug fix is
> included for the best gameplay on day one.
> 
> Gaming Technology
> Adds support for the new NVIDIA TITAN X, featuring the NVIDIA Pascal™
> architecture. Whatever you're doing, this ground-breaking TITAN X gives you the power
> to accomplish things you never thought possible.


----------



## Menno

Quote:


> Originally Posted by *KickAssCop*
> 
> If you can maintain about 2025-2050 MHz stable in SLi over hours of gaming, that is quite an achievement with most air cooled cards.


I can do 2050MHz stable in SLI on FEs with 100% fan and the rest of the case fans on 100%? It boosts to 2100 but then temp throttling kicks in. If I put these on water it can hit 2100.

With normal profile it's about 1986 - 2012 in SLI. Those extra 100-150mhz isn't worth for me to buy all those blocks.


----------



## grimboso

Quote:


> Originally Posted by *Kold*
> 
> It's all luck of the draw. My FTW was boosting to 2009 out of the box. It stayed under 67C in a 28C room with the fan speed set to 60%. Oh, and fan speed under 65% was nearly inaudible. During load screens for Heaven and Valley (notoriously know to cause coil whine due to high FPS) it was so minimal, I couldn't hear it unless I put my ear directly to the card.
> 
> Basically, don't go spreading misinformation just because you had a single bad experience or even just an opinion. All vendors will have a percentage of cards that have excessive coil whine, higher than normal temps..possibly due to thermal paste application and other random issues.
> 
> http://www.newegg.com/Product/Product.aspx?Item=n82e16814487245
> 
> ^ 140+ reviews.. 97% of those reviews are 5 star rated.


I had some audible coilwhine on my ftw. My friend had so as well. Went away for both of us by doing some benches and whatnot. Seems that was the case for several people, from Reading forums. Can't say tho that "audible" was very loud. You can barely hear it when fan is still running flow but fps is high.
Quote:


> Originally Posted by *Randomocity*
> 
> I couldn't agree more. Just got my FTW in today, 2012 Mhz out of the box and 2100 stable over an hour. Can't wait to put it under water when ek finally releases their FTW block. I'm quite sad that it was pushed back a month, but who am I to really complain?


Mine boosted to 2025 out of the Box. Some fiddling with curve sets me at roughly 2120 and a 3dmark graphic score of 24700. Reelly want to break that 25k barrier! (Will post screenshots in a little while. Just went to bed).

Running 100% fan does not serm to Thermal throttle me atleast. Sitting in at around 60 with my 2120 OC.

One thing I did err on was that the slave bios seems to accept less Hz. A 110 offset with main bios was stable, while it was not with the slave. Also I got better scores with the main bios over the slave bios. Same settings gave almost 300 differance in graphic score, in favor of the main bios. Anyone got any insight to what the differance is other than power target (130) and a different fan profile?


----------



## x7007

Windows 10 update anniversary even fix it more..

you can get it from here

https://support.microsoft.com/en-us/help/12387/windows-10-update-history



This even browsing Chrome and surfing .... yey Fixed !
EDIT : It jump again after a while but it seems way more steady , not big jumps


----------



## Spiriva

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.nvidia.com/download/driverResults.aspx/105685/en-us
> 369.05 WHQL
> I tried but it says no compatible Hardware...


Modded inf for Win10.

http://www10.zippyshare.com/v/75UsrdOE/file.html

Don't forget to disable driver signature enforcement.


----------



## KillerBee33

Quote:


> Originally Posted by *TWiST2k*
> 
> Did you see under compatible hardware, its for the Titan.


It says addSupport to TitanX







Quote:


> Originally Posted by *Spiriva*
> 
> Modded inf for Win10.
> 
> http://www10.zippyshare.com/v/75UsrdOE/file.html
> 
> Don't forget to disable driver signature enforcement.


Got it for Win10 , didnt read Supported Products at first but it does say Titan only.
Anyone else has 3DMark issue after Win10 Anniversary Update? It just says NOT AVAILABLE under every test .


----------



## TWiST2k

Quote:


> Originally Posted by *KickAssCop*
> 
> Tons of report of coil whine on FTW cards on evga forums. Also FTW runs hotter than most AIB cards. Best one out in my opinion is the Strix (after you tighten some screws lol).
> Bad coil whine reports are for FTW and Gigabyte G1 cards mostly. See some egg/Amazon reviews.


I have a FTW and no coil whine and fans at 100% keep me under 60c or so and I am in SoCal so temps here are not pleasant right now lol.


----------



## TWiST2k

Quote:


> Originally Posted by *KillerBee33*
> 
> It says addSupport to TitanX
> 
> 
> 
> 
> 
> 
> 
> 
> Got it for Win10 , didnt read Supported Products at first but it does say Titan only.
> Anyone else has 3DMark issue after Win10 Anniversary Update? It just says NOT AVAILABLE under every test .


Been running the anniversary update for a couple weeks now and no issues with 3dmark or anything else
Quote:


> Originally Posted by *juniordnz*
> 
> Yeah, imagine here where room temps are most of the time 28-30°C. I was getting 78 in some situations with fans @100% speed.
> 
> Also, mine had 6 months warranty only, no backplate and was going to end up buying a custom one from EK. So, in the end, I spent more with EVGA, got a better cooler, a nice backplate, more powerphases, a whole lot better lonking card and 3 years warranty (which is extremely importante if I want to sell it 2 years from now. much easier to sell something that still has a full year covered. here in brazil evga sells a lot easier second hand than any other brand).
> 
> As the name says, EVGA FTW.


I am quoting the wrong post for you, but how do you plan to hookup a AIO cooler to the FTW? I would LOVE to use one of my corsair units on my FTW card!!


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> Been running the anniversary update for a couple weeks now and no issues with 3dmark or anything else
> I am quoting the wrong post for you, but how do you plan to hookup a AIO cooler to the FTW? I would LOVE to use one of my corsair units on my FTW card!!


Just use an adapter like Kraken G10 (only works with a few AIO coolers) or a custom made one like the artisan sells here on the forum. I'll have one custom made for me with 2mm aluminum plate, since I live far far away from the artisan. shouldn't be expensive at all.

One good thing about the FTW is that it's heatplate cover the entire pcb, leaving only the die exposed, that will keep everything else cooled (especially if you blow some cold ar onto it) while keeping the GPU cold as banker's hearth.
Quote:


> Originally Posted by *grimboso*
> 
> Running 100% fan does not serm to Thermal throttle me atleast. Sitting in at around 60 with my 2120 OC.


Anything above 49ºC already causes the card to throttle.
Quote:


> Originally Posted by *Randomocity*
> 
> I couldn't agree more. Just got my FTW in today, 2012 Mhz out of the box and 2100 stable over an hour. Can't wait to put it under water when ek finally releases their FTW block. I'm quite sad that it was pushed back a month, but who am I to really complain?


Boy, I'm so happy to hear that. I was a little bit afraid because I heard from some reviewers that FTW was not a good overclocker. But it seems it's only luck-o-the-draw anyway...

Going to return my armor and get a brand new sexy FTW on friday


----------



## Spiriva

Quote:


> Originally Posted by *KillerBee33*
> 
> It says addSupport to TitanX
> 
> 
> 
> 
> 
> 
> 
> 
> Got it for Win10 , didnt read Supported Products at first but it does say Titan only.
> Anyone else has 3DMark issue after Win10 Anniversary Update? It just says NOT AVAILABLE under every test .


If you use the modded inf you can install the driver using any Geforce card tho, i got it installed with my 1080´s.

3DMark seems to be working for me w/o any issues.


----------



## KillerBee33

Quote:


> Originally Posted by *Spiriva*
> 
> If you use the modded inf you can install the driver using any Geforce card tho, i got it installed with my 1080´s.
> 
> 3DMark seems to be working for me w/o any issues.


I guess i'll have to clean install Win.now. Refuses to load , not from Steam not from EXE


----------



## GreedyMuffin

'Anything above 49ºC already causes the card to throttle.'

I can confirm this.


----------



## grimboso

Quote:


> Originally Posted by *juniordnz*
> 
> .
> Anything above 49ºC already causes the card to throttle.


Quote:


> Originally Posted by *GreedyMuffin*
> 
> 'Anything above 49ºC already causes the card to throttle.'
> 
> I can confirm this.


How much should it throttle? When I ran some loops for an hour of heaven it stayed at Max clock the entitre time. Will check again that I didnt missread


----------



## raidflex

Quote:


> Originally Posted by *Randomocity*
> 
> I couldn't agree more. Just got my FTW in today, 2012 Mhz out of the box and 2100 stable over an hour. Can't wait to put it under water when ek finally releases their FTW block. I'm quite sad that it was pushed back a month, but who am I to really complain?


I had the exact same clock speeds at stock, mine boosts to 2012 Mhz also. I have not yet tried even overclocking it, also waiting for the WB from EK.


----------



## juniordnz

Quote:


> Originally Posted by *grimboso*
> 
> How much should it throttle? When I ran some loops for an hour of heaven it stayed at Max clock the entitre time. Will check again that I didnt missread


Maybe 13mhz every 10°C or so after 49°C.

Check if your core clock when you start the test (with your idle temp) is sustained after 40, 45, 50, 55 and so on...


----------



## fat4l

Ok guys.
I asked Der8auer about shunt resistors in regards of removing TDP throttling.

As we know we have 3 resistors, RS1/2/3 and I asked him if we need to do all 3 or just one he showed in the video.
His reply was:
_"The differerent resistors are for different voltage rails. So you usually have one for GPU current, one for memory current and one for PLL current. If you mod the one GPU shunt resistor (usually the biggest one) it should be fine.﻿"_

Vid here:


----------



## TK421

49c seems quite low for throttle point, why is this done deliberately?


----------



## juniordnz

Quote:


> Originally Posted by *TK421*
> 
> 49c seems quite low for throttle point, why is this done deliberately?


Maybe to force us into buying waterblocks or AIO kits for our cards. (conspiracy theory)

But it does happen...


----------



## KillerBee33

Will be selling MSI 1080 FE .


----------



## karlahoin

Finally got my Zotac 1080 AMP Extreme yesterday. It was a tight fit:


http://imgur.com/lAHf5


Seen it at 2.185 MHz for some time but it stabilizes at 2.088 MHz while gaming (GTA V, Witcher 3 and Fallout 4 mostly). This was at stock.

I'm not familiar with the OC of Pascal yet, but did a few quick tries just using the sliders on afterburner. It seems to handle +20 MHz GPU and +50 MHz RAM and that was it. Even the +20 didn't seem stable in the long term. Even with that OC, I couldn't notice any difference in FPS.. maybe +1 FPS in W3.

I'm fairly happy with it tho, never see it below 2 GHz while gaming and it is pretty cool -. As in 57-62C while gaming (in a room with 22C ambient temperature).
The fans are a bit noisier than I expected tho.


----------



## wsarahan

Quote:


> Originally Posted by *KickAssCop*
> 
> If you can maintain about 2025-2050 MHz stable in SLi over hours of gaming, that is quite an achievement with most air cooled cards.


Thanks

Yes I can reach 2050/2063 90% of the times in all games

Will improve my airflow in my cosmos 2 to see if I can get temps lower


----------



## grimboso

Quote:


> Originally Posted by *juniordnz*
> 
> Maybe 13mhz every 10°C or so after 49°C.
> 
> Check if your core clock when you start the test (with your idle temp) is sustained after 40, 45, 50, 55 and so on...




Here is the clock, and even at tempratures at 58 it does not clock down.
However when running firestrike it does, as you suggest, clock down by 12, to 2114.

Here is my latest Firestrike score: http://www.3dmark.com/fs/9617547. 24 734 graphic score. I guess that is pretty decent?

I have not used firestrike for long, but I get the feeling that the St.dev. for the results is quite large. Here is another with exact same settings:
http://www.3dmark.com/3dm/13833916?
Why the big differance in score?

As it seems to be the concensus that FE bios is the best. Would flashing FE Bios to my FTW card benefit my scores / capability?


----------



## juniordnz

Quote:


> Originally Posted by *grimboso*
> 
> 
> 
> Here is the clock, and even at tempratures at 58 it does not clock down.
> However when running firestrike it does, as you suggest, clock down by 12, to 2114.
> 
> Here is my latest Firestrike score: http://www.3dmark.com/fs/9617547. 24 734 graphic score. I guess that is pretty decent?
> 
> I have not used firestrike for long, but I get the feeling that the St.dev. for the results is quite large. Here is another with exact same settings:
> http://www.3dmark.com/3dm/13833916?
> Why the big differance in score?
> 
> As it seems to be the concensus that FE bios is the best. Would flashing FE Bios to my FTW card benefit my scores / capability?


Now that's a first...

Very nice to see those stable clocks at almost 60°C, I'm shure everyone will be as surprised as I am. Heaven is "light" though. I could get more stable clocks with it too, but not THAT stable.

I'm not shure if it's FE bios that's causing all that difference in overclocks. But if you try it out, please report your findings









And firestrike does have some % of error in it's results. If you run 3 consecutive benchmarks you'll get 3 different results. Now imagine in a different PC.


----------



## grimboso

Quote:


> Originally Posted by *juniordnz*
> 
> Now that's a first...
> 
> Very nice to see those stable clocks at almost 60°C, I'm shure everyone will be as surprised as I am. Heaven is "light" though. I could get more stable clocks with it too, but not THAT stable.
> 
> I'm not shure if it's FE bios that's causing all that difference in overclocks. But if you try it out, please report your findings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And firestrike does have some % of error in it's results. If you run 3 consecutive benchmarks you'll get 3 different results. Now imagine in a different PC.


I'm going to do some more tinkering with more demanding tests, and try some games. The 300 score differance is only about 1% so I guess just a thing as getting a message on steam can affect the score.

Going to test the FE bios later today - I think. Never tried to flash a bios before







Don't want to ruin anything!


----------



## zlpw0ker

im getting my 1080 seahawk x on friday
cant wait to install it.


----------



## nexxusty

Quote:


> Originally Posted by *fat4l*
> 
> Haters ? ugh... I think you are in the wrong thread then...
> 
> 
> 
> 
> 
> 
> 
> 
> <= Pretty much this.
> 
> You @uberwootage are acting like its area 51 and noone can know about anything. I thought this forum here is about spreading love and helping each other but guess not...
> And if its really that top secret, then there's a more polite way of saying so, rather than "haters gonna hate".
> Yes you are the winner sir, cuz you have the modded bios, you are waaaay above us and can laugh at all of us, haters.. L0L


He thinks he's elite for some reason.

Uber bro... don't post about it if you can't share it with us. The old "oh I have a special BIOS thats amazing but you can't have it. Even though it's just a file".

Being from a heavy Linux, Unix, BSD background.... I do understand not releasing something until it's ready. I get that.

It works however. You stated this multiple times. At this point, the creator is just being a knob if he won't give it up. Period.


----------



## Derpinheimer

Quote:


> Originally Posted by *nexxusty*
> 
> He thinks he's elite for some reason.
> 
> Uber bro... don't post about it if you can't share it with us. The old "oh I have a special BIOS thats amazing but you can't have it. Even though it's just a file".
> 
> Being from a heavy Linux, Unix, BSD background.... I do understand not releasing something until it's ready. I get that.
> 
> It works however. You stated this multiple times. At this point, the creator is just being a knob if he won't give it up. Period.


What is it that he has? A bios with higher voltage, I hope?


----------



## metal409

Quote:


> Originally Posted by *Derpinheimer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nexxusty*
> 
> He thinks he's elite for some reason.
> 
> Uber bro... don't post about it if you can't share it with us. The old "oh I have a special BIOS thats amazing but you can't have it. Even though it's just a file".
> 
> Being from a heavy Linux, Unix, BSD background.... I do understand not releasing something until it's ready. I get that.
> 
> It works however. You stated this multiple times. At this point, the creator is just being a knob if he won't give it up. Period.
> 
> 
> 
> What is it that he has? A bios with higher voltage, I hope?
Click to expand...

Supposedly a modded bios with the TDP raised to 300w.


----------



## juniordnz

Really, he's bragging about that? Sure I'm gonna hate some random dude over something he might not even have...kids these days...


----------



## Derpinheimer

Oh.. who really cares about the TDP? 120% is enough for every game afaik. Voltage is at least going to yield better clocks..


----------



## fat4l

So i put my card under water. Used liquid metal. Load temps 34C lol. Water temps 25C. 2164 mhz boost so far..


----------



## boredgunner

Quote:


> Originally Posted by *fat4l*
> 
> So i put my card under water. Used liquid metal. Load temps 34C lol. Water temps 25C. 2164 mhz boost so far..


Sounds like a win to me!


----------



## juniordnz

Quote:


> Originally Posted by *fat4l*
> 
> So i put my card under water. Used liquid metal. Load temps 34C lol. Water temps 25C. 2164 mhz boost so far..


wow, that's amazing. congrats. Hope I can get at least below 50 24/7 with an dapted AIO.


----------



## GreedyMuffin

If anyone else want to use their FE backplate with the EK block.


----------



## sew333

Quote:


> Originally Posted by *Bdonedge*
> 
> It's very possible. The drivers have had a bunch of issues that took a while to surface so maybe it's "mini" crashing or something. I don't even know what that means but what you're saying makes sense


Had the same issue on my Gigabyte 1080 Xtreme.It happened once.

At some point after having my system powered on, while the system was some time idle, my GTX 1080 becomes permanently throttled. The graphics clock was halved from a maximum of 1950-1960mhz (factory OC) to 1285mhz . When this occurs, it doesn't matter how much I load my card, be it games or other demanding applications, it does not ever go past the above caps. The only solution I have had up until now was to reboot the system .GPU-Z reports the "perfcap" reason as PWR

I have 0 driver crashes. And normally the event log will show the drive crash report which in my case it doesn't show. So its not related to driver crash.

But maybe some kind of unstable factory oc by my card? Mini crash?

Please answer me guys. I am really worried about card and maybe rma this thing.

Also there are many reports by users in geforce forum about similiar issues. THANK YOU







! !


----------



## Bdonedge

Quote:


> Originally Posted by *sew333*
> 
> Had the same issue on my Gigabyte 1080 Xtreme.It happened once.
> 
> At some point after having my system powered on, while the system was some time idle, my GTX 1080 becomes permanently throttled. The graphics clock was halved from a maximum of 1950-1960mhz (factory OC) to 1285mhz . When this occurs, it doesn't matter how much I load my card, be it games or other demanding applications, it does not ever go past the above caps. The only solution I have had up until now was to reboot the system .GPU-Z reports the "perfcap" reason as PWR
> 
> I have 0 driver crashes. And normally the event log will show the drive crash report which in my case it doesn't show. So its not related to driver crash.
> 
> But maybe some kind of unstable factory oc by my card? Mini crash?
> 
> Please answer me guys. I am really worried about card and maybe rma this thing.
> 
> Also there are many reports by users in geforce forum about similiar issues. THANK YOU
> 
> 
> 
> 
> 
> 
> 
> ! !


Yeah that's exactly the same error I experienced. What are people on the forums saying? It seems like it's not only a specific GPU vendor at least?

What does "perfcap" as PWR mean


----------



## GanGstaOne

Quote:


> Originally Posted by *sew333*
> 
> Had the same issue on my Gigabyte 1080 Xtreme.It happened once.
> 
> At some point after having my system powered on, while the system was some time idle, my GTX 1080 becomes permanently throttled. The graphics clock was halved from a maximum of 1950-1960mhz (factory OC) to 1285mhz . When this occurs, it doesn't matter how much I load my card, be it games or other demanding applications, it does not ever go past the above caps. The only solution I have had up until now was to reboot the system .GPU-Z reports the "perfcap" reason as PWR
> 
> I have 0 driver crashes. And normally the event log will show the drive crash report which in my case it doesn't show. So its not related to driver crash.
> 
> But maybe some kind of unstable factory oc by my card? Mini crash?
> 
> Please answer me guys. I am really worried about card and maybe rma this thing.
> 
> Also there are many reports by users in geforce forum about similiar issues. THANK YOU
> 
> 
> 
> 
> 
> 
> 
> ! !


I have tested Gigabyte 1080 Xtreme and i own Gigabyte 1080 G1 never had that problem also tested EVGA 1080 FTW, FE and Zotac 1080 AMP Xtreme all card i have tested with different bios from other cards all works no matter FE or not no matter if its 2x8 pin or 1x8+1x6 pin all bios works try other bios on your card may fix the problem you can use any bios from techpowerup database just Dont flash Galax bios


----------



## sew333

.


----------



## sew333

Quote:


> Originally Posted by *Bdonedge*
> 
> Yeah that's exactly the same error I experienced. What are people on the forums saying? It seems like it's not only a specific GPU vendor at least?
> 
> What does "perfcap" as PWR mean


There are many peoples on GF forum with the same issue. There are many opinions why it can happen . Not stable cards, psu, pcie power management, drivers,purportedly binned GPU overclock speeds.

I am confused now , of course it happened just once. It is something with my card ? Drivers pushing too hard my factory oc in card?


----------



## sew333

Quote:


> Originally Posted by *GanGstaOne*
> 
> I have tested Gigabyte 1080 Xtreme and i own Gigabyte 1080 G1 never had that problem also tested EVGA 1080 FTW, FE and Zotac 1080 AMP Xtreme all card i have tested with different bios from other cards all works no matter FE or not no matter if its 2x8 pin or 1x8+1x6 pin all bios works try other bios on your card may fix the problem you can use any bios from techpowerup database just Dont flash Galax bios


My Gigabyte 1080 Xtreme is stable overall on normal clocks,no crashes etc .

Do you left that cards in idle 4-5 days and then try? Just check it. I must left for few days idle to trigger that throttle bug in 3d apps. Perfcap is PWR related from GPUZ.


----------



## djtmalta

I have a EVGA FTW edition gtx 1080. It has two 8 pin connectors just like the Zotax GTX 1080 AMP! Extreme. I was able to put the zotac amp! extreme bios into my evga ftw gtx 1080 by using zotacs new overclocking tool that has a built in bios flasher. My evga ftw with the zotac amp! extreme bios card worked flawlessly and performed amazingly. I scored higher 3d mark scores and my clock never dropped below 2000 when playing games. This is without any sort of overclocking software.

My question is, I know the zotac amp! extreme card pulls 270 watts so I'm assuming the bios is designed for that, while my evga ftw has a max power draw of 215 watts. Is it a bad idea to leave the zotac amp! edition bios in my evga ftw edition card?

thanks,

David


----------



## GanGstaOne

Quote:


> Originally Posted by *sew333*
> 
> My Gigabyte 1080 Xtreme is stable overall on normal clocks,no crashes etc .
> 
> Do you left that cards in idle 4-5 days and then try? Just check it. I must left for few days idle to trigger that throttle bug in 3d apps. Perfcap is PWR related from GPUZ.


What do you mean 4-5 days all day for pc to run no i turn on my pc only when i use it but i have used the 1080 G1 for up to 12 hours in games like Fallout 4 with massive amount of game mods so very havy game no problems at all card stays at 2080-2100mhz on 55-60C and the card has stay in idle for whole day

BTW with 1080 G1 bios you can push your cards fan speed from 3200rpm to 3800rpm with FE bios even uo to 4000rpm dont know why gigabyte has lower the fan speed so much this time


----------



## sew333

Quote:


> Originally Posted by *GanGstaOne*
> 
> What do you mean 4-5 days all day for pc to run no i turn on my pc only when i use it but i have used the 1080 G1 for up to 12 hours in games like Fallout 4 with massive amount of game mods so very havy game no problems at all card stays at 2080-2100mhz on 55-60C and the card has stay in idle for whole day
> 
> BTW with 1080 G1 bios you can push your cards fan speed from 3200rpm to 3800rpm with FE bios even uo to 4000rpm dont know why gigabyte has lower the fan speed so much this time


When the PC has been idle for few days. The boost clock speed gets stuck at 1278 Mhz in 3d applications, until I reboot my PC and then it's at full boost clock speed again ( 1990-2000mhz ).

I know weird issue.


----------



## juniordnz

Quote:


> Originally Posted by *sew333*
> 
> When the PC has been idle for few days. The boost clock speed gets stuck at 1278 Mhz in 3d applications, until I reboot my PC and then it's at full boost clock speed again ( 1990-2000mhz ).
> 
> I know weird issue.


happened to me once or twice. Both times I had 2 3d applications open and after shutting one down it went away.

Another weird thing that happens here is that sometimes, even if I'm staring at my desktop, clocks will go up halfway and repeat thoses spikes for a while.


----------



## Goroshi

So I returned the FE I got and managed to pick up an FTW. Managed to OC the card stable at 2114Mhz core and 11000Mhz on the memory. Quite happy with this card overall the only thing I am a little worried about is the temps. While playing GTA V now for a couple of hours I have to have the fans at about 85% to keep the card from going over 70C is that normal? I have seen people say they are sub 60C with the fans at a 100%.


----------



## octiny

Picked up two FE's a couple weeks ago for my new build.

Top card gets a little toasty with OC as expected but nothing a fan curve and 3000rpm Noctua fans couldn't solve


----------



## Maligx

Quote:


> Originally Posted by *fat4l*
> 
> So i put my card under water. Used liquid metal. Load temps 34C lol. Water temps 25C. 2164 mhz boost so far..


Are you using Coollaboratory Liquid Ultra?


----------



## juniordnz

Quote:


> Originally Posted by *Goroshi*
> 
> So I returned the FE I got and managed to pick up an FTW. Managed to OC the card stable at 2114Mhz core and 11000Mhz on the memory. Quite happy with this card overall the only thing I am a little worried about is the temps. While playing GTA V now for a couple of hours I have to have the fans at about 85% to keep the card from going over 70C is that normal? I have seen people say they are sub 60C with the fans at a 100%.


your temps are normal. Try ramping up to 100% fans and see if it gets better. Good airflow also play a crucial role, make sure you hame something blowing cold air right to the card.


----------



## Goroshi

Quote:


> Originally Posted by *juniordnz*
> 
> your temps are normal. Try ramping up to 100% fans and see if it gets better. Good airflow also play a crucial role, make sure you hame something blowing cold air right to the card.


At a 100% probably drops to around 65-67C during gameplay in GTA V. How far down would temps go if I put a hybrid cooler on this thing you think? sub 50?


----------



## juniordnz

Quote:


> Originally Posted by *Goroshi*
> 
> At a 100% probably drops to around 65-67C during gameplay in GTA V. How far down would temps go if I put a hybrid cooler on this thing you think? sub 50?


Have no idea. Below 49 would be optimal, but I don't know if a hybrid could keep it that low.

Anyone?


----------



## axiumone

Quote:


> Originally Posted by *juniordnz*
> 
> Have no idea. Below 49 would be optimal, but I don't know if a hybrid could keep it that low.
> 
> Anyone?


Depends on the usage. I have two hybrids in sli. [email protected] keeps the gpu usage at around 60-70%. With my ambient temps of 77c, I'm hovering around 48-49c with both cards clocked at 2088. If both cards are under 100% load, the temps top out around 58c. That's with both cards using corsair SP120 high static pressure fans in push/pull at around 1300 rpm.

Also, the clock throttling starts at 40c, not 49c.


----------



## fat4l

Quote:


> Originally Posted by *Maligx*
> 
> Are you using Coollaboratory Liquid Ultra?


No. I used thermal grizzly conductonaut.


----------



## TWiST2k

Quote:


> Originally Posted by *Goroshi*
> 
> At a 100% probably drops to around 65-67C during gameplay in GTA V. How far down would temps go if I put a hybrid cooler on this thing you think? sub 50?


My FTW barely goes over 60c with the fans @ 100%. I have a Corsair Air 540 case with all Noctua 3k fans, I need to setup a sig for my rig, will have to add that to my action item list lol.

I would also switch to the second bios on the FTW, I did not test to be sure, but I know I have mine on the second bios and have the 130% power limit. I also noticed today that there is a major CPU hogging issue with imprecision-X that I do not experience when using afterburner.


----------



## nexxusty

Quote:


> Originally Posted by *axiumone*
> 
> Depends on the usage. I have two hybrids in sli. [email protected] keeps the gpu usage at around 60-70%. With my ambient temps of 77c, I'm hovering around 48-49c with both cards clocked at 2088. If both cards are under 100% load, the temps top out around 58c. That's with both cards using corsair SP120 high static pressure fans in push/pull at around 1300 rpm.
> 
> *Also, the clock throttling starts at 40c, not 49c.*


Seriously sick of this crap being posted.

Prove it! How about that?

Many, many others would refute your claim. I'm one of them.


----------



## Derpinheimer

I believe Anandtech found it to throttle at 40c? My card does not, however. It throttles at like 45.


----------



## nexxusty

Quote:


> Originally Posted by *Derpinheimer*
> 
> I believe Anandtech found it to throttle at 40c? My card does not, however. It throttles at like 45.


Anandtech is a joke.

That's EXACTLY what I was going to say. Throttle temps ARE different between cards.

Simply put, this is the reason why I don't agree with axiumone or Anand. Data or not, my data shows differences. As does yours it seems.

Thx for the reply.


----------



## Derpinheimer

Really? I've always found their reviews to be nice and thorough. Whats wrong with them?


----------



## nexxusty

Quote:


> Originally Posted by *Derpinheimer*
> 
> Really? I've always found their reviews to be nice and thorough. Whats wrong with them?


I don't know. I have a knack for knowing reviewers..... never liked La Shimpi.

I feel like there are flaws in his reviews. He used to be better. In his early days. Just like Tom from Tom's Hardware.

They both suck now IMO. For video cards I go to Guru3d. They are the best for GPU reviews. For CPU and RAM reviews I talk to my boys here.

Never an issue. I'd use Anand for SSD reviews or whatever else. GPU? No.

I feel like they're a little n00bish and act like they are capable than more than they actually are. Comes off like that to me anyway.


----------



## axiumone

Quote:


> Originally Posted by *nexxusty*
> 
> Seriously sick of this crap being posted.
> 
> Prove it! How about that?
> 
> Many, many others would refute your claim. I'm one of them.


Have you ever had your FE at 30c to start with? It's just what I see with my hybrid cards. My idle is around 34c, as soon as the temps go over 40c the cards tick down one step. It may very well be dependent on the quality of the chip and vary card to card, I have no idea. Just like most others, I can't wait for the ability to customize the bios and regain control over my cards.


----------



## nexxusty

Quote:


> Originally Posted by *axiumone*
> 
> Have you ever had your FE below at 30c to start with? It's just what I see with my hybrid cards. My idle is around 34c, as soon as the temps go over 40c the cards tick down one step. It may very well be dependent on the quality of the chip and vary card to card, I have no idea. Just like most others, I can't wait for the ability to customize the bios and regain control over my cards.


No offense meant to you bro.

I just think there is more to it than saying "Pascal throttles at 39c".

Me too man. Me too. Not feeling the hardware mods like I used to. No time anymore. OC'ing takes enough time....


----------



## Derpinheimer

The temp it throttles at seems to vary as well, maybe based off of the voltage?

For example, before I was using my OC curve and it throttled at ~45c. (Either at 45 or 46)
Now I tried default, and I'm getting "Perfcap reason: Thrm" at just 41c?

I might be giving bad info here. I know for sure that I saw perfcap at those values though. I just might be off on the exact temperature it started at.


----------



## escalibur

Guys what clocks are you getting with EVGA FTWs?


----------



## fat4l

So guys.......2202MHz so far.... Going strong


----------



## KickAssCop

Gaming or benching? Air or Water?


----------



## fat4l

Quote:


> Originally Posted by *KickAssCop*
> 
> Gaming or benching? Air or Water?


Benching and gaming. On water. Tdp moded. Holding stable clocks.not even one drop. Same for volts









Here... valley bench.

Temps ~33C(water temp 25C)-liquid metal on the core!
Core 2202MHz
Mems 11008MHz
TDP mod done-der8auer's guide.











*FS X: 12 261 Graphics score*
http://www.3dmark.com/fs/9630685

*3DMark TimeSpy: 8304 Graphics Score*
http://www.3dmark.com/3dm/13850050


----------



## tin0

So I traded my abysmal clocking FE for a MSI ARMOR 8G OC card. Taking this one since it's the same PCB as the MSI GAMING X and Z, so I can use the GAMING Z BIOS on it. My FE couldn't clock any higher than 1976MHz, no matter which BIOS, which settings in AB even at max. load temp of 49°C using aircon over the card.

Still have the Corsair HG10 bracket laying around, hopefully I can still make that fit since PCB is totally different from FE. Not scared of a little modding


----------



## Goroshi

Quote:


> Originally Posted by *TWiST2k*
> 
> My FTW barely goes over 60c with the fans @ 100%. I have a Corsair Air 540 case with all Noctua 3k fans, I need to setup a sig for my rig, will have to add that to my action item list lol.
> 
> I would also switch to the second bios on the FTW, I did not test to be sure, but I know I have mine on the second bios and have the 130% power limit. I also noticed today that there is a major CPU hogging issue with imprecision-X that I do not experience when using afterburner.


Quote:


> Originally Posted by *TWiST2k*
> 
> My FTW barely goes over 60c with the fans @ 100%. I have a Corsair Air 540 case with all Noctua 3k fans, I need to setup a sig for my rig, will have to add that to my action item list lol.
> 
> I would also switch to the second bios on the FTW, I did not test to be sure, but I know I have mine on the second bios and have the 130% power limit. I also noticed today that there is a major CPU hogging issue with imprecision-X that I do not experience when using afterburner.


Big difference there then ha, do you have your Noctua fans at max speed to get those temps though?

My FTW is in a Define R4 with the stock case fans at the front and a corsair APU on the bottom pulling air into card, so definitely not ideal airflow I guess I should just be happy with how it's performing for now maybe in the future I'll put a hybrid on it being temps and noise way down.

So you are saying that the slave bios has higher power limit? My card doesn't ever hit power limit anyways I'm always hitting voltage limit.


----------



## HeXBLiTz

Looking to jump in on the 1080 Club but wanted some input.

I'm coming from a 2015 titanX on water OC to 1450 core and +400 on the ram.

Only gaming in 1080p now as i like to stream and it's just easier to duplicate and export 1080p signal but at the same time I love to hit the 144fps my monitor supports (or close to it).

With BF1 COMING I'm hoping the 1080 can do that for me and there are a few options for waterblock compatible cards from the strix OC to the evga SC to the founders.

Basically is it worth the extra money for the strix with a higher defualt boost clock or is there a card that'll surpass it's clock speed with when OVERCLOCKING it?

Also worth the upgrade for me also?


----------



## wangle0485

Quote:


> Originally Posted by *fat4l*
> 
> Benching and gaming. On water. Tdp moded. Holding stable clocks.not even one drop. Same for volts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here... valley bench.
> 
> Temps ~33C(water temp 25C)-liquid metal on the core!
> Core 2202MHz
> Mems 11008MHz
> TDP mod done-der8auer's guide.
> 
> *FS X: 12 261 Graphics score*
> http://www.3dmark.com/fs/9630685
> 
> *3DMark TimeSpy: 8304 Graphics Score*
> http://www.3dmark.com/3dm/13850050


Hitting the same voltage limit as my Super Jetstream, loving your temps and clocks though!

I'm on air and can hit 2063 on the core @ 49c, got a couple of af120 fans throwing air exclusively at the gpu along with 75% fanspeed on the card itself.

Do you find your card going past 104% power limit at all?


----------



## TWiST2k

Quote:


> Originally Posted by *Goroshi*
> 
> Big difference there then ha, do you have your Noctua fans at max speed to get those temps though?
> 
> My FTW is in a Define R4 with the stock case fans at the front and a corsair APU on the bottom pulling air into card, so definitely not ideal airflow I guess I should just be happy with how it's performing for now maybe in the future I'll put a hybrid on it being temps and noise way down.
> 
> So you are saying that the slave bios has higher power limit? My card doesn't ever hit power limit anyways I'm always hitting voltage limit.


I have all of my case and radiator fans for my CPU set at 20 / 20, 40 / 40 and 60 / 100, they are all on a splitter and controlled with the CPU header on the motherboard so they all spin up and down together.

I have not tested so I cannot confirm, but I did read that bios 2 had the higher power limit. I am thinking of hooking up one of my AIO coolers to my FTW, but haven't found any adapter like the H10 or HG10 that will for sure fit this card. I heard evga is going to have a hybrid cooler for the FTW, but who knows how overpriced its going to be.


----------



## looniam

Quote:


> Originally Posted by *nexxusty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Derpinheimer*
> 
> Really? I've always found their reviews to be nice and thorough. Whats wrong with them?
> 
> 
> 
> I don't know. I have a knack for knowing reviewers..... never liked La Shimpi.
> 
> I feel like there are flaws in his reviews. He used to be better. In his early days. Just like Tom from Tom's Hardware.
> 
> They both suck now IMO. For video cards I go to Guru3d. They are the best for GPU reviews. For CPU and RAM reviews I talk to my boys here.
> 
> Never an issue. I'd use Anand for SSD reviews or whatever else. GPU? No.
> 
> I feel like they're a little n00bish and act like they are capable than more than they actually are. Comes off like that to me anyway.
Click to expand...

i guess you don't know anand lal shrimpi left two years ago to work for apple and ryan smith, who is well respected among his peers (hilbert included) has been the gpu editor since. temp throttling has been known to happen with maxwell; there is a bios modding guide to remove it:
https://www.computerbase.de/forum/showthread.php?t=1445972 (yeah, its in german but you know google translate if you don't use chrome)
Quote:


> *Voltage Table (980 Ti, Titanium X) --- max. Set volts, and throttle limits*
> .
> .
> .
> *Temp throttle limit.* This defines how far allowed to throttle the map. Currently it seems that when the throttle 900 series at 50 ° - 60 ° to a voltage level and thus the boost clock is also reduced (you depending on how you set the Voltage Table and Boost Table).
> To prevent premature chokes it is advisable for the regulator of the 3rd column to be set to the same value.
> The max. *Temp. Target is not circumvented* , it simply prevents the map on the way to the max. Temp. Target the Volt reduced by one level.
> .


personally, seen this behavior dozens of times myself and in the 980ti owners thread. with pascel having a higher transistor density, it's not unfathomable to see the behavior at a lower temp.

will every card have it? no, but since a bios mod removed to from maxwell, that is a possible reason. and it certainly doesn't mean to dispute every claim that it does happen.


----------



## juniordnz

I'm pretty sure my 1080armor gets its first clock throttle at 49°C. Will take a better look today. I'm on air in a 30°C room temp, so maybe it's getting hot so quickly that I don't even notice the first step down. It opens at 2088 and stabilizes at 2050. 2062 if it's a not so heavy game/bench.

Hopefully BIOS modding will give us some controle over these thermal throttle. It's not possible that it is a safety measure. Above 80°C? Ok! But not at 50s.


----------



## SauronTheGreat

why cannot i read the ASIC quality of my card ? it says ASIC quality is not supported


----------



## GreedyMuffin

ASIQ readings is not supported on 1080/TitanX/1070.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> if you are on a single card, change the PCIE slot to Gen 2 - your CPU score will return to normal.


Confirming..on MSI board , changing to GEN1-GEN2-GEN3 or keeping it AUTO does not change anything at all for the 6700k with a single 1080


----------



## juniordnz

Why MSI cards are so expensive? An ARMOR costing 10 bucks more and GamingX 40 bucks more than EVGA FTW? That makes no sense. What's so revolutionary about MSI that justifies the premium price?


----------



## pez

Quote:


> Originally Posted by *juniordnz*
> 
> Why MSI cards are so expensive? An ARMOR costing 10 bucks more and GamingX 40 bucks more than EVGA FTW? That makes no sense. What's so revolutionary about MSI that justifies the premium price?


I think their coolers are among the more silent and efficient, but other than that....I'm honestly not sure. Quite the joke, honestly.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Why MSI cards are so expensive? An ARMOR costing 10 bucks more and GamingX 40 bucks more than EVGA FTW? That makes no sense. What's so revolutionary about MSI that justifies the premium price?


My MSI FE gets to 2260 but drops down bit by bit quickly, on stock Ref . cooler/


----------



## juniordnz

Quote:


> Originally Posted by *pez*
> 
> I think their coolers are among the more silent and efficient, but other than that....I'm honestly not sure. Quite the joke, honestly.


I can't speak for Gaming series, but Armor get's pretty hot! Only way to sustain 60C gaming is with [email protected]% and a huge fan blowing cold air like a turbine with the case open. (posted some pics earlier in this thread).

Maybe it's the hype price over twinfrozr5 or whatever is called.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> My MSI FE gets to 2260 but drops down bit by bit quickly, on stock Ref . cooler/


That's just luck of the draw I guess. My armor gets pretty mediocre OC on air. (2050mhz stable with fan turbine mod) Bear in mind they all (Amor, GamingXZ) share the exact same PCB design.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> That's just luck of the draw I guess. My armor gets pretty mediocre OC on air. (2050mhz stable with fan turbine mod) Bear in mind they all (Amor, GamingXZ) share the exact same PCB design.


Tried few of the GaminX bios , much worse overclockers even with higher power limit. Try FE bios. Also Ref. fan gets up to 4200RPM








and mine rarely gets to 100% mostly sits around 68 degrees in gaming.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Tried few of the GaminX bios , much worse overclockers even with higher power limit. Try FE bios. Also Ref. fan gets up to 4200RPM
> 
> 
> 
> 
> 
> 
> 
> 
> and mine rarely gets to 100% mostly sits around 68 degrees in gaming.


Thank's for the hint. But I'm returning it tomorrow for a FTW, so no more fiddling with BIOS flashing with this one.

Hopefully I'll get a better sylicon with EVGA and better temps at load.

Also, MSI custom PCB is so wide that I couldn't even close my side panel with 2 fans.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Thank's for the hint. But I'm returning it tomorrow for a FTW, so no more fiddling with BIOS flashing with this one.
> 
> Hopefully I'll get a better sylicon with EVGA and better temps at load.
> 
> Also, MSI custom PCB is so wide that I couldn't even close my side panel with 2 fans.


Got TitanX coming saturday, will probably sell this 1080 after few tests.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Got TitanX coming saturday, will probably sell this 1080 after few tests.


That's nice! Post some pics of that baby when it arrives









What %increase in performance are you expecting? Haven't seen anything about titan lately...


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> That's nice! Post some pics of that baby when it arrives
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What %increase in performance are you expecting? Haven't seen anything about titan lately...


20 to 25% at best.


----------



## Barterlos

TitanX is a beast, no question about that, but strangely TitanXP is not the KING of the Pascal family, its not on full fat PASCAL, maybe this time around there will be no full PASCAL die on Geforce series and no 1080Ti


----------



## GreedyMuffin

Quote:


> Originally Posted by *Barterlos*
> 
> TitanX is a beast, no question about that, but strangely TitanXP is not the KING of the Pascal family, its not on full fat PASCAL, maybe this time around there will be no full PASCAL die on Geforce series and no 1080Ti


That is what I'm thinking.

No way AMD can compete with TitanX, so Nvidia wont be needed to release a 1080Ti.


----------



## Barterlos

Quote:


> Originally Posted by *GreedyMuffin*
> 
> That is what I'm thinking.
> 
> No way AMD can compete with TitanX, so Nvidia wont be needed to release a 1080Ti.


yes, with another released GPU on PASCAL Nvidia is basically competing with themselves, thats why they remove SLI finger on GTX 1060. GTX 1060 at 2ghz in SLI prolly would be faster than GTX 1080 in DX11 games


----------



## GanGstaOne

1080 TI is on its way with 3328 cores 320-bit 10GB GDDR5X
Titan X is 3584 and 384-bit 12GB GDDR5X


----------



## Xenozx

Hey all, I have a EVGA FTW 1080 on air. I am able to get 2114 core in game, and for the most part it will stay above 2000 all the time, depending on temp / load. usually see like 2047. can run 3dmark and the witcher at these speeds, only place i have issues is with direct X 12 tests, which is weird, where i do need to lower it a little, but all direct x 11 games, run flawlessly at these speeds.

i have the stock bios, but I am interested in trying one that increases TDP, cause mine does shoot quite high. my temps seem in order for air though, usually never see my GPU go over 60c durring firestrike benchmarking. it sits around 77c after paying witcher 3 for a few hours at 2047mhz.

what do you guys think, is it worth it, could i break the 2200mhz barrier?

are there bioses that increase TDP and voltage? whats odd, is i tested a lot of overclocks, and leaving core voltage in afterburner on 0 is actually the most stable for me. does that make sense, I thought adding voltage would add stability, but it seams to take it away?


----------



## jase78

Has the guy with "wonder bios" shared any info on his results before and after. I understand he doesnt want to share it but wouldnt he want to contribute to these discussions about what pascal can do once its limits are removed.


----------



## Joshwaa

Quote:


> Originally Posted by *Xenozx*
> 
> Hey all, I have a EVGA FTW 1080 on air. I am able to get 2114 core in game, and for the most part it will stay above 2000 all the time, depending on temp / load. usually see like 2047. can run 3dmark and the witcher at these speeds, only place i have issues is with direct X 12 tests, which is weird, where i do need to lower it a little, but all direct x 11 games, run flawlessly at these speeds.
> 
> i have the stock bios, but I am interested in trying one that increases TDP, cause mine does shoot quite high. my temps seem in order for air though, usually never see my GPU go over 60c durring firestrike benchmarking. it sits around 77c after paying witcher 3 for a few hours at 2047mhz.
> 
> what do you guys think, is it worth it, could i break the 2200mhz barrier?
> 
> are there bioses that increase TDP and voltage? whats odd, is i tested a lot of overclocks, and leaving core voltage in afterburner on 0 is actually the most stable for me. does that make sense, I thought adding voltage would add stability, but it seams to take it away?


Have you clicked over to the slave BIOS with the higher TDP?


----------



## juniordnz

Quote:


> Originally Posted by *Joshwaa*
> 
> Have you clicked over to the slave BIOS with the higher TDP?


So FTW Slave BIOS has higher TDP than the main one?


----------



## Joshwaa

Quote:


> Originally Posted by *juniordnz*
> 
> So FTW Slave BIOS has higher TDP than the main one?


Yes it has a 130% power compared to the 120%.


----------



## juniordnz

Quote:


> Originally Posted by *Joshwaa*
> 
> Yes it has a 130% power compared to the 120%.


Nice to know that, thanks! Will use the slave one to overclock it as soon as I get it then. After we get access to BIOS modding it won't matter anymore though.


----------



## GanGstaOne

wow all nvidia 1080 cards so far even very evga classified, gigabyte xtreme. zotac amp xtreme and so on are with Micron memory


----------



## Joshwaa

Quote:


> Originally Posted by *juniordnz*
> 
> Nice to know that, thanks! Will use the slave one to overclock it as soon as I get it then. After we get access to BIOS modding it won't matter anymore though.


My FTW never goes over 109% anyways.


----------



## Xenozx

Quote:


> Originally Posted by *Joshwaa*
> 
> Have you clicked over to the slave BIOS with the higher TDP?


no i did not know there was a switch on my card for that. I am pretty sure ive seen it go well above 120 before, and maybe thats my issue? I am +110 core on the stock clocks of the FTW though, and +550 on the memory. I will give this a try! thanks for the heads up.


----------



## juniordnz

Quote:


> Originally Posted by *Joshwaa*
> 
> My FTW never goes over 109% anyways.


Not even on heavy benchmarks? Have you tried Firestrike Ultra Stress test? Mine will reach 121% and power throttle.


----------



## Joshwaa

YEa I have tried all the different benches and stress tests. I hit voltage regulation before the TDP gets close to 110%. Once I am on water I expect this to change as it should hit higher clock for longer term with the temp throttle not kicking in.


----------



## Barterlos

i remember when i got GTX 970 SLI G1 Gaming, my 970s always throttle cuz VRM section was getting to hot, when i put another fan near VRM section my throttling goes aways, maybe its the same with GTX 1080 FE, my 2038 overclock alwyas resulting in 1911/46mhz after 20mins of gameplay due to that throttling, my temps never goes above 78c, when i check msi afterburner it was showing that i was hitting power target, but playing witcher 3 in 4k with rivatuner overlary rarely my FE hits that 120% PT go figure


----------



## LolCakeLazors

Sent my card in for RMA yesterday, looks like its arriving Tuesday at EVGA. Probably will receive my replacement next next week. In the meantime... back to my 290X.


----------



## nexxusty

Quote:


> Originally Posted by *GanGstaOne*
> 
> 1080 TI is on its way with 3328 cores 320-bit 10GB GDDR5X
> Titan X is 3584 and 384-bit 12GB GDDR5X


No.
Quote:


> Originally Posted by *GanGstaOne*
> 
> wow all nvidia 1080 cards so far even very evga classified, gigabyte xtreme. zotac amp xtreme and so on are with Micron memory


This surprises you? GDDR5X is only manufactured by Micron. No one else.


----------



## Thetbrett

there I was saying I was going to pass on the 1080, and was certainly not paying 1200 aussie dollars for one, i noticed Amazon would ship to Australia. I know I should support local business and all that, but, I nabbed a Zotac AMP for 880 AUD including postage. Thats 300 cheaper than anywhere here, and well under my 1000 limit for a card I set myself,, not by much mind you. I always thought there were geo restrictions on cards. Anyway, it's on it's way. I also have a buyer for my 980ti for 400, so 480 for a 1080? couldn't say no. Just hoping there's no warranty issues, but the local Zotac site said nothing about buying from overseas, I checked that out before I bought. Gotta wait now but boy I love the feeling of new gear coming. I will be sure to validate and join. Anyone else here with the AMP(not extreme) edition?


----------



## fat4l

Quote:


> Originally Posted by *wangle0485*
> 
> Hitting the same voltage limit as my Super Jetstream, loving your temps and clocks though!
> 
> I'm on air and can hit 2063 on the core @ 49c, got a couple of af120 fans throwing air exclusively at the gpu along with 75% fanspeed on the card itself.
> 
> Do you find your card going past 104% power limit at all?


No not really








I modded the TDP resistor(shunt) and If you do +20% in afterburner you will never hit te +120% target ...


----------



## JCrimson

Hello guys! Quick 2 questions.

I used to be apart of the gtx titan owners club but i sold both of my OG titans for 1 evga 1080 ftw. One thing I LOVED was the Skyn3t bios for them. Do we have anything like that available yet for the 1080s? In the works?

Also, what is the purpose of dual bios on our FTWs if we cant modify the bios yet?


----------



## juniordnz

Quote:


> Originally Posted by *JCrimson*
> 
> Hello guys! Quick 2 questions.
> 
> I used to be apart of the gtx titan owners club but i sold both of my OG titans for 1 evga 1080 ftw. One thing I LOVED was the Skyn3t bios for them. Do we have anything like that available yet for the 1080s? In the works?
> 
> Also, what is the purpose of dual bios on our FTWs if we cant modify the bios yet?


There's no BIOS modding tool avaiable for pascal yet. We are all hoping to get one soon. Until then, the only purpose for the dual bios is to have a backup if you like to adventure in cross flashing.


----------



## JCrimson

Who will be providing this tool? Is there a date or thread about it somewhere? thanks for the help! I just want to kill the boost and make it core all teh time lol.


----------



## Barterlos

Quote:


> Originally Posted by *JCrimson*
> 
> Who will be providing this tool? Is there a date or thread about it somewhere? thanks for the help! I just want to kill the boost and make it core all teh time lol.


prolly from the same guy who made keppler/maxwell bios tweaker


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *JCrimson*
> 
> Who will be providing this tool? Is there a date or thread about it somewhere? thanks for the help! I just want to kill the boost and make it core all teh time lol.


There's a big discussion on the bios in this thread, and the tool is being worked on by the same people as the Maxwell tool (I believe), but long story short, nobody has decrypted the Pascal bios yet. Pretty interesting following the discussion in that thread though about how people have picked up on snippets of hex here and there and correlating them to definitions that apply to the cards.


----------



## toncij

Anyone got stats for the two FTW BIOS-es? Clocks look the same, oc potential also. Look identical. I expected more from these. 2114 max stable (voltage in Precision to max, fan 100%, 120%/92°C) is not really the best. But still, one more thing bothers me: the performance of 1080 is not really that great. Without OC to 2.1/5.5mem it can't pass oced Titanx at 1.5GHz. Max oc makes it only 13% faster. I expected more. Any way to raise voltage (past what PrecisionX does ofc)?


----------



## jorpe

Quote:


> Originally Posted by *TWiST2k*
> 
> Man I have a 980 Ti Classy and I was pretty happy with it, but the 749 tag on the 1080 Classy turned me off to it, my 1080 FTW will be here tomorrow. What kind of speeds are you getting with the Classy?


2150 boost 11000 memory. Me getting that classy was a lesson in always getting the in store warranty at micro center. The day the classys came out I had a fan bearing let go on my 980 Gaming 4G MSI card. I took it into the store and got to pick the 1080 classy (the only one in stock) off the shelf and pay the $100 difference. I was ready to just swap it straight across for the MSI 980 or equivalent.


----------



## jorpe

Quote:


> Originally Posted by *Xenozx*
> 
> Hey all, I have a EVGA FTW 1080 on air. I am able to get 2114 core in game, and for the most part it will stay above 2000 all the time, depending on temp / load. usually see like 2047. can run 3dmark and the witcher at these speeds, only place i have issues is with direct X 12 tests, which is weird, where i do need to lower it a little, but all direct x 11 games, run flawlessly at these speeds.
> 
> i have the stock bios, but I am interested in trying one that increases TDP, cause mine does shoot quite high. my temps seem in order for air though, usually never see my GPU go over 60c durring firestrike benchmarking. it sits around 77c after paying witcher 3 for a few hours at 2047mhz.
> 
> what do you guys think, is it worth it, could i break the 2200mhz barrier?
> 
> are there bioses that increase TDP and voltage? whats odd, is i tested a lot of overclocks, and leaving core voltage in afterburner on 0 is actually the most stable for me. does that make sense, I thought adding voltage would add stability, but it seams to take it away?


Depending on your individual card, higher voltage can destabilize an OC very quickly.


----------



## Derpinheimer

Has anyone tried OCCT gpu test with error checking? I seem to be game stable at 1.093v / 2164 but get some errors in that test.


----------



## SAFX

Just got 1080 SC, still in box, holding out for FTW, but non in stock unless I settle for FTW DT available now on newegg.
Biggest concern is DT cards do not overclock well,


----------



## Vellinious

Quote:


> Originally Posted by *jorpe*
> 
> Depending on your individual card, higher voltage can destabilize an OC very quickly.


I'd never flash to a bios that wasn't meant for the card I'm flashing it to. I see a lot of these guys doing it, but.....they're playin with fire. Especially since the FTW is a custom board.

Wait for Pascal Bios Editor, increase the TDP and Power Limit and see if you think you need to raise the voltage. I can tell you with a great amount of certainty though, that unless your GPU is under water, increasing voltage isn't going to yield you much of anything....if anything at all. Pascal's biggest hurdles are TEMPS and POWER LIMIT, not voltage.


----------



## toncij

Quote:


> Originally Posted by *Vellinious*
> 
> I'd never flash to a bios that wasn't meant for the card I'm flashing it to. I see a lot of these guys doing it, but.....they're playin with fire. Especially since the FTW is a custom board.
> 
> Wait for Pascal Bios Editor, increase the TDP and Power Limit and see if you think you need to raise the voltage. I can tell you with a great amount of certainty though, that unless your GPU is under water, increasing voltage isn't going to yield you much of anything....if anything at all. Pascal's biggest hurdles are TEMPS and POWER LIMIT, not voltage.


Actually, FTWs have no power problems and temps are not an issue. Chips are starved on voltage or there is another problem we don't see.


----------



## Kenshiro 26

Received my EVGA 1080 FTW last week, huge step up from a reference R9 290X!


----------



## Barterlos

guys do u are testing your overclock in 4k res ?! in 1080p i dont have a problem with throttling with no reasons, in 1080p my speed is 2012mhz all time 99%gpu load, but in 4k its diffrent story, like something is constatly monitoring GPU load and when there is 100% GPU load for a like 5-10mins it will downlock when temps are ok, like mid 60s with spare power, i think in PASCAL chips there is lots of diffrent limiters that is affecting GPU speed that we dont know yet, and strangely, nvidia hard locked bios, they used very strong encryption, they definitly dont want people to know more about pascal


----------



## xer0h0ur

Quote:


> Originally Posted by *Barterlos*
> 
> guys do u are testing your overclock in 4k res ?! in 1080p i dont have a problem with throttling with no reasons, in 1080p my speed is 2012mhz all time 99%gpu load, but in 4k its diffrent story, like something is constatly monitoring GPU load and when there is 100% GPU load for a like 5-10mins it will downlock when temps are ok, like mid 60s with spare power, i think in PASCAL chips there is lots of diffrent limiters that is affecting GPU speed that we dont know yet


Your clock speed drops due to temperature not load percentage.


----------



## Barterlos

Quote:


> Originally Posted by *xer0h0ur*
> 
> Your clock speed drops due to temperature not load percentage.


sry to say but not in my case, ive got FE, with FANS at 100% my temps are 64c max, and my boost clock in 4k is 1889/1911mhz 99% load compared to 1080p 99% load 2025/12mhz

its due to load


----------



## axiumone

Quote:


> Originally Posted by *Barterlos*
> 
> guys do u are testing your overclock in 4k res ?! in 1080p i dont have a problem with throttling with no reasons, in 1080p my speed is 2012mhz all time 99%gpu load, but in 4k its diffrent story, like something is constatly monitoring GPU load and when there is 100% GPU load for a like 5-10mins it will downlock when temps are ok, like mid 60s with spare power, i think in PASCAL chips there is lots of diffrent limiters that is affecting GPU speed that we dont know yet, and strangely, nvidia hard locked bios, they used very strong encryption, they definitly dont want people to know more about pascal


It's due to the power limit. I've noticed that at higher resolutions you'll hit the 120% must easier than at lower resolutions. If you're constantly close to 120% the card will downclock regardless of the temperature.


----------



## Barterlos

Quote:


> Originally Posted by *axiumone*
> 
> It's due to the power limit. I've noticed that at higher resolutions you'll hit the 120% must easier than at lower resolutions. If you're constantly close to 120% the card will downclock regardless of the temperature.


probably u are right, ive tested witcher 3 in 4k, with afterburner overlay, avarage power consumption was 105-115% sometimes hitting but not constant 120%

maybe its that shunt resistor cuz even with 120% PT vrm section is not letting gpu to draw 225wats or afterbuner statitics are not 100% accurate


----------



## GreedyMuffin

Anyone with a 1080 can you post your Valley score? I think mine is a bit off?


Spoiler: Warning: Spoiler!


----------



## Asus11

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Anyone with a 1080 can you post your Valley score? I think mine is a bit off?
> 
> 
> Spoiler: Warning: Spoiler!


seems abit on the low side, could be anything, make sure g sync is off

my max score is 132 it was rank 2 till titan XP came out anyway you should be getting 120s easy

ur on water aswell?


----------



## Barterlos

this is my score, your score is ok, cuz mine is worse, but my min fps are 5fps higher than yours



Spoiler: Warning: Spoiler!


----------



## Zeek

Picked up some stuff from Newegg earlier this week


----------



## Barterlos

holy smoke, amazing stuff, with gtx 1080 144fps will be easy







))) games will be smooth as butter or silky smooth


----------



## Asus11

is it me or does anyone else think the 1080 ti will still be based on GP104? it makes sense to carry on using 8gb ram and leaving the titan x with the ''crazy amount'' of 12gb
therefore they can still price it competitively also after seeing missing parts on the 1070(more missing obv) and 1080 PCB it makes sense that they could maybe add some more cores close to 3k and have it 15-20% faster than the current 1080? & have it 210w tdp?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Asus11*
> 
> seems abit on the low side, could be anything, make sure g sync is off
> 
> my max score is 132 it was rank 2 till titan XP came out anyway you should be getting 120s easy
> 
> ur on water aswell?


Yeah. No throttling. :/


----------



## GreedyMuffin

Quote:


> Originally Posted by *Asus11*
> 
> seems abit on the low side, could be anything, make sure g sync is off
> 
> my max score is 132 it was rank 2 till titan XP came out anyway you should be getting 120s easy
> 
> ur on water aswell?


What speed is your GPU on? Anybody else got any tip?


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> Thank's for the hint. But I'm returning it tomorrow for a FTW, so no more fiddling with BIOS flashing with this one.
> 
> Hopefully I'll get a better sylicon with EVGA and better temps at load.
> 
> Also, MSI custom PCB is so wide that I couldn't even close my side panel with 2 fans.


We gotta compare FTWs when you get yours bro!


----------



## Barterlos

but what about price ? if 1080 cost around 700-800$ when titan x is like 25-30% faster at most, 1080Ti should cost atleast 999$ but perfomance jump from 1080 will be like from 10 to 20% at best


----------



## TWiST2k

Quote:


> Originally Posted by *Zeek*
> 
> Picked up some stuff from Newegg earlier this week


What monitor is that?


----------



## Barterlos

asus 3D Lightbosst 1080p 144hz


----------



## Asus11

Quote:


> Originally Posted by *GreedyMuffin*
> 
> What speed is your GPU on? Anybody else got any tip?


I think it was on 2164 and 575 on the memory

also I had it heavily optimised just to run that bench ill dig up some none optimised scores

I was getting 118-122 scores from the look of my screenshots so yours isn't all too bad


----------



## Zeek

Quote:


> Originally Posted by *TWiST2k*
> 
> What monitor is that?


ASUS VG248QE


----------



## Asus11

Quote:


> Originally Posted by *Zeek*
> 
> ASUS VG248QE












I accidentally broke my brothers vg278h with build in 3d in the car other day , when I seen you monitor it reminded me


----------



## GreedyMuffin

G-synch dosen't make a difference. I dunno..


----------



## Asus11

Quote:


> Originally Posted by *GreedyMuffin*
> 
> G-synch dosen't make a difference. I dunno..


disable g sync, close all other programs just have valley open and see how you get on obv you already know power slider to max and core slider to max


----------



## GreedyMuffin

Quote:


> Originally Posted by *Asus11*
> 
> disable g sync, close all other programs just have valley open and see how you get on obv you already know power slider to max and core slider to max


Yeah. Power, temp maxed out, voltage is on stock (0%). GPU is running 2126 and mem is 1386mhz.


----------



## Zeek

Quote:


> Originally Posted by *Asus11*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I accidentally broke my brothers vg278h with build in 3d in the car other day , when I seen you monitor it reminded me


How does such a thing happen. I need details lmao


----------



## GreedyMuffin

With my card I could push it further. No voltage limit is ever met, only TDP.









Probably a 2200 mhz GPU.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> We gotta compare FTWs when you get yours bro!


Sure thing! Hopefully I'll get it tomorrow and as soon as plug it I'll post some "out of the box" info.

How did your perform out of the box? Hopefully I can get more than the top 2088mhz my Armor does.


----------



## ikjadoon

Quote:


> Originally Posted by *Zeek*
> 
> Picked up some stuff from Newegg earlier this week


I love my ASUS VG248QE. If you care about motion clarity / blur, you should--without a doubt--check out Lightboost. It'll tweak the colors in a weird way, but there are color profiles to fix it. _Insanely_ smooth; @ 120Hz LB, it feels better than my old CRT:


----------



## GanGstaOne

2230 on v1.062 with Gigabyte 1080 G1 Gaming and that is with Zotac FE bios will test if it goes as high with its own bios


----------



## GreedyMuffin

What happens If i flash the FTW slave bios onto my card?

Will it work? I will get a bit higher TDP limit, right?


----------



## GanGstaOne

Quote:


> Originally Posted by *GreedyMuffin*
> 
> What happens If i flash the FTW slave bios onto my card?
> 
> Will it work? I will get a bit higher TDP limit, right?


Do you have FTW 1080 slave bios


----------



## ikjadoon

Quote:


> Originally Posted by *fat4l*
> 
> No. I used thermal grizzly conductonaut.


I thought this was the other Grizzly [Kryonaut] stuff, but this one is actual liquid metal. And, from the few benches I've found, it's as good as or even a little better than CLU, plus a fair bit easier to apply. Is that how yours went?

http://forum.notebookreview.com/threads/liquid-metal-showdown-thermal-grizzly-conductonaut-vs-cool-laboratory-liquid-ultra-pro.791489/


----------



## Snabeltorsk

Quote:


> Originally Posted by *GanGstaOne*
> 
> 2230 on v1.062 with Gigabyte 1080 G1 Gaming and that is with Zotac FE bios will test if it goes as high with its own bios


Did u tried the Classified ?


----------



## GanGstaOne

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Did u tried the Classified ?


No i completely forgot will test it tomorrow but i found that with water cooling you can get better overclock because the cards dosnt power any fans


----------



## xer0h0ur

I have a hard time believing the cooler fan(s) factor into the gpu's power envelope. If so, that is some grade A senseless engineering.


----------



## GanGstaOne

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have a hard time believing the cooler fan(s) factor into the gpu's power envelope. If so, that is some grade A senseless engineering.


They do add to the overall TDP of the card so yes when removed you add more power for oc


----------



## GanGstaOne

Go check EVGA site see card description its says their fan(s) use less power which helps for gpu, memory OC


----------



## juniordnz

Something like 2W for each 92mm fan running at full 12V. And that´s shooting high.


----------



## Zenophobe

Can hit 2138 at start but that's it. Memory can not be OC for me. Tried everything I could.


----------



## GreedyMuffin

Wow. Did a driver re-install.(DDU + newest bios).

Getting 107 FPS at 2100/5400.. This can't be normal?


----------



## GreedyMuffin

Increasing voltage won't work either..

Tried in both MSI and Evga Prex.X.

The voltage is locked at 1.0500V?


----------



## Derpinheimer

Are you using manual curve? I can't get the voltage to go up unless I use a manual curve AND increase the voltage slider.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Derpinheimer*
> 
> Are you using manual curve? I can't get the voltage to go up unless I use a manual curve AND increase the voltage slider.


Overclocking like I have always done it with older cards etc. Max put PT/TT. Then add a offset to GPU core and mem, perhaps increase the voltage as well.

This POS refuses to work.


----------



## GreedyMuffin

If anyone can help me with my issue, It would be much appreciated!

I'm going to bed now, have been tinkering with it for the last three hours. It's 04:05 AM here.

Could try to reinstall windows, but that's alot of work.

Cheers!


----------



## fat4l

So guys is there any good bios to try out on FE 1080 that could possibly help with oc or better scores?

Im on water.
Tdp mod done.
Thx


----------



## Vellinious

Quote:


> Originally Posted by *toncij*
> 
> Actually, FTWs have no power problems and temps are not an issue. Chips are starved on voltage or there is another problem we don't see.


Temps are always an issue. lol


----------



## Fixxxer696

Hello all, new to the forum, just had a couple of things I was curious about. Why is the Founder's Edition getting so much flak? I've that edition, and it works flawlessly in my rig. Granted, I haven't overclocked yet, however I am currently unable to because of summer SoCal weather, my room is a furnace, in the mid 70's at it's coolest around 2 in the morning when I get home from work. I required a blower card for my Node 202 case, and I've no issues whatsoever, hence my misunderstanding of the bad rap the FE has, I'm running Doom at 1080p 60FPS on Ultra settings, average GPU temp is around 76° and CPU at 41°.


----------



## ssgwright

anyone know where I can pick up the new titan X been looking everywhere


----------



## toncij

Quote:


> Originally Posted by *Vellinious*
> 
> Temps are always an issue. lol


Ambient 29°C and the card doesn't go over 62°C on air and 2114. I don't see that to be a thermal problem. Am I missing something?

Quote:


> Originally Posted by *Derpinheimer*
> 
> Are you using manual curve? I can't get the voltage to go up unless I use a manual curve AND increase the voltage slider.


Manual curve and main control are mutually exclusive, aren't they?


----------



## toncij

Quote:


> Originally Posted by *ssgwright*
> 
> anyone know where I can pick up the new titan X been looking everywhere


The only way is through Nvidia store directly. There is no other way (except eBay for $4000).


----------



## Derpinheimer

Quote:


> Originally Posted by *toncij*
> 
> Ambient 29°C and the card doesn't go over 62°C on air and 2114. I don't see that to be a thermal problem. Am I missing something?
> Manual curve and main control are mutually exclusive, aren't they?


You would think so, but it's not - at least not completely. I'm guessing the issue is that the GPU will never use the manual curve voltage values above stock without also setting your voltage offset to +X

i.e. if stock is 1.05v, and I have a value set at 1.093v, it will never use that value unless I have set voltage to +100% as well.


----------



## Agavehound

Hey gang, been tinkering with my FTW and finally was able to get it stable at +130/500 using the slave bios.

I ran Heaven without G-sync and it ran perfectly with a max temp of 57c and second run with G-sync off but I got some weird waves on my screen. What would cause the waves with G-sync off?


----------



## Derpinheimer

Are you sure the waves youre referring to arent screen tearing (the primary things g-sync fixes)


----------



## xTesla1856

The Furys are sold, bit the bullet on a MSI GTX1080 Gaming X. Should be here in about 2 weeks


----------



## Agavehound

Quote:


> Originally Posted by *Derpinheimer*
> 
> Are you sure the waves youre referring to arent screen tearing (the primary things g-sync fixes)


You're right, I thought it was screen tearing but wasn't sure. Should I run my benches with G-sync enabled or not? Are there other settings I can adjust to eliminate it? I've never spent much time OC'ing my GPU, I usually find a mild stable OC and let it run.


----------



## toncij

Quote:


> Originally Posted by *Agavehound*
> 
> Hey gang, been tinkering with my FTW and finally was able to get it stable at +130/500 using the slave bios.
> 
> I ran Heaven without G-sync and it ran perfectly with a max temp of 57c and second run with G-sync off but I got some weird waves on my screen. What would cause the waves with G-sync off?


How do you tell which BIOS is which one? Slave is with the switch closer to power connectors or away from those?


----------



## aylan1196

Bite the bullet sold one of my 1080s and bought titanx pascal :/ can't resist that beast I'll get it soon and post impressions .....
Now to sell the other one and no more sli
Hail to the new king


----------



## grimboso

How much differance is there between the regular grizzly kryo, grizzly metal and for instance the paste included with the EK blocks? Also have a tube of noctua paste. I know it's one of the better for CPUs (atleast last reviews i read). How is that for watercooled gpus?


----------



## Kold

Anyone know why PNY 1080 is doing this? It's keeping me from getting into my bios because my backlit mechanical keyboard isn't initializing fast enough now.
Quote:


> Originally Posted by *toncij*
> 
> How do you tell which BIOS is which one? Slave is with the switch closer to power connectors or away from those?


Yes it's the one closest to the power connectors.

Quote:


> Originally Posted by *aylan1196*
> 
> Bite the bullet sold one of my 1080s and bought titanx pascal :/ can't resist that beast I'll get it soon and post impressions .....
> Now to sell the other one and no more sli
> Hail to the new king


Oh, I ALMOST did the same thing. Newegg issued my store credit for my 1080 and I kept thinking about trying to wait and get a Titan XP. Ultimately, I purchased a PNY FE 1080 knowing full well I'll sell it the moment the 1080Ti is announced.

Quote:


> Originally Posted by *grimboso*
> 
> How much differance is there between the regular grizzly kryo, grizzly metal and for instance the paste included with the EK blocks? Also have a tube of noctua paste. I know it's one of the better for CPUs (atleast last reviews i read). How is that for watercooled gpus?


I wouldn't recommend using any liquid metal based thermal paste on a GPU. Far too risky. As far as the regular Grizzly, I know it dethroned my favorite (Geild GC-Extreme) by 2-3C. I wouldn't go out of my way to purchase it over the EK paste, though. I used what EK supplied with my OG Titan X and it never went above 45C.


----------



## grimboso

Quote:


> Originally Posted by *Kold*
> 
> [
> I wouldn't recommend using any liquid metal based thermal paste on a GPU. Far too risky. As far as the regular Grizzly, I know it dethroned my favorite (Geild GC-Extreme) by 2-3C. I wouldn't go out of my way to purchase it over the EK paste, though. I used what EK supplied with my OG Titan X and it never went above 45C.


I used CLU on the die of My 6700k when I delidded it, but I guess there are a lot more to break on the Gpu.
Might as well stick with the EK paste. Seems that the 1080 is cool enough not to warrant the hazzle for that 1-2 degree.
My FTW sits nice and cool at 58 degrees at my 2126/11000 OC. Put the voltage slider back to 0 and it gave me a more stable ram OC. Can't wait for the FTW waterblock to come!


----------



## toncij

Quote:


> Originally Posted by *grimboso*
> 
> I used CLU on the die of My 6700k when I delidded it, but I guess there are a lot more to break on the Gpu.
> Might as well stick with the EK paste. Seems that the 1080 is cool enough not to warrant the hazzle for that 1-2 degree.
> My FTW sits nice and cool at 58 degrees at my 2126/11000 OC. Put the voltage slider back to 0 and it gave me a more stable ram OC. Can't wait for the FTW waterblock to come!


With no voltage added you get stable 2126?!
130%/92°C? Fans RPM? Ambient temp?


----------



## grimboso

Quote:


> Originally Posted by *toncij*
> 
> With no voltage added you get stable 2126?!
> 130%/92°C? Fans RPM? Ambient temp?


130/92 indeed. Fan 100%, ambient 22 degrees, open window case, which I guess helps a lot. You can see one of my earlier posts here, card does not throttle in heaven or in light games. The division / witcher 3 / fallout 4 with 4k res mod however throttles the card down to 2114. Guess this is a good win in the lottery?

My OC is more stable at 0% voltage over 100%. Going to test to see if I can push it higher with someting inbetween.


----------



## max883

I am impresed With how quiet this GPU is! The fan speed never goes over 40%. It gets cooled by the Noctua fanns that puls cooled air from aoutside the case


----------



## GanGstaOne

can anyone confirm that NON-EVGA cards cant use K Boost function of EVGA PrecisionX OC ???


----------



## juniordnz

Quote:


> Originally Posted by *GanGstaOne*
> 
> can anyone confirm that NON-EVGA cards cant use K Boost function of EVGA PrecisionX OC ???


KBoost works on pascal? Only on EVGA cards? That's new for me...


----------



## Xenozx

Quote:


> Originally Posted by *grimboso*
> 
> 130/92 indeed. Fan 100%, ambient 22 degrees, open window case, which I guess helps a lot. You can see one of my earlier posts here, card does not throttle in heaven or in light games. The division / witcher 3 / fallout 4 with 4k res mod however throttles the card down to 2114. Guess this is a good win in the lottery?
> 
> My OC is more stable at 0% voltage over 100%. Going to test to see if I can push it higher with someting inbetween.


same deal. I have set to +80 core on top of the already overclocked FTW speeds, and I see 21xx_ when starting benchmarks and stuff. It will eventually drop to like 2088, but thats 100% stable. use 0 extra voltage, and 130/92 limits. I actually didnt know about the switch yesterday, so i was at 120/92, but same overclock, cant overclock any higher stable going from 120 to 130.

Adding extra voltage usually just adds instability. Example, i can pass 3dmark at +110 core, with 0 extra voltage, but if i do + max voltage, it will crash every time.

I can benchmark at +110 to the core, and ill stay over 21XX the whole time, but it cant run games long term like that. the issue I have usually is game will run, but if i alt tab out, ill get a nvidia display adapter failed to respond, and game will exit.


----------



## juniordnz

You should get your max stable clock, not the value on the core clock bar. Of course +110 is more stable on +0 voltage than it is on +100. +100 will make the curve get to boost clocks much higher than +0, even if the slider bar is at the same value of +110.


----------



## Xenozx

Quote:


> Originally Posted by *juniordnz*
> 
> You should get your max stable clock, not the value on the core clock bar. Of course +110 is more stable on +0 voltage than it is on +100. +100 will make the curve get to boost clocks much higher than +0, even if the slider bar is at the same value of +110.


wait have i been overclocking wrong?

so if i set +100 core voltage, and say +80 core mhz, I might have a higher overclock than 0 core voltage and 110 core clock?


----------



## juniordnz

Quote:


> Originally Posted by *Xenozx*
> 
> wait have i been overclocking wrong?
> 
> so if i set +100 core voltage, and say +80 core mhz, I might have a higher overclock than 0 core voltage and 110 core clock?


Exactly. More voltage will tell the card to get a clock from a higher point of the curve mhz/v. Press CTRL+F and you'll see what I'm talking about.


----------



## fireyfire

So... Apparently I had my cooler mounted correctly and it did affect my maximum stable OC (Pascal is *EXTREMELY* affected by temps...) I wanted to try the stock cooler on my card to check the temps and the OC results.. Lets just say it wasn't worth my time (or was it?) When I put my AIO water cooler back on I gobbed on the thermal paste in a rush job, and for some reason (*Cough* Convex coldplate *Cough*) I got better cooling results this time and am able to OC further. 2202 MHz is stable, 2228 can make it half way through heaven benchmark.
so, if you have an AIO water cooler (excluding EVGA Hybrid coolers) make sure to put a bit of extra thermal paste.


----------



## GreedyMuffin

If I want to TDP mod the card, will it be possible to remove it if I need to RMA?


----------



## Snabeltorsk

Quote:


> Originally Posted by *GreedyMuffin*
> 
> If I want to TDP mod the card, will it be possible to remove it if I need to RMA?


Thats no problem if you use Liquid metal like CLP/CLU


----------



## tin0

Received my MSI GeForce GTX 1080 ARMOR 8G OC today and couldn't be happier







It's fully stable at 2113MHz no matter what I throw at it. It even keeps rocking the 2113MHz core clock steady with a custom fan curve. Temperatures are also great (especially compared to the FE). Went from a really poor and hot running 1967MHz FE (not even fully stable or maintaining those clocks) to a cool and quiet 2113MHz with the MSI card. With max fan speed the sound isn't even that bad and keeps the card at a 60degrees Celsius under full load.

Didn't even touch voltage yet, but will wait with that till I mod this baby on with a Corsair H80 + flash the GAMING Z BIOS since this is the same PCB;


----------



## GreedyMuffin

Seems like 2200 is no problem with voltage, just a no-no with TDP.

The only way I managed to increase the voltage was through the custom curve thingy, or else. Max 1.0500V.

I don't know why nvidia even added some voltage support when the TDP is an issue, even with stock voltage.

Can't wait for bioses to come out.

I'm going to sweden on sunday, so I won't be tinkering with this POS until next week.


----------



## toncij

Ok what is that curve thing? The one on PrecisionX is so tiny that I can't see anything on it...


----------



## juniordnz

Quote:


> Originally Posted by *tin0*
> 
> Received my MSI GeForce GTX 1080 ARMOR 8G OC today and couldn't be happier
> 
> 
> 
> 
> 
> 
> 
> It's fully stable at 2113MHz no matter what I throw at it. It even keeps rocking the 2113MHz core clock steady with a custom fan curve. Temperatures are also great (especially compared to the FE). Went from a really poor and hot running 1967MHz FE (not even fully stable or maintaining those clocks) to a cool and quiet 2113MHz with the MSI card. With max fan speed the sound isn't even that bad and keeps the card at a 60degrees Celsius under full load.
> 
> Didn't even touch voltage yet, but will wait with that till I mod this baby on with a Corsair H80 + flash the GAMING Z BIOS since this is the same PCB;


You got a very nice piece of sylicon there...mine won't even get close to that.


----------



## GreedyMuffin

Your stock voltage is a bit higher than mine. Mine only goes up to 1.0500V at 2126mhz.

Getting 110 FPS on Valley. Can somebody else please post their score on Valley, so I can calm my nerves and see if it's my card, or if it's a normal score. Thanks alot!

EDIT: @Juniordnz, love your sig. *This is my rig. There are many like it, but this one is mine. It is my life. I must master it as I must master my life. Without me my rig is useless. Without my rig, I am useless. My rig is human, even as I am human, because it is my life. Thus, I will learn it as a brother. I will learn its weaknesses, its strengths, its parts, its accessories, its cpu and its gpu. I will keep my rig clean and ready, even as I am clean and ready. We will become part of each others*


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> EDIT: @Juniordnz, love your sig. *This is my rig. There are many like it, but this one is mine. It is my life. I must master it as I must master my life. Without me my rig is useless. Without my rig, I am useless. My rig is human, even as I am human, because it is my life. Thus, I will learn it as a brother. I will learn its weaknesses, its strengths, its parts, its accessories, its cpu and its gpu. I will keep my rig clean and ready, even as I am clean and ready. We will become part of each others*


oorah!


----------



## OCPG

Love the sig bro!









I've got a 1080 FE and wondering if it's advisable to set the fan off below a certain temp (50c or so)? And what would be the easiest/best way to do that?


----------



## Whitechap3l

So short update from my side :
I put my Asus Strix ( non oc ) under water, flashed the FTW BIOS on it and my results :
~ 24.400 GPU Score in Fire Strike
2177mHz Gpu Clock


----------



## tin0

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Your stock voltage is a bit higher than mine. Mine only goes up to 1.0500V at 2126mhz.
> 
> Getting 110 FPS on Valley. Can somebody else please post their score on Valley, so I can calm my nerves and see if it's my card, or if it's a normal score. Thanks alot!
> 
> EDIT: @Juniordnz, love your sig. *This is my rig. There are many like it, but this one is mine. It is my life. I must master it as I must master my life. Without me my rig is useless. Without my rig, I am useless. My rig is human, even as I am human, because it is my life. Thus, I will learn it as a brother. I will learn its weaknesses, its strengths, its parts, its accessories, its cpu and its gpu. I will keep my rig clean and ready, even as I am clean and ready. We will become part of each others*


Mine also runs 1.0500V during the bench, the 1.0620V is just a spike. Yours seems also like it's a great clocker








Will now flash the GAMING Z BIOS to see if I can push it any further.


----------



## fjordiales

Quote:


> Originally Posted by *tin0*
> 
> Mine also runs 1.0500V during the bench, the 1.0620V is just a spike. Yours seems also like it's a great clocker
> 
> 
> 
> 
> 
> 
> 
> 
> Will now flash the GAMING Z BIOS to see if I can push it any further.


Can you please point me in the direction to where I can find gaming Z bios? TPU only has gaming X. I have regular gaming any will try this out.


----------



## Phinix

Advice needed: If you had to pick between the "MSI 1080 Sea Hawk", " MSI 1080 Gaming X" or the "EVGA FTW Gaming" which would you choose?


----------



## fat4l

Has anyone tried this ?

http://forum.hwbot.org/showpost.php?p=455871&postcount=20
_
"New bios which has improved performance at the same clock speed (+400 points Fire Strike GPU score). Note that it does not have the overclocked frequency by default."_
http://www.mediafire.com/download/22ifgc9yuk72eyg/strix1080xoc_t4.zip

I see ppl are getting 2.3GHz and 1.2V with strix OC bios.....curious if the real performance is rly there...


----------



## Derpinheimer

Wow 2300 thats nice! I'll give it a shot!

EDIT: Nevermind its for strix only


----------



## USlatin

WOW, that sounds interesting


----------



## USlatin

Can you link to the thread?


----------



## ACleverName

Quote:


> Originally Posted by *Phinix*
> 
> Advice needed: If you had to pick between the "MSI 1080 Sea Hawk", " MSI 1080 Gaming X" or the "EVGA FTW Gaming" which would you choose?


I got a sea hawk, core clock stock is 1987 and stays nice and cool


----------



## zlpw0ker

Quote:


> Originally Posted by *Phinix*
> 
> Advice needed: If you had to pick between the "MSI 1080 Sea Hawk", " MSI 1080 Gaming X" or the "EVGA FTW Gaming" which would you choose?


seahawk for sure,not saying that because I already have it here,but I like watercooled aio gpu's.evem tho they take up a little more space I like my pc cool as possible.


----------



## Phinix

Quote:


> Originally Posted by *ACleverName*
> 
> I got a sea hawk, core clock stock is 1987 and stays nice and cool


Quote:


> Originally Posted by *zlpw0ker*
> 
> seahawk for sure,not saying that because I already have it here,but I like watercooled aio gpu's.evem tho they take up a little more space I like my pc cool as possible.


Thanks for your inputs, I was originally waiting for the Gigabyte 1080 Waterforce but its just constant delays and over priced in my country so switching to something else.


----------



## USlatin

Oh, dude, I thought that was without mods and there had been a breakthrough of some sort...


----------



## zlpw0ker

Quote:


> Originally Posted by *Phinix*
> 
> Thanks for your inputs, I was originally waiting for the Gigabyte 1080 Waterforce but its just constant delays and over priced in my country so switching to something else.


I was originally waiting for the 1080 hybrid myself,but due to its length of covering the sata ports+required power adapter from evga and the dozens of delays I no longer consider the 1080 hybrid.
I chose seahawk because it fits my black and white theme in my pc.
I never even considered the waterforce due to its ugly color scheme.
I have the 1080 seahawk here,but here in norway it costs 1030dollars,not cheap here either


----------



## USlatin

The MSI looks slick! The only downside is possible micro down-clocks due to power limitations, but they should literally be unoticeable. If the Hybrid doesn't offer something special (slightly higher OC, price, look) I will go with MSI.


----------



## toncij

I have two EVGA FTWs... Silent and fast but not stable above 2114MHz.


----------



## Snabeltorsk

I have a Seahawk EK, Does 2126 core and 11130 memory total game stable, never crashed. Higher than that i need more voltage and higher Powerlimit.


----------



## GanGstaOne

Guys can you please upload EVGA 1080 Slave bios thanks


----------



## Derpinheimer

Quote:


> Originally Posted by *toncij*
> 
> I have two EVGA FTWs... Silent and fast but not stable above 2214MHz.


How high can you get without errors in OCCT?


----------



## toncij

Quote:


> Originally Posted by *Derpinheimer*
> 
> How high can you get without errors in OCCT?


2114... that's the top with stock fans and no hw mods... 100% voltage, 130%/92°C and +91/+500.


----------



## Kold

Are there any bios' for the founders edition cards? I need to flash my PNY 1080 to get this annoying splash screen at boot up removed. I can't get into my motherboards bios because of it.


----------



## tin0

Quote:


> Originally Posted by *fjordiales*
> 
> Can you please point me in the direction to where I can find gaming Z bios? TPU only has gaming X. I have regular gaming any will try this out.


Sorry to dissapoint you, after flashing and not getting the right clocks it seems I copied the GAMING X BIOS from work to my pen drive instead of the GAMING Z one








When in the office on Monday I will upload the GAMING Z one to this forum.


----------



## Derpinheimer

Quote:


> Originally Posted by *toncij*
> 
> 2114... that's the top with stock fans and no hw mods... 100% voltage, 130%/92°C and +91/+500.


Oh, well before edit it was 2214 so I was really amazed you had two capable of doing that.


----------



## toncij

Quote:


> Originally Posted by *Derpinheimer*
> 
> Oh, well before edit it was 2214 so I was really amazed you had two capable of doing that.


Unfortunately it was a typo


----------



## TWiST2k

Quote:


> Originally Posted by *GanGstaOne*
> 
> Guys can you please upload EVGA 1080 Slave bios thanks


Here is the Secondary BIOS off of my 1080 FTW. Rename to .rom.

1080FTW_BIOS2_130_PWR_LIMIT.csv 251k .csv file


----------



## fat4l

Quote:


> Originally Posted by *fat4l*
> 
> Ok nice. This is what I was talking about!
> 
> But the question is..... is this bios rly bringing more performance or just higher clocks and volts ?
> 
> Some ppl have said that already that its just "visual" and you are not rly getting 2300MHz performance but rather 2100MHz(just an example).
> 
> Pls test and report back. Compare with stock max clock bios vs this one.
> 
> 
> 
> 
> 
> 
> 
> 
> More info: http://forum.hwbot.org/showthread.php?t=159025
> 
> And "new" bios here: http://forum.hwbot.org/showpost.php?p=455871&postcount=20
> Has anyone tried on FE?


Guys chech this here pls and lets focus on it.

1.thats new xoc bios
2.has anyone tried it?
3.is it safe to flash FE with XOC bios?
4.does it bring performance or jusy high clocks?

Thxx


----------



## GreedyMuffin

Will consider TDP modding my card. Been running valley for the last 4 hours at 2139 mhz stock voltage. My mem is kinda decent 545+, more than that and it will get bad scores due ECC thingy.

I am very happy with 2139 on stock voltage, glad I purchased it, was a little worried since it was a demo card (card sent back for an unknown reason).


----------



## Derpinheimer

Quote:


> Originally Posted by *fat4l*
> 
> Guys chech this here pls and lets focus on it.
> 
> 1.thats new xoc bios
> 2.has anyone tried it?
> 3.is it safe to flash FE with XOC bios?
> 4.does it bring performance or jusy high clocks?
> 
> Thxx


I'd try flashing my FTW if I was confident it wouldnt kill it... my guess is it would just not boot, but...


----------



## TK421

So I shorted one of my resistors using liquid metal as per der8baurer's instructions, but the difference is my card had 3 resistors on the power connector.

I picked the one very left, and applied liquid metal just enough to make it shiny. And pop the card back in system.

How do I verify that the power limit has been removed or allowed to go higher than stock limit?


----------



## TWiST2k

Quote:


> Originally Posted by *Derpinheimer*
> 
> I'd try flashing my FTW if I was confident it wouldnt kill it... my guess is it would just not boot, but...


I just flashed my FTW with the Strix OC BIOS and it did give higher power, but my clocks ended up just a tiny bit higher, really nothing groundbreaking, but it did work fine.

If you can rip your master BIOS off of the FTW for me that would be awesome, I forgot to backup the master one before flashing it. I wanted to keep my slave BIOS clean so I could just switch back if things did not pan out well.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> I just flashed my FTW with the Strix OC BIOS and it did give higher power, but my clocks ended up just a tiny bit higher, really nothing groundbreaking, but it did work fine.
> 
> If you can rip your master BIOS off of the FTW for me that would be awesome, I forgot to backup the master one before flashing it. I wanted to keep my slave BIOS clean so I could just switch back if things did not pan out well.


You can choose over which bios the new rom is flashed or it always flash over the one that's active?


----------



## Derpinheimer

Quote:


> Originally Posted by *TWiST2k*
> 
> I just flashed my FTW with the Strix OC BIOS and it did give higher power, but my clocks ended up just a tiny bit higher, really nothing groundbreaking, but it did work fine.
> 
> If you can rip your master BIOS off of the FTW for me that would be awesome, I forgot to backup the master one before flashing it. I wanted to keep my slave BIOS clean so I could just switch back if things did not pan out well.


Hm, I read some horror story of a guy flashing a 5870 to a 5850 and permanently killing it so I'm just a bit concerned

20percentBIOSftw.zip 148k .zip file


The 20% one?


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> You can choose over which bios the new rom is flashed or it always flash over the one that's active?


I just flip the switch to whichever BIOS I want to flash before hand. I will probably continue to flash the master BIOS to keep my slave BIOS clean so I can switch back easily to something that works correctly when its game time haha.

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hm, I read some horror story of a guy flashing a 5870 to a 5850 and permanently killing it so I'm just a bit concerned
> 
> 20percentBIOSftw.zip 148k .zip file
> 
> 
> The 20% one?


Yah the master BIOS with the 120 limit as opposed to the slave one with the 130 limit. If this is indeed the master BIOS then thank you very much!


----------



## Derpinheimer

Quote:


> Originally Posted by *TWiST2k*
> 
> I just flip the switch to whichever BIOS I want to flash before hand. I will probably continue to flash the master BIOS to keep my slave BIOS clean so I can switch back easily to something that works correctly when its game time haha.
> Yah the master BIOS with the 120 limit as opposed to the slave one with the 130 limit. If this is indeed the master BIOS then thank you very much!


It is indeed, no problem

Were you able to increase voltage to 1.2v with the strix bios? I assume you are flashing back to the default one huh?


----------



## fat4l

Quote:


> Originally Posted by *TWiST2k*
> 
> I just flashed my FTW with the Strix OC BIOS and it did give higher power, but my clocks ended up just a tiny bit higher, really nothing groundbreaking, but it did work fine.
> 
> If you can rip your master BIOS off of the FTW for me that would be awesome, I forgot to backup the master one before flashing it. I wanted to keep my slave BIOS clean so I could just switch back if things did not pan out well.


now the question is if your scores increased or not


----------



## fat4l

Quote:


> Originally Posted by *TK421*
> 
> So I shorted one of my resistors using liquid metal as per der8baurer's instructions, but the difference is my card had 3 resistors on the power connector.
> 
> I picked the one very left, and applied liquid metal just enough to make it shiny. And pop the card back in system.
> 
> How do I verify that the power limit has been removed or allowed to go higher than stock limit?


Well just check if your card is reaching its tdp limit..
Also you need to put a pittle bit more CLU. It should look like buble not like a coating


----------



## TWiST2k

Quote:


> Originally Posted by *fat4l*
> 
> now the question is if your scores increased or not


Quote:


> Originally Posted by *Derpinheimer*
> 
> It is indeed, no problem
> 
> Were you able to increase voltage to 1.2v with the strix bios? I assume you are flashing back to the default one huh?


It would increase on its own as the temp would rise and no I am not flashing back, I only use the slave BIOS anyways, I just wanted to have a backup of it if I need to return the master bios to stock.

Quote:


> Originally Posted by *fat4l*
> 
> now the question is if your scores increased or not


They did actually, I got about 300 higher in the timespy benchmark.


----------



## fat4l

Quote:


> Originally Posted by *TWiST2k*
> 
> They did actually, I got about 300 higher in the timespy benchmark.


Nice. +rep
Did you get 1.2v? If so did you have to touch voltage slider?

Now....someone flash FE with this strix OC bios(the newest one) and report hah. Compare scores before and after pls


----------



## TWiST2k

Quote:


> Originally Posted by *fat4l*
> 
> Nice. +rep
> Did you get 1.2v? If so did you have to touch voltage slider?
> 
> Now....someone flash FE with this strix OC bios(the newest one) and report hah. Compare scores before and after pls


Thanks! I am using that BIOS from the hwbot forum where you were posting as well. The power limit and temp limit sliders are disabled for me with afterburner when using the strix oc bios, I even tried to install the Asus GPU OC whatever its called software and it did not make a difference. With my FTW bios its +70 on core but with the Strix OC bios its +190.


----------



## pez

Quote:


> Originally Posted by *juniordnz*
> 
> I can't speak for Gaming series, but Armor get's pretty hot! Only way to sustain 60C gaming is with [email protected]% and a huge fan blowing cold air like a turbine with the case open. (posted some pics earlier in this thread).
> 
> Maybe it's the hype price over twinfrozr5 or whatever is called.


Yeah...I was immediately turned off of the Armor after seeing it's price, and then seeing no inclusion of a backplate and a rather stark looking PCB (IMO). I mean, the FE might have a plastic, 2-piece backplate, but it's something







.
Quote:


> Originally Posted by *juniordnz*
> 
> That's nice! Post some pics of that baby when it arrives
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What %increase in performance are you expecting? Haven't seen anything about titan lately...


Just because you asked nicely (I know you weren't talking to me here, but I did this to troll my boss earlier today (excuse my poor ability to realize the orientation was wrong







):


----------



## kx11

my PC is laying on the side just so i can save those heavy gpus from leaning down



after i did that i hear the CPU cooler (EK predator 240) pushing water in those tubes harder , should i be fine like that or should i put the PC back up ?

oh and ignore the terrible cable management


----------



## fat4l

Quote:


> Originally Posted by *TWiST2k*
> 
> Thanks! I am using that BIOS from the hwbot forum where you were posting as well. The power limit and temp limit sliders are disabled for me with afterburner when using the strix oc bios, I even tried to install the Asus GPU OC whatever its called software and it did not make a difference. With my FTW bios its +70 on core but with the Strix OC bios its +190.


what about voltage bro?
Also. Have you tried using curve in afterburner? It should affect voltage too. Ctrl F in ab

Also. Any guide you used for flashing? I only flashed amd cards before. I have FE now. Thx


----------



## Derpinheimer

STRIX bios increased FPS in bf4 test range menu from 157 to 159 (both boosting to 2025) as well as allowed 1.164v using precision, but I was unable to get the card stable at any better OC than with 1.093v.

The lack of power usage monitoring is also concerning as there seems to be no limit.

Edit: nvm it is allowing better OC. 2254 seems ok in a quick test but i can't do anything more thorough tonight


----------



## andressergio

Anyone here with the ZOTAC GTX 1080AMP! Extreme ?

i would like to know how to increase volts, Firestorm Software i buggy

here's my setup









[email protected] 4.6GHz 1.31VCore + 32GB DDR4 3200C16
ASRock X99 OCF 3.1
2x 512GB OCZ RD400 NVMe OS and Benchmark Software
7x 1TB SSD Data and Games
2x ZOTAC GTX 1080AMP! Extreme 8GB
Powered By Silverstone ST1500W GS
Cooled by Alphacool, EK Waterblocks

can you add me to the list ?
GPUZ Valid: https://www.techpowerup.com/gpuz/details/frhhg

Kindly, Sergio


----------



## kx11

nvm........


----------



## fat4l

Quote:


> Originally Posted by *Derpinheimer*
> 
> STRIX bios increased FPS in bf4 test range menu from 157 to 159 (both boosting to 2025) as well as allowed 1.164v using precision, but I was unable to get the card stable at any better OC than with 1.093v.
> 
> The lack of power usage monitoring is also concerning as there seems to be no limit.
> 
> Edit: nvm it is allowing better OC. 2254 seems ok in a quick test but i can't do anything more thorough tonight


what card is that?


----------



## TWiST2k

Quote:


> Originally Posted by *fat4l*
> 
> what about voltage bro?
> Also. Have you tried using curve in afterburner? It should affect voltage too. Ctrl F in ab
> 
> Also. Any guide you used for flashing? I only flashed amd cards before. I have FE now. Thx


I am attaching the nvflash I have been using and its working great. I would extract it and place the nvflash folder in your C:\ and then launch an admin command prompt.

Code:



Code:


cd c:\nvflash

nvlflash --protectoff

nvflash -6 whatever.rom

press y when prompted and reboot to apply the changes, you can also hit tab to complete filenames in dos if you are not aware of that.

Thanks for the tip on the ctrl-f I did not know that was there. I am unable to control the power limit or temp limit with the strix bios on my FTW in afterburner but as the temps go up the voltage goes up as well, which is interesting because with my FTW it is the opposite when my temps keep going up my voltage slowly drops along with my clockspeeds. I was seeing if there was anything in the afterburner config files to enable the k boost or sliders even with the strix bios, but did not make any progress on that.

I am going to reboot to my strix bios again in a min and do some more testing, was trying to see how hard I could push things in the slave FTW bios again just for a baseline.

nvflash.zip 1167k .zip file


----------



## fat4l

Quote:


> Originally Posted by *TWiST2k*
> 
> I am attaching the nvflash I have been using and its working great. I would extract it and place the nvflash folder in your C:\ and then launch an admin command prompt.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> cd c:\nvflash
> 
> nvlflash --protectoff
> 
> nvflash -6 whatever.rom
> 
> press y when prompted and reboot to apply the changes, you can also hit tab to complete filenames in dos if you are not aware of that.
> 
> Thanks for the tip on the ctrl-f I did not know that was there. I am unable to control the power limit or temp limit with the strix bios on my FTW in afterburner but as the temps go up the voltage goes up as well, which is interesting because with my FTW it is the opposite when my temps keep going up my voltage slowly drops along with my clockspeeds. I was seeing if there was anything in the afterburner config files to enable the k boost or sliders even with the strix bios, but did not make any progress on that.
> 
> I am going to reboot to my strix bios again in a min and do some more testing, was trying to see how hard I could push things in the slave FTW bios again just for a baseline.
> 
> nvflash.zip 1167k .zip file


Nice! I hope this bios will work on FE card


----------



## ucode

Quote:


> Originally Posted by *Derpinheimer*
> 
> STRIX bios increased FPS in bf4 test range menu from 157 to 159 (both boosting to 2025) as well as allowed 1.164v using precision, but I was unable to get the card stable at any better OC than with 1.093v.
> 
> The lack of power usage monitoring is also concerning as there seems to be no limit.
> 
> Edit: nvm it is allowing better OC. 2254 seems ok in a quick test but i can't do anything more thorough tonight


Although power setting and percentage reading for both strix xoc and t4 are neutered you can use HWinfo to read gpu power draw in Watts.


----------



## octiny

I can confirm that new XOC bios will brick FE cards. Just flashed both, probably should've just done 1 to test it. Now got to buy a cheapy card so I can flash back.









If anyone has an Asus FE bios I'd much appreciate it, thanks in advance!

Edit: NVM bios found on TPU

Edit 2: Switched to another port on GPU and bios works now.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *octiny*
> 
> I can confirm that new XOC bios will brick FE cards. Just flashed both, probably should've just done 1 to test it. Now got to buy a cheapy card so I can flash back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone has an Asus FE bios I'd much appreciate it, thanks in advance!
> 
> Edit: NVM bios found on TPU


Did you change the DP cable into another port on the gpu?


----------



## octiny

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Did you change the DP cable into another port on the gpu?


Thanks for the heads up! Works now.

+rep


----------



## MrTOOSHORT

No problem.









I had that same problem and went the long way to flash back until someone in this thread told me about the DP port thing.


----------



## fat4l

I will flash it when i wake up. Which dp port will be unusable ?
Btw whats your 2d clocks? Mine stay at 1190mhz or so.. maybe cuz of tdp hard mod?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *fat4l*
> 
> I will flash it when i wake up. Which dp port will be unusable ?
> Btw whats your 2d clocks? Mine stay at 1190mhz or so.. maybe cuz of tdp hard mod?


Use the circled ports in this pic:



See the very top port of the Strix is HDMI and on the FE, it's DP. That's why the black screen when flashing and using this port.


----------



## yenclas

Quote:


> Originally Posted by *fat4l*
> 
> Has anyone tried this ?
> 
> http://forum.hwbot.org/showpost.php?p=455871&postcount=20
> _
> "New bios which has improved performance at the same clock speed (+400 points Fire Strike GPU score). Note that it does not have the overclocked frequency by default."_
> http://www.mediafire.com/download/22ifgc9yuk72eyg/strix1080xoc_t4.zip
> 
> I see ppl are getting 2.3GHz and 1.2V with strix OC bios.....curious if the real performance is rly there...


it works !!! Flashed to my Palit 1080 Gamerock and no tdp limit anymore !

And performance issue from previous version fixed


----------



## Pierre118

Good work!

Would this firmware work on a MSI GTX 1080 Gaming X?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Pierre118*
> 
> Good work!
> 
> Would this firmware work on a MSI GTX 1080 Gaming X?


The previous XOC bios worked on the my Sea Hawk EK X 1080, same pcb as your card. I'm fairly certain there won't be any issues. Make sure to use the correct DP port after the flash as the pic above.


----------



## Whitechap3l

Quote:


> Originally Posted by *yenclas*
> 
> it works !!! Flashed to my Palit 1080 Gamerock and no tdp limit anymore !
> 
> And performance issue from previous version fixed


which OC tool do you use ?
i can't get the TDP regulator on afterburner nor with gpu tweak


----------



## yenclas

Tdp limit don't work because there isn't limit









Enviado desde mi Mi-4c mediante Tapatalk


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Whitechap3l*
> 
> which OC tool do you use ?
> i can't get the TDP regulator on afterburner nor with gpu tweak


Quote:


> Originally Posted by *yenclas*
> 
> Tdp limit don't work because there isn't limit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enviado desde mi Mi-4c mediante Tapatalk


This, locked out in afterburner and other OC programs is normal


----------



## ROKUGAN

Quote:


> Originally Posted by *karlahoin*
> 
> Finally got my Zotac 1080 AMP Extreme yesterday. It was a tight fit:
> 
> 
> http://imgur.com/lAHf5
> 
> 
> Seen it at 2.185 MHz for some time but it stabilizes at 2.088 MHz while gaming (GTA V, Witcher 3 and Fallout 4 mostly). This was at stock.
> 
> I'm not familiar with the OC of Pascal yet, but did a few quick tries just using the sliders on afterburner. It seems to handle +20 MHz GPU and +50 MHz RAM and that was it. Even the +20 didn't seem stable in the long term. Even with that OC, I couldn't notice any difference in FPS.. maybe +1 FPS in W3.
> 
> I'm fairly happy with it tho, never see it below 2 GHz while gaming and it is pretty cool -. As in 57-62C while gaming (in a room with 22C ambient temperature).
> The fans are a bit noisier than I expected tho.


My AMP Extreme starts over 2150 (I've seen it like yours reaching 2185) then gets stable at 2126 Mhz with +60 GPU, temps under 70C without maxing fans (32C room temp)
I keep Mem at +100 @ 11k Mhz but it could go higher.

A great deal of lottery going on, in his review Jay could not get his Extreme past 2050 Mhz:






And this guy with an SLI gets 2152Mhz fully stable with +99 GPU:






At the end comparing models is pretty much pointless as one unit will differ greatly from another. But I have to say that I've never seen any reports of a Zotac AMP Extreme not going above 2K Mhz (while I've seen quite a few from other well known brands), so to me buying this model gives you at least a certain peace of mind that you will reach a pretty decent clock. Mine boosted 2077 Mhz out of the box.

Don't mix it up with the AMP version reports btw, I had both models and the AMP runs 20C hotter than the AMP Extreme, it's a complete different cooler.


----------



## GreedyMuffin

Which bios removes TDP limit and allows you to increase voltage?

Is it safe to use on FE over longer periods of time? (My card is under water) I know the FE only got a 5+1 if i remember correctly.

Perhaps I could push 2250-2300? Since I can do 2139 stock? :hmm:










I am folding alot with my card, so it will probably live a rough life compared to many others.


----------



## Derpinheimer

Quote:


> Originally Posted by *fat4l*
> 
> what card is that?


FTW
Quote:


> Originally Posted by *Whitechap3l*
> 
> which OC tool do you use ?
> i can't get the TDP regulator on afterburner nor with gpu tweak


any idea why afterburner has the old green theme and lacks all those options even with update for me?


----------



## pantsoftime

Quote:


> Originally Posted by *Derpinheimer*
> 
> any idea why afterburner has the old green theme and lacks all those options even with update for me?


Check in the settings where you can pick the skin. It's on the rightmost tab labeled "User Interface".

In the picture you quoted he had both Afterburner and Asus's tool running side by side though.


----------



## karlahoin

Quote:


> Originally Posted by *ROKUGAN*
> 
> My AMP Extreme starts over 2150 (I've seen it like yours reaching 2185) then gets stable at 2126 Mhz with +60 GPU, temps under 70C without maxing fans (32C room temp)
> I keep Mem at +100 @ 11k Mhz but it could go higher.
> 
> A great deal of lottery going on, in his review Jay could not get his Extreme past 2050 Mhz:


Wow, those are great numbers. No way I can get even close to 100 on memory. It crashes immediately at 50 regardless of temps. Same for GPU, even at 40 it isn't stable.
I think I finally numbers that allow 3h+ of gaming without a crash... +15 RAM/+35 GPU at stock voltage (didn't seem to make much of a difference).

Guess I wasn't all that lucky with this one, just like Jay. But it's still nice that it is always at 2 GHz or more.


----------



## Derpinheimer

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Which bios removes TDP limit and allows you to increase voltage?
> 
> Is it safe to use on FE over longer periods of time? (My card is under water) I know the FE only got a 5+1 if i remember correctly.
> 
> Perhaps I could push 2250-2300? Since I can do 2139 stock? :hmm:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am folding alot with my card, so it will probably live a rough life compared to many others.


Its the STRIX bios posted 2 pages back. 1 person says it didnt work with their FE, but it works with my EVGA FTW. Wouldnt really recommend overvolting with a custom bios for a heavy workload like that.
Quote:


> Originally Posted by *pantsoftime*
> 
> Check in the settings where you can pick the skin. It's on the rightmost tab labeled "User Interface".
> 
> In the picture you quoted he had both Afterburner and Asus's tool running side by side though.


Doh.. thanks









Am I also missing where the voltage-clockspeed curve is, or does only EVGA Precision have that?


----------



## Joshwaa

When you flashed that bios to your FTW what does it allow the volts to go to now?


----------



## Derpinheimer

Quote:


> Originally Posted by *Joshwaa*
> 
> When you flashed that bios to your FTW what does it allow the volts to go to now?


Yeah, with precision manual curve I can get to 1.163.

Card doesnt seem to like that voltage much though and I get more stability with 1.143 or 1.153

Max stable seems lower than I thought before, at about 2202mhz


----------



## pantsoftime

Quote:


> Originally Posted by *Derpinheimer*
> 
> Am I also missing where the voltage-clockspeed curve is, or does only EVGA Precision have that?


In afterburner hit Ctrl + F to access the curve. There isn't a button at this point for it. Protip: When using the curve, hold down the Ctrl key to shift all points on the curve simultaneously.


----------



## Barterlos

i have one question to Founders Edition users, did u try to use your FE with Review Sample Bios ?

on techpowerup bios database, there is review sample bios : 86.04.11.00.0C

on retail FE bios is : 86.04.17.00.01 its newer

im curious if that review sample bios is better for overclocking

i know i can check myself, but im little bit afraid, i dont wanna brick my card


----------



## GreedyMuffin

In Norway we got a 5 years warranty, So I'll flash the bios anyways. I'm happy if it dies after the 1180 is out.


----------



## nexxusty

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Use the circled ports in this pic:
> 
> 
> 
> See the very top port of the Strix is HDMI and on the FE, it's DP. That's why the black screen when flashing and using this port.


Quote:


> Originally Posted by *Zeek*
> 
> ASUS VG248QE


Always use the bottom port on any GPU. Be it DVI, HDMI or DP.

Anything else is high level n00batry.


----------



## GreedyMuffin

Which tool do I need in order to flash?

Can I still controll the voltage with the STRIX bios?

If so, one profile in MSI af. for folding, another one for gaming/benching


----------



## fireyfire

Quote:


> Originally Posted by *andressergio*
> 
> Anyone here with the ZOTAC GTX 1080AMP! Extreme ?
> 
> i would like to know how to increase volts, Firestorm Software i buggy
> 
> here's my setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [email protected] 4.6GHz 1.31VCore + 32GB DDR4 3200C16
> ASRock X99 OCF 3.1
> 2x 512GB OCZ RD400 NVMe OS and Benchmark Software
> 7x 1TB SSD Data and Games
> 2x ZOTAC GTX 1080AMP! Extreme 8GB
> Powered By Silverstone ST1500W GS
> Cooled by Alphacool, EK Waterblocks
> 
> can you add me to the list ?
> GPUZ Valid: https://www.techpowerup.com/gpuz/details/frhhg
> 
> Kindly, Sergio


Dont use the firestorm software for overclocking, last time I checked you cant use the V/F curve OC with it anyways. Use the MSI Afterburner BETA, It can control the voltage on my AMP! card just fine


----------



## kx11

Quote:


> Originally Posted by *andressergio*
> 
> Anyone here with the ZOTAC GTX 1080AMP! Extreme ?
> 
> i would like to know how to increase volts, Firestorm Software i buggy
> 
> here's my setup
> 
> 
> 
> [email protected] 4.6GHz 1.31VCore + 32GB DDR4 3200C16
> ASRock X99 OCF 3.1
> 2x 512GB OCZ RD400 NVMe OS and Benchmark Software
> 7x 1TB SSD Data and Games
> 2x ZOTAC GTX 1080AMP! Extreme 8GB
> Powered By Silverstone ST1500W GS
> Cooled by Alphacool, EK Waterblocks
> 
> can you add me to the list ?
> GPUZ Valid: https://www.techpowerup.com/gpuz/details/frhhg
> 
> Kindly, Sergio


how are the temps during a benchmark?? those cards seems so close to each other


----------



## grimboso

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yeah, with precision manual curve I can get to 1.163.
> 
> Card doesnt seem to like that voltage much though and I get more stability with 1.143 or 1.153
> 
> Max stable seems lower than I thought before, at about 2202mhz


How much did you firestrike score increase with xoc bios at that clock?


----------



## grimboso

Quote:


> Originally Posted by *pantsoftime*
> 
> In afterburner hit Ctrl + F to access the curve. There isn't a button at this point for it. Protip: When using the curve, hold down the Ctrl key to shift all points on the curve simultaneously.[/quote
> Just curious, how is this different than offset slider? When I ctrl move the curve like that AB just changes the offset


----------



## nexxusty

Seriously astonishes me how many of you just flash BIOS's still..... lol.


----------



## pantsoftime

Quote:


> Originally Posted by *grimboso*
> Just curious, how is this different than offset slider? When I ctrl move the curve like that AB just changes the offset


Ctrl is similar to offset slider. Its utility is primarily for getting you to a starting point curve. Then you can tune from there. It saves you time of dragging a bunch of points along when you may only be interested in tuning a few points further than you'd get with offset. For me I get quite a decent boost by bumping the points between +1025 and +1093 since that's the area where most things tend to run under load. The other points don't seem to matter as much.


----------



## fat4l

Quote:


> Originally Posted by *Derpinheimer*
> 
> Its the STRIX bios posted 2 pages back. 1 person says it didnt work with their FE, but it works with my EVGA FTW. Wouldnt really recommend overvolting with a custom bios for a heavy workload like that.
> Doh.. thanks


I think it works on FE, you just have to use another Displayport cuz one will be disabled.


----------



## Latchback

Hi all, I made a post on "GTX 1070/1080/TITAN X(2ND GEN) BIOS - Who has it?" about my results with the new XOC bios. I will share it here too since I found the bios interesting.

For me, the new XOC bios t4 helps my card out. I have a 1080 Gigabyte G1 Gaming.

Pros:
I get 200+ more points with the xoc bios with same clocks in timespy. Fire Strike Compare

No more throttling - In multiple games, I would get major power throttle and the card would lower its clocks significantly (which would show up as a frame drop). More stable frames with bios.

Stable voltage at 1.093V (no fluctuations). Note: to limit the voltage I put the Core Voltage slider at 0% and using the voltage curve I set 1.093 as the highest overclock and then for any above points/voltage I set the same clock as as the 1.093V, it should look like


http://imgur.com/nVt2d

. I have not thoroughly tested with higher voltages, but using this method does allow me to set the max attainable voltage using the curve (though you may need to fiddle with the voltage slider) to attain higher overclocks.









Cons:
You guessed it - more heat!

Since the card does not throttle anymore, and stays at a stable 1.093V, my cards heats up a bit more. With default "auto" fan profile (and about 24C-26C ambient) with the original bios I get temps from the range of 70-74C max when gaming. On the xoc bios the range is more like 78-82C max with the "auto" fan profile. Note: I did start to get artifacts/crashes sometimes when the card got into the 80's range, so I just made a custom fan profile to bump up the fan speed a little bit to keep it <80C (which keeps it stable and quiet for me).

Though I am able to set voltages above 1.093V and get better overclocks, the card started heating up more and more to the point where I had to set the fan to 100% and it was still a bit too warm.

I do plan on eventually installing or modding the card with a hybrid water cooling AIO. I know EVGA sells this one, but I am not sure I can mod that one since I have a different PCB. Anyone have a source or know someone that has modded their gigabyte g1 gaming with a AIO water cooler? I know some people have slapped the AIO 980 TI on their 1080 FE, but this would a bit different as I would have to cool the rest of the card with fans (VRM, etc). I am sure I could get better overclocks with better temperatures; or at the very least cooler/quieter.


----------



## octiny

The new XOC bios works great, clocks nearly stay rock solid with better OCing even at high temps versus stock bios. GPU clock adjustment does not work in Precision but memory does for some reason, both clock adjustments work in Afterburner. So essentially you get 1.112v in 3D mode (GPUZ) without any adjustment, and can only use Afterburner with FE cards. Best bios by far, temps are killer though for my top card since it's slotted side by side. I didn't try upping the voltage in Afterburner for obvious reasons, and Precision is kind of useless due to the GPU overclock not kicking in (only memory works). Hope that helps!


----------



## andressergio

Quote:


> Originally Posted by *kx11*
> 
> how are the temps during a benchmark?? those cards seems so close to each other


http://hwbot.org/submission/3275729_andressergio_3dmark___fire_strike_2x_geforce_gtx_1080_30632_marks

The cards barely reach 58C on Air on my Benchtable, I tested the ZOTAC GTX 980TI AMP! Extreme also and never seen a so quiet and performance wise cooler, honestly no need for water at all, and i'm a overclocker...

Kindly, Sergio


----------



## andressergio

Quote:


> Originally Posted by *fireyfire*
> 
> Dont use the firestorm software for overclocking, last time I checked you cant use the V/F curve OC with it anyways. Use the MSI Afterburner BETA, It can control the voltage on my AMP! card just fine


yes i reported to ZOTAC it's buggy i'm using MSI Aft. Thanks !


----------



## GreedyMuffin

Quote:


> Originally Posted by *Latchback*
> 
> Hi all, I made a post on "GTX 1070/1080/TITAN X(2ND GEN) BIOS - Who has it?" about my results with the new XOC bios. I will share it here too since I found the bios interesting.
> 
> For me, the new XOC bios t4 helps my card out. I have a 1080 Gigabyte G1 Gaming.
> 
> Pros:
> I get 200+ more points with the xoc bios with same clocks in timespy. Fire Strike Compare
> 
> No more throttling - In multiple games, I would get major power throttle and the card would lower its clocks significantly (which would show up as a frame drop). More stable frames with bios.
> 
> Stable voltage at 1.093V (no fluctuations). Note: to limit the voltage I put the Core Voltage slider at 0% and using the voltage curve I set 1.093 as the highest overclock and then for any above points/voltage I set the same clock as as the 1.093V, it should look like
> 
> 
> http://imgur.com/nVt2d
> 
> . I have not thoroughly tested with higher voltages, but using this method does allow me to set the max attainable voltage using the curve (though you may need to fiddle with the voltage slider) to attain higher overclocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cons:
> You guessed it - more heat!
> 
> Since the card does not throttle anymore, and stays at a stable 1.093V, my cards heats up a bit more. With default "auto" fan profile (and about 24C-26C ambient) with the original bios I get temps from the range of 70-74C max when gaming. On the xoc bios the range is more like 78-82C max with the "auto" fan profile. Note: I did start to get artifacts/crashes sometimes when the card got into the 80's range, so I just made a custom fan profile to bump up the fan speed a little bit to keep it <80C (which keeps it stable and quiet for me).
> 
> Though I am able to set voltages above 1.093V and get better overclocks, the card started heating up more and more to the point where I had to set the fan to 100% and it was still a bit too warm.
> 
> I do plan on eventually installing or modding the card with a hybrid water cooling AIO. I know EVGA sells this one, but I am not sure I can mod that one since I have a different PCB. Anyone have a source or know someone that has modded their gigabyte g1 gaming with a AIO water cooler? I know some people have slapped the AIO 980 TI on their 1080 FE, but this would a bit different as I would have to cool the rest of the card with fans (VRM, etc). I am sure I could get better overclocks with better temperatures; or at the very least cooler/quieter.


+Rep!

Will install it when I'm finished playing R6S.

You got the flash tool download for me?


----------



## Latchback

Quote:


> Originally Posted by *GreedyMuffin*
> 
> +Rep!
> 
> Will install it when I'm finished playing R6S.
> 
> You got the flash tool download for me?


Sure. Attached is the NVFLASH.

nvflash5.292.zip 927k .zip file


Use commands:
nvflash -i0 --protectoff
nvflash -i0 -6 x.rom

where x.bios where x is the name of the bios.

Also, please save a backup of your bios using GPU-Z first.

Be warned the strix has 2 HDMI slots and 2 DP, whereas many of other cards have 3 DP and 1 HDMI. So when flashing this do not use the DP next to the hdmi port as it will cause issues/not work.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Latchback*
> 
> Sure. Attached is the NVFLASH.
> 
> nvflash5.292.zip 927k .zip file
> 
> 
> Use commands:
> nvflash -i0 --protectoff
> nvflash -i0 -6 x.rom
> 
> where x.bios where x is the name of the bios.
> 
> Also, please save a backup of your bios using GPU-Z first.
> 
> Be warned the strix has 2 HDMI slots and 2 DP, whereas many of other cards have 3 DP and 1 HDMI. So when flashing this do not use the DP next to the hdmi port as it will cause issues/not work.


Thanks! I already have my or.g bios backed up to my email, and I have switched the DP cable. I'm using the port that is right below my DVI.

Rep!

Thanks!


----------



## karelbastos

Someone tested this BIOS XOC bios t4 with the 1080 ZOTAC FE ?

Thanks

Thanks


----------



## GreedyMuffin

Quote:


> Originally Posted by *Latchback*
> 
> -snip-


The folder is empty/no flash utilities in it.


----------



## Latchback

Quote:


> Originally Posted by *GreedyMuffin*
> 
> The folder is empty/no flash utilities in it.


You sure? I just downloaded it. It is a zip that contains nvflash.exe and nvflsh64.sys. You have to open command prompt (as administrator) and go to the nvflash folder (with the two files, and the bioses) and use the commands I listed.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Latchback*
> 
> You sure? I just downloaded it. It is a zip that contains nvflash.exe and nvflsh64.sys. You to open command prompt and go to the nvflash folder (with the two files, and the bioses) and use the commands I listed.


NVM! I managed to get it working with winrar. Duh. Thanks again!


----------



## GreedyMuffin

Sorry for bothering again, but I can't get it to work properly.. If i press a button it will go one menu down, after 6-8 presses it will exit. I have flashed several times before, but I forget how to every time. Hehe.


----------



## pantsoftime

Quote:


> Originally Posted by *nexxusty*
> 
> Seriously astonishes me how many of you just flash BIOS's still..... lol.


At least this time people are seeing tangible results. Plus we get extra entertainment each time gangsta bricks his card.


----------



## nexxusty

Quote:


> Originally Posted by *pantsoftime*
> 
> At least this time people are seeing tangible results. Plus we get extra entertainment each time gangsta bricks his card.


Haha haha. LOL'd nicely off that.


----------



## GanGstaOne

Quote:


> Originally Posted by *pantsoftime*
> 
> At least this time people are seeing tangible results. Plus we get extra entertainment each time gangsta bricks his card.


sorry to spoil your fun but the card works just fine now


----------



## Latchback

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Sorry for bothering again, but I can't get it to work properly.. If i press a button it will go one menu down, after 6-8 presses it will exit. I have flashed several times before, but I forget how to every time. Hehe.


Alright, you cant just click it you need to use command prompt. I will list all steps needed.

1. Make a folder called "nvflash" that is located in the root of the C: drive (where the program files, and windows folders are). Put both nvflash.exe and nvflsh64.sys in there. Also put both the roms in there. The oc strix and the old rom. I renamed the roms, strix.rom and old.rom.

2. Open command prompt with administrative by going to start menu, and typing "cmd". Then when "cmd.exe" shows up, right click it and click run as administrator.

3. Now that command prompt is open we want to move to the nvflash folder where all the stuff is. Type "cd c:\nvflash" without the quotes.

4. Now we can flash cards.
First,
"nvflash -i0 --protectoff" no quotes, screen will flicker, wait about 10 seconds.

Then type,
"nvflash -i0 -6 strix.rom" no quotes, your screen will flicker and it will ask you TWICE if you really want to do this. Press the key "y" each time to continue.

You will have to reboot to take effect, and possible reinstall drivers. If you named your rom different than strix.rom put that instead.

To go back to your original rom follow the same steps except use old.rom in place of strix.rom.

PLEASE, make sure you have another graphics card (or an integrated GPU) in case something goes wrong and you need to use it to "unbrick" your card. If that does happen, you will have to perform additional steps that are not listed above.


----------



## GreedyMuffin

Will do that now!

Thanks!

I got a GT210 in case something goes very wrong. I am not scared of this flash thingy, saved a friends 780 when he flashed his wrong, so I am familiar with the process.

And I'm folding, so I've got 3x 980Tis if the 210 dosen't work.









+Rep!


----------



## GreedyMuffin

Got it all working now, only issues is.

*ERROR: GPU mismatch*

You got any idea of what that is? I renamed the ROM to EVGA.rom so it would be easier to type. I do have a mobo with PLX chips, can that have something to do with it?


Spoiler: Warning: Spoiler!



NVIDIA Firmware Update Utility (Version 5.292.0)
Simplified Version For OEM Only

Checking for matches between display adapter(s) and image(s)...

Adapter: PLX (8747h) (10B5,8747,10B5,8747) H:--:NRM S:00,B:07,D:00,F:00

NOTE: EEPROM does not contain board ID, skipping board ID check.
WARNING: Firmware image PCI Vendor ID (10DE)
does not match adapter PCI Vendor ID (10B5).
WARNING: None of the firmware image compatible PCI Device ID's
match the PCI Device ID of the adapter.
Adapter PCI Device ID: 8747
Firmware image PCI Device ID: 1B80
WARNING: Firmware image PCI Subsystem ID (1043.8594)
does not match adapter PCI Subsystem ID (10B5.8747).
NOTE: Exception caught.
Nothing changed.

Nothing changed.

ERROR: GPU mismatch

c:\nvflash>


----------



## GreedyMuffin

I used the command, but instead of 0I i used F.

'nvflash -F -6 EVGA.rom'

Didn't need to double press Y though, is this OK?


----------



## GreedyMuffin

It worked. Thanks!

Says ASUS in GPU-Z now, and it displays a picture. Thanks!


----------



## nexxusty

Lol.

As long as you're having fun boys.


----------



## ssgwright

wow, with that new XOC bios I'm getting no throttling? I also get 1.111v instead of 1.093? I'm gonna go test firstrike and timespy and report back. I've got the Zotac FE btw.


----------



## GreedyMuffin

Quote:


> Originally Posted by *ssgwright*
> 
> wow, with that new XOC bios I'm getting no throttling? I also get 1.111v instead of 1.093? I'm gonna go test firstrike and timespy and report back. I've got the Zotac FE btw.


Same here. I'm still running stock though. (2139). But no more throttling in BF4! :-D


----------



## grimboso

Do you think a slim 240 and a regular 360 is enough for a deliddad 6700k and a 1080? I want to have the fans go as silent as possible. Vardars at 1000 rpm or so. Considering adding an extra slim 240 but I dno . I am ditching my currently scratchbuild because I bought a 3D printer and I Will redo it.


----------



## ssgwright

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Same here. I'm still running stock though. (2139). But no more throttling in BF4! :-D


nice! looks like my max clocks for benching are 2100.. here's my firestrike ultra run, my highest score so far!!


----------



## Benjiw

Quote:


> Originally Posted by *grimboso*
> 
> Do you think a slim 240 and a regular 360 is enough for a deliddad 6700k and a 1080? I want to have the fans go as silent as possible. Vardars at 1000 rpm or so. Considering adding an extra slim 240 but I dno . I am ditching my currently scratchbuild because I bought a 3D printer and I Will redo it.


I have a silent rig with a 480 and 360mm rads with a 1.3v 970 and 1.5v 4670k, can't hear anything at idle and gaming, you can hear it only just.


----------



## Latchback

Quote:


> Originally Posted by *ssgwright*
> 
> wow, with that new XOC bios I'm getting no throttling? I also get 1.111v instead of 1.093? I'm gonna go test firstrike and timespy and report back. I've got the Zotac FE btw.


If you want to set the voltage, you have to open the voltage/frequency curve in afterburner using CTRL + F.

Then set the highest voltage you want at the highest clock speed, then for the rest of the voltages, keep the clock speed at the same as the one you put for the highest voltage. For example, I wanted to keep my card at 1.093V so I made the voltage/curve look like this:


http://imgur.com/nVt2d


So you can do voltages higher with this bios but beware it will get much hotter and more unstable, so bump up the clocks, voltage slowly using the method of above. You can go up to about 1.2V


----------



## fat4l

Well guys, I used this new bios on my FE, and no joy.
3D is unstable. Cant even get "old" clocks that are stable on stock bios, stable. 3D mark crashing after 10-15s. Even with 1.093v. Even with 1.2V....Hmmmm. Not sure why ?
I only did TDP hard mod + flashed this bios....


----------



## grimboso

Quote:


> Originally Posted by *Benjiw*
> 
> I have a silent rig with a 480 and 360mm rads with a 1.3v 970 and 1.5v 4670k, can't hear anything at idle and gaming, you can hear it only just.


Allright. I got a 1.3v 6700k and the standard 1.09v 1080 so I figure it will be somewhat cooler. Might be an idea to squeeze in both of the two rads, as I plan to go SLI at black friday.


----------



## Latchback

Quote:


> Originally Posted by *fat4l*
> 
> Well guys, I used this new bios on my FE, and no joy.
> 3D is unstable. Cant even get "old" clocks that are stable on stock bios, stable. 3D mark crashing after 10-15s. Even with 1.093v. Even with 1.2V....Hmmmm. Not sure why ?
> I only did TDP hard mod + flashed this bios....


My guess would be that original bios + TDP hard mod would perform better than using a separate vendors bios, which makes some sense! Since you are not having throttling, it is unlikely to give you any benefits. But it might for some people who are not doing hard mods.

Is it stable at lower clocks at all? Its possible that this bios is just not going to be stable for you at all. I would just stick with your bios with your already modded card


----------



## GreedyMuffin

Tried re-installing drivers?

2200 was a no go, testet on 1.09V, then 1.14V. Didn't want to test higher since the FE's power delivery ain't nothing special.

2175 at 1.075V seems stable though. Not too bad of a voltage either.


----------



## Derpinheimer

Quote:


> Originally Posted by *grimboso*
> 
> Do you think a slim 240 and a regular 360 is enough for a de
> liddad 6700k and a 1080? I want to have the fans go as silent as possible. Vardars at 1000 rpm or so. Considering adding an extra slim 240 but I dno . I am ditching my currently scratchbuild because I bought a 3D printer and I Will redo it.


Definitely enough. I've got a super thick 120mm and a "standard" 540mm. Only one 180mm fan at 800rpm keeps gpu temps under 46 in an hour or so. i7 3820 at 1.39v / 4750 stays below 62.

Ambient 24

Just fyi I'm not stupid, i just haven't finished putting fans in yet


----------



## ssgwright

Quote:


> Originally Posted by *Latchback*
> 
> If you want to set the voltage, you have to open the voltage/frequency curve in afterburner using CTRL + F.
> 
> Then set the highest voltage you want at the highest clock speed, then for the rest of the voltages, keep the clock speed at the same as the one you put for the highest voltage. For example, I wanted to keep my card at 1.093V so I made the voltage/curve look like this:
> 
> 
> http://imgur.com/nVt2d
> 
> 
> So you can do voltages higher with this bios but beware it will get much hotter and more unstable, so bump up the clocks, voltage slowly using the method of above. You can go up to about 1.2V


thanks but I understand the curve... what I was saying is that my max volts before were 1.093 now with this bios I get a 1.111 max about a .018 increase

EDIT: oh wow it boost all the way to 1.2! guess I should take the time to read all the posts i missed in the last couple of days lol


----------



## starichok

Dear all hello.
Have quick question,when i overclock my Asus strix 1080 I have noticed that as soon as I hit 2000mhz plus my core clock start throttling down to 1950-2000mhz but my temperature is between 67-72 max...
I have good score at all 3d marks :
fire strike 23400 card score
Fire strike Extreme: 11200 card score
Time Spy:7500
But throttling is bothering me since I'm not hitting 83C when card should start DE clocking due to temperature.
Anyone can give me heads up what might be the reason?
Thank you dudes


----------



## Bogga

Ok friends... here comes more questions!

I've downloaded the strix bios... is that an upgraded bios or is it the same one as the strix cards comes with? I've got two strix cards, so if it's an upgrade I should have 0 issues (I hope)

So, as I said... I've got two strix cards... shall I take one of them out and try it with one of them first or can I just disable SLI? I read the description a couple of posts back, can I just follow those?

EDIT: And which version of nvflash should I use (where to DL?)


----------



## Derpinheimer

Quote:


> Originally Posted by *starichok*
> 
> Dear all hello.
> Have quick question,when i overclock my Asus strix 1080 I have noticed that as soon as I hit 2000mhz plus my core clock start throttling down to 1950-2000mhz but my temperature is between 67-72 max...
> I have good score at all 3d marks :
> fire strike 23400 card score
> Fire strike Extreme: 11200 card score
> Time Spy:7500
> But throttling is bothering me since I'm not hitting 83C when card should start DE clocking due to temperature.
> Anyone can give me heads up what might be the reason?
> Thank you dudes


Its probably temps. These cards throttle well before 83c. Mine starts to throttle in the mid 40s.
Quote:


> Originally Posted by *Bogga*
> 
> Ok friends... here comes more questions!
> 
> I've downloaded the strix bios... is that an upgraded bios or is it the same one as the strix cards comes with? I've got two strix cards, so if it's an upgrade I should have 0 issues (I hope)
> 
> So, as I said... I've got two strix cards... shall I take one of them out and try it with one of them first or can I just disable SLI? I read the description a couple of posts back, can I just follow those?


I believe its better than the stock bios, and also allows voltage >1.1v (I dont think thats by default for strix? Not sure)

I think you will be fine disabling SLI and flashing just one of them.


----------



## starichok

Yes i understand that it is due to temperature...But how can I fix this issue,since tepretaure is not too high and card shouldn't throttle ...maybe I should change card bios or maybe there is something ells that can be doe besides water cooling...but I don't think I need one with Temp 72 top
Thx


----------



## stxe34

hi i have a question, i have two Zotac 1080 FE's i can reach 2100mhz on both cards and around +500 on memory. i don't need any extra voltage to keep them stable. however i notice running these speeds causes artefacts like red dots flashing. i tried increasing the voltage and playing with the memory speeds even setting the memory to zero and i still get them. i can only stop them if i bring the clock speed down to 2088mhz. it that the max for my card or will a different bios help me out? heat is not an issue as they are under water.
thanks


----------



## Derpinheimer

Quote:


> Originally Posted by *starichok*
> 
> Yes i understand that it is due to temperature...But how can I fix this issue,since tepretaure is not too high and card shouldn't throttle ...maybe I should change card bios or maybe there is something ells that can be doe besides water cooling...but I don't think I need one with Temp 72 top
> Thx


Only thing you can do at the moment is to run fans at 100%.

You could also give the STRIX bios posted here a shot:
Quote:


> Originally Posted by *fat4l*
> 
> Has anyone tried this ?
> 
> http://forum.hwbot.org/showpost.php?p=455871&postcount=20
> _
> "New bios which has improved performance at the same clock speed (+400 points Fire Strike GPU score). Note that it does not have the overclocked frequency by default."_
> http://www.mediafire.com/download/22ifgc9yuk72eyg/strix1080xoc_t4.zip
> 
> I see ppl are getting 2.3GHz and 1.2V with strix OC bios.....curious if the real performance is rly there...


My final results with that bios on an EVGA FTW, using just one game, Battlefield 4 Test Range (I believe this to be a fairly reliable test as it is very consistent in what is being rendered. Also I'm lazy and dont want to run a ton of benches just to compare)

Both boosting to 2025MHz and 11160MHz memory: (MENU/TANK SPAWN)FPS
FTW: 157/164
STRIX: 159/166

Max OC (FTW = 2164/11160 @ 1.093V / STRIX = 2190/11160 @ 1.182V)
FTW: 168/176
STRIX: 171/179

So the net result is
clock:clock, STRIX is +1.25%
max clock, STRIX is +1.75%

For now I'll be sticking to the standard FTW bios as - and I may be crazy - it seems like I get a lot more coil whine with the STRIX bios.


----------



## steverebo

tried to flash my 1080 and keep getting a double beep and "cannot open strix.rom" any ideas???


----------



## schoolofmonkey

Quote:


> Originally Posted by *starichok*
> 
> Dear all hello.
> Have quick question,when i overclock my Asus strix 1080 I have noticed that as soon as I hit 2000mhz plus my core clock start throttling down to 1950-2000mhz but my temperature is between 67-72 max...
> I have good score at all 3d marks :
> fire strike 23400 card score
> Fire strike Extreme: 11200 card score
> Time Spy:7500
> But throttling is bothering me since I'm not hitting 83C when card should start DE clocking due to temperature.
> Anyone can give me heads up what might be the reason?
> Thank you dudes


I have the same card and noticed the same problem, if I increase the fan curve the clocks stay higher.
Usually around 50c you'll start noticing them drop slightly as the temps increase, yet my card never goes over 64c even with a 2000Mhz OC (that's because it can't sustain 2000Mhz).


----------



## TWiST2k

I was reading that you used to be able to take a imprecision-x skin and load it into afterburner and enable the kboost option, but I guess they patched that method recently. Does anybody know of a new way to enable the kboost with afterburner on non-msi cards?


----------



## fat4l

Is there any other bios to try out on FE ?


----------



## Phinix

I was thinking, since the MSI 1080 Seahawk uses a Corsair H55 cooler, could you technically replace it with a H110 for example since it has the same pump/mounting design?

I wonder if anyone has opened one up yet to check.


----------



## ssgwright

the new xoc bios is the best, can use 1.2v and no throttling! (for me at least)


----------



## octiny

Quote:


> Originally Posted by *ssgwright*
> 
> the new xoc bios is the best, can use 1.2v and no throttling! (for me at least)


+1. Finally a good OC bios.


----------



## steverebo

how are you guys controlling the voltage mine tops out at 1.112 on this bios and the power/temp limits are greyed out so I cant adjust them????


----------



## ssgwright

Quote:


> Originally Posted by *steverebo*
> 
> how are you guys controlling the voltage mine tops out at 1.112 on this bios and the power/temp limits are greyed out so I cant adjust them????


ya the power and temp limits are maxed so make sure you got good cooling. Use the curve to get the voltage up to 1.2, someone posted a good description on how to do it a couple pages back.


----------



## nyk20z3

Any one running the Gigabyte Extreme Gaming 1080 and how do you like it ?


----------



## starichok

When i overclock i can hit 2050 mhz but it only last few minutes till card hits 65c and then it speed down to 2000-2012mhz....i thought maybe we can do something so card keep working 100% ,unless it is really overheated at 80c...


----------



## moustang

Quote:


> Originally Posted by *Phinix*
> 
> I was thinking, since the MSI 1080 Seahawk uses a Corsair H55 cooler, could you technically replace it with a H110 for example since it has the same pump/mounting design?
> 
> I wonder if anyone has opened one up yet to check.


You probably could, but why bother?

If you're going to go that direction it would be cheaper to get the Gaming, an NZXT G10 bracket, and then the H110. Save yourself $20 and end up with the exact same thing.


----------



## schoolofmonkey

Which OC software are people using, I tried Asus GPU Tweak II and it was horrible, wouldn't even load at startup.
Went back to MSI Afterburner and it just works.

Quote:


> Originally Posted by *Phinix*
> 
> I was thinking, since the MSI 1080 Seahawk uses a Corsair H55 cooler, could you technically replace it with a H110 for example since it has the same pump/mounting design?
> 
> I wonder if anyone has opened one up yet to check.


No it actually uses a different mount, same pump etc, simular to the MSI GTX980ti Seahawk:
http://media.gamersnexus.net/images/media/2015/nvidia/sea-hawk/msi-sea-hawk-9.jpg

You can't change the mounting plate to a different AIO cooler without it leaking.


----------



## fat4l

Quote:


> Originally Posted by *Latchback*
> 
> Alright, you cant just click it you need to use command prompt. I will list all steps needed.
> 
> 1. Make a folder called "nvflash" that is located in the root of the C: drive (where the program files, and windows folders are). Put both nvflash.exe and nvflsh64.sys in there. Also put both the roms in there. The oc strix and the old rom. I renamed the roms, strix.rom and old.rom.
> 
> 2. Open command prompt with administrative by going to start menu, and typing "cmd". Then when "cmd.exe" shows up, right click it and click run as administrator.
> 
> 3. Now that command prompt is open we want to move to the nvflash folder where all the stuff is. Type "cd c:\nvflash" without the quotes.
> 
> 4. Now we can flash cards.
> First,
> "nvflash -i0 --protectoff" no quotes, screen will flicker, wait about 10 seconds.
> 
> Then type,
> "nvflash -i0 -6 strix.rom" no quotes, your screen will flicker and it will ask you TWICE if you really want to do this. Press the key "y" each time to continue.
> 
> You will have to reboot to take effect, and possible reinstall drivers. If you named your rom different than strix.rom put that instead.
> 
> To go back to your original rom follow the same steps except use old.rom in place of strix.rom.
> 
> PLEASE, make sure you have another graphics card (or an integrated GPU) in case something goes wrong and you need to use it to "unbrick" your card. If that does happen, you will have to perform additional steps that are not listed above.


What is this command -i0 for? I flashed my card without using it.
Also do we have to use --protecton after flashing?


----------



## ssgwright

Quote:


> Originally Posted by *fat4l*
> 
> What is this command -i0 for? I flashed my card without using it.
> Also do we have to use --protecton after flashing?


-i0 is to identify which card you want to flash (unneeded if you not running sli) --protectionoff is to flash different vendors bios (msi to asus etc.)


----------



## ucode

Quote:


> Originally Posted by *fat4l*
> 
> What is this command -i0 for? I flashed my card without using it.
> Also do we have to use --protecton after flashing?


I've always used just -6.


----------



## fat4l

Ok cool







thanks guys.

Is there any other bios worth trying for 1080 FE ?

Also theres a weird thing going on.

If I do my commands and I do nvflash -6 xxxxx.rom my screens blinks and goes to super low resolution and cmd closes....
Is that normal?
The only thing I can do is restart pc and try to flash again ....


----------



## ucode

Is it closing or going off screen? If going off screen put the cmd box up in the top left corner of the screen before running and you should still see it when res goes low. With -6 we're usually asked to confirm with "y" before flashing goes ahead.


----------



## ssgwright

Quote:


> Originally Posted by *fat4l*
> 
> Ok cool
> 
> 
> 
> 
> 
> 
> 
> thanks guys.
> 
> Is there any other bios worth trying for 1080 FE ?
> 
> Also theres a weird thing going on.
> 
> If I do my commands and I do nvflash -6 xxxxx.rom my screens blinks and goes to super low resolution and cmd closes....
> Is that normal?
> The only thing I can do is restart pc and try to flash again ....


when you flash you have to disable the display adapter first


----------



## xTesla1856

I have a Gaming X coming next week, how hard is it to flash the Gaming Z vBios? I did a bunch of Bios flashing when I had my Titans, but with Pascal, everything seems a bit different


----------



## MrTOOSHORT

Quote:


> Originally Posted by *xTesla1856*
> 
> I have a Gaming X coming next week, how hard is it to flash the Gaming Z vBios? I did a bunch of Bios flashing when I had my Titans, but with Pascal, everything seems a bit different


Same thing, just use the newest nvflash.

*https://www.techpowerup.com/downloads/2709/nvflash-5-292-0-for-windows*


----------



## TWiST2k

Quote:


> Originally Posted by *ssgwright*
> 
> when you flash you have to disable the display adapter first


On the 980 Ti I had to disabled the card first, when I use the protectoff I do not have to disable the card first.


----------



## stxe34

what frequencies are people seeing artefacts? is seeing artefacts the max for the card or will a bios flash help me get rid of them? i have tried playing with settings but the only thing is lowering the boost clock.


----------



## steverebo

New bios flashed and it has made a difference but not as much as I had hoped got me from 2038 to 2088, cant get above 2088 at 1.2v at 2.1 I get the display driver crash. bit disappointed really as its sitting at 45 degrees full load so I have loads of cooling headroom

Got 5830 Graphics score 3D MARK FIRESTRIKE ULTRA

How does that compare to you guys???


----------



## THEROTHERHAMKID

Whats best temp to keep my 1080 g1 at? Is 65 too hot?throttle too much?


----------



## TWiST2k

Quote:


> Originally Posted by *steverebo*
> 
> New bios flashed and it has made a difference but not as much as I had hoped got me from 2038 to 2088, cant get above 2088 at 1.2v at 2.1 I get the display driver crash. bit disappointed really as its sitting at 45 degrees full load so I have loads of cooling headroom
> 
> Got 5830 Graphics score 3D MARK FIRESTRIKE ULTRA
> 
> How does that compare to you guys???


I will give it a try here in a sec and see what scores I get!


----------



## Whitechap3l

Quote:


> Originally Posted by *steverebo*
> 
> New bios flashed and it has made a difference but not as much as I had hoped got me from 2038 to 2088, cant get above 2088 at 1.2v at 2.1 I get the display driver crash. bit disappointed really as its sitting at 45 degrees full load so I have loads of cooling headroom
> 
> Got 5830 Graphics score 3D MARK FIRESTRIKE ULTRA
> 
> How does that compare to you guys???





Funny enough that in ultra the card doenst peek as high ;D

FireStrike : 2190mHz GPU: 25.726
Ultra : 2177mHz GPU: 5.990
but I just get 1.08mV with the strix OC BIOS ( flashed my non OC Strix )


----------



## wangle0485

So I thought I would give this Strix OC bios a whirl on my Super Jetstream and i've come across an issue i'm hoping someone can help me with- The dual bios function on this card doesnt seem to be working.

I ficked the switch on the card to the no. 2 position and flashed the Strix bios, everything worked fine, GPU-Z shows Asus as the vendor, correct clocks etc. I got no improvement overclocking (Core stable at 2050, memory at 10950)) so I decided just to go back to the stock bios. So I flicked the switch back to position 1 but GPU-Z is still showing Asus as the vendor with factory clock speeds.

I'm guessing either i've done something wrong when flashing or the dual bios feature is dead on this card. Not too sure I really want to RMA this card as 2050/10950 seems an ok overclock, could be better but could be worse....

In nvflash I removed protection then flashed the bios if that makes any difference.

Any help would be greatly appreciated!


----------



## metal409

Quote:


> Originally Posted by *wangle0485*
> 
> So I thought I would give this Strix OC bios a whirl on my Super Jetstream and i've come across an issue i'm hoping someone can help me with- The dual bios function on this card doesnt seem to be working.
> 
> I ficked the switch on the card to the no. 2 position and flashed the Strix bios, everything worked fine, GPU-Z shows Asus as the vendor, correct clocks etc. I got no improvement overclocking (Core stable at 2050, memory at 10950)) so I decided just to go back to the stock bios. So I flicked the switch back to position 1 but GPU-Z is still showing Asus as the vendor with factory clock speeds.
> 
> I'm guessing either i've done something wrong when flashing or the dual bios feature is dead on this card. Not too sure I really want to RMA this card as 2050/10950 seems an ok overclock, could be better but could be worse....
> 
> In nvflash I removed protection then flashed the bios if that makes any difference.
> 
> Any help would be greatly appreciated!


Have you tried flashing back to your original bios and trying then?


----------



## wangle0485

Yeah I flashed it back using position 1, it then shows the original bios in both positions. Its like the switch is just decorative.


----------



## metal409

Quote:


> Originally Posted by *wangle0485*
> 
> Yeah I flashed it back using position 1, it then shows the original bios in both positions. Its like the switch is just decorative.


I'm not completely sure how the bios switching is handled, hopefully someone will be of more help. Makes me wonder if flashing in position 2 first to the Asus bios and then back to original in position 1 does anything to the order/slot. The Asus bios may have just ignored the switch all together, not knowing how to handle it?

Did the switch originally change clocks or anything at all? Palit's site makes it sound like its for in case of a problem with the bios it can auto switch and still be functional.


----------



## fat4l

Quote:


> Originally Posted by *ssgwright*
> 
> when you flash you have to disable the display adapter first


How do i do it?
Whats the correct steps?


----------



## TWiST2k

Might be a silly question, but did you reboot after you flashed? And to disable your device go to device manager find your graphics card and right click and disable it. But I have never had to disable it to flash it before with the 1080, after running protect off it seems to get the job done.


----------



## Derpinheimer

Quote:


> Originally Posted by *wangle0485*
> 
> So I thought I would give this Strix OC bios a whirl on my Super Jetstream and i've come across an issue i'm hoping someone can help me with- The dual bios function on this card doesnt seem to be working.
> 
> I ficked the switch on the card to the no. 2 position and flashed the Strix bios, everything worked fine, GPU-Z shows Asus as the vendor, correct clocks etc. I got no improvement overclocking (Core stable at 2050, memory at 10950)) so I decided just to go back to the stock bios. So I flicked the switch back to position 1 but GPU-Z is still showing Asus as the vendor with factory clock speeds.
> 
> I'm guessing either i've done something wrong when flashing or the dual bios feature is dead on this card. Not too sure I really want to RMA this card as 2050/10950 seems an ok overclock, could be better but could be worse....
> 
> In nvflash I removed protection then flashed the bios if that makes any difference.
> 
> Any help would be greatly appreciated!


Same issue on an EVGA FTW. Try switching to the bios you want, then turn off the PSU power switch for about a minute, then boot up.


----------



## TWiST2k

Quote:


> Originally Posted by *Derpinheimer*
> 
> Same issue on an EVGA FTW. Try switching to the bios you want, then turn off the PSU power switch for about a minute, then boot up.


On my FTW I can just flip the switch to whatever bios I want to over write while I am in Windows and go ahead and flash it, do a standard reboot and I am good to go.


----------



## fat4l

Quote:


> Originally Posted by *TWiST2k*
> 
> Might be a silly question, but did you reboot after you flashed? And to disable your device go to device manager find your graphics card and right click and disable it. But I have never had to disable it to flash it before with the 1080, after running protect off it seems to get the job done.


yeah. I had to go through the flash process twice and reboot twice. weird

So when do I disable it ? Before flashing ? before --protectoff ? or..after ?


----------



## Derpinheimer

Quote:


> Originally Posted by *TWiST2k*
> 
> On my FTW I can just flip the switch to whatever bios I want to over write while I am in Windows and go ahead and flash it, do a standard reboot and I am good to go.


Weird. There's other people with the same problem on the evga forum.


----------



## Whitechap3l

Quote:


> Originally Posted by *fat4l*
> 
> yeah. I had to go through the flash process twice and reboot twice. weird
> 
> So when do I disable it ? Before flashing ? before --protectoff ? or..after ?


First I disable drivers in device manager
then:
cd C:\nvflash
nvflash --protectoff
nvflash -6 xxx.rom
and restart + enable drivers again


----------



## Cool Mike

My best graphics card purchase in a few years.

*1080 EVGA Classified.*

2152 GPU Boost Clock
2126 Sustained Boost (After down clocking due to temps)
1850 GPU Base Clock
11080 Memory Clock
130% Power Setting
100% Voltage


----------



## Whitechap3l

So I used now Evga scanner to adjust voltages and I get my strix with flashed Bios now to 2228 MHz and 11260 memory clock stable
I can go to 2253 MHz but it runs only stable in firestrike not in extreme.

But Scores stays the same roughly 25.5k points in Firestrike

Guess I am fine with that ?


----------



## grimboso

Quote:


> Originally Posted by *Whitechap3l*
> 
> 
> So I used now Evga scanner to adjust voltages and I get my strix with flashed Bios now to 2228 MHz and 11260 memory clock stable
> I can go to 2253 MHz but it runs only stable in firestrike not in extreme.
> 
> But Scores stays the same roughly 25.5k points in Firestrike
> 
> Guess I am fine with that ?


When I tried to use the scanner, it crashes the scanner GUI and then just restarts from start again. How should I continue the scan after the graphic driver crashes? My best score is 24.7k from what I can tell that is quite good, yes?


----------



## GreedyMuffin

If I use the curve on msi. 2177mhz at 1.075V i get a lower score than 2139 at stock 1.050V.

Unstable gpu clock = decerease in performance?

Also, is 1.2V top much for FE when benching?


----------



## Whitechap3l

Quote:


> Originally Posted by *grimboso*
> 
> When I tried to use the scanner, it crashes the scanner GUI and then just restarts from start again. How should I continue the scan after the graphic driver crashes? My best score is 24.7k from what I can tell that is quite good, yes?




Thats my curve I get the most mhz and score for me


----------



## fat4l

Hmmmm I'm doing *26100 Graphics Score*

http://www.3dmark.com/3dm/13944459


----------



## Whitechap3l

Quote:


> Originally Posted by *fat4l*
> 
> Hmmmm I'm doing *26100 Graphics Score*
> 
> http://www.3dmark.com/3dm/13944459


Nice! What are you using?


----------



## grimboso

Quote:


> Originally Posted by *fat4l*
> 
> Hmmmm I'm doing *26100 Graphics Score*
> 
> http://www.3dmark.com/3dm/13944459


That is some nice clocks, both the 5.1 on the cpu and 2.2 on the gpu








Please teach us


----------



## fat4l

Quote:


> Originally Posted by *Whitechap3l*
> 
> Nice! What are you using?


What do u mean? software?
Quote:


> Originally Posted by *grimboso*
> 
> That is some nice clocks, both the 5.1 on the cpu and 2.2 on the gpu
> 
> 
> 
> 
> 
> 
> 
> 
> Please teach us


Hmm







U gotta get some $$ and buy cpu from siliconlottery.com. Then buy a few 1080 and bin them and keep the best one and then put it under water and remove tdp limits


----------



## Whitechap3l

Quote:


> Originally Posted by *fat4l*
> 
> What do u mean? software?
> Hmm
> 
> 
> 
> 
> 
> 
> 
> U gotta get some $$ and buy cpu from siliconlottery.com. Then buy a few 1080 and bin them and keep the best one and then put it under water and remove tdp limits


Yes and maybe which Bios?
You are beast man ??
How much vcore voltage do you have to use for hitting the 5.1?


----------



## ssgwright

here's mine, almost 25 on the gpu... 2100mhz


----------



## dentnu

I don't think those high clocks are scaling correctly as I get a total score of 20445 in firestike. With clocks at core 2139 and memory 5580. Running at 2.2 core should put your total score allot higher than mines. I think that new bios is just as bad as the first one which allowed you to push clocks super high but you would get lower scores and FPS then stock bios. You guys need to do some testing on that new bios. Maybe its a good bios but you guys are hitting some type of power limit or something . All I know that something does not look right.

http://www.3dmark.com/fs/9288454


----------



## wangle0485

Quote:


> Originally Posted by *Derpinheimer*
> 
> Same issue on an EVGA FTW. Try switching to the bios you want, then turn off the PSU power switch for about a minute, then boot up.


I'll give this a try this evening

Quote:


> Originally Posted by *metal409*
> 
> I'm not completely sure how the bios switching is handled, hopefully someone will be of more help. Makes me wonder if flashing in position 2 first to the Asus bios and then back to original in position 1 does anything to the order/slot. The Asus bios may have just ignored the switch all together, not knowing how to handle it?
> 
> Did the switch originally change clocks or anything at all? Palit's site makes it sound like its for in case of a problem with the bios it can auto switch and still be functional.


When I first switched to position 2 everything was exactly the same as position 1. I'll try flashing bios 1 withsomething this evening, switching to position 2 and seeing if anything changes. If it doesn't I'll flip back toposition 1 and see if anything happens.


----------



## fat4l

Quote:


> Originally Posted by *dentnu*
> 
> I don't think those high clocks are scaling correctly as I get a total score of 20445 in firestike. With clocks at core 2139 and memory 5580. Running at 2.2 core should put your total score allot higher than mines. I think that new bios is just as bad as the first one which allowed you to push clocks super high but you would get lower scores and FPS then stock bios. You guys need to do some testing on that new bios. Maybe its a good bios but you guys are hitting some type of power limit as something . All I know that something does not look right.
> 
> http://www.3dmark.com/fs/9288454


You gotta look at graphics score only. Total score is affected by CPU a lot!
My socre at 2202MHz is about 3.2% higher than urs.... That's for 60MHz more on the core..
Quote:


> Originally Posted by *Whitechap3l*
> 
> Yes and maybe which Bios?
> You are beast man ??
> How much vcore voltage do you have to use for hitting the 5.1?


I'm using 1.35V stable..passing realbench 4hours easy...


----------



## Derpinheimer

Quote:


> Originally Posted by *dentnu*
> 
> I don't think those high clocks are scaling correctly as I get a total score of 20445 in firestike. With clocks at core 2139 and memory 5580. Running at 2.2 core should put your total score allot higher than mines. I think that new bios is just as bad as the first one which allowed you to push clocks super high but you would get lower scores and FPS then stock bios. You guys need to do some testing on that new bios. Maybe its a good bios but you guys are hitting some type of power limit or something . All I know that something does not look right.
> 
> http://www.3dmark.com/fs/9288454


I get higher max oc and higher performance per clock with it. It's very good.


----------



## dentnu

Quote:


> Originally Posted by *fat4l*
> 
> You gotta look at graphics score only. Total score is affected by CPU a lot!
> My socre at 2202MHz is about 3.2% higher than urs.... That's for 60MHz more on the core..


Quote:


> Originally Posted by *Derpinheimer*
> 
> I get higher max oc and higher performance per clock with it. It's very good.


Interesting can I get a link to this bios ? Can I get some more info on it like whats max voltage and has power limit been disabled ?
Thanks


----------



## Snabeltorsk

Quote:


> Originally Posted by *dentnu*
> 
> Interesting can I get a link to this bios ? Can I get some more info on it like whats max voltage and has power limit been disabled ?
> Thanks


 strix1080xoc_t4.zip 148k .zip file


No Tdp limit, max volt is 1.2000


----------



## dentnu

Quote:


> Originally Posted by *Snabeltorsk*
> 
> strix1080xoc_t4.zip 148k .zip file
> 
> 
> No Tdp limit, max volt is 1.2000


+Rep Thanks!


----------



## Snabeltorsk

Quote:


> Originally Posted by *dentnu*
> 
> +Rep Thanks!


You may have to change Displayport beqause 1 is disabled due to the fact that Asus have 2dp and 2 hdmi, just a reminder.


----------



## xer0h0ur

Quote:


> Originally Posted by *Snabeltorsk*
> 
> strix1080xoc_t4.zip 148k .zip file
> 
> 
> No Tdp limit, max volt is 1.2000


So this is a FE compatible BIOS?


----------



## GreedyMuffin

Quote:


> Originally Posted by *xer0h0ur*
> 
> So this is a FE compatible BIOS?


Yep, runnig that on my FE.

One of your DP ports will stop working. Since the strix has 2 hdmi and 2 dp instead of the FEs 1 hdmi and 3 dp.


----------



## xer0h0ur

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yep, runnig that on my FE.
> 
> One of your DP ports will stop working. Since the strix has 2 hdmi and 2 dp instead of the FEs 1 hdmi and 3 dp.


Not a problem for me. I only use two DP ports with my two monitors.


----------



## pantsoftime

Quote:


> Originally Posted by *dentnu*
> 
> Interesting can I get a link to this bios ? Can I get some more info on it like whats max voltage and has power limit been disabled ?
> Thanks


All the info you're looking for is covered in the last few pages. It's an ROG Strix BIOS. There is no power or temperature limit. Max vcore is 1.20V. If you don't have an Asus card then you will lose one of your displayports since Asus uses 2x HDMI. Read back for BIOS links, testimonials, and flashing instructions.


----------



## fat4l

I dont use this strix T4 bios. I will try later. All the scores were done on my stock bios.


----------



## karelbastos

Someone flashed a FE with this BIOS and really get a better Benchmark results ?

Or its just bigger clock ?

Thanks

Thanks

I have two ZOTAC 1080 FE and i want to try

Here i get 2025 stable on games.... SLI

And if i Flash my card, can i go back to original BIOS, if i saved this bios before of course

Thanks


----------



## ssgwright

Quote:


> Originally Posted by *karelbastos*
> 
> Someone flashed a FE with this BIOS and really get a better Benchmark results ?
> 
> Or its just bigger clock ?
> 
> Thanks
> 
> Thanks
> 
> I have two ZOTAC 1080 FE and i want to try
> 
> Here i get 2025 stable on games.... SLI
> 
> And if i Flash my card, can i go back to original BIOS, if i saved this bios before of course
> 
> Thanks


ya the new xoc bios is overclocking better and scoring better than my stock FE bios

edit: and yes you can flash back


----------



## karelbastos

So

I can do the bios update, and if nothing get better i can return my bios back ?

I never update any VGA bios, it is safe and easy to do ?

Is there a possibility to kill the VGA ?

Sorry for my english

Thanks


----------



## ssgwright

Quote:


> Originally Posted by *karelbastos*
> 
> So
> 
> I can do the bios update, and if nothing get better i can return my bios back ?
> 
> I never update any VGA bios, it is safe and easy to do ?
> 
> Is there a possibility to kill the VGA ?
> 
> Sorry for my english
> 
> Thanks


yes you can flash back.

if you get a bad flash (rare) for example I flashed a non-reference 1080 bios with my FE and the card wouldn't boot. If this happens you just have to blind flash it back or use another video card to flash it. I've never heard of anyone bricking a card due to flashing


----------



## karelbastos

Thnx again

So the steps for FLASH IS

Disable the drivers in device manager

cd C:\nvflash
nvflash --protectoff
nvflash -6 xxx.rom
and restart + enable drivers again

This is the correct steps ?

And as i USE 2 x 1080 SLI, i need to remove one 1080 first and use one ate time to FLASH each one single ?

And after flashed the two, i can back to SLI ?

Thanks


----------



## stxe34

use nvflash -i0 -6 xxx.rom for card 1 then nvflash -i1 -6 xxx.rom for card 2 no need to remove them


----------



## stxe34

im still stuck with artefacts even after the strix bios fllash anything above 2088. i have tried max voltage and 0 memory oc. notice it more on project cars and crysis 3. if i set the oc to 2088 no artefacts. does this mean i hae reached the max for these cards? thanks


----------



## karelbastos

Quote:


> Originally Posted by *stxe34*
> 
> use nvflash -i0 -6 xxx.rom for card 1 then nvflash -i1 -6 xxx.rom for card 2 no need to remove them


Card 1 you mean first SLOT upper SLOT

And card 2 you mean second slot lower one ?

Thanks

Other thing

I have 2 x 1080 SLI

One separated i can reach +215 on CORE CLOCK stable without crashes

Other separated i can reach max +165 on CORE CLOCK stable without crashes

The Bios update will hellp me to pass the card LOTTERY limit +165 on CORE CORE for my second SLI ?

Or will still the same ?

Two are same ZOTAC 1080 FE. Thanks

Thanks


----------



## stxe34

nvflash --list will show you which cards are where


----------



## karelbastos

Quote:


> Originally Posted by *stxe34*
> 
> nvflash --list will show you which cards are where


Error unable to setup nvflash driver here. I do something wrong ?

--list here show no devices

I need to do

--protectoff first ?


----------



## Asus11

Quote:


> Originally Posted by *Snabeltorsk*
> 
> strix1080xoc_t4.zip 148k .zip file
> 
> 
> No Tdp limit, max volt is 1.2000


is this a different strix bios?

I flashed one a few months ago and dont remember it having 1.2v?

when me and a few tested we found out the FE was still better even though it had less clocks

is it the same for this? or is it better than FE as in scores better which in turn means its better despite what the clocks say


----------



## Spiriva

Quote:


> Originally Posted by *karelbastos*
> 
> Error unable to setup nvflash driver here. I do something wrong ?
> 
> --list here show no devices
> 
> I need to do
> 
> --protectoff first ?


nvflash --index=0 --save 1080org.rom
nvflash --index=0 --protectoff
nvflash --index=0 -6 name.rom


----------



## chronicfx

Quote:


> Originally Posted by *Asus11*
> 
> disable g sync, close all other programs just have valley open and see how you get on obv you already know power slider to max and core slider to max


How do you disable g-sync?
Quote:


> Originally Posted by *Asus11*
> 
> disable g sync, close all other programs just have valley open and see how you get on obv you already know power slider to max and core
> Quote:
> 
> 
> 
> Originally Posted by *Agavehound*
> 
> Hey gang, been tinkering with my FTW and finally was able to get it stable at +130/500 using the slave bios.
> 
> I ran Heaven without G-sync and it ran perfectly with a max temp of 57c and second run with G-sync off but I got some weird waves on my screen. What would cause the waves with G-sync off?
> 
> 
> 
> Screen Tearing?
Click to expand...


----------



## karelbastos

Quote:


> Originally Posted by *Spiriva*
> 
> nvflash --index=0 --save 1080org.rom
> nvflash --index=0 --protectoff
> nvflash --index=0 -6 name.rom


REsolved the problem here

Just restarted and commands on nvflash works great.

But Other thing

I have 2 x 1080 SLI

One separated i can reach +215 on CORE CLOCK stable without crashes

Other separated i can reach max +165 on CORE CLOCK stable without crashes

The Bios update will hellp me to pass the card LOTTERY limit +165 on CORE CORE for my second SLI ?

Or will still the same ?

Two are same ZOTAC 1080 FE. Thanks


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Asus11*
> 
> is this a different strix bios?
> 
> I flashed one a few months ago and dont remember it having 1.2v?
> 
> when me and a few tested we found out the FE was still better even though it had less clocks
> 
> is it the same for this? or is it better than FE as in scores better which in turn means its better despite what the clocks say


It's version two of the XOC bios, more improved:

*http://forum.hwbot.org/showpost.php?p=455871&postcount=20*


----------



## Asus11

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It's version two of the XOC bios, more improved:
> 
> *http://forum.hwbot.org/showpost.php?p=455871&postcount=20*


thanks going to test it now


----------



## Asus11

Quote:


> Originally Posted by *chronicfx*
> 
> How do you disable g-sync?
> Screen Tearing?


via Nvidia control panel

should be on the left, click and untick the enable box then click apply should be done then


----------



## karelbastos

So

I have done the bios update

Will test now

See that Afterburner show No power limite and temp slider option

Will test not to see if i can past my gpy +165 core clock limit

--index=1 is my descond slot vga

--index=0 is my first pci e slot vga

Thanks

So...

I Tested and my GPU that i can't reach +165 COREC CLOCK still the same, can'T reach more than +165 on core clock.

But this is a LOTTERY, my other 1080 ( i use sLI ) i can reach +215 stable.

Not i will test if with this Strix T4 i can reach more performance with the same +165 core clock.


----------



## Asus11

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It's version two of the XOC bios, more improved:
> 
> *http://forum.hwbot.org/showpost.php?p=455871&postcount=20*


ok flashed the bios, is it normal for the power/ temp to be greyed out?

I dont mind because im on water just want to make sure the oc programs are working fine with the bios


----------



## pantsoftime

Edit: Redundant post

Yes it's normal for the power and temp limits to be grayed out. This BIOS does that intentionally. Make sure your cooling is up to the task since nothing will protect your chip from burning itself up.


----------



## Asus11

Quote:


> Originally Posted by *pantsoftime*
> 
> Edit: Redundant post
> 
> Yes it's normal for the power and temp limits to be grayed out. This BIOS does that intentionally. Make sure your cooling is up to the task since nothing will protect your chip from burning itself up.


thank you,

yes its fine

watercooled..

time to test it to the FE

brb


----------



## Azazil1190

I flashed too the new xoc bios on my strix oc but in 3d mark fs the max voltage that I get is 1.130v.
Who is the way to see that damn I.2v

Sorry for my English.


----------



## Asus11

Quote:


> Originally Posted by *Azazil1190*
> 
> I flashed too the new sox bios on my strix oc but in 3d mark fs the max voltage that I get is 1.130v.
> Who is the way to see that damn I.2v
> 
> Sorry for my English.


ditto

also flashed this bios and don't see 1.2v


----------



## Azazil1190

Quote:


> Originally Posted by *Asus11*
> 
> ditto
> 
> also flashed this bios and don't see 1.2v


What is the max voltage that you get.
I dont know if the curve can help for the 1.2


----------



## MrTOOSHORT

Have to use the curve in AB to get 1.2v:

*http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,2.html*


----------



## achilles73

Flashed this "new XOC bios" on my MSI 1080 Gaming X, and i can go higher in the clocks, but not in performance...

I need to clock the XOC bios at 2126mhz (1,112v) to have the same performance that my original MSI bios give me at 2063mhz (1.093v)
(Tested with 3dmark, Valley, RSSiege)
I can up the voltage, but it crashes and it dosn't give more performance.


----------



## Asus11

Quote:


> Originally Posted by *Azazil1190*
> 
> What is the max voltage that you get.
> I dont know if the curve can help for the 1.2


1,093v...









I've never messed with the curve only manual?

can you unlock it with the curve


----------



## Asus11

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Have to use the curve in AB to get 1.2v:
> 
> *http://www.guru3d.com/articles_pages/geforce_gtx_1080_overclocking_guide_with_afterburner_4_3,2.html*


thanks, the curve makes my head hurt have no idea how it works or how to adjust it perfectly

:-( better get learning


----------



## karelbastos

Tested

For me, the ZOTAC FE bios works better than the T4 bios loaded to ZOTAC FE

The voltage here on AIR arrived max 1,13v using STRIX

But no more CLOCK advantage or performance on HEAVEN BENCH

I will wait for FE BIOS MOD... Maybe better results.

But this no difference maybe is because i have no lucky with this GPU, that stock bios i can't go futher than +165 on core clock

Now i will try on my first GPU that i can reach +215 on core clock using stock bios, and see if the T4 Bios do some improvement.


----------



## Asus11

Quote:


> Originally Posted by *karelbastos*
> 
> Tested
> 
> For me, the ZOTAC FE bios works better than the T4 bios loaded to ZOTAC FE
> 
> The voltage here on AIR arrived max 1,13v using STRIX
> 
> But no more CLOCK advantage or performance on HEAVEN BENCH
> 
> I will wait for FE BIOS MOD... Maybe better results.
> 
> But this no difference maybe is because i have no lucky with this GPU, that stock bios i can't go futher than +165 on core clock
> 
> Now i will try on my first GPU that i can reach +215 on core clock using stock bios, and see if the T4 Bios do some improvement.


I managed to get 1.2v working via the msi afterburner curve.. click cntrl + F to bring it up

also noticing some very poor valley scores even with 2250mhz on the core.. alot worse than 2100 on FE bios


----------



## TWiST2k

Quote:


> Originally Posted by *Asus11*
> 
> I managed to get 1.2v working via the msi afterburner curve.. click cntrl + F to bring it up
> 
> also noticing some very poor valley scores even with 2250mhz on the core.. alot worse than 2100 on FE bios


I was messing with the curve last night as well, but did not have much time. Mind posting a shot of your curve for reference. Thanks dude!


----------



## Derpinheimer

Interesting! I get better scores with it, but thats from a FTW. Maybe FE bios has tighter memory timings? Or.. someone mentioned "video clock" on the hwbot forums?


----------



## Azazil1190

Quote:


> Originally Posted by *Asus11*
> 
> I managed to get 1.2v working via the msi afterburner curve.. click cntrl + F to bring it up
> 
> also noticing some very poor valley scores even with 2250mhz on the core.. alot worse than 2100 on FE bios


How you do that mate?
I want too 1.2








Am I have to let the voltage slider at 0 and the core and to play only with the curve profile?


----------



## Asus11

Quote:


> Originally Posted by *Azazil1190*
> 
> How you do that mate?
> I want too 1.2
> 
> 
> 
> 
> 
> 
> 
> 
> Am I have to let the voltage slider at 0 and the core and to play only with the curve profile?




im going to flash back to FE

I even tried only 2200 to see if maybe it was because the clocks was too high managed to get 2300 briefly
but still was worse than 2100 on the FE


----------



## Azazil1190

Quote:


> Originally Posted by *Asus11*
> 
> 
> 
> im going to flash back to FE
> 
> I even tried only 2200 to see if maybe it was because the clocks was too high managed to get 2300 briefly
> but still was worse than 2100 on the FE


Thnx mate for the help appreciate.
Which 1080 you got?
Cause somwhere I read that xoc bios works better on asus strix cards I dont know if that is true.


----------



## karelbastos

On curve it goes to 1.2v here too

ZOTAC FE 1080

Curve with last point CTRL and select 2150 mhz

But temps get soo high

Maybe on water we can get FE stable ate 2150 - 2200

on air... its impossible. Temp goes to 80+ and clock drop to 2114 mhz and not stable.


----------



## Azazil1190

Still I cant make it work the 1.2 even with the curve ***.i set only the last point with ctrl to 2150 1.2v and only the clock goes to 2150 but the voltage staying stuck at 1.125-1.130


----------



## karelbastos

Quote:


> Originally Posted by *Azazil1190*
> 
> Still I cant make it work the 1.2 even with the curve ***.i set only the last point with ctrl to 2150 1.2v and only the clock goes to 2150 but the voltage staying stuck at 1.125-1.130


You selected the last points, press controll and goes it to 2150 - 2200 ?

Doing this, your voltage goes to maximum allowed.

Mine here works doing this.. ZOTAC FE what is yours ?


----------



## Asus11

Quote:


> Originally Posted by *karelbastos*
> 
> On curve it goes to 1.2v here too
> 
> ZOTAC FE 1080
> 
> Curve with last point CTRL and select 2150 mhz
> 
> But temps get soo high
> 
> Maybe on water we can get FE stable ate 2150 - 2200
> 
> on air... its impossible. Temp goes to 80+ and clock drop to 2114 mhz and not stable.


im on water and managed 2200+ and even 2300 for awhile but then it crashed

passed a few on valley at 2250-2293

and its still slower than 2100 on FE

my card is a FE MSI 1080 ( makes no difference that its MSI all FE are direct from Nvidia and just repackaged)

can anyone else also confirm my finding even with this new strix bios?


----------



## karelbastos

So, after 2100 mhz

No performance increased ?


----------



## Azazil1190

Quote:


> Originally Posted by *karelbastos*
> 
> You selected the last points, press controll and goes it to 2150 - 2200 ?
> 
> Doing this, your voltage goes to maximum allowed.
> 
> Mine here works doing this.. ZOTAC FE what is yours ?


I set it to 2150 and the max Voltage was 1.175 but I reach the 60c and then the clock goes to 2133 at 1.130v.
The truth is that I didn't notice any improvements on scores vs the stock bios of my card maybe a bit but with more heat and voltage.
My card is the strix oc.
Im gonna flash the stock bios again cause really I can reach 2126 with 1.075v and now with this bios my card have more voltage to those clocks and heat.
I believe that no one can make a vbios for the 10xx series or whatever pascal chip


----------



## Whitechap3l

Quote:


> Originally Posted by *Azazil1190*
> 
> Thnx mate for the help appreciate.
> Which 1080 you got?
> Cause somwhere I read that xoc bios works better on asus strix cards I dont know if that is true.


For me it is the best BIOS... I get the most Mhz AND Score and yeah i am rocking a watercooled Strix ( non OC )


----------



## Azazil1190

Quote:


> Originally Posted by *Whitechap3l*
> 
> For me it is the best BIOS... I get the most Mhz AND Score and yeah i am rocking a watercooled Strix ( non OC )


Really good for you mate.
Maybe this bios works better on water I dont know.


----------



## fat4l

For me it wasnt good either. We just have to wait for pascal bios tweaker..


----------



## TWiST2k

Quote:


> Originally Posted by *fat4l*
> 
> For me it wasnt good either. We just have to wait for pascal bios tweaker..


The wait is real man, uggghhhh!!!


----------



## Raisingx

Is it normal to have artifacts on a MSI 1080 Gaming X with the memory overclock over ~250mhz ?

Core clock is stable at 2126mhz with default fan profile.


----------



## Hutzi

@Raisingx: Why should this be normal? It depents - either you are lucky or you are not.
Mine is rockstable running with +480 Mhz. It seems pretty solid even with +500 but had some minor artifacts in Witcher 3 after some hours of gameplay, so I turned it down a bit.

My GPU Core is not willing to go above 2050 Mhz ; however i didnt touched the vcore yet.

(Palit GTX 1080 Gamerock)


----------



## stxe34

so should upping the vcore to 1.2v clear the artefacts?


----------



## Hutzi

@stxe34: Which post do you refer to?
If this is a reply to mine then "no", because vcore is refered to GPU itself, while the vram (memory) has it's own voltage.

Its dependent on where the artifacts come from - the VRAM or the GPU.
Easily to find out by overclocking just one of them at the same time: if the artifacts showing up it comes form the part you have currently overclocked.

Raising the Vcore to fight artifacts from the VRAM wouldn't do it.


----------



## juniordnz

Quote:


> Originally Posted by *Raisingx*
> 
> Is it normal to have artifacts on a MSI 1080 Gaming X with the memory overclock over ~250mhz ?
> 
> Core clock is stable at 2126mhz with default fan profile.


I got a lot of artifacts with +425mem on my Armor (same card as yours essentially). They come in the shape of snowflakes and blanking screen sometimes. Got a stable 2062mhz core when card is below 62ºC. I read something about MSI making a recall of some defective 1070 that were having artifacts problems even with stock settings, don't know if the same applies to 1080s aswell...


----------



## stxe34

Quote:


> Originally Posted by *Hutzi*
> 
> @stxe34: Which post do you refer to?
> If this is a reply to mine then "no", because vcore is refered to GPU itself, while the vram (memory) has it's own voltage.
> 
> Its dependent on where the artifacts come from - the VRAM or the GPU.
> Easily to find out by overclocking just one of them at the same time: if the artifacts showing up it comes form the part you have currently overclocked.
> 
> Raising the Vcore to fight artifacts from the VRAM wouldn't do it.


mine are related to as memory oc does not effect them. thats why im wondering if increasing will get rid of them.


----------



## stxe34

ok so i have noticed that after the xoc bios im stuck at 962mv no matter what. i have unlocked voltage control and set the curve via the control button to 2101mhz. the voltage stays at 962mv?


----------



## tin0

As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.



*Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).

When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you









GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


----------



## TWiST2k

Quote:


> Originally Posted by *tin0*
> 
> As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.
> 
> 
> 
> *Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).
> 
> When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


What are these EXE files? Did you write them yourself? what do they do? Do you have the source for them?

When we have the ability to just flash rom files, I do not understand the need to have some custom exe files that we do not know the origin of, no offence to you at all, sharing information is always appreciated, I just like to know the origin of random code I am running on my computer.

I did not look at the PDF and see it is straight from MSI, as I said before appreciate the upload, when I first looked at them I was like what the hell is this virus looking madness haha.


----------



## fat4l

I wonder if someone is gonna flash gaming Z to FE card









So far the stock nvidia bios is the best for me. Evga sc is not good. Strix oc t4 no good. High clokcs but low scores as 50mhz less. 2300mhz crash...
Max i got was 2252mhz and 1.15v.


----------



## fjordiales

Quote:


> Originally Posted by *tin0*
> 
> As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.
> 
> 
> 
> *Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).
> 
> When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


Thanks. Been looking everywhere for this. Only found X and used it on regular gaming. Will try this after working.

Do you mind sharing where you found this bat file? When I used the X bat file, I had to reinstall drivers though but worked great. Regular gaming will now turn to Z.


----------



## fat4l

Here, 2189MHz + 1.081v and 11114MHz mems.
This is giving me the same results as 2202MHz + 1.094v and 11114MHz mems so I rather have lower volts









FS - 26082 Graphics score http://www.3dmark.com/fs/9690893
FS X - 12329 Graphics score http://www.3dmark.com/3dm/13964135
FS U - 6063 Graphics score http://www.3dmark.com/fs/9691070
TS - 8294 Graphics score http://www.3dmark.com/3dm/13964425


----------



## tin0

Quote:


> Originally Posted by *fjordiales*
> 
> Thanks. Been looking everywhere for this. Only found X and used it on regular gaming. Will try this after working.
> 
> Do you mind sharing where you found this bat file? When I used the X bat file, I had to reinstall drivers though but worked great. Regular gaming will now turn to Z.


In my office







Let me know how this BIOS clocks for you!


----------



## ucode

Quote:


> Originally Posted by *TWiST2k*
> 
> The wait is real man, uggghhhh!!!


Just like waiting to win the lottery.


----------



## Bogga

Tried the Strix-bios on my strix cards... gave me nothing. Perhaps I'll try the curve once I get both cards under water in a couple of weeks


----------



## stxe34

i found out what was going on. i noticed for the curve to work i had to do each card separately. so if you have sli!


----------



## jase78

So strix cards do have a dual bios switch? Dont recall seeing it. Are they both same bios from stock? Want to try the new bios on this card. Wish we had a graph or table comparing results from different cards and this bios.


----------



## fjordiales

Quote:


> Originally Posted by *tin0*
> 
> In my office
> 
> 
> 
> 
> 
> 
> 
> 
> Let me know how this BIOS clocks for you!


OC mode not stable for me. It spikes to 2012 then 2000 then driver crash. Gaming mode it stable so far. I do need to reinstall drivers after every bios flash. Well, I guess that's y it's just a regular gaming. Didn't make the cut on 2k mhz.


----------



## cstarkey42

How are you guys getting over 1.1V? Do I need the new bios or does XOC not allow it and I need afterburner? I'm using XOC but at 100% it gets to around 1.09. I've started playing with the linear curve and seem to be getting better results with my 1080 FTW. BF4 would freeze on startup at +125 but with linear set to +50 low/+175 high I haven't had a problem yet. The power even finally broke 110%, even though it's set at 130% (via HW64).


----------



## GreedyMuffin

The XOC bios is stable for me. Been folding for a couple of days at 2139 1.05V. 100 percent stable.


----------



## juniordnz

Guess the XOC wouldn't be a smart choice to someone on air with room temps around 30ºC, would it? Guess I'm sticking with 1.093v until I can put my ftw on water.


----------



## nexxusty

Quote:


> Originally Posted by *Whitechap3l*
> 
> Yes and maybe which Bios?
> You are beast man ??
> How much vcore voltage do you have to use for hitting the 5.1?


He's a beast for buying a binned CPU?

Lol. Give praise when it's warranted..... seriously.


----------



## chronicfx

Quote:


> Originally Posted by *nexxusty*
> 
> He's a beast for buying a binned CPU?
> 
> Lol. Give praise when it's warranted..... seriously.


Takes ballz! I ran my 3570k at 5.1[email protected]56v for a stint until I got some sense "several months" later and downclocked it to 5.0ghz @1.51v... the got some more sense "several months" later and started to run 4.5ghz at 1.28v and left it there until I gave it to my brother which now runs it at stock auto.. Mind you it never blue screened or had whea errors, the dropping of multipliers was more like... Welll..... What if this chip dies, then what... Anyways I never learn as I run my skylake 6700k at 1.488v and 4.9ghz







This is OC-MFn-N after all.. Come with the voltage or don't come at all!


----------



## GreedyMuffin

And here I am beaching about not going over 1.250V on my 5960X.


----------



## nexxusty

Quote:


> Originally Posted by *chronicfx*
> 
> Takes ballz! I ran my 3570k at [email protected] for a stint until I got some sense "several months" later and downclocked it to 5.0ghz @1.51v... the got some more sense "several months" later and started to run 4.5ghz at 1.28v and left it there until I gave it to my brother which now runs it at stock auto.. Mind you it never blue screened or had whea errors, the dropping of multipliers was more like... Welll..... What if this chip dies, then what... Anyways I never learn as I run my skylake 6700k at 1.488v and 4.9ghz
> 
> 
> 
> 
> 
> 
> 
> This is OC-MFn-N after all.. Come with the voltage or don't come at all!


Type in Silicon Lottery in Google and you should understand better.

He's a good guy don't get me wrong.... he paid for those balls though. That's what Silicon Lottery is....


----------



## djtmalta

Anyone have a EVGA GTX 1080 Classified bios they wouldn't mind posting so I could download it? I have looked all around the internet and no one put up one for download.


----------



## octiny

I can confirm that the new Gaming Z bios that was posted works on FE cards. That batch program is very fast, pressed 1 key and it flashed both of my cards at the same time as restarted.

Testing now!

Edit: Not good for FE cards, fans are limited to 2492RPM @100% and scores are worse at same speed versus stock FE bios. XOC V2 still the best bios.


----------



## fat4l

Quote:


> Originally Posted by *octiny*
> 
> I can confirm that the new Gaming Z bios that was posted works on FE cards. That batch program is very fast, pressed 1 key and it flashed both of my cards at the same time as restarted.
> 
> Testing now!
> 
> Edit: Not good for FE cards, fans are limited to 2492RPM @100% and scores are worse at same speed versus stock FE bios. XOC V2 still the best bios.


well for me, nvidias stock bios is the best. yes XOC T4 gives you more volts but no real performance gains..


----------



## octiny

Quote:


> Originally Posted by *fat4l*
> 
> well for me, nvidias stock bios is the best. yes XOC T4 gives you more volts but no real performance gains..


Odd, I get 750+ more GPU points with XOC bios in SLI on Timespy. Whatever works best for you!


----------



## juniordnz

Finally got my EVGA 1080FTW today. No more MSI armor with high temps and low clocks.


Spoiler: PICS FTW








Comparison of FTW and ARMOR "out of the box" firestrike bench. 75mhz more for FTW. 2000mhz out of the box without touching anything. I'm happy with that.


----------



## Whitechap3l

Quote:


> Originally Posted by *octiny*
> 
> Odd, I get 750+ more GPU points with XOC bios in SLI on Timespy. Whatever works best for you!


Me too ?


----------



## Asus11

Quote:


> Originally Posted by *octiny*
> 
> Odd, I get 750+ more GPU points with XOC bios in SLI on Timespy. Whatever works best for you!


I might give this bios another shot... maybe I rushed abit..

what I want to know is

power / temps are greyed out

what is the power limit of the xoc bios? 120 on FE whats on the xoc? dont bother about temps im on water


----------



## wangle0485

Quote:


> Originally Posted by *Hutzi*
> 
> @Raisingx: Why should this be normal? It depents - either you are lucky or you are not.
> Mine is rockstable running with +480 Mhz. It seems pretty solid even with +500 but had some minor artifacts in Witcher 3 after some hours of gameplay, so I turned it down a bit.
> 
> My GPU Core is not willing to go above 2050 Mhz ; however i didnt touched the vcore yet.
> 
> (Palit GTX 1080 Gamerock)


I get exactly the same from my super jetstream, I wonder if its coincidence?. Any other palit owners on here able to verify?


----------



## juniordnz

After just a few hours with my new 1080 ftw:

Rock solid with 2114/11016 @1,93V. Without much effort, I can get more out of it if I sit a few hours to do it, just not in the mood today.

High room temp is keeping me from getting better results. I'm very very happy happy with the trade I made. I could get 2062mhz core and no even +300 was stable on memory with the MSI Armor.

I think I fell in love with a piece of hardware...


----------



## fat4l

Quote:


> Originally Posted by *octiny*
> 
> Odd, I get 750+ more GPU points with XOC bios in SLI on Timespy. Whatever works best for you!


Well the reason for it might be TDP limit.
My limits are removed by a hardmod. This bios removes TDP limits too so this is why(probably) you see more performance cuz you are not hitting tdp limits as here are none


----------



## Joshwaa

Quote:


> Originally Posted by *juniordnz*
> 
> After just a few hours with my new 1080 ftw:
> 
> Rock solid with 2114/11016 @1,93V. Without much effort, I can get more out of it if I sit a few hours to do it, just not in the mood today.
> 
> High room temp is keeping me from getting better results. I'm very very happy happy with the trade I made. I could get 2062mhz core and no even +300 was stable on memory with the MSI Armor.
> 
> I think I fell in love with a piece of hardware...


Glad you like the FTW. I am really happy with mine to. Just the waiting game on EK now.


----------



## karelbastos

Guys

I have changed my two 1080 to WATER today

EVGA Hybrid Water Cooler Closed Loop

Temps going from 67 - 70 C to 42 - 50 C on 100 % load

But for my surprise, my overclock stability was gone









Before i can run SLI stable at 2025 Mhz ( +165 core clock +500 on mem )

Now i can't run +100 on core and drivers fails

Any IDEIA W T F is happennnnnnnnning ??

Thanks ?


----------



## juniordnz

Quote:


> Originally Posted by *Joshwaa*
> 
> Glad you like the FTW. I am really happy with mine to. Just the waiting game on EK now.


I'm loving it. I can play for long hours at least 6ºC cooler. Always under 60s, even with hoverclock, and clocks never dip below 2100mhz. Memory rock solid at 11016mhz. This card is just incredible. Can't wait to hook a H105 to it.


----------



## pantsoftime

Quote:


> Originally Posted by *karelbastos*
> 
> Any IDEIA W T F is happennnnnnnnning ??


I would check my workmanship... What thermal paste did you use? DId you pick something partially conductive? If so, get it off of the capacitors around the die on the substrate. Those caps are important for power delivery to the core.

If that's not it, make sure you didn't over-tighten the heatsink around the core. Flexing the PCB can cause some components to fail.

Make sure you didn't affect the thermal tape on your VRAM chips.

Make sure the VRMs are still being cooled properly

Etc.


----------



## nyk20z3

If only newegg would get some Gigabyte extreme gaming stock i can pull the trigger.


----------



## fat4l

Quote:


> Originally Posted by *karelbastos*
> 
> Guys
> 
> I have changed my two 1080 to WATER today
> 
> EVGA Hybrid Water Cooler Closed Loop
> 
> Temps going from 67 - 70 C to 42 - 50 C on 100 % load
> 
> But for my surprise, my overclock stability was gone
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Before i can run SLI stable at 2025 Mhz ( +165 core clock +500 on mem )
> 
> Now i can't run +100 on core and drivers fails
> 
> Any IDEIA W T F is happennnnnnnnning ??
> 
> Thanks ?


Your oc is not about +xxx MHZ,but about the FINAL MHZ your gpu runs at.
When you lower the temps then gpu boost 3 wil lboost higher = you need to add less +xxxmhz to achieve the same clocks as before.


----------



## TK421

anyone's 1080 here throttle 13mhz when it hits 50c or above?


----------



## ucode

I get a drop at 43C. How much it drops depends on the curve.







Multiple drops over a temperature range.


----------



## Snabeltorsk

Quote:


> Originally Posted by *djtmalta*
> 
> Anyone have a EVGA GTX 1080 Classified bios they wouldn't mind posting so I could download it? I have looked all around the internet and no one put up one for download.


 EVGAClassified1080.zip 149k .zip file


----------



## juniordnz

If I remember correctly someone already bricked his card flashing classy BIOS on it. Post back your results.


----------



## TK421

Quote:


> Originally Posted by *ucode*
> 
> I get a drop at 43C. How much it drops depends on the curve.
> 
> 
> 
> 
> 
> 
> 
> Multiple drops over a temperature range.


need bios that kill this throttle!


----------



## karelbastos

Quote:


> Originally Posted by *pantsoftime*
> 
> I would check my workmanship... What thermal paste did you use? DId you pick something partially conductive? If so, get it off of the capacitors around the die on the substrate. Those caps are important for power delivery to the core.
> 
> If that's not it, make sure you didn't over-tighten the heatsink around the core. Flexing the PCB can cause some components to fail.
> 
> Make sure you didn't affect the thermal tape on your VRAM chips.
> 
> Make sure the VRMs are still being cooled properly
> 
> Etc.


I have used a Philips screwdriver to remove the screws. What this can cause ?

But my other vga works great, and i can reach 2200 no problem at all

The second one on sli i have no lucky, its not good for overclock.

Any way i will try to check


----------



## karelbastos

Quote:


> Originally Posted by *fat4l*
> 
> Your oc is not about +xxx MHZ,but about the FINAL MHZ your gpu runs at.
> When you lower the temps then gpu boost 3 wil lboost higher = you need to add less +xxxmhz to achieve the same clocks as before.


Hummm

I see it now

Before water cooler i need to add + 165 to get 2025 on final

Now i just add +120 i can reach 2025 on final

But not stable

And before waterr cooled i can maintain 2025 stable on SLI

One one my GPU, the second one is worst than the first to overclock. This is slow down my first on SLI









But with water cooler the correct was not to reach the same clocks as on AIR or more ? ( STABLE )

Not less, i think

Thanks


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> If I remember correctly someone already bricked his card flashing classy BIOS on it. Post back your results.


I flashed the classy to my FTW and it would not post, but I just switched to my slave bios and rebooted and reflashed the master and all was well.


----------



## djtmalta

Thank you Snabeltorsk


----------



## djtmalta

Quote:


> Originally Posted by *Snabeltorsk*
> 
> EVGAClassified1080.zip 149k .zip file


Thank you! Much appreciated!


----------



## SweWiking

I ended up testing the new strix ocx t4 bios, and like the first strix ocx my FE overclocked around 50mhz higher at 1,2v compared to the FE 1,080v.

But the preformance is worse across the board, less score in firestrike/time spy/valley and gta5, with the gpus running 50mhz higher and the mem at the same speed as on the FE bios.

For me at least the FE original bios is still the best, and from the looks of it i dont think there will be a bios editor at all for pascal.


----------



## toncij

Any 1080 SLI owners here? How's scaling?


----------



## Whitechap3l

Quote:


> Originally Posted by *SweWiking*
> 
> I ended up testing the new strix ocx t4 bios, and like the first strix ocx my FE overclocked around 50mhz higher at 1,2v compared to the FE 1,080v.
> 
> But the preformance is worse across the board, less score in firestrike/time spy/valley and gta5, with the gpus running 50mhz higher and the mem at the same speed as on the FE bios.
> 
> For me at least the FE original bios is still the best, and from the looks of it i dont think there will be a bios editor at all for pascal.


What Card are you using ?


----------



## SweWiking

Quote:


> Originally Posted by *Whitechap3l*
> 
> What Card are you using ?


I got the evga 1080 FE


----------



## djtmalta

I can confirm that the Evga GTX 1080 Classified Bios when put in my Evga FTW Edition GTX 1080 caused the device to brick. I had to switch bios to Master just to boot up, then switch over to slave to reflash original FTW slave bios. All is good now.


----------



## Whitechap3l

Quote:


> Originally Posted by *SweWiking*
> 
> I got the evga 1080 FE


It seems like the strix works better with the no tdp strix bios and FE is just get better score with stock bios..


----------



## GreedyMuffin

Yeah, I'm getting a lower score with the stock bios. But I get some throttling in BF4 with that bios

2139 is stable on stock voltage, but will throttle due TDP limit.


----------



## Hutzi

Yesterday i've just tried to give my Palit 1080 GameRock a little more juice, since there was plenty of room when it comes to temperatur and powerlimit. Thereby I'm a bit confused because there is no gain at all.

2050 was the limit without touching the voltage.
2050 is still the limit even with +100% vcore applied.

The only think changed due to the increasement was power consumption (jumped from about 100% to 110%'isch) and temperatur (raised from a solid 68-70°C under full load to 77-79°C). Anyone else with such observation?


----------



## wangle0485

Quote:


> Originally Posted by *Hutzi*
> 
> Yesterday i've just tried to give my Palit 1080 GameRock a little more juice, since there was plenty of room when it comes to temperatur and powerlimit. Thereby I'm a bit confused because there is no gain at all.
> 
> 2050 was the limit without touching the voltage.
> 2050 is still the limit even with +100% vcore applied.
> 
> The only think changed due to the increasement was power consumption (jumped from about 100% to 110%'isch) and temperatur (raised from a solid 68-70°C under full load to 77-79°C). Anyone else with such observation?


I experience the same behaviour with my super jetstream, but at lower temperatures. With fans at 80% and 2 case fans pointed at the cooler I get 50-55c at 2050mhz with 104% tdp at stock voltage or 115%tdp at +100% voltage.

Even with the strix OC bios I cannot push clocks any higher, even with up to 1.2v


----------



## juniordnz

I don't get why the BIOS thread had to be closed. We're all grown up here (at least that's what I expect LOL), paid for our cards and have the right to do whatever we want with them. There were several warnings across the whole thread about the risks of cross flashing. People were aware of the risks and yet they decided to take it. Now we need to be babysit and told what we can and cannot do with the cards we paid for? Cmon guys, we're better than that. Let the boys have their fun, brick their cards, pull out their hair and struggle to get it working again...it's all part of the game and everyone is aware about the risks. All that closing that thread did was to bring that discussion here.

Also, wasn't it fun when you said that a BIOS would brick the card, another user confirmed and yet, after all that, someone asks for said BIOS, flash it and report back an hour later that the BIOS bricked his card?


----------



## Snabeltorsk

Just reflash when it bricks, i just put my monitor in the onboard hdmi and reflashed it when i tried the galaxy bios, just LAME to close that thread.


----------



## Hutzi

Quote:


> Originally Posted by *wangle0485*
> 
> I experience the same behaviour with my super jetstream, but at lower temperatures. With fans at 80% and 2 case fans pointed at the cooler I get 50-55c at 2050mhz with 104% tdp at stock voltage or 115%tdp at +100% voltage.


Okay, thats strange. It looks like there is an barrier which doesn't come from the chip itself.
I didn't touched the ventilation since it didn't reach its temperature limit - I don't care for more throtteling for now because it's getting watercooled anyways soon.

I wonder that you also had the same issue with another BIOS, because i hoped for it to fix the problem


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> I don't get why the BIOS thread had to be closed.


Because it wasn't going the way the OP had intended and I guess he didn't want to be associated with bricks. Seems fair enough to me.

Any poster can start a new VBIOS thread, hopefully someone with at least half the dedication towards it.


----------



## juniordnz

Quote:


> Originally Posted by *ucode*
> 
> Because it wasn't going the way the OP had intended and I guess he didn't want to be associated with bricks. Seems fair enough to me.
> 
> Any poster can start a new VBIOS thread, hopefully someone with at least half the dedication towards it.


If I create a thread named "Let's cross flash, peeps" it doesn't make me responsible for anyone elses cards but mine. Again, we don't need to be baby sit. All it takes is a "FLASH AT YOUR OWN RISK" written in the first page. It keeps making no sense to me.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> I don't get why the BIOS thread had to be closed. We're all grown up here (at least that's what I expect LOL), paid for our cards and have the right to do whatever we want with them. There were several warnings across the whole thread about the risks of cross flashing. People were aware of the risks and yet they decided to take it. Now we need to be babysit and told what we can and cannot do with the cards we paid for? Cmon guys, we're better than that. Let the boys have their fun, brick their cards, pull out their hair and struggle to get it working again...it's all part of the game and everyone is aware about the risks. All that closing that thread did was to bring that discussion here.
> 
> Also, wasn't it fun when you said that a BIOS would brick the card, another user confirmed and yet, after all that, someone asks for said BIOS, flash it and report back an hour later that the BIOS bricked his card?


I agree, but I guess is was not the OPs original intention for the thread. You should start one and we can start over from the beginning and have a nice first page with all the good info


----------



## wangle0485

I also tried the game rock and game rock premium BIOSes (biosis??) with no difference in what was attainable.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> I agree, but I guess is was not the OPs original intention for the thread. You should start one and we can start over from the beginning and have a nice first page with all the good info


Currently I don't have the time for that. Also, I'm not so into cross flashing myself, but nothing against those who want to venture into those waters...

Finally got my FTW yesterday. Got 2000mhz out of the box and rock solid 2100/11016 in games with +100 voltage 120% tdb and 100% fan profile. Card stays at 56ºC most of the time.

Could you share what yours did out of the box and what you get orverclocking it? I can squeeze more out of mine, just hadn't the time do so yet. But I'm very happy with it. Got 25k graphics on firestrike, something I could never get with armor (with stable 24/7 clocks).


----------



## TK421

I fully shorted two of the three resistor on the Zotac AMP 1080, half-shorted the 3rd one. Now the power consumption is very low and I don't get any kind of power throttling!

Now if this card doesn't throttle -13mhz every time I break 50c lol...


----------



## karelbastos

Tested more here

And i really don't know why after WATER COOLED the two ZOTAC GTX 1080 FE i can't reach the same clock speed as before 2025 stable

And can't reach the same performance on benchmarks.

For me, lowe the temps will give me better results, but this not happen to me









Any of you guys, have a explanation ?

Thanks

Mas i arrive nor stable is 2000mhz and 150 less points on benchmarks like HEAVEN.

BEFORE 67-70 C - NOW 45 - 50 C


----------



## GanGstaOne

guys with Kraken G10 water cooling what cooling do you use for the vram chips


----------



## gerbil80

Quote:


> Originally Posted by *karelbastos*
> 
> Tested more here
> 
> And i really don't know why after WATER COOLED the two ZOTAC GTX 1080 FE i can't reach the same clock speed as before 2025 stable
> 
> And can't reach the same performance on benchmarks.
> 
> For me, lowe the temps will give me better results, but this not happen to me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any of you guys, have a explanation ?
> 
> Thanks
> 
> Mas i arrive nor stable is 2000mhz and 150 less points on benchmarks like HEAVEN.
> 
> BEFORE 67-70 C - NOW 45 - 50 C


This does not read right at all... My advice would be to uninstall any relevant drivers/software and start again.


----------



## KillerBee33

Quote:


> Originally Posted by *TK421*
> 
> I fully shorted two of the three resistor on the Zotac AMP 1080, half-shorted the 3rd one. Now the power consumption is very low and I don't get any kind of power throttling!
> 
> Now if this card doesn't throttle -13mhz every time I break 50c lol...


How's that Hybrid kit treating 1080? Have you seen 60's?


----------



## karelbastos

Quote:


> Originally Posted by *gerbil80*
> 
> This does not read right at all... My advice would be to uninstall any relevant drivers/software and start again.


I will try to reinstall all drivers to check if anything change.

.........................................

EDIT

I remember now, its not a drivers case, because i have 2 SLI 1080. When use one alone works great for OC and stable results. But second one not.

But before Water the second one works stable at last 2025 mhz. Now can't run stable above 2000.

Any idea ? Maybe this GPU works better on hot temps ? LOL


----------



## juniordnz

Quote:


> Originally Posted by *GanGstaOne*
> 
> guys with Kraken G10 water cooling what cooling do you use for the vram chips


If the card does not have a heatplate covering all the vram chips like Gaming X or FTW, you should buy individual copper heatsinks and use on all vram chips. Cold air blowed directly onto them also helps a lot!


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> If the card does not have a heatplate covering all the vram chips like Gaming X or FTW, you should buy individual copper heatsinks and use on all vram chips. Cold air blowed directly onto them also helps a lot!


Exactly what i did but still wanted to check cause i'm seening alot of ppl using G10 no vram heatsink at all


----------



## nexxusty

Quote:


> Originally Posted by *karelbastos*
> 
> I have used a Philips screwdriver to remove the screws. What this can cause ?
> 
> But my other vga works great, and i can reach 2200 no problem at all
> 
> The second one on sli i have no lucky, its not good for overclock.
> 
> Any way i will try to check


Wrong post to reply to.

Fatl's post is the post you should be listening to.

Quote:


> Originally Posted by *karelbastos*
> 
> I will try to reinstall all drivers to check if anything change.
> 
> .........................................
> 
> EDIT
> 
> I remember now, its not a drivers case, because i have 2 SLI 1080. When use one alone works great for OC and stable results. But second one not.
> 
> But before Water the second one works stable at last 2025 mhz. Now can't run stable above 2000.
> 
> Any idea ? Maybe this GPU works better on hot temps ? LOL


The hell is your 1080 so dirty?

That's disgusting.


----------



## karelbastos

Its normal on AIR

Just some dust on the backplate

LOL

^^

My pc is in contact with AIR, not on a vacuum chamber


----------



## TK421

Quote:


> Originally Posted by *KillerBee33*
> 
> How's that Hybrid kit treating 1080? Have you seen 60's?


Doesn't fit on the zotac amp, But even if it does, it's very difficult to mount since there's no heatplate holes to guide the cooler legs to be in line with the screw holes. Now I'm using the AMP cooler itself but with two gentle typhoon and liquid metal thermal paste.

I believe there's also a plastic cover around the die to protect it, might also be causing fitment issues with the hybird cooler.

2 phase vram, and the vram chips itself don't get covered btw if you use the hybrid cooler.

What copper heatsink would ocn members suggest getting?


----------



## KillerBee33

Quote:


> Originally Posted by *TK421*
> 
> Doesn't fit on the zotac amp, But even if it does, it's very difficult to mount since there's no heatplate holes to guide the cooler legs to be in line with the screw holes. Now I'm using the AMP cooler itself but with two gentle typhoon and liquid metal thermal paste.
> 
> I believe there's also a plastic cover around the die to protect it, might also be causing fitment issues with the hybird cooler.
> 
> 2 phase vram, and the vram chips itself don't get covered btw if you use the hybrid cooler.
> 
> What copper heatsink would ocn members suggest getting?


What about the FE or this ""http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look"" wasnt you?


----------



## Dr.GumbyM.D.

derp


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *karelbastos*
> 
> Tested more here
> 
> And i really don't know why after WATER COOLED the two ZOTAC GTX 1080 FE i can't reach the same clock speed as before 2025 stable
> 
> And can't reach the same performance on benchmarks.
> 
> For me, lowe the temps will give me better results, but this not happen to me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any of you guys, have a explanation ?
> 
> Thanks
> 
> Mas i arrive nor stable is 2000mhz and 150 less points on benchmarks like HEAVEN.
> 
> BEFORE 67-70 C - NOW 45 - 50 C


I'm getting the same here.

I was able to run +152 for an final speed of ~2025-2050mhz when on air with the FE card running at about 65c and the EVGA SC at about 55c.

After 2xEK blocks, temps are much lower, both hovering around 40-50c at full load (I don't have a super efficient loop, I prefer silence and look, so-so performance (delta T) is fine for me) I haven't tried squeezing it yet, but from roughly adjusting settings in AB, I was able to get +100 to run, and it would clock to about 2000mhz.

I did notice though that the clocks didn't vary AT ALL, whereas when I was running the cards on air, even at stock clocks, the clocks were constantly bouncing all over the place during gaming or Firestrike Ultra. Once they were put on water, the clock went right to 2000mhz and sat until whatever I was running was 100% done (benchmark, gaming, etc.). I chalk it up to GPU Boost 3.0 working as intended. I think people (myself very much included) need to better understand how GPU Boost 3.0 works, how it works differently than prior versions, and what it means for squeezing performance out of the cards (not necessarily higher clocks).


----------



## GreedyMuffin

On air, over 2100 would fail. under water 2139 is 100 percent stable. Been folding for a few days now 24/7 no issues. (At least yet







)


----------



## cj0612

So I just ordered the EVGA hybrid AIO cooler for my 1080. Currently I can't seem to put more than +145 or so on the clock and the boost ranges from 1980-2025mhz from a few of the previous comments I've read it sounds like the water cooling should keep that boost clock up into the 2000's correct? On air I normally hit between 65-70C and there does seem to be some fluttering or well bounce which seems to me like thermal throttling even though it isn't at 82C. Anyone that can confirm this?

Also I am thinking about trying the liquid metal tdp mod as my card does seem to reach this tdp when playing overwatch and I think that is what was causing my previous OC's to fail but its weird as these overclocks didnt fail before I upgraded to windows 10 + new motherboard and processor.


----------



## nexxusty

Quote:


> Originally Posted by *karelbastos*
> 
> Its normal on AIR
> 
> Just some dust on the backplate
> 
> LOL
> 
> ^^
> 
> My pc is in contact with AIR, not on a vacuum chamber


It's normal without filters maybe.... 

All intakes should be filtered. On anything. In existence.


----------



## toncij

I have pretty good cards on me atm, FTW models that boost to 2114 if cooled correctly (open case, 85%+ fan). Memory goes up to 5500 without a blink. Now, what tears me up is - TXP vs dual 1080s considering how bad SLI is lately...


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> I have pretty good cards on me atm, FTW models that boost to 2114 if cooled correctly (open case, 85%+ fan). Memory goes up to 5500 without a blink. Now, what tears me up is - TXP vs dual 1080s considering how bad SLI is lately...


jUST dO iT !








nO rEGRETS!


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *toncij*
> 
> I have pretty good cards on me atm, FTW models that boost to 2114 if cooled correctly (open case, 85%+ fan). Memory goes up to 5500 without a blink. Now, what tears me up is - TXP vs dual 1080s considering how bad SLI is lately...


Yea, that last question is driving me nuts too. I could never justify a single $1,200 video card, but at the same time, here I am with $1,400 worth of video cards, which I'm 100% certain I say I'd never do.

Between SLI and TXP though, for the epeen of SLI on water with those fancy EK bridge blocks, I stuck with SLI. I'm very disappointed in the limited application of SLI though. I expect to only see it matter once Battlefield 1 comes out, since I haven't played BF4 in years, The Division has terrible SLI support (negative scaling, even with the popular tweaks), and I haven't played GTA V yet, but I guess I may see it there as well.

I've been gaming in [email protected] for 2 years now, and was actually pretty happy with my aftermarket 290 overclocked. So far, the biggest difference for me between the old (2.5-3 year old) rig and new one is that the new rig is silent after watercooling. The performance increase is underwhelming, particularly in my current main game, The Division, particularly at 4K (was at 45 FPS with most settings on medium, now at 50FPS with most settings on high, still can't touch ultra because SLI doesn't work).

So I chose epeen and look (and maybe performance, but not of anything I play) of SLI over practical use of TXP. I'd feel bad, but OCN is the land of diminishing returns


----------



## TK421

Quote:


> Originally Posted by *KillerBee33*
> 
> What about the FE or this ""http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look"" wasnt you?


it's me, but I don't have my FE anymore, exchanged for this which is cheaper


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *KillerBee33*
> 
> jUST dO iT !
> 
> 
> 
> 
> 
> 
> 
> 
> nO rEGRETS!


----------



## KillerBee33

Quote:


> Originally Posted by *Dr.GumbyM.D.*


iNDEED


----------



## KillerBee33

Quote:


> Originally Posted by *TK421*
> 
> it's me, but I don't have my FE anymore, exchanged for this which is cheaper


So what was absolute top for 1080 on that Hybrid? Highest i've seen on 980 was 61 with room temperature over 80 .


----------



## juniordnz

Quote:


> Originally Posted by *toncij*
> 
> I have pretty good cards on me atm, FTW models that boost to 2114 if cooled correctly (open case, 85%+ fan). Memory goes up to 5500 without a blink. Now, what tears me up is - TXP vs dual 1080s considering how bad SLI is lately...


Same here. FTW with 2114/5500 without even trying. That's on air with open case and fans 100%. What could we expect from putting it under water?:


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> jUST dO iT !
> 
> 
> 
> 
> 
> 
> 
> 
> nO rEGRETS!


You did it?








Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> Yea, that last question is driving me nuts too. I could never justify a single $1,200 video card, but at the same time, here I am with $1,400 worth of video cards, which I'm 100% certain I say I'd never do.
> 
> Between SLI and TXP though, for the epeen of SLI on water with those fancy EK bridge blocks, I stuck with SLI. I'm very disappointed in the limited application of SLI though. I expect to only see it matter once Battlefield 1 comes out, since I haven't played BF4 in years, The Division has terrible SLI support (negative scaling, even with the popular tweaks), and I haven't played GTA V yet, but I guess I may see it there as well.
> 
> I've been gaming in [email protected] for 2 years now, and was actually pretty happy with my aftermarket 290 overclocked. So far, the biggest difference for me between the old (2.5-3 year old) rig and new one is that the new rig is silent after watercooling. The performance increase is underwhelming, particularly in my current main game, The Division, particularly at 4K (was at 45 FPS with most settings on medium, now at 50FPS with most settings on high, still can't touch ultra because SLI doesn't work).
> 
> So I chose epeen and look (and maybe performance, but not of anything I play) of SLI over practical use of TXP. I'd feel bad, but OCN is the land of diminishing returns


Yes, price wise, 1080 SLI is, when scaling at about 50% exactly as much more expensive than a TXP. The only problem I see is that TXP isn't sold here. I need to actually smuggle it..So, you chose SLI intentionally?


----------



## toncij

Quote:


> Originally Posted by *juniordnz*
> 
> Same here. FTW with 2114/5500 without even trying. That's on air with open case and fans 100%. What could we expect from putting it under water?:


I guess something like 2150-2160, but what still bothers me is the TXP battle.







And SLI problem really annoys the hell out of me.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> You did it?


http://www.3dmark.com/3dm/13985049
Just to start : first Driver and Beta OC Sof. & still no BIOS Tools.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.3dmark.com/3dm/13985049
> Just to start : first Driver and Beta OC Sof. & still no BIOS Tools.


Can you do an Ultra?


----------



## DStealth

Will play tomorrow with my new toy...actually with the curve...not bad for stock cooler
Palit [email protected]/11058 + 5280k


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Can you do an Ultra?


When i get home.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *toncij*
> 
> You did it?
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, price wise, 1080 SLI is, when scaling at about 50% exactly as much more expensive than a TXP. The only problem I see is that TXP isn't sold here. I need to actually smuggle it..So, you chose SLI intentionally?


I did choose SLI intentionally. I haven't ever had SLI before (had done Crossfire with 6950s, those were the good ol days). I wanted to try out the new EK parallel connector block with their full cover waterblocks. I did not realize that SLI was so poorly supported by most games out there (including recent/current AAA titles). For me, this was more an exercise of building an awesome rig, something I've never been able to afford to do. Performance, practicality, and budget were secondary.

If I were concerned about performance, practicality, and budget, I probably would have bought one AIB 1080 and been done with it. The returns diminish very very quickly after that (and even then, they're pretty low).


----------



## GreedyMuffin

On 2139 core, 545+ on mem i only get 24K graphic score, no throttling. Does anyone have an idea why I'm getting lower scores compared to others? Perhaps I need to reinstall windows?


----------



## karelbastos

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> I'm getting the same here.
> 
> I was able to run +152 for an final speed of ~2025-2050mhz when on air with the FE card running at about 65c and the EVGA SC at about 55c.
> 
> After 2xEK blocks, temps are much lower, both hovering around 40-50c at full load (I don't have a super efficient loop, I prefer silence and look, so-so performance (delta T) is fine for me) I haven't tried squeezing it yet, but from roughly adjusting settings in AB, I was able to get +100 to run, and it would clock to about 2000mhz.
> 
> I did notice though that the clocks didn't vary AT ALL, whereas when I was running the cards on air, even at stock clocks, the clocks were constantly bouncing all over the place during gaming or Firestrike Ultra. Once they were put on water, the clock went right to 2000mhz and sat until whatever I was running was 100% done (benchmark, gaming, etc.). I chalk it up to GPU Boost 3.0 working as intended. I think people (myself very much included) need to better understand how GPU Boost 3.0 works, how it works differently than prior versions, and what it means for squeezing performance out of the cards (not necessarily higher clocks).


The problem for me is that i'm getting less score on bench HEAVEN with water ^^

Like 5820 on AIR ( 2025 mhz ) and 5680 on water ( 2000 mhz )

When i try just use single 1080, my first card, i can reach more 2100+ mhz

But my second 1080 i have no lucky and he do not allow me a big overclock.

So when sli the high clock will be the smaller one, from second card

^^

Maybe i will sell this second card for someone that do not want to overclock and buy other to try LUCKY.

Quote:


> Originally Posted by *nexxusty*
> 
> It's normal without filters maybe....
> 
> All intakes should be filtered. On anything. In existence.


My case do not have filter on the SIDE FANS


----------



## nexxusty

Quote:


> Originally Posted by *karelbastos*
> 
> The problem for me is that i'm getting less score on bench HEAVEN with water ^^
> 
> Like 5820 on AIR ( 2025 mhz ) and 5680 on water ( 2000 mhz )
> 
> When i try just use single 1080, my first card, i can reach more 2100+ mhz
> 
> But my second 1080 i have no lucky and he do not allow me a big overclock.
> 
> So when sli the high clock will be the smaller one, from second card
> 
> ^^
> 
> Maybe i will sell this second card for someone that do not want to overclock and buy other to try LUCKY.
> My case do not have filter on the SIDE FANS


Maaaaaan! Get on that.

I'll keep making fun of you.


----------



## KillerBee33

Quote:


> Originally Posted by *GreedyMuffin*
> 
> On 2139 core, 545+ on mem i only get 24K graphic score, no throttling. Does anyone have an idea why I'm getting lower scores compared to others? Perhaps I need to reinstall windows?


Lower your Memory to 500 FLAT and Voltage to Stock , Just Try







You want to try and get it up to 2150Mhz for 25000.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *karelbastos*
> 
> The problem for me is that i'm getting less score on bench HEAVEN with water ^^
> 
> Like 5820 on AIR ( 2025 mhz ) and 5680 on water ( 2000 mhz )
> 
> When i try just use single 1080, my first card, i can reach more 2100+ mhz
> 
> But my second 1080 i have no lucky and he do not allow me a big overclock.
> 
> So when sli the high clock will be the smaller one, from second card
> 
> ^^
> 
> Maybe i will sell this second card for someone that do not want to overclock and buy other to try LUCKY.
> My case do not have filter on the SIDE FANS


As my friend keeps reminding me/rubbing in my face, if you really want to do SLI right, you buy a handful of cards, bin them for the two highest clocking cards, make sure they still work well together, and then assemble the system, selling the non-highest cards (or returning them and potentially paying the restocking fee). If you're not doing that, you're going to get stuck with the lower of the performing cards being your cap. Which I'm fine with, because I'm still getting excellent performance and dead silence, but if you're chasing clocks, then I can see where you'd get disappointed.


----------



## toncij

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> I did choose SLI intentionally. I haven't ever had SLI before (had done Crossfire with 6950s, those were the good ol days). I wanted to try out the new EK parallel connector block with their full cover waterblocks. I did not realize that SLI was so poorly supported by most games out there (including recent/current AAA titles). For me, this was more an exercise of building an awesome rig, something I've never been able to afford to do. Performance, practicality, and budget were secondary.
> 
> If I were concerned about performance, practicality, and budget, I probably would have bought one AIB 1080 and been done with it. The returns diminish very very quickly after that (and even then, they're pretty low).


I don't see 1080 solo as a solution since I own a TitanX as a prior experience. TX at 1.5 is almost the same as 1080 at 1.8 which is, well, low. Only at 2GHz+ I get something out of a single 1080.
Regarding games, I play Overwatch, Division, Battlefront and plan on BF1 - only Division lacks support, Civ 6 will support multi-GPU and I expect other games too... But then, StarCraft 2 doesn't support it...

What I see as a problem - you can't actually buy TXP back here in Europe at any place, Croatia in particular. No NV shop for it or I don't know of any... I need to smuggle it...


----------



## GreedyMuffin

Voltage is stock. My card is pretty good OCing i believe.

Will test mem at 500+ when I get home from sweden on friday!


----------



## KillerBee33

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Voltage is stock. My card is pretty good OCing i believe.
> 
> Will test mem at 500+ when I get home from sweden on friday!


With the FE i noticed higher Performance on stock Voltage and Memory no more than + 500
http://www.3dmark.com/3dm/13638712


----------



## karelbastos

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> As my friend keeps reminding me/rubbing in my face, if you really want to do SLI right, you buy a handful of cards, bin them for the two highest clocking cards, make sure they still work well together, and then assemble the system, selling the non-highest cards (or returning them and potentially paying the restocking fee). If you're not doing that, you're going to get stuck with the lower of the performing cards being your cap. Which I'm fine with, because I'm still getting excellent performance and dead silence, but if you're chasing clocks, then I can see where you'd get disappointed.


I'm do not care for clocks, unless he give me performance increase.

Like on the Witcher 3- I can play 4K 60+ FPS without problems using SLI 1080 stock settings.

But if i can sell the low clock vga and buy other to try the lottery, why not ? ^^

Meanwhile i will playing using this cap sli










Sorry for my english PPL.


----------



## Asus11

anyone noticed lower performance when using the curve overclocking?


----------



## KillerBee33

Quote:


> Originally Posted by *Asus11*
> 
> anyone noticed lower performance when using the curve overclocking?


Yes , even with constant Highest Voltage. Much higher recorder clock with much lower scores.


----------



## GreedyMuffin

Quote:


> Originally Posted by *KillerBee33*
> 
> With the FE i noticed higher Performance on stock Voltage and Memory no more than + 500
> http://www.3dmark.com/3dm/13638712


Will test it ASAP.

I should at least get 25K? My CPU is no slouch. Only running 4500 daily though.


----------



## KillerBee33

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Will test it ASAP.
> 
> I should at least get 25K? My CPU is no slouch. Only running 4500 daily though.


Lowered mine to 4.6 & 1.32V daily, it's in the upper 90's with 90% humidity here , my AC at home sound like it's under water


----------



## juniordnz

Quote:


> Originally Posted by *Asus11*
> 
> anyone noticed lower performance when using the curve overclocking?


Offset worked much much better for me. Had no luck with the curve.


----------



## LolCakeLazors

Great news from EVGA! My 1080 FTW just arrived in CA and they approved me for a brand new 1080 instead of a refurbished. Can't wait until my new one arrives here.


----------



## nyk20z3

The lack of non reference 1080 stock is rather annoying at this point, how long do you guys think this will continue ?


----------



## LolCakeLazors

There's plenty in stock. Just stalk https://www.nowinstock.net/computers/videocards/nvidia/gtx1080/


----------



## shibumi

My local Microcenter just got 10+ Classifieds in stock today.


----------



## Menthol

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.3dmark.com/3dm/13985049
> Just to start : first Driver and Beta OC Sof. & still no BIOS Tools.


Very good graphics score, right up there with the best of them


----------



## nyk20z3

Quote:


> Originally Posted by *shibumi*
> 
> My local Microcenter just got 10+ Classifieds in stock today.


I only use newegg for the most part.

I have a MC 10 minutes from me but why would i pay tax on a $750 gpu ya know.


----------



## KillerBee33

Quote:


> Originally Posted by *Menthol*
> 
> Very good graphics score, right up there with the best of them


Got one Ultra FS in top 3 just for today or so


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.3dmark.com/3dm/13985049
> Just to start : first Driver and Beta OC Sof. & still no BIOS Tools.


NOOOOICE


----------



## Sazexa

Some benchmark results on two air-cooled Founder's Edition GTX 1080's, with a 6950X. All stock clocks. I'll redo them once I get my watercooling loop set up, as there was definitely some throttling.


----------



## juniordnz

Switching to FTW slave BIOS with 130% TDP got rid of all the vRel throttle I was getting. Now I'm rock solid at 2114mhz with no perfcap. But that's all I can get on air I guess. 2126mhz got me some minor artifacts. Very happy with the results though. I'm even reconsidering if it's worth putting it under water since the cooler works so well that thermal throttle happens much much less than with Armor version.

I couldn't get +300mhz on vram stable with Armor. I can get solid +500 with FTW, 525 showed no artifacts at all, but performance decreased. Could be that 500 is something like a sweetspot for these micron chips?


----------



## karelbastos

Quote:


> Originally Posted by *juniordnz*
> 
> Switching to FTW slave BIOS with 130% TDP got rid of all the vRel throttle I was getting. Now I'm rock solid at 2114mhz with no perfcap. But that's all I can get on air I guess. 2126mhz got me some minor artifacts. Very happy with the results though. I'm even reconsidering if it's worth putting it under water since the cooler works so well that thermal throttle happens much much less than with Armor version.
> 
> I couldn't get +300mhz on vram stable with Armor. I can get solid +500 with FTW, 525 showed no artifacts at all, but performance decreased. Could be that 500 is something like a sweetspot for these micron chips?


Vram here +500 ok + 550 no performance increase.


----------



## kx11

steady 2060mhz on both cards playing Crysis3 @4k/85hz , it goes up to 2122 sometimes but never lower than 2020mhz , 500+mem


----------



## kx11

a Zotac Factory Tour , how do they make your GPUs ?!






apparently they use ASUS mobos


----------



## Whitechap3l

Quote:


> Originally Posted by *Asus11*
> 
> anyone noticed lower performance when using the curve overclocking?


It is so strange... As I said earlier : my strix goes ham with that no tdp Bios.. I tried FE, FTW and Zotac BIOS and with non I could hit 25.5k+ scores in Firestrike. Just the Strix t4 gives me a solid 700ish score boost with more mHz as well
And curve gives me a roughly 300 to 400 Point boost with FE Bios it was even more









This Card drives me kind of crazy with so much unknown factors


----------



## ucode

For me curve is better as allows more voltage otherwise seems much the same.


----------



## Thetbrett

Quote:


> Originally Posted by *kx11*
> 
> a Zotac Factory Tour , how do they make your GPUs ?!
> 
> 
> 
> 
> 
> 
> apparently they use ASUS mobos


nice vid. Have an AMP coming on friday, interesting to see where it came from.


----------



## supermi

Quote:


> Originally Posted by *kx11*
> 
> steady 2060mhz on both cards playing Crysis3 @4k/85hz , it goes up to 2122 sometimes but never lower than 2020mhz , 500+mem


What 4k monitor do you have running at 85hz?


----------



## max883

WItout thermal grizzly Kryonaut

With thermal Grizzly Kryonaut Temp is the same but fann speed maxed out at 40%

Testing Doom 4 now.


----------



## Konstantink

Hello everybody!

I did some tests with HB/normal sli bridge yesterday:

*6850k (16x3.0 + 16x3.0) + HB sli bridge:*
FSU (10 516) - http://www.3dmark.com/fs/9708412
FSE (17 019) - http://www.3dmark.com/fs/9708804
TS (11 606) - http://www.3dmark.com/spy/244950
*6850k (16x3.0 + 16x3.0) + Normal sli bridge:*
FSU (10 457) - http://www.3dmark.com/fs/9708764
FSE (17 004) - http://www.3dmark.com/fs/9708569
TS (11 592) - http://www.3dmark.com/spy/244745
*6700k (8x3.0 + 8x3.0) + HB sli bridge:*
FSU (10 100) - http://www.3dmark.com/fs/9558907
TS (10 750) - http://www.3dmark.com/spy/175150

Unfortunately i didn't have two separate soft bridges to test it as well. But as you can see from the tests above, the profit from HB bridge is minimal!

Test config:
I7 6850K (4.2 - 1.2)
I7 6700k (4.7 - 1.3)
Sli MSI 1080 gaming X (2100 - 11000)


Spoiler: Photo


----------



## toncij

Quote:


> Originally Posted by *max883*
> 
> 
> 
> 
> 
> 
> WItout thermal grizzly Kryonaut
> 
> With thermal Grizzly Kryonaut Temp is the same but fann speed maxed out at 40%
> 
> Testing Doom 4 now.


How much did you gain in effect? What if you put a fixed fan % on, let's say, 50% - how much a temp differs?


----------



## kx11

Quote:


> Originally Posted by *supermi*
> 
> What 4k monitor do you have running at 85hz?


i don't actually , just created a custom res from NVCP and played it


----------



## nexxusty

Quote:


> Originally Posted by *Konstantink*
> 
> Hello everybody!
> 
> I did some tests with HB/normal sli bridge yesterday:
> 
> *6850k (16x3.0 + 16x3.0) + HB sli bridge:*
> FSU (10 516) - http://www.3dmark.com/fs/9708412
> FSE (17 019) - http://www.3dmark.com/fs/9708804
> TS (11 606) - http://www.3dmark.com/spy/244950
> *6850k (16x3.0 + 16x3.0) + Normal sli bridge:*
> FSU (10 457) - http://www.3dmark.com/fs/9708764
> FSE (17 004) - http://www.3dmark.com/fs/9708569
> TS (11 592) - http://www.3dmark.com/spy/244745
> *6700k (8x3.0 + 8x3.0) + HB sli bridge:*
> FSU (10 100) - http://www.3dmark.com/fs/9558907
> TS (10 750) - http://www.3dmark.com/spy/175150
> 
> Unfortunately i didn't have two separate soft bridges to test it as well. But as you can see from the tests above, the profit from HB bridge is minimal!
> 
> Test config:
> I7 6850K (4.2 - 1.2)
> I7 6700k (4.7 - 1.3)
> Sli MSI 1080 gaming X (2100 - 11000)
> 
> 
> Spoiler: Photo


It's been proven frametimes are better with the HB SLi bridge.


----------



## tin0

Quote:


> Originally Posted by *juniordnz*
> 
> Switching to FTW slave BIOS with 130% TDP got rid of all the vRel throttle I was getting. Now I'm rock solid at 2114mhz with no perfcap. But that's all I can get on air I guess. 2126mhz got me some minor artifacts. Very happy with the results though. I'm even reconsidering if it's worth putting it under water since the cooler works so well that thermal throttle happens much much less than with Armor version.
> 
> I couldn't get +300mhz on vram stable with Armor. I can get solid +500 with FTW, 525 showed no artifacts at all, but performance decreased. Could be that 500 is something like a sweetspot for these micron chips?


Seems we've got the same GPU lottery as my ARMOR performs exactly the same, 2114MHz / +500MHz memory (1350MHz)


----------



## Konstantink

Quote:


> Originally Posted by *nexxusty*
> 
> It's been proven frametimes are better with the HB SLi bridge.


Could you specify who had proven it? I just tested it yesterday on the latest Nvidia drivers and framerates where nearly same with single soft sli bridge and HB bridge in 4k and 2k=)


----------



## toncij

Quote:


> Originally Posted by *Konstantink*
> 
> Could you specify who had proven it? I just tested it yesterday on the latest Nvidia drivers and framerates where nearly same with soft sli bridge and HB bridge in 4k and 2k=)


You need to need more bandwidth than a soft bridge to actually see change. As far as I think, 4K still doesn't exceed it. Dual soft should be the very same as a HB bridge - it's just cables.


----------



## justinyou

Quote:


> Originally Posted by *tin0*
> 
> Seems we've got the same GPU lottery as my ARMOR performs exactly the same, 2114MHz / +500MHz memory (1350MHz)


Are you using the Gaming Z bios in your armor card now?


----------



## pez

Quote:


> Originally Posted by *Konstantink*
> 
> Could you specify who had proven it? I just tested it yesterday on the latest Nvidia drivers and framerates where nearly same with single soft sli bridge and HB bridge in 4k and 2k=)


Framerates =/= Frametimes

You should be able to retest and log using MSI AB as you can monitor and log frametimes.


----------



## juniordnz

Quote:


> Originally Posted by *max883*
> 
> 
> 
> 
> 
> 
> WItout thermal grizzly Kryonaut
> 
> With thermal Grizzly Kryonaut Temp is the same but fann speed maxed out at 40%
> 
> Testing Doom 4 now.


Temp throttle must be high with those temps, isn't it?
Quote:


> Originally Posted by *tin0*
> 
> Seems we've got the same GPU lottery as my ARMOR performs exactly the same, 2114MHz / +500MHz memory (1350MHz)


+500 on memory results on more than 1350mhz doesn't it? I'm gettin a total of 11016mhz memory clock.

Anyway, had no luck with the Armor I got. And since it was a "not so official purchase", the 3 year warranty I got with EVGA was the game changer for me.

Luck of the draw. But 1080armor is a great budget option, though. Especially if you plan to put it under water.


----------



## fat4l

Quote:


> Originally Posted by *Whitechap3l*
> 
> It is so strange... As I said earlier : my strix goes ham with that no tdp Bios.. I tried FE, FTW and Zotac BIOS and with non I could hit 25.5k+ scores in Firestrike. Just the Strix t4 gives me a solid 700ish score boost with more mHz as well
> And curve gives me a roughly 300 to 400 Point boost with FE Bios it was even more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This Card drives me kind of crazy with so much unknown factors


you get more performance cuz of no TDP limits not cuz of this bios....
What you have to test is, unlocked TDP FE/zotac etc vs unlocked TDP strix T4. Thenyou will see if theres a difference. But obviously we dont have unlocked TDP FE.

My card has unlocked TDP(FE) as I did the hard mod, and this bios is no good for me. For some ppl it might actually be cuz as I said, it unlocks TDP limits so you will get more performance than with FE bios.

Makes sense right ?


----------



## boredgunner

Quote:


> Originally Posted by *tin0*
> 
> Seems we've got the same GPU lottery as my ARMOR performs exactly the same, 2114MHz / +500MHz memory (1350MHz)


I also have an ARMOR. I thought my +500 was stable but then I tried 3DMark Time Spy stress test. Now I'm at +400. Fire Strike Ultra stress test is even worse.

Mine gets way too hot to hit 2114 MHz. 3DMark Fire Strike Extreme and Ultra, and Time Spy, bring it up to a stable 91c due to temp limit. Does yours get this hot in those tests?


----------



## tin0

I've got a fixed fan speed of 80% when gaming, where it's perfectly capable of keeping the card cool and never bothers me in terms of noise (it's still pretty silent), same noise level of my case fans.
At this speed the card's max. temp is 76degress celsius.


----------



## TK421

Quote:


> Originally Posted by *tin0*
> 
> I've got a fixed fan speed of 80% when gaming, where it's perfectly capable of keeping the card cool and never bothers me in terms of noise (it's still pretty silent), same noise level of my case fans.
> At this speed the card's max. temp is 76degress celsius.


76 is still pretty high though, is this with the stock thermal paste?


----------



## juniordnz

Quote:


> Originally Posted by *boredgunner*
> 
> I also have an ARMOR. I thought my +500 was stable but then I tried 3DMark Time Spy stress test. Now I'm at +400. Fire Strike Ultra stress test is even worse.
> 
> Mine gets way too hot to hit 2114 MHz. 3DMark Fire Strike Extreme and Ultra, and Time Spy, bring it up to a stable 91c due to temp limit. Does yours get this hot in those tests?


Could never hit anything like that with mine. +425 seemed stable, but all it took was a firestrike ultra stress test that it crashed in less than 30 seconds. Tomb Raider had red blinking dots like christmas too. Could never get not even 300 stable. Core clock would temp throttle like crazy here. Could get 2088max and it would go down to 2025. And that's with 100%fan speed, open case, and a huge fan blowing air on it.

At least you get some good clocks and will benefit from watercooling it.
Quote:


> Originally Posted by *tin0*
> 
> I've got a fixed fan speed of 80% when gaming, where it's perfectly capable of keeping the card cool and never bothers me in terms of noise (it's still pretty silent), same noise level of my case fans.
> At this speed the card's max. temp is 76degress celsius.


How much does it throttle from the starting clock?


----------



## Dr Mad

Hello,

Is it possible to fit original backplate with EK waterblock with success?

I own the new Titan but EK backplate is expected to be available at the very end of the month.
So I plan to use the original backplate instead.

Thanks


----------



## boredgunner

Quote:


> Originally Posted by *tin0*
> 
> I've got a fixed fan speed of 80% when gaming, where it's perfectly capable of keeping the card cool and never bothers me in terms of noise (it's still pretty silent), same noise level of my case fans.
> At this speed the card's max. temp is 76degress celsius.


Depending on the game I'll get similar temps, mid 70s with 80% fan speed. Some games only push it to the low 60s, and Cryostasis: Sleep of Reason pushes it to low 80s.

For those wondering, in the low 80s the core clock alternates between 2025 MHz and 2050 MHz, while in the 60s and 70s it's usually 2063 MHz.


----------



## Whitechap3l

Quote:


> Originally Posted by *fat4l*
> 
> you get more performance cuz of no TDP limits not cuz of this bios....
> What you have to test is, unlocked TDP FE/zotac etc vs unlocked TDP strix T4. Thenyou will see if theres a difference. But obviously we dont have unlocked TDP FE.
> 
> My card has unlocked TDP(FE) as I did the hard mod, and this bios is no good for me. For some ppl it might actually be cuz as I said, it unlocks TDP limits so you will get more performance than with FE bios.
> 
> Makes sense right ?


Totally agreed
Perhaps the strix t4 is not the best bios but it is ( till now) the only one without tdp and it seems that some cards can benefit better from it than others.

Yeah hard modding is to risky for me personally


----------



## GreedyMuffin

Quote:


> Originally Posted by *Dr Mad*
> 
> Hello,
> 
> Is it possible to fit original backplate with EK waterblock with success?
> 
> I own the new Titan but EK backplate is expected to be available at the very end of the month.
> So I plan to use the original backplate instead.
> 
> Thanks


Yep. Running it myself. I have a few pics of it.


----------



## nexxusty

Quote:


> Originally Posted by *Konstantink*
> 
> Could you specify who had proven it? I just tested it yesterday on the latest Nvidia drivers and framerates where nearly same with single soft sli bridge and HB bridge in 4k and 2k=)


Did I say Framerate?









Frametimes.

Major difference. I seem to remember The Witcher 3 benefitting HEAVILY from the HB SLi bridge.

Like big time.


----------



## TK421

the hard mod is to short the resistor right?


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yep. Running it myself. I have a few pics of it.


I didn't use the stock backplate for my FE card, but would it just be to use the original double-set of screws (one holding the PCB to the cooler, one holding the backplate to the PCB, all on one assembly) instead of the screws EK includes? On my EVGA SC (FE PCB) I used the original EVGA screws into the EK block and it fit fine (and was philips instead of the astoundingly annoying double-screw bolts on the FE card). I put the EVGA card on top with the back plate, and on the lower card, you can't tell that it doesn't have a backplate, and I'm not noticing any sag at all on the cards.


----------



## juniordnz

Have anyone tried tha famous, 1.2V, XOC bios on the FTW? How did it went temp wise?

Also, should we consider the vicious thermal throttle that occurs on pascal as a warning that this architecture may be more sensible to heat than maxwell? I'm trying to set a number to the "max acceptable temperature" to use with these cards.


----------



## GreedyMuffin

Yeah. You can see where I used what scews.










Spoiler: Warning: Spoiler!


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yeah. You can see where I used what scews.


Perfect. Thanks for confirming. And no thanks, to NVIDIA for using such ridiculous screws that I don't have a socket anywhere that fits those awful things, and if you aren't careful you can easily strip them with pliars or break off components if using an adjustable wrench. I get why they did it, but EVGA didn't have to resort to such ridiculousness when they designed their backplate. But I guess the NV one is sleeker...


----------



## GreedyMuffin

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> Perfect. Thanks for confirming. And no thanks, to NVIDIA for using such ridiculous screws that I don't have a socket anywhere that fits those awful things, and if you aren't careful you can easily strip them with pliars or break off components if using an adjustable wrench. I get why they did it, but EVGA didn't have to resort to such ridiculousness when they designed their backplate. But I guess the NV one is sleeker...


I used a socket when removing them. No stress at all. Easy peacy!









I think I used a 4 mm socket. I would never have installed a WB (AKa removing the cooler/FE bp) if I had to use pliars or such.

People who are not using the correct too should wait until they got the correct tool.


----------



## fat4l

Quote:


> Originally Posted by *TK421*
> 
> the hard mod is to short the resistor right?


-
yes


----------



## fat4l

I just did tests where I tested Curve vs offset in AB.
No difference in points .
Tested with FS X


----------



## Dr Mad

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Yeah. You can see where I used what scews.


This is perfect indeed









I think I'll take this tool to make the mountage more secure :

https://www.ekwb.com/shop/hex-socket-4mm


----------



## GanGstaOne

Does anyone knows how to apply the power mod to Gigabyte 1080 G1 Gaming


----------



## Menthol

The smallest one in this set works great

https://www.amazon.com/gp/product/B000BQ4XPQ/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1


----------



## TK421

Quote:


> Originally Posted by *fat4l*
> 
> -
> yes


Ok nice.

I completely shorted the left/middle resistor of my 1080AMP! - and partially short the rightmost.

Now the power consumption is extremely low, enabling me 24/7 1.093v!


----------



## juniordnz

Quote:


> Originally Posted by *TK421*
> 
> Ok nice.
> 
> I completely shorted the left/middle resistor of my 1080AMP! - and partially short the rightmost.
> 
> Now the power consumption is extremely low, enabling me 24/7 1.093v!




No, seriously...there must be a catch to it. It's not possible something so simple yields so much benefits like that without having a huge downside. Also, why wouldn't it be already like that by design?


----------



## Snabeltorsk

Quote:


> Originally Posted by *juniordnz*
> 
> 
> 
> No, seriously...there must be a catch to it. It's not possible something so simple yields so much benefits like that without having a huge downside. Also, why wouldn't it be already like that by design?


Beqause the power consumption would be higer than intended.


----------



## juniordnz

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Beqause the power consumption would be higer than intended.


So they cap the card to maintain a low power consumption? Even though the GPU they designed being able to benefit a lot from more power?

That seems to me like designing a ferrari engine and then putting a limiter to it so it doesn't spend too much fuel


----------



## looniam

^ marketing.


----------



## juniordnz

Anyone did the hard mod on the FTW?


----------



## Derpinheimer

Quote:


> Originally Posted by *juniordnz*
> 
> Anyone did the hard mod on the FTW?


Not sure it would achieve much since we already have a 130% TDP limit. 120% can be hit in some cases but 130% is enough to almost never be a barrier.


----------



## juniordnz

Quote:


> Originally Posted by *Derpinheimer*
> 
> Not sure it would achieve much since we already have a 130% TDP limit. 120% can be hit in some cases but 130% is enough to almost never be a barrier.


Had the impression it would also help to keep voltage stable. With 130% I still get vRel perfcap.


----------



## ikjadoon

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Beqause the power consumption would be higer than intended.


Quote:


> Originally Posted by *juniordnz*
> 
> So they cap the card to maintain a low power consumption? Even though the GPU they designed being able to benefit a lot from more power?
> 
> That seems to me like designing a ferrari engine and then putting a limiter to it so it doesn't spend too much fuel


Well, I'm just gonna throw this out there. About 15 people will reply to me and say that their overvolted card has had no issues for the past 3 years while mining 24/7....OK, sure, that's possible (no sarcasm).
Quote:


> According to Nvidia Senior PR Manager Bryan Del Rizzo, overvolting is supported "up to a limit," in order to "protect the life of the product." Del Rizzo claims *Nvidia won't stop graphics card makers who want to overvolt their products wildly or want to provide users that freedom via voltage controls*. However, doing so disqualifies products from receiving warranty support from Nvidia. Add-in board makers are free to provide their own warranty coverage, of course.


Quote:


> MSI's GeForce GTX 680 Lightning Edition card reportedly offered users too much leeway to tweak voltages and had to be scaled back to comply. Del Rizzo notes that *MSI chose warranty coverage over extreme overvolting support*, just as EVGA appears to have done with its Classified card.


Now, someone will say, "but what about CPUs? Intel doesn't sell a 4.5GHz i7-6700K!" *But*, Intel also has not locked overvolting like NVIDIA (and interestingly, AMD, too!). You can go right ahead and ram 2.0V through your i7-6700K, ain't nobody gonna care except maybe your wallet. If Intel wanted...they could've locked voltage support down (like they later "forced" motherboard manufacturers to put out BIOS updates to disable non-K overclocking).

However, NVIDIA *and* almost all their AIB partners decided to stick with that warranty instead of the overvolting. Why?

I'm not saying this is true (GPUs are more prone to voltage-degradation than, say, CPUs)--I'd actually love to have a discussion about this. Who has something to contribute, _besides_ anecdotes?

I mean, why would _both_ NVIDIA and AMD lock down both voltages, _even_ on their high-end cards?


----------



## Snabeltorsk

Quote:


> Originally Posted by *ikjadoon*
> 
> Well, I'm just gonna throw this out there. About 15 people will reply to me and say that their overvolted card has had no issues for the past 3 years while mining 24/7....OK, sure, that's possible (no sarcasm).
> 
> Now, someone will say, "but what about CPUs? Intel doesn't sell a 4.5GHz i7-6700K!" *But*, Intel also has not locked overvolting like NVIDIA (and interestingly, AMD, too!). You can go right ahead and ram 2.0V through your i7-6700K, ain't nobody gonna care except maybe your wallet. If Intel wanted...they could've locked voltage support down (like they later "forced" motherboard manufacturers to put out BIOS updates to disable non-K overclocking).
> 
> However, NVIDIA *and* almost all their AIB partners decided to stick with that warranty instead of the overvolting. Why?
> 
> I'm not saying this is true (GPUs are more prone to voltage-degradation than, say, CPUs)--I'd actually love to have a discussion about this. Who has something to contribute, _besides_ anecdotes?
> 
> I mean, why would _both_ NVIDIA and AMD lock down both voltages, _even_ on their high-end cards?


So you need to buy 2 cards to get the performance you want


----------



## juniordnz

By overvolting you mean going over the stock 1093mv right? Like that 1200mv XOC bios.

I would guess it's about the litografy. Maybe these new GPUs are more fragile and therefore can't stand too much heat and voltage. (So they locked the voltage and made thermal throttle much more sensitive)

But I'm no one and that's just anecdotal


----------



## Derpinheimer

Quote:


> Originally Posted by *juniordnz*
> 
> Had the impression it would also help to keep voltage stable. With 130% I still get vRel perfcap.


Hm, I just cant get it. I might be CPU limited (i7 3820 @ 4.75)
Max I could get in Time Spy or BF4 or Heaven was a short spike to 114%.
GPU is at 2164 / 1.093V


----------



## DStealth

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hm, I just cant get it. I might be CPU limited (i7 3820 @ 4.75)
> Max I could get in Time Spy or BF4 or Heaven was a short spike to 114%.
> GPU is at 2164 / 1.093V


These values are dependent on the chip quality..leaks, temperature. Not all cards are the same.


----------



## ucode

Quote:


> Originally Posted by *ikjadoon*
> 
> If Intel wanted...they could've locked voltage support down (like they later "forced" motherboard manufacturers to put out BIOS updates to disable non-K overclocking).


My old C2D was voltage locked down. Got around that with a hardware mod and similarly can be done on GTX-1080 if one wants to go that way.

As it is it has already been mentioned voltage limit is supposedly 1.25V and curve values would seem to confirm this. We have seen 2 VBIOS that enable 1.2V which is more than the "standard" 1.093V we usually see.

Power limits exist for both CPU and GPU which you are well aware of especially with your laptop exploits. As for the unlocked non-k Skylake saga, that loophole required disabling power management which meant temperature control solely relied on the catastrophic temperature of 125-130C as one of the consequences of unlocking bclk. Temperature plays a big role in semiconductors with life expectancy and you will usually see de-rating factors for them with high temps.

Intel do not endorse operating their CPU's out of spec although they do offer a one time deal for K and X desktop processors for extra cost. AFAIK there isn't one for the i7-6820HK which is BGA so not a simple replacement. GPU's are also BGA.

If you would like to operate your GTX-1080 out of spec and void warranty there is nothing stopping you, guides have already been shown.


----------



## IronAge

Quote:


> Originally Posted by *juniordnz*
> 
> Switching to FTW slave BIOS with 130% TDP got rid of all the vRel throttle I was getting.


May i beg you to upload this Bios to a file hoster (zippyshare) and post the link ? thx in advance.


----------



## max883

Im using the windows 10 anniversary update and after 5-10 min the Virtual Disk Service Manager starts and my GTX 1080 Starts to make coil whine! never had coil whine before!

It uses 100% gpu!!!!

Edit: in this folder: C:\Users\max883\AppData\Local\TileDataLayer\MSSvc There were a Blake 256 that was messing with my GTX 1080 making it use 100% gpu delitet it and now all is ok!


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> By overvolting you mean going over the stock 1093mv right? Like that 1200mv XOC bios.
> 
> I would guess it's about the litografy. Maybe these new GPUs are more fragile and therefore can't stand too much heat and voltage. (So they locked the voltage and made thermal throttle much more sensitive)
> 
> But I'm no one and that's just anecdotal


You aren't nobody man, were FTW
Quote:


> Originally Posted by *IronAge*
> 
> May i beg you to upload this Bios to a file hoster (zippyshare) and post the link ? thx in advance.


I posted it in the last week or so.

1080FTW_BIOS2_130_PWR_LIMIT.csv 251k .csv file


Rename to .rom and spend some time reading man, I had to read thru this whole beast and the other BIOS thread that got closed, to get to where I am now.


----------



## juniordnz

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hm, I just cant get it. I might be CPU limited (i7 3820 @ 4.75)
> Max I could get in Time Spy or BF4 or Heaven was a short spike to 114%.
> GPU is at 2164 / 1.093V


By vRel I mean I get the blue line in GPUz and one clock down from 2114 to 2101 in some applications. I doesn't mean I'm hitting 130%.
BF4 is very CPU bound and Heaven is light. Try something like a full pass on Firestrike Ultra Stress Test (about 11minutes) and then look at GPUz statistics.

In most applications though I get no perfcap at all and a rock steady 2114mhz @ 1093V. (As long as temps don't go over 53ºC)
Quote:


> Originally Posted by *TWiST2k*
> 
> You aren't nobody man, were FTW


Indeed


----------



## 5150 Joker

Sold my two hybrid Titan X's and grabbed a single Zotac AMP! Edition 1080 as a hold over card until 1080 Ti shows up in a few months (hopefully). The Zotac does a pretty great job of overclocking, without any voltage it hits 2050 MHz with power at 120% and temp target of 92C with the fan set at 82% (which is still very silent). Max temperature I've seen in Firestrike is about 67C and in Heaven it hit about 74C which isn't too bad considering how hot it is where I live (Arizona).

3DMark compare between my old OC'd Titan X (single) vs Zotac 1080: http://www.3dmark.com/compare/fs/6521300/fs/9730932


----------



## IronAge

Quote:


> Originally Posted by *TWiST2k*
> 
> You aren't nobody man, were FTW
> I posted it in the last week or so.
> 
> 1080FTW_BIOS2_130_PWR_LIMIT.csv 251k .csv file
> 
> 
> Rename to .rom and spend some time reading man, I had to read thru this whole beast and the other BIOS thread that got closed, to get to where I am now.


Thanks a lot man ... yeah its really at beast and i am short of time and still waiting for my Gigabyte G1 to arrive.

So i will try this one on a FE of a comrade and hopefully not brick it.









Hopefully we will get our hands on a Pascal Bios Tweaker soon.


----------



## juniordnz

Let us all come together in prayer for the Pascal Bios Tweaker.

Amen!


----------



## GreedyMuffin

Amen!


----------



## LeviathanVI

More confused now than when I started.









For a custom loop would it be better to just get 2 reference FEs or 2 partner cards?


----------



## IronAge

Used FEs are rather inexpensive, shunt mod-ability + it is easier to get fc waterblocks for FEs ... IMHO FEs = no brainer.


----------



## IronAge

Quote:


> Originally Posted by *juniordnz*
> 
> Let us all come together in prayer for the Pascal Bios Tweaker.
> 
> Amen!


Amen!


----------



## LeviathanVI

Quote:


> Originally Posted by *IronAge*
> 
> Used FEs are rather inexpensive, shunt mod-ability + it is easier to get fc waterblocks for FEs ... IMHO FEs = no brainer.


I thought as much. Thank you!

Edit Any idea where I can get two new?


----------



## Hutzi

In my opinion its not a no brainer.
You pay less, but if you are going to sell the card one day you will also get less back.
I used to be an AMD Guy (for 15 Years) and i dont think it's different for nvidia when i tell you that refrence cards are more likely to coil whine (R9 290X oh god...)
Also the reference 1080's just have 1x 8-PIN and therefore less OC-Potential, even tho this does not seem to be the limiter at the moment.

Btw you get watercoolers for partnercards aswell and you can also powermod the partnercards ... wait... there is no need to do that.


----------



## LeviathanVI

Hm. Well, I don't think I'll be able to get my hands on a reference FE right now, let alone 2 of them. So, I'll probably have to go for a partner card anyway.
Right now, I'm leaning towards the EVGA Founders Edition because I've hear good things about EVGA.


----------



## IronAge

I have seen that quite a view early adopters selling FEs even with waterblocks going from single/SLI 1080 to the Pascal Titan.

Selling them for 10-20% less what they have payed.









Of course there are non FEs on the FE/reference PCB with different blower coolers ... for instance MSI GTX1080 Aero OC, Asus TUR­BO-GTX1080-8G.

Probably you don't want to spend extra money for custom coolers / pcb when you put fc waterblocks on em.

EVGA Service is unbeaten no brainer when you can get two used EVGA FEs since there is no trouble with guest RMA as a 2nd,3rd etc. owner.


----------



## KillerBee33

Quote:


> Originally Posted by *IronAge*
> 
> Amen!


Not sure if we gonna get one, looks like Boost 3.0 is what we gonna get , thingk about it 40% Boost over stock right from the factory.


----------



## IronAge

But where is the fun when there is no fiddling with the Bios values ?


----------



## juniordnz

I HAVE A DREAM! That one day we will be able to edit our BIOS the way we want.

But, srsly now, I'm hoping it becomes possible. I don't see why wouldn't it be. We already have BIOS allowing 1.2V...

Even though I'm pretty happy with rock solid 2114mhz/11016mhz @ 1.093V 130%TDP and never reaching 60ºC on air in a hot country.


----------



## KillerBee33

Quote:


> Originally Posted by *IronAge*
> 
> But where is the fun when there is no fiddling with the Bios values ?


There isn't one . But even if one shows up it wont be as fun , you already getting +40% , at most we will get on top of that is roughly +15%.
I think they tried to squeeze most of that extra juice from factory with that [email protected] Boost 3.0


----------



## kx11

and i'm here sitting with my HOF 1080s , can't mod the bios or anything like that


----------



## cstarkey42

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hm, I just cant get it. I might be CPU limited (i7 3820 @ 4.75)
> Max I could get in Time Spy or BF4 or Heaven was a short spike to 114%.
> GPU is at 2164 / 1.093V


I've got the same set up and am getting the same results with my FTW using the 130% bios. One thing I finally realized, several years and 4 gtxs later, is that you have to run the nvidia made exe, found in the googlesphere, in order to open the 3820 up to pcie 3.0. I'm not sure it's going to make much of a difference with tdp or benchmarking but it feels good knowing I finally have it activated.


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> and i'm here sitting with my HOF 1080s , can't mod the bios or anything like that


Why would you? I've seen GHOF 9 series BIOS and it's rock solid, more than sure they did the same thing with the 10s, slight OC wouldn't hurt ofcorse but Power and Voltage on HOF should be solid


----------



## IronAge

Very likely he will be getting freezes after running 3D with MSI AB oder EVGA Precision-X even just when browsing the internet.

That has been reported by hof / classified owners in germany.

Classified has the same issues since HOF and Classified are supposed to have the same IR VRM Controler.


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Why would you? I've seen GHOF 9 series BIOS and it's rock solid, more than sure they did the same thing with the 10s, slight OC wouldn't hurt ofcorse but Power and Voltage on HOF should be solid


i'm talking about pushing the voltage to run higher than 1.095v which seems to be the highest , core clock wise i can see play @ steady 2050+ for a while and not dropping below 2020mhz + 5500mhz

solid cards for sure


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> i'm talking about pushing the voltage to run higher than 1.095v which seems to be the highest , core clock wise i can see play @ steady 2050+ for a while and not dropping below 2020mhz + 5500mhz
> 
> solid cards for sure


Boosting Voltage on the 10Series seems to cripple your OC , ive posted two results a wile back, if it's the case for the FE i can only assume it's the case for the whole series but i might be wrong


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Boosting Voltage on the 10Series seems to cripple your OC , ive posted two results a wile back, if it's the case for the FE i can only assume it's the case for the whole series but i might be wrong


Tested that with both armor and ftw. Both got the best results with +100 voltage and max TDP. Could get the same clock with less voltage, but with occasional drops due to vRel. With +100 voltage and 130% TDP, there's no vRel in most applications.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Tested that with both armor and ftw. Both got the best results with +100 voltage and max TDP. Could get the same clock with less voltage, but with occasional drops due to vRel. With +100 voltage and 130% TDP, there's no vRel in most applications.


I tried GamingX OC bios on mine and got bad results , have you tried FE bios with everything Stock? Reason i'm asking , few people posted better OC results with FE Bios.
Btw this was my last run on 1080 and on Stock 1.06V, adding Voltage 1.09 would result in under 25000.
http://www.3dmark.com/3dm/13638712


----------



## IronAge

@LeviathanVI

You might also want to check out those ones ... pretty good price for a custom PCB with 8+2 Phase VRM Design.

http://it-supplier.co.uk/gigabyte-geforce-gtx-1080-windforce-oc-8gb-gv-n1080wf3oc-8gd

FC waterblock + backplates available from EK:

https://www.ekwb.com/configurator/step1_complist?gpu_gpus=2110


----------



## xer0h0ur

Quote:


> Originally Posted by *Hutzi*
> 
> In my opinion its not a no brainer.
> You pay less, but if you are going to sell the card one day you will also get less back.
> I used to be an AMD Guy (for 15 Years) and i dont think it's different for nvidia when i tell you that refrence cards are more likely to coil whine (R9 290X oh god...)
> Also the reference 1080's just have 1x 8-PIN and therefore less OC-Potential, even tho this does not seem to be the limiter at the moment.
> 
> Btw you get watercoolers for partnercards aswell and you can also powermod the partnercards ... wait... there is no need to do that.


My reference 290X doesn't have coil whine, my 295X2 had coil whine and my GTX 1080 has more coil whine than the 295X2. At least its not Fury X banshee levels of screaming though. Those cards if you had coil whine, gud lawd, rest in pepperonis.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> I tried GamingX OC bios on mine and got bad results , have you tried FE bios with everything Stock? Reason i'm asking , few people posted better OC results with FE Bios.
> Btw this was my last run on 1080 and on Stock 1.06V, adding Voltage 1.09 would result in under 25000.
> http://www.3dmark.com/3dm/13638712


Haven't tried FE bios yet, might do it out of pure curiosity, but something tells me that it won't make any difference (for better, at least).


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Haven't tried FE bios yet, might do it out of pure curiosity, but something tells me that it won't make any difference (for better, at least).


Like i said , tried GamingX OC i think it was 1709Mhz stock and 1999 Boost and OC options with that bios were way worse and also Power limit was set to 110 in that bios by default.
Check it out i guess , it wont hurt







Just get EVGAs FE BiOS


----------



## LeviathanVI

Quote:


> Originally Posted by *IronAge*
> 
> I have seen that quite a view early adopters selling FEs even with waterblocks going from single/SLI 1080 to the Pascal Titan.
> 
> Selling them for 10-20% less what they have payed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course there are non FEs on the FE/reference PCB with different blower coolers ... for instance MSI GTX1080 Aero OC, Asus TUR­BO-GTX1080-8G.
> 
> Probably you don't want to spend extra money for custom coolers / pcb when you put fc waterblocks on em.
> 
> EVGA Service is unbeaten no brainer when you can get two used EVGA FEs since there is no trouble with guest RMA as a 2nd,3rd etc. owner.


I figured that it would be fine to pay for the blowers that come with the partner cards for two reasons.
1. Because they'd be overclocked a certain extent already, which would be nice.
2. Because they'd be easier to sell down the line.

But I'm pretty new to this, so I'm always prepared to be wrong.
Quote:


> Originally Posted by *IronAge*
> 
> @LeviathanVI
> 
> You might also want to check out those ones ... pretty good price for a custom PCB with 8+2 Phase VRM Design.
> 
> http://it-supplier.co.uk/gigabyte-geforce-gtx-1080-windforce-oc-8gb-gv-n1080wf3oc-8gd
> 
> FC waterblock + backplates available from EK:
> 
> https://www.ekwb.com/configurator/step1_complist?gpu_gpus=2110


Thanks! That does look good and I like the price.
How are Gigabyte with warranty and service?

Yep, all my parts will be EK except fittings and tubing.


----------



## karelbastos

GTX 1080 Hybrid, installation..

Just for fun ^^

Portuguese BR - AUDIO. No english.


----------



## IronAge

when you put them back to air cooling and you are the first owner they are under a three years warranty AFAIK.

for self induced damage its at your own responsibility ... just the same as with the other manufacturers.

besides the RGB lightning of the cooler and the clock rates these match the G1 editions.

so it should be easy to flash them with the G1 bios. but shunt mod won't work with those.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Like i said , tried GamingX OC i think it was 1709Mhz stock and 1999 Boost and OC options with that bios were way worse and also Power limit was set to 110 in that bios by default.
> Check it out i guess , it wont hurt
> 
> 
> 
> 
> 
> 
> 
> Just get EVGAs FE BiOS


Will try it as soon as I get home. I'm also curious to test the 1.2V bios and see how hot it can get on air.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Will try it as soon as I get home. I'm also curious to test the 1.2V bios and see how hot it can get on air.


1.2V what Vendor uses 1.2 on what Card?


----------



## cj0612

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hm, I just cant get it. I might be CPU limited (i7 3820 @ 4.75)
> Max I could get in Time Spy or BF4 or Heaven was a short spike to 114%.
> GPU is at 2164 / 1.093V


I noticed it also depends on whether or not you are limiting your frame rate, whether that be through Vsync or the particular application.

I found that I could lower my frame cap from the monitors refresh rate (144hz, which it wouldnt push constantly anyways in overwatch maxed out) to say 100fps and it would lower my tdp% substantially. Once I did this I could overclock my gpu higher until I was using back in the 11x for tdp usage. Hoping to do the TDP mod once I get some conductonaut in. The power mod should allow me to keep a high overclock and remain under what the card thinks is its "power limit".


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> 1.2V what Vendor uses 1.2 on what Card?


Gotta dig the old BIOS thread. I guess it's the XOC Bios. People were getting 1.2V out of that. Will look for it as soon as I get home.


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> 1.2V what Vendor uses 1.2 on what Card?


No card does. I think the BIOS can allow up to 1.2v. Which we all know is useless without subzero cooling anyway.

You said it earlier. Extra voltage cripples OC's.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> No card does. I think the BIOS can allow up to 1.2v. Which we all know is useless without subzero cooling anyway.
> 
> You said it earlier. Extra voltage cripples OC's.


Thst's what i was asking







What Vendor uses 1.2 V on their 1080 from factory


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Thst's what i was asking
> 
> 
> 
> 
> 
> 
> 
> What Vendor uses 1.2 V on their 1080 from factory


Not one vendor and you know it.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Not one vendor and you know it.


You mean to tell me someone managed to Custom Tweak 1.2V into bios?


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> You mean to tell me someone managed to Custom Tweak 1.2V into bios?


No no, I am not sure of this as I have not used the BIOS myself.

However, apparently it ALLOWS UP TO 1.2v. Doesn't start there.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> No no, I am not sure of this as I have not used the BIOS myself.
> 
> However, apparently it ALLOWS UP TO 1.2v. Doesn't start there.


I see , really wondering now if some one with lets say EVGA FTW flashed it with EVGA FE and post OC changes if any


----------



## fat4l

Quote:


> Originally Posted by *KillerBee33*
> 
> I tried GamingX OC bios on mine and got bad results , have you tried FE bios with everything Stock? Reason i'm asking , few people posted better OC results with FE Bios.
> Btw this was my last run on 1080 and on Stock 1.06V, adding Voltage 1.09 would result in under 25000.
> http://www.3dmark.com/3dm/13638712


Cuz with higher volts you hit TDP limits faster - > lower clocks = lower score


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> I see , really wondering now if some one with lets say EVGA FTW flashed it with EVGA FE and post OC changes if any


I'll satisfy your curiosity, just a moment.


----------



## Phinix

Hi everyone I need your help and advice.

I just got my MSI 1080 Seahawk, and while it runs cool (sub 50C under load) I don't think the performance is matching up.

Comparing its benchmarks to my 980 (stock) I had in before, it seems to only be 60% to 70% faster.
From what I had seen in reviews it was pretty much double the performance of a 980.

I used DDU, have tried both the new 368.95 driver and the 368.81 driver.
I have tried running with and without HPET enabled, and have disabled all power saving features.
I have also tried using a different pci slot.

The card is ramping up to 99% usage and runs at 1987 core clock on its own. (haven't done any over clocking yet.)

I currently have a 2600K @ 4.5Ghz on a maximus 4 extreme on win 7. As far as I knew this wouldn't be a bottle neck, but now I'm unsure.

I have been trying everything to try and figure out where this extra 30% to 40% of performance has gone.

If anyone has any suggestion I'd really appreciate it.


----------



## DStealth

W/o curve and not that bad results....stock cooled with Palit JS 1080 and T4 XOC BIOS


]










Agreed with the scores...just curious of some unordinary behavior


----------



## fat4l

Quote:


> Originally Posted by *Phinix*
> 
> Hi everyone I need your help and advice.
> 
> I just got my MSI 1080 Seahawk, and while it runs cool (sub 50C under load) I don't think the performance is matching up.
> 
> Comparing its benchmarks to my 980 (stock) I had in before, it seems to only be 60% to 70% faster.
> From what I had seen in reviews it was pretty much double the performance of a 980.
> 
> I used DDU, have tried both the new 368.95 driver and the 368.81 driver.
> I have tried running with and without HPET enabled, and have disabled all power saving features.
> I have also tried using a different pci slot.
> 
> The card is ramping up to 99% usage and runs at 1987 core clock on its own. (haven't done any over clocking yet.)
> 
> I currently have a 2600K @ 4.5Ghz on a maximus 4 extreme on win 7. As far as I knew this wouldn't be a bottle neck, but now I'm unsure.
> 
> I have been trying everything to try and figure out where this extra 30% to 40% of performance has gone.
> 
> If anyone has any suggestion I'd really appreciate it.


And where do you see its "only" 60-70% faster?


----------



## Phinix

Quote:


> Originally Posted by *fat4l*
> 
> And where do you see its "only" 60-70% faster?


I ran benchmarks on a few games and synthetics before I switched - average, minimum and maximum. I took reading of my 980 at stock and over clocked.

Comparing my stock 980 results to the results I am getting with the 1080, It is only 60% to 70% faster.


----------



## juniordnz

Quote:


> Originally Posted by *Phinix*
> 
> I ran benchmarks on a few games and synthetics before I switched - average, minimum and maximum. I took reading of my 980 at stock and over clocked.
> 
> Comparing my stock 980 results to the results I am getting with the 1080, It is only 60% to 70% faster.


Seems normal to me. My 1080 is 90% faster than my 970. What did you expect?


----------



## lyang238

Quote:


> Originally Posted by *nexxusty*
> 
> Putting an H110i on my 1080 tomorrow. I'm expecting load temps to be 55c max. Probably less.


Seems a bit overkill. I think you would be OK with a H90 or even H55. My Sea Hawk X is doing 50C max in an Inwin 805.
Quote:


> Originally Posted by *juniordnz*
> 
> Seems normal to me. My 1080 is 90% faster than my 970. What did you expect?


I mean going from 40 fps to 70 fps is a really good jump considering.


----------



## juniordnz

Guys, how can I flash over the master BIOS when I'm on the slave BIOS. Is it possible? I bricked the master one


----------



## Phinix

Quote:


> Originally Posted by *juniordnz*
> 
> Seems normal to me. My 1080 is 90% faster than my 970. What did you expect?


Well this is why I'm asking you guys if you think this is normal?

I bring it up because the guru3d review on the card shows certain games like Hitman, Tomb raider and Witcher 3 having 90% to 100% fps boost when compared to a 980.

So when I have the same game settings but am seeing much lower fps gain, I wonder.

On my system:
The rise of the tomb raider went from 59 to 91 (144fps in the review)
Hitman went from 69 to 90 (120fps in the review)
Witcher 3 went from 53 to 75 (109fps in the review)


----------



## juniordnz

Quote:


> Originally Posted by *Phinix*
> 
> Well this is why I'm asking you guys if you think this is normal?
> 
> I bring it up because the guru3d review on the card shows certain games like Hitman, Tomb raider and Witcher 3 having 90% to 100% fps boost when compared to a 980.
> 
> So when I have the same game settings but am seeing much lower fps gain, I wonder.
> 
> On my system:
> The rise of the tomb raider went from 59 to 91 (144fps in the review)
> Hitman went from 69 to 90 (120fps in the review)
> Witcher 3 went from 53 to 75 (109fps in the review)


Do you have the exact same build as the reviewer? FPS is not only about GPU


----------



## karelbastos

Quote:


> Originally Posted by *juniordnz*
> 
> Guys, how can I flash over the master BIOS when I'm on the slave BIOS. Is it possible? I bricked the master one


Junior

on nvflash you can do

nvflash --list

Nvflash will list all GPU

In my case

GPU on first slot was --index=0

GPU on second slot was --index=1

So, to flash the first one ( CLOSER TO THE CPU ) you need to

disable drivers on device manager
nvflash --index=0 --protectoff
nvflash --index=0 -6 XXXXX.ROM

Second one --index=0, this is the way that i used.

E taca-lhe pau... srsrsr


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*


Flashed FE with simple old school commands
cd c:/nvflash
nvflash -6 ***.rom
Used nvflash 5.292 https://www.techpowerup.com/downloads/2709/nvflash-5-292-0-for-windows


----------



## juniordnz

Already figured it out, you just have to start with the good bios and then, before flashing, you just turn the switch to the bad one and nvflash will flash over it.

FTW owners be advised: Don't try XOC bios







 was afraid of the 1.2V anyway


----------



## Phinix

Quote:


> Originally Posted by *juniordnz*
> 
> Do you have the exact same build as the reviewer? FPS is not only about GPU


No I don't, 2600K @ 4.4 on windows 7, the review is on a 5960X on windows 10.

Would you say that the large difference in fps is caused by that?


----------



## grimboso

Quote:


> Originally Posted by *Phinix*
> 
> No I don't, 2600K @ 4.4 on windows 7, the review is on a 5960X on windows 10.
> 
> Would you say that the large difference in fps is caused by that?


Quote:


> Originally Posted by *Phinix*
> 
> No I don't, 2600K @ 4.4 on windows 7, the review is on a 5960X on windows 10.
> 
> Would you say that the large difference in fps is caused by that?


Well Windows 10 support WDDM 2.0, and that alone can contribute a lot. I increased my fps across all games going from 7 to 10.


----------



## juniordnz

Quote:


> Originally Posted by *Phinix*
> 
> No I don't, 2600K @ 4.4 on windows 7, the review is on a 5960X on windows 10.
> 
> Would you say that the large difference in fps is caused by that?


Of course! That's the difference you were looking for.
Quote:


> Originally Posted by *KillerBee33*
> 
> I see , really wondering now if some one with lets say EVGA FTW flashed it with EVGA FE and post OC changes if any


Did it with ASUS FE (couldn't find EVGA, but I guess they are all the same).

FTW Slave: 2100-2113 core / 500mem / 25050 graphics on firestrike
ASUS FE: 2126-2139 core / 500mem / 25250 graphics on firestrike

2 clocks up and +200 on FE bios. The card also seems to run 2-4 degrees cooler and draws something like 15% less TDP.

Something strange happened though. When I set fan to 100% on FE bios, the fans went crazy fast, something likle double the speed and noise compared to FTW. It blows much much more air and GPUZ shows fans at 2700rpm, while on FE BIOS they reached 3600rpm at 100%. Guess thats dangerous, isn't it?


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> Of course! That's the difference you were looking for.
> Did it with ASUS FE (couldn't find EVGA, but I guess they are all the same).
> 
> FTW Slave: 2100-2113 core / 500mem / 25050 graphics on firestrike
> ASUS FE: 2126-2139 core / 500mem / 25250 graphics on firestrike
> 
> 2 clocks up and +200 on FE bios. The card also seems to run 2-4 degrees cooler and draws something like 15% less TDP.
> 
> Something strange happened though. When I set fan to 100% on FE bios, the fans went crazy fast, something likle double the speed and noise compared to FTW. It blows much much more air and GPUZ shows fans at 2700rpm, while on FE BIOS they reached 3600rpm at 100%. Guess thats dangerous, isn't it?


I also get best result with FE bios on my 1080 G1 Gaming strange but does work


----------



## wangle0485

Quote:


> Originally Posted by *DStealth*
> 
> W/o curve and not that bad results....stock cooled with Palit JS 1080 and T4 XOC BIOS
> 
> 
> ]
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Agreed with the scores...just curious of some unordinary behavior


Air or water? I can only get 2050 out of my super jetstream on air









Edit: read your post properly, stock cooled.....


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Of course! That's the difference you were looking for.
> Did it with ASUS FE (couldn't find EVGA, but I guess they are all the same).
> 
> FTW Slave: 2100-2113 core / 500mem / 25050 graphics on firestrike
> ASUS FE: 2126-2139 core / 500mem / 25250 graphics on firestrike
> 
> 2 clocks up and +200 on FE bios. The card also seems to run 2-4 degrees cooler and draws something like 15% less TDP.
> 
> Something strange happened though. When I set fan to 100% on FE bios, the fans went crazy fast, something likle double the speed and noise compared to FTW. It blows much much more air and GPUZ shows fans at 2700rpm, while on FE BIOS they reached 3600rpm at 100%. Guess thats dangerous, isn't it?


Heh







figure out your Fans RPM and custom set them, Reference blower goes up to 4200
Had similar issue when flashed from Custom AIBs Bios to a reference card , highest RPM was 2800 which wasnt enough


----------



## juniordnz

Great, got better results with FE bios than the one made specifically for my card. But that bugs me, whats the hocus pokus with theses FE bios? Why is my card running faster, better, cooler and with less power?


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Great, got better results with FE bios than the one made specifically for my card. But that bugs me, whats the hocus pokus with theses FE bios? Why is my card running faster, better, cooler and with less power?


Try all that but with Voltage Untouched now.


----------



## karelbastos

All FE BIOS is the same ?

ASUS
EVGA
ZOTAC
MSI

all FE version, have the same BIOS ?


----------



## nexxusty

Quote:


> Originally Posted by *lyang238*
> 
> Seems a bit overkill. I think you would be OK with a H90 or even H55. My Sea Hawk X is doing 50C max in an Inwin 805.
> I mean going from 40 fps to 70 fps is a really good jump considering.


H110i GT won't even fit. Nor will the Antec Kuhlers I bought.

I'm now in for over $150 on cooling.... should have bought a block. Oh well.

Quote:


> Originally Posted by *Phinix*
> 
> No I don't, 2600K @ 4.4 on windows 7, the review is on a 5960X on windows 10.
> 
> Would you say that the large difference in fps is caused by that?


Sandy Bridge? Even SB-E is too slow today. 4.4 is really weak for a 2600k as well. I've never seen a 2600k that didn't do 4.6ghz.. super rare.


----------



## Phinix

Quote:


> Originally Posted by *juniordnz*
> 
> Of course! That's the difference you were looking for.


Thanks for letting me know. I appreciate it.







Will look into upgrading to win10 sooner rather then later.

Also I haven't over clocked my card yet, will do it tomorrow but noticed that it only goes up to 1.031v under load. That normal?
Quote:


> Originally Posted by *nexxusty*
> 
> Sandy Bridge? Even SB-E is too slow today. 4.4 is really weak for a 2600k as well. I've never seen a 2600k that didn't do 4.6ghz.. super rare.


My chip has degraded a lot over the years, I tried hitting 4.6 yesterday but was requiring more then 1.4V and didn't want to go that high.


----------



## boredgunner

Quote:


> Originally Posted by *nexxusty*
> 
> Sandy Bridge? Even SB-E is too slow today. 4.4 is really weak for a 2600k as well. I've never seen a 2600k that didn't do 4.6ghz.. super rare.


I went from an i7 2600 non-K (4.2 GHz) to an i7 6700k and the difference in games is minimal, even with the 6700k at 4.6 GHz. Looking at gamegpu.ru only a few games really benefit from the 6700k over previous gen i7's, like BF1 alpha.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Try all that but with Voltage Untouched now.


Tried that briefly, couldn't get the same clocks but firestrike graphics score stayed around 25250. Guess higher clocks are ego boosters, since I'm getting the same scores with less voltage, less power, less heat and less clock. What's odd is that with 130%TDP on ftw BIOS, I get no perfcap at all, and with FE Bios I have vRel perfcap and still get better clocks and better performance.

Need to sit and do it right, with time, and see what I can get. Will also try other benchmarks to see if the results are solid. But thanks for the tip!








Quote:


> Originally Posted by *nexxusty*
> 
> H110i GT won't even fit. Nor will the Antec Kuhlers I bought.


Wait, what? Why? How? It should fit...what's getting in the way?


----------



## nexxusty

Quote:


> Originally Posted by *boredgunner*
> 
> I went from an i7 2600 non-K (4.2 GHz) to an i7 6700k and the difference in games is minimal, even with the 6700k at 4.6 GHz. Looking at gamegpu.ru only a few games really benefit from the 6700k over previous gen i7's, like BF1 alpha.


That's not true at all.

I've owned every i7 generation and the difference between SB and Skylake is astounding.

Minimum framerates are much much lower in modern games with a 2600k.
Quote:


> Originally Posted by *juniordnz*
> 
> Tried that briefly, couldn't get the same clocks but firestrike graphics score stayed around 25250. Guess higher clocks are ego boosters, since I'm getting the same scores with less voltage, less power, less heat and less clock. What's odd is that with 130%TDP on ftw BIOS, I get no perfcap at all, and with FE Bios I have vRel perfcap and still get better clocks and better performance.
> 
> Need to sit and do it right, with time, and see what I can get. Will also try other benchmarks to see if the results are solid. But thanks for the tip!
> 
> 
> 
> 
> 
> 
> 
> 
> Wait, what? Why? How? It should fit...what's getting in the way?


H110i or any "I" series AIO from Corsair is not Asetek design. Block won't mount.


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> Tried that briefly, couldn't get the same clocks but firestrike graphics score stayed around 25250. Guess higher clocks are ego boosters, since I'm getting the same scores with less voltage, less power, less heat and less clock. What's odd is that with 130%TDP on ftw BIOS, I get no perfcap at all, and with FE Bios I have vRel perfcap and still get better clocks and better performance.
> 
> Need to sit and do it right, with time, and see what I can get. Will also try other benchmarks to see if the results are solid. But thanks for the tip!
> 
> 
> 
> 
> 
> 
> 
> 
> Wait, what? Why? How? It should fit...what's getting in the way?


NP







Started my suspicions when flashed a "BETTER" bios with much higher everything and got much lower Performance


----------



## toncij

Quote:


> Originally Posted by *nexxusty*
> 
> H110i or any "I" series AIO from Corsair is not Asetek design. Block won't mount.


How do you mean? What block won't mount where?


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> How do you mean? What block won't mount where?


With a G10 or similar Asetek GPU AIO bracket.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> H110i or any "I" series AIO from Corsair is not Asetek design. Block won't mount.


What adapter are you trying to use with it? G10? I'm very prone to buy a H110i to put on my 1080, I'll just have an adapter custom made.
Quote:


> Originally Posted by *nexxusty*
> 
> With a G10 or similar Asetek GPU AIO bracket.


Oh, I figured it would be hard to adapt that G10 to coolers different from H105 or H75. See if you can trade your for a H105. You won't loose a thing in performance and the kraken will fit perfectly. The H105 is even better than H110i on higher fan speeds.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> What adapter are you trying to use with it? G10? I'm very prone to buy a H110i to put on my 1080, I'll just have an adapter custom made.


Yep Kraken G10.

I bought an H90 to put on it with some Noctua 140mm fans. Should be great temp wise. Full DIY Hybrid with copper shim.

*edit*

Hmm.... H105 eh?


----------



## KillerBee33

Dude just posted few temps. on a TitanP with EVGAs Hybrid tops out @ 55 degrees .


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> Dude just posted few temps. on a TitanP with EVGAs Hybrid tops out @ 55 degrees .


Good to know. H90 with Kryonaut and Noctuas should do under 50c.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Good to know. H90 with Kryonaut and Noctuas should do under 50c.


I really wanted to slap an H90 for it's 140mm but couldnt resist when EVGA sent me 10Series Hybrid available


----------



## nexxusty

Quote:


> Originally Posted by *KillerBee33*
> 
> I really wanted to slap an H90 for it's 140mm but couldnt resist when EVGA sent me 10Series Hybrid available


The pump matters A LOT with these AIO's. The EVGA pump for the Hybrid Kit is the best one for GPU's.

Saw a write up about it somewhere. The Hybrid Kit was by far the most efficient cooler. Not the best... the most efficient... Hehe.


----------



## toncij

Running Corsair bracket and H115i on TitanX Maxwell with no problems. I think the bracket needs only some slight modification to fit Pascals...


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> The pump matters A LOT with these AIO's. The EVGA pump for the Hybrid Kit is the best one for GPU's.
> 
> Saw a write up about it somewhere. The Hybrid Kit was by far the most efficient cooler. Not the best... the most efficient... Hehe.


I can honestly say a 980 @ 1557 @ 1.25V @ 380 Max Power topped out @ 60 MAX even with room being hot as hell around 80







So EVGAs Hybrid isn't a bad way to go.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> Hmm.... H105 eh?


Try to find some reviews that have h110i GT and H105 on the same chart, there aren't too many, but the ones I found showed H110i with and advantage on low fan speeds and H105 when fans are faster.


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> Running Corsair bracket and H115i on TitanX Maxwell with no problems. I think the bracket needs only some slight modification to fit Pascals...


Hmph. Is H115i Asetek style?


----------



## juniordnz

Quote:


> Originally Posted by *toncij*
> 
> Running Corsair bracket and H115i on TitanX Maxwell with no problems. I think the bracket needs only some slight modification to fit Pascals...


Did you drilled the bracket? Because the wholes are a lot wider than the 53mm on nvidia cards
Quote:


> Originally Posted by *nexxusty*
> 
> Hmph. Is H115i Asetek style?


Nope, won't work with G10. G10 works with H75, H90, H105 and H100 if I'm not mistaken. Those round pumps with "teeth" around it.


----------



## GanGstaOne

1080 G1 Gaming with Kraken G10 + Thermaltake water 3.0 Extreme S 240mm even on 2200mhz keeps the card below 40 C with 4x1200rpm fans


----------



## nexxusty

Quote:


> Originally Posted by *GanGstaOne*
> 
> 1080 G1 Gaming with Kraken G10 + Thermaltake water 3.0 Extreme S 240mm even on 2200mhz keeps the card below 40 C with 4x1200rpm fans


Forgot about Tt AIO's. AFAIK they all are Asetek mount style.

Quote:


> Originally Posted by *juniordnz*
> 
> Did you drilled the bracket? Because the wholes are a lot wider than the 53mm on nvidia cards
> Nope, won't work with G10. G10 works with H75, H90, H105 and H100 if I'm not mistaken. Those round pumps with "teeth" around it.


FWIW, I as well thought this was the case. Let's see what he says....

Going to go finish the mod now. Hoping the shim is going to work well. 25mm×25mm×1.2mm is what I have, 20mm is ideal, however I don't see any issues with an extra 5mm. It does fit fine and isn't much bigger than the die.

That shim was sent to me BTW by a VERY nice OCN.NET member. For free. He also sent an extra one.

Love this forum.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> Guys, how can I flash over the master BIOS when I'm on the slave BIOS. Is it possible? I bricked the master one


I have done that as well, when you boot with the slave, just flip the switch back to master when your in windows and then flash and you are good to go.


----------



## toncij

Quote:


> Originally Posted by *juniordnz*
> 
> Did you drilled the bracket? Because the wholes are a lot wider than the 53mm on nvidia cards
> Nope, won't work with G10. G10 works with H75, H90, H105 and H100 if I'm not mistaken. Those round pumps with "teeth" around it.


Nope... matched perfectly:
https://www.amazon.de/gp/product/B01ALIU7AS


----------



## juniordnz

Would a card that already doesn't throttle due to heat benefit from watercooling to stay at 40s? I mean, my FTW doesn't throttle at all, maybe 1 clock down in some very heavy situations. Would watercooling make it possible to achieve higher clocks, or have I already achieved maximum clock if I'm no throttling due to temperatures.


----------



## ganzosrevenge

I'm trying to fill out the form. I just bought a 1080 Classy. There's nothing there that fits "Founders Edition" nor "Reference", and I don't know where my GPU-Z Validation goes. Would it be OK to put up a screenshot (or many screenshots)?

Jason


----------



## boredgunner

Quote:


> Originally Posted by *ganzosrevenge*
> 
> I'm trying to fill out the form. I just bought a 1080 Classy. There's nothing there that fits "Founders Edition" nor "Reference", and I don't know where my GPU-Z Validation goes. Would it be OK to put up a screenshot (or many screenshots)?
> 
> Jason


You should be able to email the validation to yourself. But yeah the form is outdated.


----------



## nexxusty

Quote:


> Originally Posted by *toncij*
> 
> Nope... matched perfectly:
> https://www.amazon.de/gp/product/B01ALIU7AS


That's a Corsair bracket.... we aren't taking about that.

Get with the program....


----------



## Hutzi

Quote:


> Originally Posted by *juniordnz*
> 
> Would a card that already doesn't throttle due to heat benefit from watercooling to stay at 40s? I mean, my FTW doesn't throttle at all, maybe 1 clock down in some very heavy situations. Would watercooling make it possible to achieve higher clocks, or have I already achieved maximum clock if I'm no throttling due to temperatures.


Would at least reduce the noise, if you are a silent guy like me.
Due to better cooling the card will drain little less power (290X OC drained 470 Watt from the Wall (whole system) and 440 Watt with while being watercooled) and components are going to live longer due to better temperatures.

I've done some other tests on my 1080 yesterday and found out my card clocks up to 2055 if I put the fans on 100% for cooling.
It goes up to 2088 without changing anything beside adding some voltage. I didnt test if that also increase the performance, tho.

Personally I'll watercool it for the noise and hopefully better OC-Result which comes at least with the cooling (72°C (auto), 55°C (100%) compared to 45°C being watercooled).


----------



## nexxusty

Cut my Kraken G10 Bracket.

So ugly.... I don't know how people can stand it.

All that's needed is the damn circular retention teeth and the mounting holes. So I dremeled off the fan mount portion and am now grinding it down to size. It will work great in the end, and won't look ridiculous with the damn bracket and the NZXT logo showing.

I will of course give pics when I'm done shaping the bracket. I recommend everyone do this to be honest. If you have the tools, absolutely. This essentially makes your Homebrew DIY kit identical to the EVGA one. Installation is exactly the same, however you have Radiator options this way.

No brainer IMO.

Have to finish up tomorrow though as its late and I already woke someone using my Dremel and Grinder.... 3am grinding isnt too cool unless you live alone.

Major reason to clean my garage out. Lol.


----------



## Zyther

Can someone run the atitool 3d view cube and tell me if they get coil whine?
https://www.techpowerup.com/downloads/436/atitool-0-26

This is what mine sounds like :/


----------



## Phinix

To those who helped me yesterday with what I thought was poor performance on my new 1080, I did more benchmarking with Shadow of Mordor as it isn't very cpu intensive, and compared my results to the review at 2560x1440, 3840x2160 and 5120x2880. My results ended up being 3%, 2.8% and 2.2% respectively below their results, which is perfect.

So at least now I know that my card is fine and that any huge difference in other games is either due to windows 7 or my cpu bottleneck.

Thanks again.


----------



## Thetbrett

bloody DHL. Rang them this morning and they assured me I'd have it today. Apparently it got "misplaced" and now I have to wait until monday to get my paws on it. My Friday night just got bummed out.


----------



## alawadhi3000

Quote:


> Originally Posted by *Zyther*
> 
> Can someone run the atitool 3d view cube and tell me if they get coil whine?
> https://www.techpowerup.com/downloads/436/atitool-0-26
> 
> This is what mine sounds like :/


Of course you'll get coil whine at +4000 fps.


----------



## Jaju123

Hey guys. Can anyone help me out? I got my new gtx 1080 ftw edition and it overclocks to +95 core and +350 memory stable. Giving about a 2080 sustained boost clock. I want to also change voltages though to achieve 2100mhz and am confused as to how it works. Is it safe just to jack it up to +100℅ voltage limit and just explore there? The graph method is also very confusing and the precision x tool is not very well designed. Thanks for any help.

Sent from my ONEPLUS A3003 using Tapatalk


----------



## pantsoftime

Quote:


> Originally Posted by *Zyther*
> 
> Can someone run the atitool 3d view cube and tell me if they get coil whine?


Yep I get it too on NVIDIA FE. Not surprised given the framerates involved.


----------



## Zyther

Quote:


> Originally Posted by *alawadhi3000*
> 
> Of course you'll get coil whine at +4000 fps.


No even in games i get it, here im getting about 76FPS and you can hear it, i then alt tab to desktop and the whine goes.


----------



## Hutzi

Coil whine is huge problem when you want to have a silent PC - this is one of the main reasons I had to replace my 290X (as they were new i had to order 3 pce of them and kept the one with the least coil while - but it was still the loudest noise in my whole PC. Very annoying.
I'll never buy a reference card anymore.

My 1080 (Palit Gamerock) does have no coil whine at all (testet in scenarios with up to 400 FPS) and iam very happy with that.

@Zyther: Did you overclock your card? Because my coil while drastically increased with memory-OC. I ended up to not OC memory because of that.


----------



## Zyther

Quote:


> Originally Posted by *Hutzi*
> 
> @Zyther: Did you overclock your card? Because my coil while drastically increased with memory-OC. I ended up to not OC memory because of that.


No, running stock


----------



## IronAge

Voltage Offset Control for Pascal could be available soon with another AB Beta.

http://www.guru3d.com/articles-pages/nvidia-titan-x-(pascal)-overclock-guide,2.html
Quote:


> For this article we use AfterBurner, this is a new Beta stage development build in which we opened up voltage offset control, the release is still pending and not available to the generic public just yet.


----------



## Hutzi

Well, thats unfortunate then.








Did you thought about replacing it? It seems there a just very few 1080's with coil whine.


----------



## Hutzi

Quote:


> Originally Posted by *IronAge*
> 
> Voltage Offset Control for Pascal could be available soon with another AB Beta.
> 
> http://www.guru3d.com/articles-pages/nvidia-titan-x-(pascal)-overclock-guide,2.html


???
You can download the 4.3.0 Beta already.
https://gaming.msi.com/features/afterburner


----------



## Zyther

Quote:


> Originally Posted by *Hutzi*
> 
> Well, thats unfortunate then.
> 
> 
> 
> 
> 
> 
> 
> 
> Did you thought about replacing it? It seems there a just very few 1080's with coil whine.


Im going to log a RMA and see what they say, as 1 of the fans has a fluttery sound aswell.


----------



## Hutzi

Quote:


> Originally Posted by *Zyther*
> 
> Im going to log a RMA and see what they say, as 1 of the fans has a fluttery sound aswell.


I wish you the best of luck on that! Although I dont think coil while is a reason to RMA a card, maybe the faulty fan helps.


----------



## IronAge

Quote:


> Originally Posted by *Hutzi*
> 
> ???
> You can download the 4.3.0 Beta already.
> https://gaming.msi.com/features/afterburner


That is AB 4.3.0 Beta 4 ... Guru3d has a 4.3.0 Beta 11 ... seems to be different.


----------



## Hutzi

Well i thought it's about the unlock for voltage - which is already working for 4.3.0 Beta which I linked for you.


----------



## IronAge

So Guru3D is outdated ... i am still waiting for the Gigabyte G1 and Inno3D iChill x3 to arrive.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> Cut my Kraken G10 Bracket.
> 
> So ugly.... I don't know how people can stand it.
> 
> All that's needed is the damn circular retention teeth and the mounting holes. So I dremeled off the fan mount portion and am now grinding it down to size. It will work great in the end, and won't look ridiculous with the damn bracket and the NZXT logo showing.
> 
> I will of course give pics when I'm done shaping the bracket. I recommend everyone do this to be honest. If you have the tools, absolutely. This essentially makes your Homebrew DIY kit identical to the EVGA one. Installation is exactly the same, however you have Radiator options this way.
> 
> No brainer IMO.
> 
> Have to finish up tomorrow though as its late and I already woke someone using my Dremel and Grinder.... 3am grinding isnt too cool unless you live alone.
> 
> Major reason to clean my garage out. Lol.


You cut it off? LOL Well, if you only want the bracket, that's th way to go...

Please post some before and after temps. I'm very interested in this matter. Good luck with the mod









Besides, what about VRAM/VRM cooling? Don't know if your card has a heatplate on them. If so, do you think the heatplate, without air blowing on it, is enough to keep them cool? (since you mutilated the fan out of your G10







)
Quote:


> Originally Posted by *IronAge*
> 
> That is AB 4.3.0 Beta 4 ... Guru3d has a 4.3.0 Beta 11 ... seems to be different.


Could you link that beta 11? I can only find beta 4 on guru3d database.


----------



## IronAge

That Beta 11 is not available to the public ... just as stated by the author of the Guru3D guide which i have quoted.


----------



## pantsoftime

Quote:


> Originally Posted by *Hutzi*
> 
> Well i thought it's about the unlock for voltage - which is already working for 4.3.0 Beta which I linked for you.


I think he's confusing the TitanXP voltage control which is absent from the current public version of Afterburner. The new release will just get the Titan controls on par with the 1080.


----------



## escalibur

Soon.


----------



## juniordnz

Quote:


> Originally Posted by *escalibur*
> 
> Soon.


NICE!









Please, if you can, post temps before and after. You'll probably need a cooper shim to install the pump over the heatplate fingers of the FTW.

PS: what's that small brown box?


----------



## IronAge

Isnt the EVGA Hybrid Cooler supposed to perform better than other AIOs with brackets ?

http://www.gamersnexus.net/guides/2441-diy-gtx-1080-hybrid-thermals-100-percent-lower-higher-oc-room

Also i remember a overclock.net member took orders for selfmade brackets for AIO for a way more reasonable price.


----------



## escalibur

Quote:


> Originally Posted by *juniordnz*
> 
> NICE!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please, if you can, post temps before and after. You'll probably need a cooper shim to install the pump over the heatplate fingers of the FTW.
> 
> PS: what's that small brown box?


Sure. That's the reason I have t o delay my installation for a week or something. I need the data for 'before'.









Shim is on it's way from UK I'm expecting it at the beginning of the next week.

Brown box contains http://noctua.at/en/nf-b9-redux-1600-pwm

I've installed my card last night so I didn't even had time to try some OC.


----------



## IronAge

GTX1070FTW Hybrid ist available ... should not take too long and 1080FTW Hybrid will be released.


----------



## KillerBee33

Quote:


> Originally Posted by *IronAge*
> 
> GTX1070FTW Hybrid ist available ... should not take too long and 1080FTW Hybrid will be released.


http://www.evga.com/Products/Product.aspx?pn=08G-P4-6288-KR


----------



## IronAge

Its nice that they use a Custom PCB für the Hybrids now ... for the GTX980/Ti Hybrids they have used reference PCB AFAIK ?!


----------



## Phinix

Well I am very impressed with my temp on my seahawk, ambient is 18C to 20C at the moment and idle the card is at 23C and under load at 46C.
Currently have the fan pinned at 60% which is quiet but have two corsair ML120 fans that I just got that I want to try on the radiator for push pull and see if there's any change.


----------



## karelbastos

Quote:


> Originally Posted by *Phinix*
> 
> Well I am very impressed with my temp on my seahawk, ambient is 18C to 20C at the moment and idle the card is at 23C and under load at 46C.
> Currently have the fan pinned at 60% which is quiet but have two corsair ML120 fans that I just got that I want to try on the radiator for push pull and see if there's any change.


Same here with my 1080 ZOTAC FE with EVGA Hybrid..

Idle 25 - 28 C

Full Load 40 - 46 C ( sometimes 50 C )

With fan stock speed



No i just neet to wait for VOLTAGE limit to be removed on BIOS mod, so we can increase the voltage and maintain better clocks with low temps...


----------



## Phinix

Quote:


> Originally Posted by *karelbastos*
> 
> Same here with my 1080 ZOTAC FE with EVGA Hybrid..
> 
> Idle 25 - 28 C
> 
> Full Load 40 - 46 C ( sometimes 50 C )
> 
> With fan stock speed
> 
> 
> 
> No i just neet to wait for VOLTAGE limit to be removed on BIOS mod, so we can increase the voltage and maintain better clocks with low temps...


That's awesome, I just finished over clocking my card and it ended up being pretty good I think, got it with +150 on the core and +500 on the memory.
The memory could go to +850ish but it had the best performance at +500. So in total its at 2126 core and 5508 memory. Although the voltage has never gone higher then 1.050.


----------



## juniordnz

Now I just found out something really incredible.

Asetek 5th gen watercoolers works with G10! I though, looking them on pictures, the bracket would be different, but after downloading asetek manual I see the bracket is exactly the same, just covered by the pump on the pictures the stores display.

Take a look:

http://www.asetek.com/media/1796/asetek_gen5_premium_installation_web.pdf

Now that opens up a lot of possibilities like H115i. H80i V2 and H100i V2

There's a lot of talking about how good these gen5 asetek coolers are, any thoughts on that? Was pretty much sold into coolit, but after seeing that I'm not so sure anymore.

OBS: you can identify a GEN5 asetek by the tubing. It comes from the top of the pump, not on the sides.


----------



## cj0612

Just thought i'd throw this out there, last night I put the evga hybrid cooler on my fe 1080 and it dropped my temps from 65-70 to 45-50. Also I was able to push my OC from 2012mhz(in game) to 2100mhz (in game) and a little higher in heaven and its completely stable. I'm curious as to what the tdp mod will allow me to achieve. Most people that commented on that video squeezed at least another 100 mhz or so out of it.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> You cut it off? LOL Well, if you only want the bracket, that's th way to go...
> 
> Please post some before and after temps. I'm very interested in this matter. Good luck with the mod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides, what about VRAM/VRM cooling? Don't know if your card has a heatplate on them. If so, do you think the heatplate, without air blowing on it, is enough to keep them cool? (since you mutilated the fan out of your G10
> 
> 
> 
> 
> 
> 
> 
> )
> Could you link that beta 11? I can only find beta 4 on guru3d database.


Right? Bracket only FTW.

I'm running an FE. I'm doing the mod this way specifically so I can use the stock VRM/RAM heatplate. Stock fan is still on there.

It will look exactly like an EVGA Hybrid when it's done. I'll take pics.

Just woke up... lol. Going to finish now.


----------



## GreedyMuffin

Quote:


> Originally Posted by *nexxusty*
> 
> I'm running an FE. I'm doing the mod this way specifically so I can use the stock VRM/RAM heatplate.
> 
> Just woke up... lol. Going to finish now.


I would love some pics!


----------



## nexxusty

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I would love some pics!


Np boys. I won't leave you hanging.


----------



## jedimasterben

I got around a hundred megahertz from doing the TDP mod on my FE. From 1975ish to 2088-2100, depending on whether it crosses 55C or not lol.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Right? Bracket only FTW.
> 
> I'm running an FE. I'm doing the mod this way specifically so I can use the stock VRM/RAM heatplate.
> 
> Just woke up... lol. Going to finish now.


Let us know if the rest of the Reference Shroud can fit on top of that pump , you might have to unscrew the Plexiglass from inside


----------



## karelbastos

Quote:


> Originally Posted by *jedimasterben*
> 
> I got around a hundred megahertz from doing the TDP mod on my FE. From 1975ish to 2088-2100, depending on whether it crosses 55C or not lol.


With TDP mod, short mod.

what is your max voltage ?

I'm thinking if is a worth to do thw TDP hard mod..

Thanks


----------



## cj0612

Quote:


> Originally Posted by *jedimasterben*
> 
> I got around a hundred megahertz from doing the TDP mod on my FE. From 1975ish to 2088-2100, depending on whether it crosses 55C or not lol.


Thinking about doing this, I just put the EVGA hybrid cooler for the 10 series on last night and now I don't cross 50C and I can manage 2025-2100mhz. Wondering if the tdp mod will push this a bit more..


----------



## nexxusty

Not pretty as the damn thing is painted instead of being black andonized.

But functional. Did a good job with the grinding I think, so many edges to take care of after cutting.

Now it's exactly what you'd get from EVGA or the Artisan Store here.

Also hatched this while grinding....


----------



## juniordnz

After some further testing on the FTW versus FE BIOS:


Spoiler: FTW Slave BIOS









Spoiler: FE BIOS







TL;DR: The FE Bios got me 1 clock up (it hits 2126 actually), a few degrees less, less TDP% and all that with 1.062V against 1.093V from FTW original BIOS. Oh, and almost 200 points more on FS.

Go figure...


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> After some further testing on the FTW versus FE BIOS:
> TL;DR: The FE Bios got me 1 clock up (it hits 2126 actually), a few degrees less, less TDP% and all that with 1.062V against 1.093V from FTW original BIOS. Oh, and almost 200 points more on FS.
> Go figure...


What was you fans RPM @ 100% on factory bios? i'd be carefull not to blow those with FE bios


----------



## DStealth

Quote:


> Originally Posted by *juniordnz*
> 
> After some further testing on the FTW versus FE BIOS:
> 
> TL;DR: The FE Bios got me 1 clock up (it hits 2126 actually), a few degrees less, less TDP% and all that with 1.062V against 1.093V from FTW original BIOS. Oh, and almost 200 points more on FS.
> 
> Go figure...


Quite similar







) Pictures I mean.
Go and try Strix T4 for me is much faster in pretty everything tested clock per clock and maxed
Example 24/7 settings with it just pushed the memory further...stock cooler on my Palit 1080 card


----------



## GanGstaOne

Guys that want to use Kraken G10 take a look at Arctic Cooling Liquid Freezer 120 one is 49mm and 240 is 38mm thick radiator i'm defenetly getting one of these beasts


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> What was you fans RPM @ 100% on factory bios? i'd be carefull not to blow those with FE bios


2700rpm Factory. Had to recalculate the fan % to get the same results, since FE BIOS was pushing it to almost 4000







68% on FE BIOS gives me the same 2700 from the FTW.

It's nice to get slightly better results with less voltage and lower temps. That's ALWAYS much appreciated hehe

Would there be any downside on using FE bios with my FTW? Also, the TDP% is much lower with the FE Bios, that's awesome too.
Quote:


> Originally Posted by *DStealth*
> 
> Quite similar
> 
> 
> 
> 
> 
> 
> 
> ) Pictures I mean.
> Go and try Strix T4 for me is much faster in pretty everything tested clock per clock and maxed
> Example 24/7 settings with it just pushed the memory further...stock cooler on my Palit 1080 card


Tried XOC bios yesterday, it bricked my card







All that voltage would be too much on air in a hot country anyway...


----------



## juniordnz

Quote:


> Originally Posted by *GanGstaOne*
> 
> Guys that want to use Kraken G10 take a look at Arctic Cooling Liquid Freezer 120 one is 49mm and 240 is 38mm thick radiator i'm defenetly getting one of these beasts




Now that's one beefy radiator LOL

This one is also a 5gen Asetek like H80i V2, H100i V2 and H115i. It's to see it comes with 4 fans too, push and pull ready.

If only I could get one of those here in Brazil. Also, we get 5 years Corsair warranty and support here, that would be difficult to get with arctic


----------



## KillerBee33

Quote:


> Originally Posted by *juniordnz*
> 
> 2700rpm Factory. Had to recalculate the fan % to get the same results, since FE BIOS was pushing it to almost 4000
> 
> 
> 
> 
> 
> 
> 
> 68% on FE BIOS gives me the same 2700 from the FTW.
> 
> It's nice to get slightly better results with less voltage and lower temps. That's ALWAYS much appreciated hehe
> 
> Would there be any downside on using FE bios with my FTW? Also, the TDP% is much lower with the FE Bios, that's awesome too.


Other than FAN rpm issue there cant be a downside but i'd search for your Vendors FE BIOS just to see the difference


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> 
> 
> Now that's one beefy radiator LOL
> 
> This one is also a 5gen Asetek like H80i V2, H100i V2 and H115i. It's to see it comes with 4 fans too, push and pull ready.
> 
> If only I could get one of those here in Brazil. Also, we get 5 years Corsair warranty and support here, that would be difficult to get with arctic


You could get one.

Trusted member... PayPal him, he buys it and ships it.

I'm sure more than a few people would be willing.

*edit*

No Scyther love? Pfft... Dragonite whores... lol.


----------



## juniordnz

Quote:


> Originally Posted by *KillerBee33*
> 
> Other than FAN rpm issue there cant be a downside but i'd search for your Vendors FE BIOS just to see the difference


I'll see if I can find an EVGA FE BIOS. Couldn't find one on techpowerup database.
*
if anyone here with an EVGA FE coud be so kind..







*
Quote:


> Originally Posted by *nexxusty*
> 
> You could get one.
> Trusted member... PayPal him, he buys it and ships it.
> I'm sure more than a few people would be willing.
> *edit*
> No Scyther love? Pfft... Dragonite whores... lol.


That's interesting. PMed you.









sry, I'm just not a pokemon guy


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> I'll see if I can find an EVGA FE BIOS. Couldn't find one on techpowerup database.
> *
> if anyone here with an EVGA FE coud be so kind..
> 
> 
> 
> 
> 
> 
> 
> *
> That's interesting. PMed you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> sry, I'm just not a pokemon guy


I have an EVGA FE. Latest BIOS too.

As soon as my rig is up I'll dump it for you.

Bro... I'm 33 and still play it. I see 45 year old women playing for god sakes.

It's not even the pokemon aspect that's the best. It's getting out and using your GPS to find these little guys. It's so fun.

If you have kids or a niece or nephew or cousins.... it's perfect. Trust me.


----------



## GreedyMuffin

I'm 16. So I don't need an excuse lol. ^^

Wondering if I want to flash my card back to stock FE bios. Can drop mine in here if you still need it.


----------



## nexxusty

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I'm 16. So I don't need an excuse lol. ^^
> 
> Wondering if I want to flash my card back to stock FE bios. Can drop mine in here if you still need it.


Wouldn't of guessed you were 16. I knee you were younger, but not 16.

You're on your way to being a respected member. Keep it up man.

Lol, I am always told I look 23 so it's not too bad. I blend in well. The hard part is all the drunk girl teenagers in tight pants asking you how to play... ugh.

It's not fair.... lol.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> I have an EVGA FE. Latest BIOS too.
> As soon as my rig is up I'll dump it for you.
> 
> Bro... I'm 33 and still play it. I see 45 year old women playing for god sakes.
> It's not even the pokemon aspect that's the best. It's getting out and using your GPS to find these little guys. It's so fun.
> If you have kids or a niece or nephew or cousins.... it's perfect. Trust me.


That would be much appreciated if you could do so.

I got no beef with pokemon go, bro. Everybody plays it here, my college teacher got one in the middle of the class yesterday







it's just not my thing I guess.
Quote:


> Originally Posted by *GreedyMuffin*
> 
> I'm 16. So I don't need an excuse lol. ^^
> 
> Wondering if I want to flash my card back to stock FE bios. Can drop mine in here if you still need it.


That would be nice, thanks!


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> That would be much appreciated if you could do so.
> 
> I got no beef with pokemon go, bro. Everybody plays it here, my college teacher got one in the middle of the class yesterday
> 
> 
> 
> 
> 
> 
> 
> it's just not my thing I guess.
> That would be nice, thanks!


Have you even tried it though?









I'll stop. Haha.

Your college teacher? Hahaha. So funny.


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> I'll see if I can find an EVGA FE BIOS. Couldn't find one on techpowerup database.
> *
> if anyone here with an EVGA FE coud be so kind..
> 
> 
> 
> 
> 
> 
> 
> *
> That's interesting. PMed you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> sry, I'm just not a pokemon guy


Techpowerup has it in Unverified uploads

https://www.techpowerup.com/vgabios/?architecture=Uploads&manufacturer=EVGA&model=GTX+1080&interface=&memType=&memSize=


----------



## stxe34

well i have tried the xoc bios at 2101 mhz and higher and i have decided to go back to the stock fe bios as there is no gain in performance, just heat. i have an sli setup. 2088 mhz seems the highest performing clock. anyone else noticed this?


----------



## bfedorov11

Here is mine..Evga FE.. rename to rom

gp104.csv 251k .csv file


I was wondering how these cards would perform with less voltage since PT is still an issue. Does any other software allow you to lower voltage besides the old 4.x precision x? Overclocking works with it, but it won't unlock voltage control.


----------



## GreedyMuffin

Quote:


> Originally Posted by *juniordnz*
> 
> That would be nice, thanks!


Here you go buddy!

Evga1080FEROM.zip 149k .zip file


You need to un-zip it.
Quote:


> Originally Posted by *nexxusty*
> 
> Wouldn't of guessed you were 16. I knee you were younger, but not 16.
> 
> You're on your way to being a respected member. Keep it up man..


Thanks!

Most people don't think I'm 16. Reason I got this HW is due [email protected] (Working for other folderes, and getting folding gear etc). So me and my parents teamed up for folding! 3x 980Tis 247/7 and 1x 1080 half-time. (At least when it's colder, and not so hot.







)


----------



## nexxusty

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Here you go buddy!
> 
> Evga1080FEROM.zip 149k .zip file
> 
> 
> You need to un-zip it.
> Thanks!
> 
> Most people don't think I'm 16. Reason I got this HW is due [email protected] (Working for other folderes, and getting folding gear etc). So me and my parents teamed up for folding! 3x 980Tis 247/7 and 1x 1080 half-time. (At least when it's colder, and not so hot.
> 
> 
> 
> 
> 
> 
> 
> )


That's pretty freakin cool. Good on your parents. Huge.


----------



## GreedyMuffin

Quote:


> Originally Posted by *nexxusty*
> 
> That's pretty freakin cool. Good on your parents. Huge.


Thanks man! Really appreciate it!








Quote:


> Originally Posted by *bfedorov11*
> 
> I was wondering how these cards would perform with less voltage since PT is still an issue. Does any other software allow you to lower voltage besides the old 4.x precision x? Overclocking works with it, but it won't unlock voltage control.


As far as I know only editing the rom/bios would allow for a voltage *decrease* sadly. Would be fun to see how far I could go down on ex. 2100 mhz.








Quote:


> Originally Posted by *stxe34*
> 
> well i have tried the xoc bios at 2101 mhz and higher and i have decided to go back to the stock fe bios as there is no gain in performance, just heat. i have an sli setup. 2088 mhz seems the highest performing clock. anyone else noticed this?


Yeah. I flashed my GPU back to my old stock bios that came with it. Since I was running stock voltage I didn't really need the XOC bios, and it lowered my scores. Seems like going with a FE was a smart choice after all.


----------



## Gabkicks

My EVGA GTX 1080 FTW won't go beyont 1.075v according to precisionx and afterburner's monitoring under load... is this normal?









heere is a 3dmark run i Just did . http://www.3dmark.com/3dm/14074612?


----------



## KillerBee33

Quote:


> Originally Posted by *Gabkicks*
> 
> My EVGA GTX 1080 FTW won't go beyont 1.075v according to precisionx and afterburner's monitoring under load... is this normal?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> heere is a 3dmark run i Just did . http://www.3dmark.com/3dm/14074612?


Core clock 2,152 MHz , Higher Voltage isn't always better


----------



## juniordnz

gzus, everybody is winning the silicon lottery but me..

EDIT:

Now that's odd...couldn't get the same overclock I did with ASUS FE with EVGA FE.


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> gzus, everybody is winning the silicon lottery but me..
> 
> EDIT:
> 
> Now that's odd...couldn't get the same overclock I did with ASUS FE with EVGA FE.


ASUS FE bios end with 00.01 EVGA with 00.80 different bios actualy all FE bios end with 00.01 but EVGA one


----------



## Starkinsaur

Juniordnz, it looks like you may have accidentally posted the same screenshot for both bioses on the last page. Thanks for doing that test btw; it's very curious to see the FE bios producing better results at reduced clocks. I wonder if the difference in measured TDP and VDDC has to do with differences in monitoring circuits within these boards? Ie the two bios interpret the signals differently.

Does anyone have the Gigabyte FE bios? I'd like to try the same thing on my G1 gaming card (25k in firestrike on custom water with TDP mod and xtreme gaming bios @2177mhz atm)


----------



## GanGstaOne

Quote:


> Originally Posted by *Starkinsaur*
> 
> Juniordnz, it looks like you may have accidentally posted the same screenshot for both bioses on the last page. Thanks for doing that test btw; it's very curious to see the FE bios producing better results at reduced clocks. I wonder if the difference in measured TDP and VDDC has to do with differences in monitoring circuits within these boards? Ie the two bios interpret the signals differently.
> 
> Does anyone have the Gigabyte FE bios? I'd like to try the same thing on my G1 gaming card (25k in firestrike on custom water with TDP mod and xtreme gaming bios @2177mhz atm)


1080 G1 gaming here same thing higher score very good overclock with FE bios techpowerup has gigabyte FE bios in unverified uploads


----------



## Starkinsaur

Sry for double post.

Gabkicks can you confirm that the AB voltage offset slider is at 100% and that in the curve, the 1093mV freq offset exceeds the offset for the 1075mV point? If the curve goes flat above a certain point it seems that It won't bother boosting to those higher points.


----------



## BigBeard86

I have a watercooled 1080 evga FE. My OC stays solid at 2126mhz; the card does not use more than stock voltage. However, powerlimit is constantly hit and I feel that is keeping my OC from going higher.

Is there a bios yet to change this? What about the physical mod; where can I find the best video/tutorial, and is it reversible?


----------



## juniordnz

Just got my maximum overclock with EVGA FE Bios. 2126mhz @ stock voltage and 120%TDP. With FTW BIOS I could get a maximum of 2113mhz @ 1.093V and 130%TDP. Also, After a few benchmarks, my TDP never hits 80%. While with the FTW bios it would get over 100% in the same test, same conditions.

That's crazy IMO. And like It was said above, I wonder if different BIOS don't make GPU-Z/AB read the information differently. It's not possible that EVGA, and all the other companies for that matter, would make a BIOS that makes their product WORSE. Not in a market where you gotta win from competition with higher clocks, lower temps, everything.

Really, that doesn't make ANY sense at all. Results are there, bat there's no logic to it.


----------



## nexxusty

Finished.

I run a Core X9 so I had to orient the tubing toward the front of the GPU. Need all the tubing length I can get to mount the Rad.

Anyone with a smaller case could orient the tubing toward the back of the GPU and still have the stock fan shroud on there too.

It will not make much if any difference though. Just aesthetics...

So... best Kraken G10 mod or what?

I'm going to throw it in turpentine to get rid of the paint and re paint the bracket once I take the card apart for maintenence.


----------



## Starkinsaur

Bigeard86;
We're still locked out of bios mods at this point. Some of the AIB cards allow for a higher TDP if you decide to flash them. However, as juniordnz is finding, the AIB bioses may be associated with a drop in performance relative to the FE bios.

If you want to do the shunt mod, it's very easy. Get a tube of Coolaboratory Liquid Metal Ultra (or pro) and use a q-tip to apply the liquid metal to the top of the resistors circled in the image below. Rub well towards the conductive ends of the resistor to ensure there is a good connection. You're basically just shorting out the resistor. The liquid metal can be removed later with turps without a trace - probably.
Reported TDP will drop significantly. Seems the reported number is not the same as what the card is using to determine when to hit Pwr limit though so you'll have to trial it to find the limits.
I disclaim myself entirely from liability for this information.


----------



## jedimasterben

Quote:


> Originally Posted by *karelbastos*
> 
> With TDP mod, short mod.
> 
> what is your max voltage ?
> 
> I'm thinking if is a worth to do thw TDP hard mod..
> 
> Thanks


1.093v rock steady.
Quote:


> Originally Posted by *cj0612*
> 
> Thinking about doing this, I just put the EVGA hybrid cooler for the 10 series on last night and now I don't cross 50C and I can manage 2025-2100mhz. Wondering if the tdp mod will push this a bit more..


Doooo ittttt
Quote:


> Originally Posted by *Starkinsaur*
> 
> Bigeard86;
> We're still locked out of bios mods at this point. Some of the AIB cards allow for a higher TDP if you decide to flash them. However, as juniordnz is finding, the AIB bioses may be associated with a drop in performance relative to the FE bios.
> 
> If you want to do the shunt mod, it's very easy. Get a tube of Coolaboratory Liquid Metal Ultra (or pro) and use a q-tip to apply the liquid metal to the top of the resistors circled in the image below. Rub well towards the conductive ends of the resistor to ensure there is a good connection. You're basically just shorting out the resistor. The liquid metal can be removed later with turps without a trace - probably.
> Reported TDP will drop significantly. Seems the reported number is not the same as what the card is using to determine when to hit Pwr limit though so you'll have to trial it to find the limits.
> I disclaim myself entirely from liability for this information.


Some people (though trusted, not just joe schmoes) are saying that this mod locks clock speeds at low desktop idle, but I shorted the three resistors in question with 63/37 solder and a small piece of 18AWG solid wire, and don't have that problem, and have no attainable TDP limit


----------



## Starkinsaur

Quote:


> Originally Posted by *jedimasterben*
> 
> Some people (though trusted, not just joe schmoes) are saying that this mod locks clock speeds at low desktop idle, but I shorted the three resistors in question with 63/37 solder and a small piece of 18AWG solid wire, and don't have that problem, and have no attainable TDP limit


That's a good point. I imagine this is a result of the card detecting that the voltage drop across the shunt resistors is approaching 0 and responds by locking clocks at 291mhz (from memory). I'd suggest that you're less likely to encounter this issue when using CLU as the resistance is higher compared to solder and wire. Further, CLU means the mod is more easily reversed.


----------



## cj0612

Quote:


> Originally Posted by *jedimasterben*
> 
> 1.093v rock steady.
> Doooo ittttt
> Some people (though trusted, not just joe schmoes) are saying that this mod locks clock speeds at low desktop idle, but I shorted the three resistors in question with 63/37 solder and a small piece of 18AWG solid wire, and don't have that problem, and have no attainable TDP limit


Once I get my conductonaut in I plan on it. As all I'm using is the liquid metal I plan on just rubbing it over the on resistor under the 8 pin like in the video. Do you think this will work or do I need to do more?


----------



## Phinix

In afterburner what does your curve look like, after 1.012mV my curve flat lines.


----------



## JaredC01

Quote:


> Originally Posted by *Starkinsaur*
> 
> Bigeard86;
> We're still locked out of bios mods at this point. Some of the AIB cards allow for a higher TDP if you decide to flash them. However, as juniordnz is finding, the AIB bioses may be associated with a drop in performance relative to the FE bios.
> 
> If you want to do the shunt mod, it's very easy. Get a tube of Coolaboratory Liquid Metal Ultra (or pro) and use a q-tip to apply the liquid metal to the top of the resistors circled in the image below. Rub well towards the conductive ends of the resistor to ensure there is a good connection. You're basically just shorting out the resistor. The liquid metal can be removed later with turps without a trace - probably.
> Reported TDP will drop significantly. Seems the reported number is not the same as what the card is using to determine when to hit Pwr limit though so you'll have to trial it to find the limits.
> I disclaim myself entirely from liability for this information.


Quote:


> Originally Posted by *jedimasterben*
> 
> 1.093v rock steady.
> Doooo ittttt
> Some people (though trusted, not just joe schmoes) are saying that this mod locks clock speeds at low desktop idle, but I shorted the three resistors in question with 63/37 solder and a small piece of 18AWG solid wire, and don't have that problem, and have no attainable TDP limit


I did the 10-Ohm resistor mod on top of capacitors for the actual sense chip, TDP is considerably lower as well. If the card doesn't lock to slow MHz speeds with the shunt resistor(s) shorted, all good. If they do, do the 10-Ohm mod.


----------



## ucode

Quote:


> Originally Posted by *Phinix*
> 
> In afterburner what does your curve look like, after 1.012mV my curve flat lines.


AFAIK that's normal for default. You can change it though.

This is default for Galax FE. Has a weird dip at the beginning but AFAIK you will not see this in AfterBurner as it only shows from 0.8V to 1.2V IIRC.


----------



## gree

I ordered a 4k monitor for my 1080, should i be expecting an avg of 30-45fps?

Games i play are older like assetto corsa, mortol kombat, crysis.


----------



## ralphi59

Hi man.
Pick a gsync monitor
It s a dream with only one gtx 1080


----------



## gree

I Got the Acer xb271hk with gsync

Whats your avg frame rate?


----------



## ralphi59

I have the xb321hk
In the majority of the games 60fps without aa
Witcher 3 for example is between 45 and 60.
Butter smooth with gsync
All games are butter smooth
Gsync is fantastic


----------



## gree

Sweet im excited, cant imagine what two 1080s would do.
Lucky you lol 32" is too expensive, got the 27" for $600.


----------



## TWiST2k

I dont even care to game at 4k I am very happy with 2560x1440 but would love to get a nice IPS GSYNC but have no idea who to get. I currently have a Q-Nix and it is beautiful but no gsync.


----------



## VPII

I seem to have run into a problem.... Sorry if my post below gives too much detail but I figured it would be good to give as much detail as possible.

My Evga GTX 1080 Founders Edition with a Hybrid mod has been running perfectly well 2139mhz core and 11200mhz memory. So about two weeks ago I, due to stupidity on my part, messed up the cpu socket pins on my RVE and decided to get an Asus X99 Strix Gaming. In doing so I went through the process of reinstalling windows 10. Last weekend I did a dry ice bench session with only my 5930k under dry ice sitting at 5.55ghz on 2 cores and 5.3ghz on all 6 but I was using a fresh installation of Windows 8.1 on another hard drive. All seem to work okay, but I mostly ran some older 3d benchmarks and some 2d benchies.

Today when I tried 3dmark Fire Strike, Time Spy and even 3dmark11 I found that I am getting what could be artifacts, as it looks like red, blue, green and purple flares - good quality actually popping up while benching to the point where the benchmark would fail. I tried reverting back to an older driver but same problem there. It is not that straight and quick for a gpu to regress to that effect. Stock seems to work 100% but overclock only to 2078mhz core and +500 memory.

In case you question my results with the card, well here is the link to my hwbot profile http://hwbot.org/user/vpii/


----------



## nyk20z3

For any one looking to pick up a Extreme 1080 on the cheap, i picked up a extreme gaming 980 Ti myself since i like the ascthectics over the 1080 version -

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125873R


----------



## LeviathanVI

Quote:


> Originally Posted by *IronAge*
> 
> @LeviathanVI
> 
> You might also want to check out those ones ... pretty good price for a custom PCB with 8+2 Phase VRM Design.
> 
> http://it-supplier.co.uk/gigabyte-geforce-gtx-1080-windforce-oc-8gb-gv-n1080wf3oc-8gd
> 
> FC waterblock + backplates available from EK:
> 
> https://www.ekwb.com/configurator/step1_complist?gpu_gpus=2110


@IronAge

When I try to add it to my cart, it adds a qnap NAS to my cart instead.
https://i.gyazo.com/8d348e2b3fc5161b456dfcfe7d58dd4c.png









Back to the drawing board, I guess.

Edit. This is a headache, so I think I'm just going to buy 2 of these based on their warranty and out of the box speed. The RGB lighting might be nice too.

Hope I'm making the right decision.


----------



## StreaMRoLLeR

I need help my strix 1080 is running hot (%54 fan) 76C in bottom X8 slot,case is obsidian 900D but when i tried first time in morning X16 top slot card run 68c 67c as review sites stated whats wrong here plz help.Also card is worst overclocker max 2050mhz (voltage makes no difference )


----------



## IronAge

Quote:


> Originally Posted by *LeviathanVI*
> 
> @IronAge
> 
> When I try to add it to my cart, it adds a qnap NAS to my cart instead.
> https://i.gyazo.com/8d348e2b3fc5161b456dfcfe7d58dd4c.png


Solution:

Search the Shop for the SKU GV-N1080WF3OC-8GD and use Add to Cart Button in the search result ... got the Gigabyte added to the cart that way.


----------



## ralphi59

Got my 32 for 999 euros
He is absolutely fantastic.
Incredible


----------



## LeviathanVI

Quote:


> Originally Posted by *IronAge*
> 
> Solution:
> 
> Search the Shop for the SKU GV-N1080WF3OC-8GD and use Add to Cart Button in the search result ... got the Gigabyte added to the cart that way.


Think I broke it.








https://i.gyazo.com/f9d62f78ebbcc16266f05e57deaf6f88.png

Got it. Just pasted the name instead and it worked.

Thank you so much!


----------



## ralphi59

Gaming at 4k gsync ips = win


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> 
> 
> Finished.
> 
> I run a Core X9 so I had to orient the tubing toward the front of the GPU. Need all the tubing length I can get to mount the Rad.
> 
> Anyone with a smaller case could orient the tubing toward the back of the GPU and still have the stock fan shroud on there too.
> 
> It will not make much if any difference though. Just aesthetics...
> 
> So... best Kraken G10 mod or what?
> 
> I'm going to throw it in turpentine to get rid of the paint and re paint the bracket once I take the card apart for maintenence.


So, any chance of having the 2 pieces of Shroud back ?


----------



## toncij

Quote:


> Originally Posted by *LeviathanVI*
> 
> @IronAge
> 
> When I try to add it to my cart, it adds a qnap NAS to my cart instead.
> https://i.gyazo.com/8d348e2b3fc5161b456dfcfe7d58dd4c.png
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to the drawing board, I guess.
> 
> Edit. This is a headache, so I think I'm just going to buy 2 of these based on their warranty and out of the box speed. The RGB lighting might be nice too.
> 
> Hope I'm making the right decision.


I have two of these FTWs - both cards have consecutive serials and clock the same - 2114/+600 (ends up 2114/5620) and damn fast...


----------



## LeviathanVI

I bought 2 GA Windforce OCs in the end. Saved £200 which I can spend on the EK HB bridge.


----------



## Derpinheimer

Quote:


> Originally Posted by *toncij*
> 
> I have two of these FTWs - both cards have consecutive serials and clock the same - 2114/+600 (ends up 2114/5620) and damn fast...


Just out of curiosity have you picked +600 memory because it's the max non artifacting value or best performance? Mine will go to +900 without problems but best performance at +575


----------



## xTesla1856

I have an MSI Gaming X coming next week as well as an EK Predator 360. Has anyone tried the EK TF6 blocks yet?


----------



## TapTwo

Every time I attempt to flash the Bios for my Gigabyte 1080 G1 it gives me a no Nvidia display adapter found



Please need some help for the revving fan issue with this card I have to update F1 bios to F2 but it will not take the commands or flash the bios no matter what I do


----------



## IronAge

Have u tried disabling the GTX1080 in the device manager and execute the programm with administrator rights ?

(assuming you have dowmloaded the Bios Update from Gigabyte Homepage)


----------



## wangle0485

Quote:


> Originally Posted by *Streamroller*
> 
> I need help my strix 1080 is running hot (%54 fan) 76C in bottom X8 slot,case is obsidian 900D but when i tried first time in morning X16 top slot card run 68c 67c as review sites stated whats wrong here plz help.Also card is worst overclocker max 2050mhz (voltage makes no difference )


My super jetstream would crash during firestrike anything over 2050. I repasted it today with arctic mx-2 I had to hand and can now get 2078. It runs exactly the same temperature as before but clocks better and scores an extra 150 on fire strike.


----------



## CoolRonZ

hi everyone, heres my msi sea hawk ek x, seems like not many of these kicking around. if only the voltages were unlocked...


----------



## gree

Quote:


> Originally Posted by *CoolRonZ*
> 
> hi everyone, heres my msi sea hawk ek x, seems like not many of these kicking around. if only the voltages were unlocked...


Sweet but why a seahawk instead of a gaming z?


----------



## CoolRonZ

Quote:


> Originally Posted by *gree*
> 
> Sweet but why a seahawk instead of a gaming z?


price, the sea hawk was cheaper than buying the cheapest gtx1080 and a ek block and that's not even including the black plate.....


----------



## TapTwo

Quote:


> Originally Posted by *IronAge*
> 
> Have u tried disabling the GTX1080 in the device manager and execute the programm with administrator rights ?
> 
> (assuming you have dowmloaded the Bios Update from Gigabyte Homepage)


Yeap tried that with the same result unfortunately


----------



## GanGstaOne

Quote:


> Originally Posted by *TapTwo*
> 
> Yeap tried that with the same result unfortunately


Just download the bios file from techpowerup it ends with .FD and flash with nvflash or zotac program


----------



## SAFX

Why is the EVGA 1080 Classified impossible to find? heck, even evga.com doesn't sell it, wth?


----------



## TWiST2k

Quote:


> Originally Posted by *SAFX*
> 
> Why is the EVGA 1080 Classified impossible to find? heck, even evga.com doesn't sell it, wth?


Because its totally overpriced lol, I have a 980 Ti Classy and it was an ok card, but this 50$+ premium, ugh. FTW all the way man.


----------



## Snabeltorsk

Quote:


> Originally Posted by *gree*
> 
> Sweet but why a seahawk instead of a gaming z?


Beqause it is watercooled ?


----------



## pantsoftime

Quote:


> Originally Posted by *SAFX*
> 
> Why is the EVGA 1080 Classified impossible to find? heck, even evga.com doesn't sell it, wth?


There are a few reports in this thread that classifieds are regularly in stock at microcenter.


----------



## turtletrax

Just got my heatkiller IV blocks and backplates from Aquatuining.us and not very impressed.

They made a shipping box by cutting a bigger one and it was ridiculous, then on top of that they put the backplates on an outside edge with no extra packaging (just the thin plastic bag it comes in) and UPS beat the tar out of it. Opened the box to find both backplates with dents in the corner, and one has 7 screws missing that fell out of the bag, and their ridiculous shipping box.

Then opened the heatkiller blocks to find you cant even see thru the acrylic tops they were so badly machined. Now I have to manually polish them. For the cost I have to say not very impressed.


----------



## gree

Quote:


> Originally Posted by *CoolRonZ*
> 
> price, the sea hawk was cheaper than buying the cheapest gtx1080 and a ek block and that's not even including the black plate.....


Fair enough







i got their overpriced gaming X card cos i didnt want to wait for their other 1080s. Have to say for what i use it for its pretty great. (1440p gaming)


----------



## CoolRonZ

Quote:


> Originally Posted by *gree*
> 
> Fair enough
> 
> 
> 
> 
> 
> 
> 
> i got their overpriced gaming X card cos i didnt want to wait for their other 1080s. Have to say for what i use it for its pretty great. (1440p gaming)


nice! I was a little surprised when the blocked the X and not the Z myself.... but hey.... also for some odd reason no one seems to be buying the sea hawk ek here in Canada, its one of the few cards that's actually in stock and remained in stock.... but its got a rock solid 1987 clock out of the box.


----------



## CoolRonZ

Quote:


> Originally Posted by *turtletrax*
> 
> Just got my heatkiller IV blocks and backplates from Aquatuining.us and not very impressed.
> 
> They made a shipping box by cutting a bigger one and it was ridiculous, then on top of that they put the backplates on an outside edge with no extra packaging (just the thin plastic bag it comes in) and UPS beat the tar out of it. Opened the box to find both backplates with dents in the corner, and one has 7 screws missing that fell out of the bag, and their ridiculous shipping box.
> 
> Then opened the heatkiller blocks to find you cant even see thru the acrylic tops they were so badly machined. Now I have to manually polish them. For the cost I have to say not very impressed.


just wow.. so sorry to hear that.... the last couple xspc blocks I have gotten were pretty shoddy too... have gone EK and never had any reason to go anywhere else personally. but I always thought heat killers were right up there in quality tho, but I have never owned any, mainly due to lack of availability here in Canada....


----------



## turtletrax

Quote:


> Originally Posted by *CoolRonZ*
> 
> just wow.. so sorry to hear that.... the last couple xspc blocks I have gotten were pretty shoddy too... have gone EK and never had any reason to go anywhere else personally. but I always thought heat killers were right up there in quality tho, but I have never owned any, mainly due to lack of availability here in Canada....


Ya, I have had success and failure with all brands to be honest, but was really hoping that watercool remained at the top of my list. The good thing is the actual waterblocks can be fixed with some elbow grease and some acrylic polish, but the backplates will never go in my rig damaged. For what I paid for them and for shipping from Germany, I will fight for replacement or refund. I am in Canada (Alberta) as well, and after ordered a shipment came into Dazmode.com, or I would have ordered from him for sure. The packaging from Aquatuning was pitiful. Daz doesn't roll like that...

I am sure the Heatkillers will be pretty awesome once I find time to polish them









Here is some photos...


----------



## CoolRonZ

Quote:


> Originally Posted by *turtletrax*
> 
> Ya, I have had success and failure with all brands to be honest, but was really hoping that watercool remained at the top of my list. The good thing is the actual waterblocks can be fixed with some elbow grease and some acrylic polish, but the backplates will never go in my rig damaged. For what I paid for them and for shipping from Germany, I will fight for replacement or refund. I am in Canada (Alberta) as well, and after ordered a shipment came into Dazmode.com, or I would have ordered from him for sure. The packaging from Aquatuning was pitiful. Daz doesn't roll like that...
> 
> I am sure the Heatkillers will be pretty awesome once I find time to polish them


oh ok, all the best on that one bro. I have personally never bought from aquatuning, its either swiftech, ekweb or performance pcs if I had to go out of Canada.
ya I love dazmode, I try to support daz as much as I can. he's got a good amount of EK blocks for the 1080/1070 FE last time I checked.


----------



## Bishop07764

Originally had a Pascal Titan X in my cart, I just couldn't pull the trigger. My self control didn't win out for a 1080 though. MSI Seahawk EK here as well. Also boosts to about 1970 or so out of the box. Not a huge fan of acrylic threads but EK has done it again. Max temp trying it out tonight was 34 C. Kind of sad to retire my 780 Lightning though. Going to miss that 287% power limit on that beast.


----------



## CoolRonZ

Quote:


> Originally Posted by *Bishop07764*
> 
> Originally had a Pascal Titan X in my cart, I just couldn't pull the trigger. My self control didn't win out for a 1080 though. MSI Seahawk EK here as well. Also boosts to about 1970 or so out of the box. Not a huge fan of acrylic threads but EK has done it again. Max temp trying it out tonight was 34 C. Kind of sad to retire my 780 Lightning though. Going to miss that 287% power limit on that beast.


nice nice, congrats, but I really love the new titan x... I'm not so sure why everyone is afraid of the acrylic threads, I personally prefer the clear, just be careful not to over tighten.. and I really liked the ek/msi logo, really nice touch, but what's with the SN sticker placement? what a hokey spot to put it...


----------



## IronAge

Quote:


> Originally Posted by *TapTwo*
> 
> Yeap tried that with the same result unfortunately


disabled igpu and cleaned device entries with ddu ?


----------



## toncij

So, how high Classifieds go?


----------



## IronAge

No better or worse than other custom cards with a huge heat sink.

A forum friend got one and it does 2126 without touching VDDC.

Thats a Benchmark on main bios which has 122% PL , CPU 6700K @ 4.2 GHz.

http://www.3dmark.com/3dm/14074426


----------



## pantsoftime

Has anyone done any investigating on the reports that the EVGA FE BIOS seems to be performing/oc'ing better than other FE BIOSes?


----------



## TapTwo

Quote:


> Originally Posted by *GanGstaOne*
> 
> Just download the bios file from techpowerup it ends with .FD and flash with nvflash or zotac program


I continue to get the same error


----------



## IronAge

Clean Device entries with DDU ... had the same, did the same, after cleaning it has worked.


----------



## VPII

Quote:


> Originally Posted by *VPII*
> 
> I seem to have run into a problem.... Sorry if my post below gives too much detail but I figured it would be good to give as much detail as possible.
> 
> My Evga GTX 1080 Founders Edition with a Hybrid mod has been running perfectly well 2139mhz core and 11200mhz memory. So about two weeks ago I, due to stupidity on my part, messed up the cpu socket pins on my RVE and decided to get an Asus X99 Strix Gaming. In doing so I went through the process of reinstalling windows 10. Last weekend I did a dry ice bench session with only my 5930k under dry ice sitting at 5.55ghz on 2 cores and 5.3ghz on all 6 but I was using a fresh installation of Windows 8.1 on another hard drive. All seem to work okay, but I mostly ran some older 3d benchmarks and some 2d benchies.
> 
> Today when I tried 3dmark Fire Strike, Time Spy and even 3dmark11 I found that I am getting what could be artifacts, as it looks like red, blue, green and purple flares - good quality actually popping up while benching to the point where the benchmark would fail. I tried reverting back to an older driver but same problem there. It is not that straight and quick for a gpu to regress to that effect. Stock seems to work 100% but overclock only to 2078mhz core and +500 memory.
> 
> In case you question my results with the card, well here is the link to my hwbot profile http://hwbot.org/user/vpii/


Seem to have found the problem.... removed the hybrid cooler cleaned everything and reseated it which resulted in a 15 to 17c load temp drop.

Benching max temps around 36c and able to go through the benchmark at 2139mhz core and 11200mhz memory. Running Doom for an hour or more gives max 46c load temps.

Sent from my SM-G925F using Tapatalk


----------



## TK421

Is there an accessory that plugs to a pcie 8-pin and show power draw from that particular pcie slot?


----------



## IronAge

Nope ... many review sites would be happy to own something like that ... only THG does reliable GPU only power consumption reports.

As of now it is not possible to measure PCIe Slot current without laboratory instruments + PCIe extenders.


----------



## fat4l

Quote:


> Originally Posted by *pantsoftime*
> 
> Has anyone done any investigating on the reports that the EVGA FE BIOS seems to be performing/oc'ing better than other FE BIOSes?


I tried evga SC fe bios. It's not good. Original nvidia FE bios is the best for me for now


----------



## Bishop07764

Quote:


> Originally Posted by *CoolRonZ*
> 
> nice nice, congrats, but I really love the new titan x... I'm not so sure why everyone is afraid of the acrylic threads, I personally prefer the clear, just be careful not to over tighten.. and I really liked the ek/msi logo, really nice touch, but what's with the SN sticker placement? what a hokey spot to put it...


Thanks. Let me know how yours overclocks. Just dropped it in my loop late last night and didn't test it very long before heading for bed. I'm not real encouraged about the OC potential for mine. Tried a really quick conservative +50 core that passed valley but wouldn't work on doom. I've got a lot more playing around to do with it still though. Wish that it had PCB, VRM, ETC sensors like on the lightning but it shouldn't be anything to worry about with a full cover block. Will have to try the curve OC method when I get a chance. My volyage was only going to 1.04 or so at max. Far cry from the almost 1.3 volts on my old card.


----------



## kcuestag

To those running two 1080 cards in SLI that are NOT reference cards (Custom cards only), how are the temperatures on your top card?

I had bad experiences in the past with two GTX980Ti Gaming G1 from Gigabyte, top card ran way too hot (Like 10-14ºC hotter than bottom one), and a friend is looking into buying two custom cooled cards (G1, Xtreme Gaming, FTW, Classified...), though I told him to stick to FE's for SLI due to heat issues on top cards when running custom cards.


----------



## Agavehound

Well Hell.

Got a black screen and 100% ++ fans after a session in FO4. The first crash I got was a wonky screen with lots of artifacts and the fans went crazy fast. I had Precision running the fans at 100% during gameplay because I don't like seeing temps in the 60's but during the crash they spun up even faster. I restarted my PC and restarted the game where it promptly crashed again after about 30 seconds. I restarted again and ran FireStrike with no issues so I thought it was game/mod related. Sounds like it's the card. Double hell.


----------



## ucode

Quote:


> Originally Posted by *TK421*
> 
> Is there an accessory that plugs to a pcie 8-pin and show power draw from that particular pcie slot?


A few of the Corsair PSU's give current draw readings. My board uses only 12V supply so can get some measurements without graphic card for different CPU loads then take those loads away when using the graphics card to get graphics power draw. Measurement of GPU power from HWiNFO info ties in very close to these 12V power rail measurements on my Galax FE so seem to be pretty good unless of course one uses a hard mod or software mod then only 12V rail current draw will show the real draw.

Best to use the graph in HWiNFO rather than max reading to get a better idea of what's going on though IMO.


----------



## TK421

Quote:


> Originally Posted by *ucode*
> 
> A few of the Corsair PSU's give current draw readings. My board uses only 12V supply so can get some measurements without graphic card for different CPU loads then take those loads away when using the graphics card to get graphics power draw. Measurement of GPU power from HWiNFO info ties in very close to these 12V power rail measurements on my Galax FE so seem to be pretty good unless of course one uses a hard mod or software mod then only 12V rail current draw will show the real draw.
> 
> Best to use the graph in HWiNFO rather than max reading to get a better idea of what's going on though IMO.


I can't use HWiNFO anymore since I shorted the three resistor on my 1080, causing the power draw to be seen as lower through software (inaccurate).

I'm not in the market to buy a new PSU unfortunately


----------



## CoolRonZ

my sea hawk ek at default speeds isn't even 4K ready according to 3DMark... lol









http://www.3dmark.com/3dm/14121404?


----------



## xer0h0ur

Quote:


> Originally Posted by *CoolRonZ*
> 
> my sea hawk ek at default speeds isn't even 4K ready according to 3DMark... lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/14121404?


I was about to say why are you only getting 10975 graphics score until I realized you're showing a Firestrike Extreme score. I haven't even run anything but the standard Firestrike on my 1080 so far.


----------



## CoolRonZ

Quote:


> Originally Posted by *xer0h0ur*
> 
> I was about to say why are you only getting 10975 graphics score until I realized you're showing a Firestrike Extreme score. I haven't even run anything but the standard Firestrike on my 1080 so far.


hehehehe

http://www.3dmark.com/fs/9570022


----------



## KillerBee33

Quote:


> Originally Posted by *CoolRonZ*
> 
> hehehehe
> 
> http://www.3dmark.com/fs/9570022


Push it! You should be getting 25,2 - 25,5 atleast


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KillerBee33*
> 
> Push it! You should be getting 25,2 - 25,5 atleast


It looks about right with a medium clocked 4930k pushing the Sea Hawk and those clocks. The gpu score would be higher if the card was being pushed by a 6700k at the same clocks as the 4930k in this case.

My Sea Hawk EK score:


----------



## CoolRonZ

Quote:


> Originally Posted by *KillerBee33*
> 
> Push it! You should be getting 25,2 - 25,5 atleast


sadly it only volts up to 1.05v and 2101.... I need more volts.....


----------



## CoolRonZ

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It looks about right with a medium clocked 4930k pushing the Sea Hawk and those clocks. The gpu score would be higher if the card was being pushed by a 6700k at the same clocks as the 4930k in this case.


so what volts does you sea hawk go to? I can only get a default voltage of 1.05 and if I change the voltage at all it seems to max out at 1.043. and that voltage/MHz curve thing is a total sham to me.... all I can do is add 120 to the core clock... and 700-720ish to the memory....


----------



## xer0h0ur

IMO he is doing just fine. I don't really get a hell of a different score on my rig:

http://www.3dmark.com/fs/9311119

I haven't tested since I bumped the 4960X up to 4.5 GHz though. Doubt it would make much of a difference anyways.


----------



## GreedyMuffin

I don't fet over 24.3K on my rig. Makes me so mad..

CPU at 4200 or 4700
, dosen't matter. :/


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It looks about right with a medium clocked 4930k pushing the Sea Hawk and those clocks. The gpu score would be higher if the card was being pushed by a 6700k at the same clocks as the 4930k in this case


My 6700 is lightly clocked to 4,6 with a low Physics score of 14.5 - 14.8


----------



## KillerBee33

Quote:


> Originally Posted by *CoolRonZ*
> 
> sadly it only volts up to 1.05v and 2101.... I need more volts.....


We went thru this in much earlier posts, Voltage isnt the issue atleast not for the 2100MHz on 10 series , something wrong with Power Settings in most AIBs , try MSI Founders BIOS








Also your memory clock seems high , it doesn't Pay , try 1377 or just +500 and run same Firestrike


----------



## Snabeltorsk

Quote:


> Originally Posted by *CoolRonZ*
> 
> so what volts does you sea hawk go to? I can only get a default voltage of 1.05 and if I change the voltage at all it seems to max out at 1.043. and that voltage/MHz curve thing is a total sham to me.... all I can do is add 120 to the core clock... and 700-720ish to the memory....


1.093 is max voltage and on my card i need atleast that to run at 2114, at 2126 i sometimes get Vrel.


----------



## fat4l

Quote:


> Originally Posted by *CoolRonZ*
> 
> hehehehe
> 
> http://www.3dmark.com/fs/9570022


Whatssss up









http://www.3dmark.com/fs/9690893

26082 Graphics


----------



## andressergio

No one here with ZOTAC GTX 1080AMP! Extreme ?


----------



## KillerBee33

Quote:


> Originally Posted by *fat4l*
> 
> Whatssss up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/9690893
> 
> 26082 Graphics


Heh , Good One!


----------



## DStealth

Quote:


> Originally Posted by *KillerBee33*
> 
> Heh , Good One!


Ridiculous one actually








My best so far is 25333 with 2139 stock voltage 1.062 on stock cooler I'm happy with the result...

Will rebench with my new toy when the time allows


----------



## CoolRonZ

Quote:


> Originally Posted by *KillerBee33*
> 
> We went thru this in much earlier posts, Voltage isnt the issue atleast not for the 2100MHz on 10 series , something wrong with Power Settings in most AIBs , try MSI Founders BIOS
> 
> 
> 
> 
> 
> 
> 
> 
> Also your memory clock seems high , it doesn't Pay , try 1377 or just +500 and run same Firestrike


hmmmm tried the lower mem clock, I also heard something about the lower the mem clock the higher to the gpu, but I don't know I did manage to follow that one and clock it to 2113 and 11k on the memory. but as far as changing the BIOS, i'm a bit sketchy on that one and have used modified or different bios', but in the end the one it came with have always been the most stable. and even at default speeds and default refresh rates(75hz) on a 2560x1080 panel I'm usually .70v around 1.5-1.6ghz anyways. but thanks for the advice, appreciate it







but no I'm not upgrading my 34 ultrawide, I love it soo much, I just happened to go overkill on the GPU for it....









http://www.3dmark.com/fs/9778847

http://www.3dmark.com/fs/9778963

and it has been nice to see what others are getting too







nice work guys!


----------



## Hutzi

So which of the reference BIOS I can try with a Palit 1080 Gamerock?
Since you guys archive so much better results with reference BIOS I wanna give it a try.

Getting 24.500 Graphicsscore in Firestrike with 2035 MHz core clock after thermal trotteling.


----------



## DStealth

I have over 25k with my Palit Jetstream BIOS...FE will not give you anything...just the Strix XOC t4 will let you bench stable w/o thermal limits...but results are very similar.
FE will spin you fans to the sky


----------



## Hutzi

Quote:


> Originally Posted by *DStealth*
> 
> I have over 25k with my Palit Jetstream BIOS...FE will not give you anything...just the Strix XOC t4 will let you bench stable w/o thermal limits...but results are very similar.
> FE will spin you fans to the sky


You think it's worth trying the Strix OC?


----------



## Whitechap3l

Hello my friends.
Let me give you a short Update from what I experienced with T4 Strix Bios :

I was able to get stabil 2250 mHz and 11.200 on Memory @1.2V and my best score in FS was around 25.400
Now I lowered the clocks with the afterburner curve and tried to bench with 2195 mHz and 11.000 on Memory @1.2V and guess what - after 5 benches I constantly hitting around 25.700 Points in FS

I dont now why cause I am under Water and temps are not rising so I usually should get better scores with higher clocks but doesnt seem so


----------



## stxe34

Quote:


> Originally Posted by *Whitechap3l*
> 
> Hello my friends.
> Let me give you a short Update from what I experienced with T4 Strix Bios :
> 
> I was able to get stabil 2250 mHz and 11.200 on Memory @1.2V and my best score in FS was around 25.400
> Now I lowered the clocks with the afterburner curve and tried to bench with 2195 mHz and 11.000 on Memory @1.2V and guess what - after 5 benches I constantly hitting around 25.700 Points in FS
> 
> I dont now why cause I am under Water and temps are not rising so I usually should get better scores with higher clocks but doesnt seem so


yep my best performing clock is 2088 even though i can get 2188mhz regardless of low temps, voltage and power limits.


----------



## ucode

Quote:


> Originally Posted by *DStealth*
> 
> My best so far is 25333 with 2139 stock voltage 1.062 on stock cooler I'm happy with the result...


Says "Invalid" in the screen shot.

Quote:


> Originally Posted by *Whitechap3l*
> 
> Now I lowered the clocks with the afterburner curve and tried to bench with 2195 mHz and 11.000 on Memory @1.2V and guess what - after 5 benches I constantly hitting around 25.700 Points in FS


Did you try logging performance limit reasons during the bench?


----------



## DStealth

Quote:


> Originally Posted by *Hutzi*
> 
> You think it's worth trying the Strix OC?


Not gonna hurt at least...yes if you going to bench it's the only option to squeeze the maximum from the cards
Quote:


> Originally Posted by *ucode*
> 
> Says "Invalid" in the screen shot.


Not a valid key...


----------



## juniordnz

Well, it seems 2114mhz is the number for my 1080FTW on air. Even though FE bios got me 2126mhz and 200 points more on firestrike graphics score, I noticed that on longer runs, FE BIOS will throttle down to 2088 while stock slave bios would keep the clock at 2114 on games, sometimes dropping to 2100 if it's a demanding benchmark. That may be due to the power limitations on the FE Bios. It probably doesn't take advantage of FTW's power delivery system, nor it has the 130% power limit.

Even though FE bios does a better job on quick benchmarks like firestrike, on real world applications (like hours of gaming), FTW stock BIOS proved to be a more reliable option.

That's on air, maybe I can get more juice out of it once I can keep it at 40-50C.


----------



## Hutzi

Thats sick.
My BIOS is trotteling like hell... while temperatur is below 40°C I got 2100 MHz. It throttles down to 1976 Mhz in BF4 @ 100% load.
I thought this was normal for Pascal - since my card doesnt hit temperatur or powerlimit at all.


----------



## juniordnz

Unfortunately, that is normal. The first clock down, on most cards, occurs at 39C. My armor would have like 5 or 6 clock downs, the first being at 39C and just freezing over 78C. These cards doesn't do well with heat at all.


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> It probably doesn't take advantage of FTW's power delivery system, nor it has the 130% power limit.


What's that in Watts? Percentages are too misleading.


----------



## juniordnz

Quote:


> Originally Posted by *ucode*
> 
> What's that in Watts? Percentages are too misleading.


Haven't take any readings from HWINFO yet, can do that as soon as I get home. Are they realiable?

According to EVGA Specs, FTW can draw up to 215W stock, so 30% would end up in 280W.


----------



## ucode

People can use the nVidia command line tool that comes with the driver install.

C:\Program Files\NVIDIA Corporation\NVSMI>nvidia-smi.exe -q -d power

Will give limits in Watts as well as consumption and looks pretty good as long as hardware mod hasn't been done. 5% or 5W accuracy, not sure which. Maybe both, whichever is the greater of the two.


----------



## juniordnz

Quote:


> Originally Posted by *ucode*
> 
> People can use the nVidia command line tool that comes with the driver install.
> 
> C:\Program Files\NVIDIA Corporation\NVSMI>nvidia-smi.exe -q -d power
> 
> Will give limits in Watts as well as consumption and looks pretty good as long as hardware mod hasn't been done. 5% or 5W accuracy, not sure which. Maybe both, whichever is the greater of the two.


And I run that before or during stress? Will get some stock/FE/130% measures when I get home. I'm pretty sure it's FE BIOS TDP that's making the card clock down after a while under stress.


----------



## ucode

For checking real power limits instead of percentages it's fine to run it anytime. Just be nice to see actual limits in Watts instead of ambiguous percentages. IMO


----------



## VPII

Quote:


> Originally Posted by *juniordnz*
> 
> Unfortunately, that is normal. The first clock down, on most cards, occurs at 39C. My armor would have like 5 or 6 clock downs, the first being at 39C and just freezing over 78C. These cards doesn't do well with heat at all.


What I found is that in FS my clocks are jumping around all over the place but nevcer dipping below 2000mhz. Clocks set to 2139mhz. Max temps during the bench wpuld be between 37 and 39c as I moddified the card with the evga hybrid cooler from my 980ti.

When I play Doom for an hour or more the temps will be between 44 and 46c with the clocks only dropping now and then to 2126mhz.

When I run Heaven Extreme Preset it will heat up the core pretty quickly but only drop to 2126mhz towards the end when the temps get to 44c or so.

Sent from my SM-G925F using Tapatalk


----------



## juniordnz

Quote:


> Originally Posted by *VPII*
> 
> What I found is that in FS my clocks are jumping around all over the place but nevcer dipping below 2000mhz. Clocks set to 2139mhz. Max temps during the bench wpuld be between 37 and 39c as I moddified the card with the evga hybrid cooler from my 980ti.
> 
> When I play Doom for an hour or more the temps will be between 44 and 46c with the clocks only dropping now and then to 2126mhz.
> 
> When I run Heaven Extreme Preset it will heat up the core pretty quickly but only drop to 2126mhz towards the end when the temps get to 44c or so.
> 
> Sent from my SM-G925F using Tapatalk


Are you talking about firestrike bench or stress test? If its bench, is completely normal since there are 4 phases in that test, 2 that pushes de GPU, 1 pushing the CPU and the last one combining both.

Your numbers look more than ok to me. stable 2126mhz is very nice.


----------



## Whitechap3l

And next update from me watercooled flashed to strix t4 bios : As I started to play some hours overwatch and Witcher I got some black screens and had to restart my pc again.
Flashed to Evga Fe Bios and get 2177 MHz in FS and 2170 MHz when I play some hours. FS score is with 1.09V nearly the same compared to strix t4 with 1.2V and 2250 MHz.
I don't think that many people really boost their cards with that t4 bios. And in games it is horrible and so unstable - at least for me.

For me it seems that there is a sweetspot on those cards and u can't get much further. I mean the difference between no TDP, 1.2V and +100 MHz on core clock should be a way more noticeable difference to 120% TDP, 1.08 to 1.09V and there is little to non difference in score..


----------



## VPII

Quote:


> Originally Posted by *juniordnz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *VPII*
> 
> What I found is that in FS my clocks are jumping around all over the place but nevcer dipping below 2000mhz. Clocks set to 2139mhz. Max temps during the bench wpuld be between 37 and 39c as I moddified the card with the evga hybrid cooler from my 980ti.
> 
> When I play Doom for an hour or more the temps will be between 44 and 46c with the clocks only dropping now and then to 2126mhz.
> 
> When I run Heaven Extreme Preset it will heat up the core pretty quickly but only drop to 2126mhz towards the end when the temps get to 44c or so.
> 
> Sent from my SM-G925F using Tapatalk
> 
> 
> 
> Are you talking about firestrike bench or stress test? If its bench, is completely normal since there are 4 phases in that test, 2 that pushes de GPU, 1 pushing the CPU and the last one combining both.
> 
> Your numbers look more than ok to me. stable 2126mhz is very nice.
Click to expand...

Thanks...
Problem though is in game test one the gpu core is all over the place but in game test to it would be more constant

Sent from my SM-G925F using Tapatalk


----------



## juniordnz

Quote:


> Originally Posted by *ucode*
> 
> For checking real power limits instead of percentages it's fine to run it anytime. Just be nice to see actual limits in Watts instead of ambiguous percentages. IMO


Here it is:

FE BIOS - 120% TDP



FTW BIOS - 130% TDP


----------



## Joshwaa

Ok so help me understand this. If the FE Bios and AIB bios will both run a card at the same clock and volts. Where is the extra watts/heat being generated at running an AIB BIOS? More Amps? More power to memory?


----------



## juniordnz

I believe that, at least in my case, more watts lead to more stability. I could get higher clocks for brief periods with FE bios, but with long use like gaming, It would downclock itself to clocks lower than FTW. Also, with 130%TDP and 1.093V, I get no vrel at all with FTW Bios, while with FE BIOS I'm getting Vrel perfcap even with stock 1.062V.


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> Here it is:
> 
> FE BIOS - 120% TDP
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> FTW BIOS - 130% TDP
> 
> 
> Spoiler: Warning: Spoiler!


Now that's what I'm talking about. You have one VBIOS with 120% and one 130%, only a 10% increase from 100% or 6% between them. The reality though is that the 130% setting is 29% higher than the 120% setting, 280W vs 217W. The default setting (100%) is different in each VBIOS making the whole percentage thing misleading IMHO.

Quote:


> Originally Posted by *Joshwaa*
> 
> Ok so help me understand this. If the FE Bios and AIB bios will both run a card at the same clock and volts. Where is the extra watts/heat being generated at running an AIB BIOS? More Amps? More power to memory?


Depends on the load, for instance running furmark will draw a lot more power than some light graphics at the same voltage and clocks.


----------



## GanGstaOne

Quote:


> Originally Posted by *ucode*
> 
> Now that's what I'm talking about. You have one VBIOS with 120% and one 130%, only a 10% increase from 100% or 6% between them. The reality though is that the 130% setting is 29% higher than the 120% setting, 280W vs 217W. The default setting (100%) is different in each VBIOS making the whole percentage thing misleading IMHO.
> Depends on the load, for instance running furmark will draw a lot more power than some light graphics at the same voltage and clocks.


Yes thats why will be good to have program with full clock boost all time like precision x K Boost option but EVGA made new precision app to unlock this option only if you enter evga card serial very stupid


----------



## ucode

@GanGstaOne I've never used EVGA software, what is it you want to do exactly? Does using the "L" option under AB curve not do what you are asking?


----------



## GanGstaOne

Quote:


> Originally Posted by *ucode*
> 
> @GanGstaOne I've never used EVGA software, what is it you want to do exactly? Does using the "L" option under AB curve not do what you are asking?


Dont know about this L option but K boost in evga precision keeps the card at max boost clock at all time ones activated very useful in games


----------



## ucode

Click on the AfterBurner curve marker where you want to operate, press 'L' and apply for free, no serial required. Still subject to power and temp though.


----------



## ondoy

joining this club...


----------



## fat4l

Quote:


> Originally Posted by *ucode*
> 
> Click on the AfterBurner curve marker where you want to operate, press 'L' and apply for free, no serial required. Still subject to power and temp though.


Never heard of this L ...whats that for ??


----------



## GanGstaOne

Quote:


> Originally Posted by *ucode*
> 
> Click on the AfterBurner curve marker where you want to operate, press 'L' and apply for free, no serial required. Still subject to power and temp though.


will test now


----------



## VPII

Quote:


> Originally Posted by *juniordnz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *VPII*
> 
> Okay..... just did another Doom run with core at 2139mhz..... stayed there all the way through 45 minutes game play. It is 1080p but not once did it drop below and max core temp was 47c
> 
> Sent from my SM-G925F using Tapatalk
Click to expand...


----------



## kx11

here's a fun test for everyone

1. set everything to stock using MSI AB
2.shutdown any browser,game or 3d apps
3.run GPUz and open sensors tab then run GPU stress test

give us a screenshot of what is the max clock your GPU is running at after 2 minutes

here's mine


----------



## Whitechap3l

give that man a reward


----------



## Whitechap3l

Quote:


> Originally Posted by *ucode*
> 
> Click on the AfterBurner curve marker where you want to operate, press 'L' and apply for free, no serial required. Still subject to power and temp though.


is it harmful to but the GPU constantly under that high clock and voltage?


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> here's a fun test for everyone
> 
> 1. set everything to stock using MSI AB
> 2.shutdown any browser,game or 3d apps
> 3.run GPUz and open sensors tab then run GPU stress test
> 
> give us a screenshot of what is the max clock your GPU is running at after 2 minutes
> 
> here's mine


@ 1809 BASE you got the highest Factory clocked 1080's on the market but as usual watch out for KINGPIN , they will make the same card and sell it as if it was made of diamonds


----------



## GanGstaOne

Quote:


> Originally Posted by *ucode*
> 
> Click on the AfterBurner curve marker where you want to operate, press 'L' and apply for free, no serial required. Still subject to power and temp though.


Ok i cant unlock that voltage curve do i need an MSI bios or what
in settings i have unlocked both voltage control and monitoring what else i need to do ?


----------



## juniordnz

Quote:


> Originally Posted by *ondoy*
> 
> joining this club...


geez, that must have been a nice, long drive home







welcome
Quote:


> Originally Posted by *kx11*
> 
> here's a fun test for everyone
> 
> 1. set everything to stock using MSI AB
> 2.shutdown any browser,game or 3d apps
> 3.run GPUz and open sensors tab then run GPU stress test
> 
> give us a screenshot of what is the max clock your GPU is running at after 2 minutes
> 
> here's mine


Nice!

Mine boosts to 2000mhz out of the box with 100% fan (Here room temp hits 30°C, give me a break







)

2100/11000 with stock voltage and 130% TDP


----------



## GanGstaOne

@ucode L option dosnt do nothing its not boosting clocks to max at all time like precision K boost function well no boost option for us no bios tweaker too sad very sad for 1080 users

Edit: hey id does work just like K boost great after 3th time but who cares yay now works thanks man


----------



## karelbastos

Quote:


> Originally Posted by *kx11*
> 
> here's a fun test for everyone
> 
> 1. set everything to stock using MSI AB
> 2.shutdown any browser,game or 3d apps
> 3.run GPUz and open sensors tab then run GPU stress test
> 
> give us a screenshot of what is the max clock your GPU is running at after 2 minutes
> 
> here's mine


Here is mine FURMARK stress Test after 7Min.


----------



## juniordnz

That's insane. Furmark will power throttle like crazy here.


----------



## kx11

Quote:


> Originally Posted by *karelbastos*
> 
> Here is mine FURMARK stress Test after 7Min.


dude you OC the GPU !!!


----------



## karelbastos

Quote:


> Originally Posted by *kx11*
> 
> dude you OC the GPU !!!


I forget the part that non OC

^^

I will test again and post results....


----------



## karelbastos

Quote:


> Originally Posted by *kx11*
> 
> dude you OC the GPU !!!


Here is my ZOTAC FE stock after 5 Min


----------



## GanGstaOne

apperently there will be new 1080 card NVIDIA_DEV.1BA0 = "NVIDIA GeForce GTX 1080" anyone has any news ??


----------



## kx11

Quote:


> Originally Posted by *karelbastos*
> 
> Here is my ZOTAC FE stock after 5 Min


yeah that looks normal


----------



## danjal

Anyone with the MSI Aero oc? How are they running?


----------



## -terabyte-

Quote:


> Originally Posted by *GanGstaOne*
> 
> apperently there will be new 1080 card NVIDIA_DEV.1BA0 = "NVIDIA GeForce GTX 1080" anyone has any news ??


Maybe it is the "new" one on 14nm by Samsung?


----------



## GanGstaOne

Quote:


> Originally Posted by *-terabyte-*
> 
> Maybe it is the "new" one on 14nm by Samsung?


Ya only what we need for them to release better version of 1080


----------



## axiumone

I'm thinking - NVIDIA_DEV.1BA0 = "NVIDIA GeForce GTX 1080" - will be the mobile variant. Apparently it's a full fat 1080.


----------



## toncij

Quote:


> Originally Posted by *axiumone*
> 
> I'm thinking - NVIDIA_DEV.1BA0 = "NVIDIA GeForce GTX 1080" - will be the mobile variant. Apparently it's a full fat 1080.


Yes, it's a mobile 1080.


----------



## GanGstaOne

Quote:


> Originally Posted by *toncij*
> 
> Yes, it's a mobile 1080.


Ya maybe cause MSI and Eurocom already says they have new laptops with next gen gpus but if nvidia releases new better 1080 i will throw mine out the window and get 1080 TI or Titan X


----------



## toncij

Quote:


> Originally Posted by *GanGstaOne*
> 
> Ya maybe cause MSI and Eurocom already says they have new laptops with next gen gpus but if nvidia releases new better 1080 i will throw mine out the window and get 1080 TI or Titan X


You wouldn't! You could kill someone!









P.S. We all know 1080Ti will get out, 1080 will drop to $600 and 1080Ti will be as fast as Titan XP for $750.







- Many owners of 1080 and TitanXP will throw cards out the window. It happens every year.


----------



## shadow85

What is the cheapest 1080 I can get out there, that I can slap an EK or XSPC waterblock on it?


----------



## RayanLhindi

*Checking in the club with some hot benchmarking!
Got my G1 gaming pushed to +25k on graphics with XOC t4 stable OC









http://www.3dmark.com/3dm/14133339?

Stable on 2.19Ghz. I will post some screenshots in couple of hours for the CPU and GPU OC settings.








Cuz i am doing more benchmarking and tweaking the settings, searching for the sweet spot.*


----------



## boredgunner

Quote:


> Originally Posted by *shadow85*
> 
> What is the cheapest 1080 I can get out there, that I can slap an EK or XSPC waterblock on it?


Zotac AMP or EVGA ACX 3.0 I think.


----------



## RayanLhindi

Quote:


> Originally Posted by *shadow85*
> 
> What is the cheapest 1080 I can get out there, that I can slap an EK or XSPC waterblock on it?


*You can go for G1 Gaming







*


----------



## ryanallan

Quote:


> Originally Posted by *shadow85*
> 
> What is the cheapest 1080 I can get out there, that I can slap an EK or XSPC waterblock on it?


Most all the AIB's have a low cost option. MSI AERO / Asus Turbo. Doesn't matter which as they all use the same PCB.


----------



## juniordnz

Guys who have watercooled (AIO) their 1080s:

Is one thick 120mm rad like H80i or Arctic Liquid Freezer 120mm enough to cool a 1080 to stay below 50s? I'm really not sure if 240mm would be overkill.


----------



## axiumone

Quote:


> Originally Posted by *ryanallan*
> 
> Most all the AIB's have a low cost option. MSI AERO / Asus Turbo. Doesn't matter which as they all use the same PCB.


The turbo is not a reference design, no waterblock available. Just looked at the pics for the aero, it doesn't have an nvidia logo by the pcie slot, there's a possibility that it may not be a reference design either.


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> Guys who have watercooled (AIO) their 1080s:
> 
> Is one thick 120mm rad like H80i or Arctic Liquid Freezer 120mm enough to cool a 1080 to stay below 50s? I'm really not sure if 240mm would be overkill.


I have with G10 + 240mm rad and temps at max load with 2200mhz stays in low 40C so i dont think 120mm will be enough but if its 40mm-50mm thick may do the job


----------



## juniordnz

Quote:


> Originally Posted by *GanGstaOne*
> 
> I have with G10 + 240mm rad and temps at max load with 2200mhz stays in low 40C so i dont think 120mm will be enough but if its 40mm-50mm thick may do the job


I would only consider both 120mm that I said because they are thick, 38mm and 42mm if im not mistaken. But I'm still afraid of buying it and not working. I may test my old H80i with ziplocks and MX4 to see if it's enough, If not, I get a 280 or 240.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Guys who have watercooled (AIO) their 1080s:
> 
> Is one thick 120mm rad like H80i or Arctic Liquid Freezer 120mm enough to cool a 1080 to stay below 50s? I'm really not sure if 240mm would be overkill.


My FE and H90 in push/pull with the best Noctua 140mm fans does 45c max. Kryonaut too.

You wouldn't get under 50c I'd think. Around 52-55c.
Quote:


> Originally Posted by *axiumone*
> 
> I'm thinking - NVIDIA_DEV.1BA0 = "NVIDIA GeForce GTX 1080" - will be the mobile variant. Apparently it's a full fat 1080.


Yep. It's not MXM either... I can't even explain how much that pisses me off.


----------



## juniordnz

Quote:


> Originally Posted by *shadow85*
> 
> What is the cheapest 1080 I can get out there, that I can slap an EK or XSPC waterblock on it?


Get an AMOR, same PCB/Power design as GamingX/Z. EK make FC waterblocks for it, just search for TF6.
Quote:


> Originally Posted by *nexxusty*
> 
> My FE and H90 in push/pull with the best Noctua 140mm fans does 45c max. Kryonaut too.
> 
> You wouldn't get under 50c I'd think. Around 52-55c.


Why? Because of my crazy room temps?

I really need it to stay mid 40s, otherwise it's just not worth it. I guess I won't be able to go cheap on this....


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Get an AMOR, same PCB/Power design as GamingX/Z. EK make FC waterblocks for it, just search for TF6.
> Why? Because of my crazy room temps?
> 
> I really need it to stay mid 40s, otherwise it's just not worth it. I guess I won't be able to go cheap on this....


H90 is 140mm, I just think a 120mm Rad wouldn't do under 50c. My room temp is always under 24c though. Usually as close to 22c as possible.

I get 45c loaded at 2100mhz 1.0500v with a 140mm Rad in push/pull. I really doubt a 120mm Rad would do under 50c, even in my rooms environment.

What are your ambient room Temps? Anything above 24c and it's not happening. Positive of that.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> H90 is 140mm, I just think a 120mm Rad wouldn't do under 50c. My room temp is always under 24c though. Usually as close to 22c as possible.
> 
> I get 45c loaded at 2100mhz 1.0500v with a 140mm Rad in push/pull. I really doubt a 120mm Rad would do under 50c, even in my rooms environment.
> 
> What are your ambient room Temps? Anything above 24c and it's not happening. Positive of that.


Most of the year around 28. Sometimes during summer 32, and during winter 22-24. Yeah, it's hot.

You do know that both H80i and Arctic Liquid Freezer are very thick rads right? Yours must be 27mm, H80i 38mm and Arctic's even more than that.

I believe I can wait till summer and test my old Coolit H80i with "macgiver ziptie mod" and see how it handles the heat. I need 47C tops to make it worth it.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Most of the year around 28. Sometimes during summer 32, and during winter 22-24. Yeah, it's hot.
> 
> You do know that both H80i and Arctic Liquid Freezer are very thick rads right? Yours must be 27mm, H80i 38mm and Arctic's even more than that.
> 
> I believe I can wait till summer and test my old Coolit H80i with "macgiver ziptie mod" and see how it handles the heat. I need 47C tops to make it worth it.


Room Temp? Ugh that sucks... I couldn't even sleep in 28c, let alone 32c.

A 120mm×120mm×38mm Rad might do it. I would hedge bets to say it would be close. Not on 32c days though. No way.

Try. I would. Do push/pull for that extra 1-2c.


----------



## justinyou

Quote:


> Originally Posted by *toncij*
> 
> You wouldn't! You could kill someone!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. We all know 1080Ti will get out, 1080 will drop to $600 and 1080Ti will be as fast as Titan XP for $750.
> 
> 
> 
> 
> 
> 
> 
> - Many owners of 1080 and TitanXP will throw cards out the window. It happens every year.


I predict the price of the 1080Ti will at least be usd800 and the 1080Ti FE will be usd899


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> Room Temp? Ugh that sucks... I couldn't even sleep in 28c, let alone 32c.
> 
> A 120mm×120mm×38mm Rad might do it. I would hedge bets to say it would be close. Not on 32c days though. No way.
> 
> Try. I would. Do push/pull for that extra 1-2c.


Yeah, it can get pretty uncomfortable here sometimes of the year. Around Christmas is like a huge open sauna









Anyway, I also discovered that with gen5 asetek coolers we don't even need a modded bracket or G10s ugly one. You can just drill the original one and it will do just fine. Sorry for not posting it earlier, just found that out today.

So, if anyone wants to attach an AIO to their nv cards, just get a Gen5 Asetek and mod the original bracket, it will also be invisible below the waterblock.

It would be Nice to get a thick 240mm rad for CPU and a thick 120mm for the GPU and fill up the front of my case though. I'm hoping those gen5 thick rads are that good.


----------



## jorgerp86

Finally got my GTX 1080 to replace the 980Ti. Love playing it on my 1440p/144Hz monitor....gaming bliss! Feel dated as I'm still using an i5 2500K lol.


----------



## fat4l

What is this L function in AB? Thz


----------



## ucode

It basically selects a fixed voltage point to use.

http://www.guru3d.com/news-story/download-msi-afterburner-4-3-beta-4.html
Quote:


> You may press L after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling.


----------



## Bishop07764

Quote:


> Originally Posted by *tin0*
> 
> As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.
> 
> 
> 
> *Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).
> 
> When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


Awesome! Thanks so much for posting this. + Rep! Flashing the OC Gaming Z bios to my Seahawk EK now has it boosting to an unwavering 2076 at default settings. I only hit 90% power limit briefly when playing Doom maxed out at 4k. I've had a lot of fun playing around with this card for the brief time that I have been able to thus far. I might eventually get around to trying to overclock it.







All that I lack now is an acrylic brace for this card. Dang it's heavy.


----------



## ucode

Quote:


> Originally Posted by *Bishop07764*
> 
> I only hit 90% power limit briefly when playing Doom maxed out at 4k.


90% would be just over 240W FYI.


----------



## GanGstaOne

Quote:


> Originally Posted by *nexxusty*
> 
> Yep. It's not MXM either... I can't even explain how much that pisses me off.


Just like they did with Full 980 for laptops giving it 8GB of ram and desktop only 4GB


----------



## nexxusty

I have an EVGA FE, TDP limit removed... Loads at 40c max at 2075mhz...

However my voltage is locked at 1.0250. The voltage slider in AB 4.3.0 Beta 4 does nothing. Same with Precision X....

The curve in Precision X will allow me to get the voltage up to 1.0750 but with VREL as the PerfCap reason. Doesn't work well.

How are you guys ACTUALLY adjusting your voltage? Will this only be remedied by a BIOS flash? 2075mhz is not enough. Especially with all the work I put into this thing.


----------



## GanGstaOne

Quote:


> Originally Posted by *nexxusty*
> 
> I have an EVGA FE, TDP limit removed... Loads at 40c max at 2075mhz...
> 
> However my voltage is locked at 1.0250. The voltage slider in AB 4.3.0 Beta 4 does nothing. Same with Precision X....
> 
> The curve in Precision X will allow me to get the voltage up to 1.0750 but with VREL as the PerfCap reason. Doesn't work well.
> 
> How are you guys ACTUALLY adjusting your voltage? Will this only be remedied by a BIOS flash? 2075mhz is not enough. Especially with all the work I put into this thing.


MSI AfterBurner is the best for the job use the voltage-frequency curve not voltage slider cause that does nothing


----------



## nexxusty

Quote:


> Originally Posted by *GanGstaOne*
> 
> MSI AfterBurner is the best for the job use the voltage-frequency curve not voltage slider cause that does nothing


Every single selection I make with the curve causes VREL in PerfCap Reason in GPUZ.

Doesn't seem to work at all.... the curve does nothing in Afterburner.

Ugh ***...


----------



## GanGstaOne

Quote:


> Originally Posted by *nexxusty*
> 
> Every single selection I make with the curve causes VREL in PerfCap Reason in GPUZ.
> 
> Doesn't seem to work at all.... the curve does nothing in Afterburner.
> 
> Ugh ***...


Strange it works for me perfect i can set for every clock frequency whatever voltage i want up to 1.2 maybe another bios will do it for you


----------



## nexxusty

Quote:


> Originally Posted by *GanGstaOne*
> 
> Strange it works for me perfect i can set for every clock frequency whatever voltage i want up to 1.2 maybe another bios will do it for you


You have an EVGA FE with stock BIOS on it?


----------



## DStealth

Just re benched with my new CPU and Physics boosted a lot from [email protected] 18k to nearly 19.5k with [email protected]


----------



## GanGstaOne

Quote:


> Originally Posted by *nexxusty*
> 
> You have an EVGA FE with stock BIOS on it?


No Gigabyte 1080 G1 Gaming with F2 Beta bios ends at 00.FD


----------



## nexxusty

Quote:


> Originally Posted by *GanGstaOne*
> 
> No Gigabyte 1080 G1 Gaming with F2 Beta bios ends at 00.FD


K thanks.

Can anyone with an FE tell me how to get actual voltage control?

Or is it even possible?

This doesnt make sense to me...


----------



## jodasanchezz

Only one Day left till my Classyfied 1080 arrives


----------



## danjal

Quote:


> Originally Posted by *boredgunner*
> 
> Zotac AMP or EVGA ACX 3.0 I think.


Dont think the zotac amp has a waterblock available yet.


----------



## KGBinUSA

Quote:


> Originally Posted by *juniordnz*
> 
> Yeah, it can get pretty uncomfortable here sometimes of the year. Around Christmas is like a huge open sauna
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, I also discovered that with gen5 asetek coolers we don't even need a modded bracket or G10s ugly one. You can just drill the original one and it will do just fine. Sorry for not posting it earlier, just found that out today.
> 
> So, if anyone wants to attach an AIO to their nv cards, just get a Gen5 Asetek and mod the original bracket, it will also be invisible below the waterblock.
> 
> It would be Nice to get a thick 240mm rad for CPU and a thick 120mm for the GPU and fill up the front of my case though. I'm hoping those gen5 thick rads are that good.


My OCed 980 TI @ 1500 (with flashed bios for voltage) gets into low 50s at 100% load with a G10, an AMD FX Liquid Cooler and single pull GT AP-15. My ambient temps during summer are 28-30c. So a single thick 120mm radiator with a good fan should keep a 1080 in the 40s.


----------



## TobsenHB

Does anyone have good experiences with overclocking an EVGA SC Gaming ACX 3.0? I am considering this card, because it uses the reference pcb and fits in my custom build. I am planning on watercooling it.


----------



## Whitechap3l

Quote:


> Originally Posted by *TobsenHB*
> 
> Does anyone have good experiences with overclocking an EVGA SC Gaming ACX 3.0? I am considering this card, because it uses the reference pcb and fits in my custom build. I am planning on watercooling it.


What I heard so far the SC BIOS isnt the greatest OC wise.. I mean you can Flash it anyways so it is not that important which one you got... my Asus Strix runs with the EVGA FE Bios


----------



## toncij

Quote:


> Originally Posted by *KGBinUSA*
> 
> My OCed 980 TI @ 1500 (with flashed bios for voltage) gets into low 50s at 100% load with a G10, an AMD FX Liquid Cooler and single pull GT AP-15. My ambient temps during summer are 28-30c. So a single thick 120mm radiator with a good fan should keep a 1080 in the 40s.


A H115i keeps TX-M at under 65°C so it should, yes.


----------



## kx11

the new driver should boost Timespy results with WDDM2.1 support finally included

http://www.guru3d.com/files-details/geforce-372-54-whql-driver-download.html

no more windows XP support from nvidia though


----------



## GanGstaOne

This time nvidia give all 1000 line for notebook Full 1080,1070 and 1060 same specs as desktop ones


----------



## jodasanchezz

HI There do i read here right?

We are eable to Flash Different Bios Version Asus Strix on an EVGA FE Card ? Custom PCB BIOS to Reference PCBs Bios?

How is thit Possible?

Thanks in advance


----------



## GanGstaOne

Quote:


> Originally Posted by *jodasanchezz*
> 
> HI There do i read here right?
> 
> We are eable to Flash Different Bios Version Asus Strix on an EVGA FE Card ? Custom PCB BIOS to Reference PCBs Bios?
> 
> How is thit Possible?
> 
> Thanks in advance


Cause 1080 bios is all the same only Galax 1080 HOF bios is somehow modified so it dosnt work on all pcbs all other ones do work


----------



## nexxusty

Quote:


> Originally Posted by *nexxusty*
> 
> K thanks.
> 
> Can anyone with an FE tell me how to get actual voltage control?
> 
> Or is it even possible?
> 
> This doesnt make sense to me...


Got 1.062v. FE's top out at this voltage.

God I want I BIOS editor....

Testing stability at 2138mhz/500mhz now. Loading at 41c.... 9% TDP.









*edit*

Curve is absolutely useless. Lowers performance. I expected different when core temps don't exceed 41c.... no artifacts... just crap perfs.

[email protected]/1.0310v (whatever it feels like doing) is my max overclock. Really seems like this card has more in it.... just did 8163 Graphics Score on Time Spy however.

My best yet.


----------



## GreedyMuffin

Quote:


> Originally Posted by *nexxusty*
> 
> Got 1.062v. FE's top out at this voltage.
> 
> God I want I BIOS editor....
> 
> Testing stability at 2138mhz/500mhz now. Loading at 41c.... 9% TDP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *edit*
> 
> Curve is absolutely useless. Lowers performance. I expected different when core temps don't exceed 41c.... no artifacts... just crap perfs.
> 
> [email protected]/1.0310v (whatever it feels like doing) is my max overclock. Really seems like this card has more in it.... just did 8163 Graphics Score on Time Spy however.
> 
> My best yet.


How do you decrease voltage like that?

I would really to that to mine as I'm hitting the TDP limit. Will rather go down 39mhz to 2100 or something at a lower voltage.

Thanks!









NVM!

Testing 2100 at 1.000V.







Don't think it's stable, but we'll see. The power consumption went down quite a bit.


----------



## Whitechap3l

Quote:


> Originally Posted by *jodasanchezz*
> 
> HI There do i read here right?
> 
> We are eable to Flash Different Bios Version Asus Strix on an EVGA FE Card ? Custom PCB BIOS to Reference PCBs Bios?
> 
> How is thit Possible?
> 
> Thanks in advance


Yeah sure. Ist basically all the same.. The custom Partners like EVGA, MSI and so on do not Change the BIOS that much what Nvidia is given them I guess.








Only with the real flagships HOF Classy etc you COULD have Problems


----------



## jodasanchezz

Quote:


> Originally Posted by *Whitechap3l*
> 
> Yeah sure. Ist basically all the same.. The custom Partners like EVGA, MSI and so on do not Change the BIOS that much what Nvidia is given them I guess.
> 
> 
> 
> 
> 
> 
> 
> 
> Only with the real flagships HOF Classy etc you COULD have Problems


Thanks for the Answer,
I Get may Classy tomorrow...

Is there a tool to read the Bios Information such like the Maxwell bios tweaker / i know editing is not possible atm but just for comparing?


----------



## nexxusty

Quote:


> Originally Posted by *GreedyMuffin*
> 
> How do you decrease voltage like that?
> 
> I would really to that to mine as I'm hitting the TDP limit. Will rather go down 39mhz to 2100 or something at a lower voltage.
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVM!
> 
> Testing 2100 at 1.000V.
> 
> 
> 
> 
> 
> 
> 
> Don't think it's stable, but we'll see. The power consumption went down quite a bit.


I did the shunt resistor mod with silver solder. Clocks are not locked to idle clocks and no actual power limit. The highest I've seen the power limit rise to is 24%.

This mod is recommended over the 10ohm mod. I recommend this one anyway....

Flux, silver rosin core solder from one end to the other end of RS1, RS2, RS3 resistors and you don't have a power limit.

Quote:


> Originally Posted by *jodasanchezz*
> 
> Thanks for the Answer,
> I Get may Classy tomorrow...
> 
> Is there a tool to read the Bios Information such like the Maxwell bios tweaker / i know editing is not possible atm but just for comparing?


Not yet.


----------



## GreedyMuffin

How can I lower the voltage?

It was lowered when folding and R6S, but not when playing bf4?


----------



## Whitechap3l

Quote:


> Originally Posted by *GreedyMuffin*
> 
> How can I lower the voltage?
> 
> It was lowered when folding and R6S, but not when playing bf4?


Afterburner -> Curve -> x-axis to select your desired voltage and y-axis desired clock Speed









Not only usefull for overclocking


----------



## wholeeo

Purchased an open boxed Zotac FE this weekend and am thinking I should have went with a custom design like the Zotac AMP edition which is only $10 more.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Whitechap3l*
> 
> Afterburner -> Curve -> x-axis to select your desired voltage and y-axis desired clock Speed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not only usefull for overclocking


Dosen't work. Jumps up to 1.0620V.

Not like the 0.993/1.000V i set it to. :/

NVM:

Figured it out. Had to try CTRL + L inside the curve. Silly me.


----------



## Whitechap3l

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Dosen't work. Jumps up to 1.0620V.
> 
> Not like the 0.993/1.000V i set it to. :/
> 
> NVM:
> 
> Figured it out. Had to try CTRL + L inside the curve. Silly me.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Figured it out. Had to try CTRL + L inside the curve. Silly me.


Really? Ctrl+L makes the card stay at desired clock/voltage? that's very inteersting.

My card can do a stable 2113mhz on 1093V or 2100 on 1062. Just decided the extra 31V are not worth 13mhz. Now I'll test what's the minimum voltage I can get 2100mhz. Less heat is always appreciated for someone on air


----------



## ralphi59

In precisionx
Manual Mode
Click at 0.875 and +150
One bar appear.
If you only make 1 point, you stay at 0;875
Dont forget to click apply
You can define the curve as you want.


----------



## Hutzi

So is there an easy way to figure out the lowest possible voltage for every step?


----------



## Whitechap3l

Quote:


> Originally Posted by *juniordnz*
> 
> Really? Ctrl+L makes the card stay at desired clock/voltage? that's very inteersting.
> 
> My card can do a stable 2113mhz on 1093V or 2100 on 1062. Just decided the extra 31V are not worth 13mhz. Now I'll test what's the minimum voltage I can get 2100mhz. Less heat is always appreciated for someone on air


31V would let my Card explode









yeah on water ist not that big of a deal


----------



## ralphi59

You can start with +150 at each voltage without à doubt
Stop the curve at the voltage you want at load


----------



## GreedyMuffin

Should I test 2100 on 0.975V?

I feel like it's gonna insta crash.









Can test with Valley or something. Dosen't hurt.


----------



## juniordnz

Quote:


> Originally Posted by *Whitechap3l*
> 
> 31V would let my Card explode
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yeah on water ist not that big of a deal


31mv!!!!!!!!!!!







You understood the first time
















Yeah, room temps can get pretty hot in here, so anything that keeps the card cooler is appreciated. Actually, I'm getting 2113mhz with 1.062V, But after 47C or so it clocks down to 2100 and stays there (I'm gaming in mid 50s).

I could get 2126mhz with 1093v but it would throttle down to 2113 aswell, so no big improvement, just added heat.


----------



## ralphi59




----------



## ralphi59

i can do +275 under 1.00v
and +250 after
But i run at 0.900v
1886mhz
And +500 memory
With a custom fan curve, my fe is super silent.
The less the volt, the less the power limit, less heat, less noise.


----------



## ralphi59

Make different curve of different lenght.
Save with profiles and admire the difference in score in benchs !!!!


----------



## GreedyMuffin

Testing 2000 mhz at 900 mv.


----------



## ralphi59

Optimist i think


----------



## ralphi59

Witcher 3 4k nearly max out is a good and fast stability test.


----------



## GreedyMuffin

If 2000 is stable at 900 mv i'll run at that voltage, I seem to run 2100 at 1000mv though.


----------



## ralphi59

Nice card


----------



## GreedyMuffin

Quote:


> Originally Posted by *ralphi59*
> 
> Nice card


Thanks!

Testing 2025 at 900mv. Going out for a couple of hours. So will let it fold in the meantime.


----------



## ralphi59

Let me know the results.
See you
Bye


----------



## shadow85

Quote:


> Originally Posted by *juniordnz*
> 
> Get an AMOR, same PCB/Power design as GamingX/Z. EK make FC waterblocks for it, just search for TF6.


So they have the same PCB/Power designs?

What about the G1 and ACX models, do they also have reference PCB? Or do EK still make blocks for them if they arn't reference.


----------



## boredgunner

Quote:


> Originally Posted by *shadow85*
> 
> So they have the same PCB/Power designs?
> 
> What about the G1 and ACX models, do they also have reference PCB? Or do EK still make blocks for them if they arn't reference.


Yeah like he said ARMOR has same PCB and power design as the GAMING models. I'm putting mine under water eventually, unless I get a new card first or decide to wait on one.

ACX uses reference PCB. I thought G1 used reference PCB but I've read conflicting things about it.


----------



## Whitechap3l

Quote:


> Originally Posted by *boredgunner*
> 
> Yeah like he said ARMOR has same PCB and power design as the GAMING models. I'm putting mine under water eventually, unless I get a new card first or decide to wait on one.
> 
> ACX uses reference PCB. I thought G1 used reference PCB but I've read conflicting things about it.


https://www.ekwb.com/news/official-list-of-ek-water-blocks-for-gtx-1080-series/

There you see all the custom blocks from EKWB - Custom blocks means differences in pcb I guess

Series like Armor fit in the TF6 range yes

EKWB also mentioned on their page :

"Please note that more GeForce® GTX 1080 graphics cards will be supported due to water blocks compatibility with PCB"


----------



## boredgunner

Quote:


> Originally Posted by *Whitechap3l*
> 
> https://www.ekwb.com/news/official-list-of-ek-water-blocks-for-gtx-1080-series/
> 
> There you see all the custom blocks from EKWB - Custom blocks means diffrences in pcb I guess


You're right so that confirms it. Classified and FTW use different PCB and require different water blocks I see. Wouldn't have expected that.


----------



## GanGstaOne

Quote:


> Originally Posted by *shadow85*
> 
> So they have the same PCB/Power designs?
> 
> What about the G1 and ACX models, do they also have reference PCB? Or do EK still make blocks for them if they arn't reference.


1080 G1 uses custom pcb with 1 x 8pin 8+2 or 8+3 phases from what i saw by testing different 1080 cards like EVGA FTW Classy Asus Strix and Gigabyte Xtreme Zotac AMP Xtreme and few others
1080 G1 has one of the best custom pcbs out there i got G1 cause it was only one available at that time wanted to change it since i have friend with pc store is no problem but after all testing we did i decided to stick with the G1


----------



## cj0612

Quote:


> Originally Posted by *juniordnz*
> 
> Guys who have watercooled (AIO) their 1080s:
> 
> Is one thick 120mm rad like H80i or Arctic Liquid Freezer 120mm enough to cool a 1080 to stay below 50s? I'm really not sure if 240mm would be overkill.


I have the EVGA hybrid cooler and with the stock thermal paste it comes with temps were between 45 and 50 but I was using conductonaut thermal grizzly for the TDP mod so I went ahead and VERY CAREFULLY put it on the gpu and it works amazing. Doesn't go above 40 while gaming and 45 under stress tests.


----------



## ROKUGAN

Quote:


> Originally Posted by *wholeeo*
> 
> Purchased an open boxed Zotac FE this weekend and am thinking I should have went with a custom design like the Zotac AMP edition which is only $10 more.


My advice is that if you plan to stay with air-cooling, get the AMP Extreme as it runs 20C cooler than the AMP (60s vs 80s @ 4K) and TDP is higher (270W vs 230W). Being Pascal so temp sensitive, the Extreme gives you a much higher shot to achieve +2100Mhz stable (2126 vs 2066 in my personal experience with both cards).

If one day those Pascal BIOS limiters get "hacked" (pretty uncertain atm), the Extreme cooler is so good that it would allow 2200 on air without problems, while the AMP version is already temp limited hitting +80C on [email protected]%.


----------



## wholeeo

Quote:


> Originally Posted by *ROKUGAN*
> 
> My advice is that if you plan to stay with air-cooling, get the AMP Extreme as it runs 20C cooler than the AMP (60s vs 80s @ 4K) and TDP is higher (270W vs 230W). Being Pascal so temp sensitive, the Extreme gives you a much higher shot to achieve +2100Mhz stable (2126 vs 2066 in my personal experience with both cards).
> 
> If one day those Pascal BIOS limiters get "hacked" (pretty uncertain atm), the Extreme cooler is so good that it would allow 2200 on air without problems, while the AMP version is already temp limited hitting +80C on [email protected]%.


Thanks for the suggestion but I don't believe an Extreme AMP would fit in my case due to the 2.5 slot cooler. (NZXT Phantom).

So what is everyone using nowadays to manage these cards? MSI AB? EVGA PX? or something else?


----------



## ROKUGAN

Quote:


> Originally Posted by *wholeeo*
> 
> Thanks for the suggestion but I don't believe an Extreme AMP would fit in my case due to the 2.5 slot cooler. (NZXT Phantom).
> 
> So what is everyone using nowadays to manage these cards? MSI AB? EVGA PX? or something else?


I think MSI Afterburner 4.3.0 Beta 4 is the most widespread as far as I´ve seen. The only "problem" I have with my card is that it runs so fine that without the possibility of tweaking the BIOS it gets almost boring very soon. With the 980ti the Bios tweaking was pretty fun


----------



## GanGstaOne

Quote:


> Originally Posted by *ROKUGAN*
> 
> I think MSI Afterburner 4.3.0 Beta 4 is the most widespread as far as I´ve seen. The only "problem" I have with my card is that it runs so fine that without the possibility of tweaking the BIOS it gets almost boring very soon. With the 980ti the Bios tweaking was pretty fun


Dont hold your breath we probably wont see bios tweaker for long time if it comes at all for maxwell bios tweaker was released very soon after card release kepler and fermi too


----------



## justinyou

For those who are interested in OC, there is this step by step guide on how to OC the NVIDIA Pascal by "JayzTwoCents"









A twenty minutes long video.


----------



## cj0612

Just thought I'd post my experience.....I have a 1080 FE and on air the max in-game clock I could get stable was 2012mhz and it would drop to 1999mhz occasionally and temps would be around 65-70C I could get it up to a little higher in benchmarks and games other than overwatch, maybe around 2050mhz....so I thought I had a pretty dud overclocker.

What I noticed was that when I set my fan profile to 100% it would stay in the higher range of boost clocks until around 55C and then it would jump back and forth also in overwatch I was hitting anywhere from 110-128% tdp which is over max so that was throttling it as well. When I set an fps cap to 60 or 100 (144hz monitor) the tdp% would drop a good bit and then I could bump up and maintain slightly higher clocks.

After that I decided to get the EVGA AIO cooler and after putting that on I was immediately able to overclock to 2100mhz but in overwatch it would drop down to 2025 and anywhere in between with temps maxing out at about 52C

So after this I put Thermal Grizzly Conductonaut over just the one resistor right under the 8 pin connector and then since I already had to redo thermal paste anyway and the water pump was copper I put the Conductonaut on the GPU as well. This made my temps drop to 40C in game with the max i've seen in heaven at 45 with probably about 50-60% fan speed. This led to me being able to achieve 2126mhz stable in-game and no drops in clock whatsoever and TDP doesn't go over 80% anymore.

Memory clock set to +300 throughout. Couldn't notice any discernible differences between 300 and 500+ so I just left it there.

Kinda sad there isn't much else for me to mess with lol.


----------



## Andreadeluxe

i installed that AIO cooler on my MSI GTX 1080 Armor OC.








Thermaltake Water 3 Performer C 55€
NZXT Kraken G10 34€
GELID PWM - VGA Fan Adaptor Cable for Cooler from 3 4 PIN to VGA-PWM-ADAPTOR 5€

finally the armor cooler bad performance is going away.

from 87° with noise to 65° without noise.

PS: fan over the radiator spin very slow... but i prefer the silence over the extreme cooling.

PS2: tomorrow i exchange the kraken original fan with a dead silent noiseblocker pwm


----------



## TobsenHB

Quote:


> Originally Posted by *Whitechap3l*
> 
> What I heard so far the SC BIOS isnt the greatest OC wise.. I mean you can Flash it anyways so it is not that important which one you got... my Asus Strix runs with the EVGA FE Bios


Thanks for your input - it is greatly appreciated!


----------



## juniordnz

Quote:


> Originally Posted by *Andreadeluxe*
> 
> i installed that AIO cooler on my MSI GTX 1080 Armor OC.
> PS: fan over the radiator spin very slow... but i prefer the silence over the extreme cooling.


Could you test with 100% fan speed on the rad? Just a brief load test is enough. I'm planning on watercooling mine and I'm not sure if a 120mm rad will do it or if I need to go 240/280mm. It would help me a lot! Thanks!


----------



## fat4l

New driver brings instability for me... My oc no longer stable. Reverting back


----------



## piee

I read on the 980ticlassy club site that the ek780classy block my fit 1080 classy card, it fits the 980ticlassy card.


----------



## juniordnz

Quote:


> Originally Posted by *fat4l*
> 
> New driver brings instability for me... My oc no longer stable. Reverting back


New drivers brings improvements on how the card uses some features that before it didn't. Could that be the reason of your instability with the new drivers? Your card wasn't being used in its full capabilities because of driver issues, now the new driver is exploring it more, hence the instability in a OC you once thought it was stable. I always thought like that...


----------



## arrow0309

Quote:


> Originally Posted by *Andreadeluxe*
> 
> i installed that AIO cooler on my MSI GTX 1080 Armor OC.
> 
> 
> 
> 
> 
> 
> 
> 
> Thermaltake Water 3 Performer C 55€
> NZXT Kraken G10 34€
> GELID PWM - VGA Fan Adaptor Cable for Cooler from 3 4 PIN to VGA-PWM-ADAPTOR 5€
> 
> finally the armor cooler bad performance is going away.
> 
> from 87° with noise to 65° without noise.
> 
> PS: fan over the radiator spin very slow... but i prefer the silence over the extreme cooling.
> 
> PS2: tomorrow i exchange the kraken original fan with a dead silent noiseblocker pwm


+Rep!
Cool and well done bro








But I'd like to see some 100% fan results as well, 65C is what I get on air


----------



## GreedyMuffin

2025mhz on 0.900V was not stable, but can seem like 2000 is.

Is it worth it? When gaming on 2139 It can dip down to 2088. Same if I'm gaming on 2100. So my max stable freq. on stock bios is 2088mhz only. Even though when stock mem 2139 is not a problem.

Undervolting is kinda neat though.^^


----------



## DStealth

Quote:


> Originally Posted by *fat4l*
> 
> New driver brings instability for me... My oc no longer stable. Reverting back


In opposite to me...just the highest FS TimeSpy GPU close to 8400


----------



## Whitechap3l

Quote:


> Originally Posted by *GreedyMuffin*
> 
> 2025mhz on 0.900V was not stable, but can seem like 2000 is.
> 
> Is it worth it? When gaming on 2139 It can dip down to 2088. Same if I'm gaming on 2100. So my max stable freq. on stock bios is 2088mhz only. Even though when stock mem 2139 is not a problem.
> 
> Undervolting is kinda neat though.^^


Did you get higher clocks / score / stability with "normal" voltage?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Whitechap3l*
> 
> Did you get higher clocks / score / stability with "normal" voltage?


Nope. With stock voltage (1.050V) the card would throttle. Never tried voltage increasement with stock bios due low TDP limit.

Undervolting showed a nice decrease in power usage. Will report back when I test more later tonight.


----------



## juniordnz

I believe I have found my sweetspot:

2126mhz / +500mem / 130%TDP / 1.031V

Tried with 1.025V but it was a no go. Still, it's a great improvement from the stock 1.062V and even more from 1.093 where I could get a maximum of 2138mhz. 62mv less at the expense of 13mhz (that I couldn't keep, since the card would throttle due to heat)

This Ctrol+L thing is nice.

Oh, and best score on FS also, 25200 graphics while at 1.093 I would get something like 24980. Stability checked with firestrike stress test (it works well for me).

And the new drivers are doing fine for me. No problem at all!


----------



## Whitechap3l

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Nope. With stock voltage (1.050V) the card would throttle. Never tried voltage increasement with stock bios due low TDP limit.
> 
> Undervolting showed a nice decrease in power usage. Will report back when I test more later tonight.


OK ;-)


----------



## juniordnz

Wait a sec...this Ctrl+L thing keeps the card at full bost even when there are no 3d applications open? Now that's a boomer...I thought it would force the max boost only in 3d apps...


----------



## Whitechap3l

Quote:


> Originally Posted by *juniordnz*
> 
> Wait a sec...this Ctrl+L thing keeps the card at full bost even when there are no 3d applications open? Now that's a boomer...I thought it would force the max boost only in 3d apps...


No it stays at the clock and voltage. Don't know if it hurts the GPU or if it's not that big of an deal


----------



## juniordnz

Quote:


> Originally Posted by *Whitechap3l*
> 
> No it stays at the clock and voltage. Don't know if it hurts the GPU or if it's not that big of an deal


Even though the card is not dealing with load drawing more watts, the clock stays at 2126mhz, voltage stays at 1.031V and idle temps are like 5 degrees higher.

That's a no no for me. Maybe it's nice for those who fold or whatever that keeps the card under stress all the time. But I don't want my card "all out" when I'm browsing the web.

I was so happy with the results


----------



## jorpe

Quote:


> Originally Posted by *juniordnz*
> 
> Wait a sec...this Ctrl+L thing keeps the card at full bost even when there are no 3d applications open? Now that's a boomer...I thought it would force the max boost only in 3d apps...


What is the ctrl+L thing? Is that in EVGA precision or Afterburner or am I missing something?


----------



## Whitechap3l

Quote:


> Originally Posted by *juniordnz*
> 
> Even though the card is not dealing with load drawing more watts, the clock stays at 2126mhz, voltage stays at 1.031V and idle temps are like 5 degrees higher.
> 
> That's a no no for me. Maybe it's nice for those who fold or whatever that keeps the card under stress all the time. But I don't want my card "all out" when I'm browsing the web.
> 
> I was so happy with the results


Yeah sucks ass I know...


----------



## GreedyMuffin

For me it dosen't matter. I'm either folding or gaming.









Will test more at 2100 on 1000mv as that seemed stable. That or 2000 at 900mv. Can't decide .


----------



## juniordnz

Well, I believe I got it sorted out.

Since you can't lock down using Ctrl+L. Just find out your max OC at stock voltage (1.062) and start going down in voltage until you are unstable. Mine was at 1.031V. So what I did was to set my max OC at 1.031V and flatlines it until 1.062.

That tells the card that it can go as high as your max OC but it can use whatever voltage is needed at the time. In firestrike, on more demanding parts of the test it would use 1.050 (not even the maximum) and on less demanding ones it will go down and use only 1.031V.

Maybe if theres something heavier, ti will go up to 1.062V. Gotta do more testings.

Just wanted to share...


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Well, I believe I got it sorted out.
> 
> Since you can't lock down using Ctrl+L. Just find out your max OC at stock voltage (1.062) and start going down in voltage until you are unstable. Mine was at 1.031V. So what I did was to set my max OC at 1.031V and flatlines it until 1.062.
> 
> That tells the card that it can go as high as your max OC but it can use whatever voltage is needed at the time. In firestrike, on more demanding parts of the test it would use 1.050 (not even the maximum) and on less demanding ones it will go down and use only 1.031V.
> 
> Maybe if theres something heavier, ti will go up to 1.062V. Gotta do more testings.
> 
> Just wanted to share...


Have you ever experienced positive results with the curve?

I haven't. Even once.

Seems like it only degrades performance.


----------



## Whitechap3l

Quote:


> Originally Posted by *nexxusty*
> 
> Have you ever experienced positive results with the curve?
> 
> I haven't. Even once.
> 
> Seems like it only degrades performance.


Curve gives me a lot more.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> Have you ever experienced positive results with the curve?
> 
> I haven't. Even once.
> 
> Seems like it only degrades performance.


Yes, actually using the curve gives me better results than offset. I can get a clock more with the curve than with the offset.

When I had the Armor it was the other way, better results with offset. IDK, maybe each chip will overclock better with a method than the other. But at least with FTW I get my better results with the curve.


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> Yes, actually using the curve gives me better results than offset. I can get a clock more with the curve than with the offset.
> 
> When I had the Armor it was the other way, better results with offset. IDK, maybe each chip will overclock better with a method than the other. But at least with FTW I get my better results with the curve.


Same here Gigabyte G1 using vf curve great performance and control


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Yes, actually using the curve gives me better results than offset. I can get a clock more with the curve than with the offset.
> 
> When I had the Armor it was the other way, better results with offset. IDK, maybe each chip will overclock better with a method than the other. But at least with FTW I get my better results with the curve.


Clock means nothing. I can do 2156mhz with the curve and the performance is worse....

Have any of you actually tested?

I doubt what I'm seeing is abnormal. Basically everyone says that they get less performance with the curve.

What makes you and that other guy different? Test your performance when using the curve vs your highest over clock without it.

I'll bet your highest offset OC is faster performance wise.


----------



## KillerBee33

Quote:


> Originally Posted by *nexxusty*
> 
> Clock means nothing. I can do 2156mhz with the curve and the performance is worse....
> 
> Have any of you actually tested?
> 
> I doubt what I'm seeing is abnormal. Basically everyone says that they get less performance with the curve.
> 
> What makes you and that other guy different? Test your performance when using the curve vs your highest over clock without it.
> 
> I'll bet your highest offset OC is faster performance wise.


Sound about right, i managed to lock it @ 2202 with Curve with no real performance gain


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> Clock means nothing. I can do 2156mhz with the curve and the performance is worse....
> 
> Have any of you actually tested?
> 
> I doubt what I'm seeing is abnormal. Basically everyone says that they get less performance with the curve.
> 
> What makes you and that other guy different? Test your performance when using the curve vs your highest over clock without it.
> 
> I'll bet your highest offset OC is faster performance wise.


Did it already. My max overclock with both curve and offset hovers around 2100-2114. With it's peak at 2126. With the curve I can get 25250 and with offset 25000 firestrike graphics. Also, with curve the card uses the whole 1.062V only when necessary, most of the time I'm at 1031, 1050 tops.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Did it already. My max overclock with both curve and offset hovers around 2100-2114. With it's peak at 2126. With the curve I can get 25250 and with offset 25000 firestrike graphics. Also, with curve the card uses the whole 1.062V only when necessary, most of the time I'm at 1031, 1050 tops.


Well congrats.... you'd be the first person I've seen say that the curve improved performance.

Lol.


----------



## juniordnz

Quote:


> Originally Posted by *nexxusty*
> 
> Well congrats.... you'd be the first person I've seen say that the curve improved performance.
> 
> Lol.


Just did it again here. Same everything but:

Curve with 2126mhz: 25.258
Offset with 2126mhz: 25.244

Pretty much the same. The difference being that the curve uses 1031-1050mv and offset uses 1062mv. Same score with less power. I'll take it









But I get what you're saying. With armor curve was horrible too. Just happens to be different with FTW. Don't ask me why...


----------



## GreedyMuffin

Lower-end curve gives me better performance than offset. Higher-end OC is better with offset.

At 2050 i get 23.8K. On 2139 i get 24.3K.


----------



## RedRumy3

I just played overwatch for about 5 hours with this overclock and everything was great but am I missing power limit or does the 1080 not have that?

I wonder if corsair is going to make a water cooler for the 1080 I haven't checked into other cooling but I don't like running this card at 100% fan speed to keep it at 60C while gaming for hours lol.


----------



## nexxusty

Quote:


> Originally Posted by *RedRumy3*
> 
> I just played overwatch for about 5 hours with this overclock and everything was great but am I missing power limit or does the 1080 not have that?
> 
> I wonder if corsair is going to make a water cooler for the 1080 I haven't checked into other cooling but I don't like running this card at 100% fan speed to keep it at 60C while gaming for hours lol.


Just make your own hybrid cooler. It's extremely easy.


----------



## juniordnz

Quote:


> Originally Posted by *RedRumy3*
> 
> am I missing power limit or does the 1080 not have that?
> ]


That's weird. All 1080s have TDP setting. You should be seeing yours there.

on-topic: just got my PR with a combination of off-set with curve. Max overclock set using offset to 2126mhz/1.031V and used the curve to flatline all the above voltages to 2126mhz. 25300 graphics.


----------



## RedRumy3

Quote:


> Originally Posted by *nexxusty*
> 
> Just make your own hybrid cooler. It's extremely easy.


Hmm looks like evga sells one for 1080 so I might just do that

http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1

hmm
Quote:


> Originally Posted by *juniordnz*
> 
> That's weird. All 1080s have TDP setting. You should be seeing yours there.


Yeah idk I noticed when I first got my card that I don't have that setting.


----------



## GanGstaOne

Finally i managed to find seller thats ships here Thermal Grizzly Conductonaut


----------



## LolCakeLazors

So I can get a stable 2088-2101 with my max voltage hitting 1.093. Not sure if I should dial back or not. Temps are fine and stay in the low 60s.

EDIT: Yeah I dialed it back and it's stable mostly at 1.075. Could probably lower it more. What are you guys using? Afterburner?


----------



## Bishop07764

Taking a stab at overclocking tonight. Pascal is a different beast alright. Get less stability with upping the voltage as in at all at least with the testing that I have done with Doom. Thinking I would surely have to up the voltage for overclocking I thought it might just be a poor clocker at least for Doom. Took voltage back to stock, and it's rocking an unwavering 2164 core and 555+ memory just fine in Doom now. And it also appears to be doing it at a lower voltage of 1.043 volts where I could swear that it was doing 1.064 volts or so last night on the stock gaming Z overlclock bios.









Edit: Have to up the core voltage to +75 to get Valley to pass at the same settings. Weird.
Quote:


> Originally Posted by *LolCakeLazors*
> 
> So I can get a stable 2088-2101 with my max voltage hitting 1.093. Not sure if I should dial back or not. Temps are fine and stay in the low 60s.
> 
> EDIT: Yeah I dialed it back and it's stable mostly at 1.075. Could probably lower it more. What are you guys using? Afterburner?


Yeah. I think most everyone is using Afterburner. It's working great for me. Boost 3.0 is taking some getting used to though. Looks like some pretty good clocks there.


----------



## cstarkey42

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah. I think most everyone is using Afterburner. It's working great for me. Boost 3.0 is taking some getting used to though. Looks like some pretty good clocks there.


I prefer Afterburner but I've figured out the you can use Precision XOC's pixel clock to OC your monitor while still using DSR. The DSR settings are at the monitor's default refresh rate but the native resolution keeps the OC refresh rate. It's annoying bc playing with Afterburner's curve is so much better than PXOC.


----------



## GanGstaOne

Quote:


> Originally Posted by *cstarkey42*
> 
> I prefer Afterburner but I've figured out the you can use Precision XOC's pixel clock to OC your monitor while still using DSR. The DSR settings are at the monitor's default refresh rate but the native resolution keeps the OC refresh rate. It's annoying bc playing with Afterburner's curve is so much better than PXOC.


Ya EVGA Precision X is not what it was


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> Even though the card is not dealing with load drawing more watts, the clock stays at 2126mhz, voltage stays at 1.031V and idle temps are like 5 degrees higher.
> 
> That's a no no for me. Maybe it's nice for those who fold or whatever that keeps the card under stress all the time. But I don't want my card "all out" when I'm browsing the web.
> 
> I was so happy with the results


Profiles?

Quote:


> Originally Posted by *nexxusty*
> 
> Well congrats.... you'd be the first person I've seen say that the curve improved performance.
> 
> Lol.


Curve is good for me too. I needed 1.2V to reach 2.2GHz and can score 25.5k graphics in FS with that, lucky to get near 24k with stock BIOS and need every bit of that 1.2V in x-flash to hit 25.5k. Curve flat-lines so if one doesn't adjust the curve we are left with the lower voltage of the start of the flat-line. Run up to 80C with this setting though so clocks drop to 2150 average I guess. Certainly diminishing returns with higher voltage and debatable whether worth it or not. Okay I guess for trying to achieve best benches.

Quote:


> Originally Posted by *RedRumy3*
> 
> I just played overwatch for about 5 hours with this overclock and everything was great but am I missing power limit or does the 1080 not have that?


Your using an old skin without power limit settings. Try a newer skin in preferences/options. An option to change percentages for actual Watts would have been nice IMO. BTW you can do power limit settings with the in-built nVidia nvsmi utility and do it in Watts









Quote:


> Originally Posted by *ralphi59*
> 
> In precisionx
> Manual Mode
> Click at 0.875 and +150
> One bar appear.
> If you only make 1 point, you stay at 0;875
> Dont forget to click apply
> *You can define the curve as you want.*


FYI the curve has 80 points, how many does Precision software allow to adjust?


----------



## nexxusty

Quote:


> Originally Posted by *ucode*
> 
> Profiles?
> Curve is good for me too. I needed 1.2V to reach 2.2GHz and can score 25.5k graphics in FS with that, lucky to get near 24k with stock BIOS and need every bit of that 1.2V in x-flash to hit 25.5k. Curve flat-lines so if one doesn't adjust the curve we are left with the lower voltage of the start of the flat-line. Run up to 80C with this setting though so clocks drop to 2150 average I guess. Certainly diminishing returns with higher voltage and debatable whether worth it or not. Okay I guess for trying to achieve best benches.
> Your using an old skin without power limit settings. Try a newer skin in preferences/options. An option to change percentages for actual Watts would have been nice IMO. BTW you can do power limit settings with the in-built nVidia nvsmi utility and do it in Watts
> 
> 
> 
> 
> 
> 
> 
> 
> FYI the curve has 80 points, how many does Precision software allow to adjust?


You don't have an FE correct?


----------



## ucode

I have a Galax FE, with stock cooling.


----------



## Whitechap3l

Quote:


> Originally Posted by *ucode*
> 
> Profiles?
> Curve is good for me too. I needed 1.2V to reach 2.2GHz and can score 25.5k graphics in FS with that, lucky to get near 24k with stock BIOS and need every bit of that 1.2V in x-flash to hit 25.5k. Curve flat-lines so if one doesn't adjust the curve we are left with the lower voltage of the start of the flat-line. Run up to 80C with this setting though so clocks drop to 2150 average I guess. Certainly diminishing returns with higher voltage and debatable whether worth it or not. Okay I guess for trying to achieve best benches.
> Your using an old skin without power limit settings. Try a newer skin in preferences/options. An option to change percentages for actual Watts would have been nice IMO. BTW you can do power limit settings with the in-built nVidia nvsmi utility and do it in Watts
> 
> 
> 
> 
> 
> 
> 
> 
> FYI the curve has 80 points, how many does Precision software allow to adjust?


On Air ?


----------



## jodasanchezz

Quote:


> Originally Posted by *ucode*
> 
> Profiles?
> Curve is good for me too. I needed 1.2V to reach 2.2GHz and can score 25.5k graphics in FS with that, lucky to get near 24k with stock BIOS and need every bit of that 1.2V in x-flash to hit 25.5k. Curve flat-lines so if one doesn't adjust the curve we are left with the lower voltage of the start of the flat-line. Run up to 80C with this setting though so clocks drop to 2150 average I guess. Certainly diminishing returns with higher voltage and debatable whether worth it or not. Okay I guess for trying to achieve best benches.
> Your using an old skin without power limit settings. Try a newer skin in preferences/options. An option to change percentages for actual Watts would have been nice IMO. BTW you can do power limit settings with the in-built nVidia nvsmi utility and do it in Watts
> 
> 
> 
> 
> 
> 
> 
> 
> FYI the curve has 80 points, how many does Precision software allow to adjust?


How do u Get 1.2v out of ur Card?
and waht is X Flash?

Can


----------



## ucode

Yes, on air with 90% effective fan speed from the x-flash. Need that fan blowing though. Not sure if I forgot to set it once or if it stopped by itself but after playing some not so demanding Skyrim for a couple of hours came out of the game to find 96C logged!. Never had the OSD on, didn't think I needed it, so not sure why that happened. Turns out 96C is the nVidia hard slowdown temp and 99C is the shutdown temp.

x-flash = cross flashing. Using a BIOS meant for a different board. Usually a good chance of bricking, dead card.


----------



## Whitechap3l

Quote:


> Originally Posted by *ucode*
> 
> Yes, on air with 90% effective fan speed from the x-flash. Need that fan blowing though. Not sure if I forgot to set it once or if it stopped by itself but after playing some not so demanding Skyrim for a couple of hours came out of the game to find 96C logged!. Never had the OSD on, didn't think I needed it, so not sure why that happened. Turns out 96C is the nVidia hard slowdown temp and 99C is the shutdown temp.


Would not be worth for me personally.. when you want some good clock with low temps get a waterblock or at least a hybrid solution









I get 25.6k scores with only 1.09V on water...


----------



## ucode

Yep, and some people one person gets over 26k. Others not so lucky







Personally I will probably just go back to basic on air, speed is good enough for what I use it for.


----------



## Whitechap3l

Quote:


> Originally Posted by *ucode*
> 
> Yep, and some people get over 26k. Others not so lucky
> 
> 
> 
> 
> 
> 
> 
> Personally I will probably just go back to basic on air, speed is good enough for what I use it for.


Sure true








But for me it isnt worth to hard mod my Card etc. to get maybe 500? Points in FS or equal 1 to 2 fps in games.
But different People different oppionions and Goals


----------



## VPII

So after I figured out that I needed to refit the hybrid cooler on my Evga Gtx 1080 Fe.

Right now it is back to max 47c when gaming Doom for 2 hours or so. When running FS or Spy about 36c. No bios mod and no voltage curve change and I can run 24/7 2139mhz core and 11200mhz memory but taken that I'm only gaming at 1080p I leave it stock and will oc for benching.

Sent from my SM-G925F using Tapatalk


----------



## ralphi59

Yep ucode look at this


----------



## ralphi59

I ve make only 3 points.
My load voltage is 0.893
You know what i mean now ?


----------



## ucode

Thanks for the pic, I can see 30 points on that and wanted to know if the other 50 points are there or not. Just curious, I guess I should download it and check. Tried AB a while back and it only does 40 of the 80.


----------



## ralphi59

I use precision x oc since the first beta.
Works great.
Excuse my french writing.
Excellent place here to exchange hardware tips and for upgrading my english.
Have a nice day man.


----------



## justinyou

Quote:


> Originally Posted by *tin0*
> 
> As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.
> 
> 
> 
> *Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).
> 
> When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


I have just flashed my MSI Gaming X to this Gaming Z bios, so far so good.
Will be performing some tests afterwards.
Thanks for sharing this amazing bios, as I can't find it from the MSI website.


----------



## shadow85

Was about to buy 2x G1 gaming 1080s, and EK FCGTX 1080 G1 blocks to slap on them, but realised on the ek website it says the New Nvidia Sli HB bridges are not compatible with these blocks but regualr SLi bridges can be used.

I thought the new HB SLi bridges are needed to reduce SLi stuttering. What should I do?


----------



## boredgunner

Quote:


> Originally Posted by *shadow85*
> 
> Was about to buy 2x G1 gaming 1080s, and EK FCGTX 1080 G1 blocks to slap on them, but realised on the ek website it says the New Nvidia Sli HB bridges are not compatible with these blocks but regualr SLi bridges can be used.
> 
> I thought the new HB SLi bridges are needed to reduce SLi stuttering. What should I do?


Any alternatives to EK, whether it's for the G1 Gaming or other GTX 1080s? If you're set on SLI, HB bridge should take priority.


----------



## GanGstaOne

Quote:


> Originally Posted by *shadow85*
> 
> Was about to buy 2x G1 gaming 1080s, and EK FCGTX 1080 G1 blocks to slap on them, but realised on the ek website it says the New Nvidia Sli HB bridges are not compatible with these blocks but regualr SLi bridges can be used.
> 
> I thought the new HB SLi bridges are needed to reduce SLi stuttering. What should I do?


Better get one Titan X Pascal and not deal with SLI problems


----------



## shadow85

Quote:


> Originally Posted by *boredgunner*
> 
> Any alternatives to EK, whether it's for the G1 Gaming or other GTX 1080s? If you're set on SLI, HB bridge should take priority.


Well I just pulled the trigger. Apparantly EK are making there own HB SLI bridges soon ready for their blocks.

Going to be interesting as this will be my first ever custom loop build aswell, and for 2x G1s on a dedicated loop.


----------



## Works4me

Quote:


> Originally Posted by *boredgunner*
> 
> Any alternatives to EK, whether it's for the G1 Gaming or other GTX 1080s? If you're set on SLI, HB bridge should take priority.


you have this little puppy :
http://www.aliexpress.com/store/product/VGA-water-block-n680-x-gtx680-gtx770-gtx670-gtx660-gtx760-graphics-card-water-block/431286_32250446082.html
i bought 2 for my MSI GTX 1080 GAMING ( the ones compatible for my cards not this one of course )
testing them later this week


----------



## stxe34

Quote:


> Originally Posted by *shadow85*
> 
> Well I just pulled the trigger. Apparantly EK are making there own HB SLI bridges soon ready for their blocks.
> 
> Going to be interesting as this will be my first ever custom loop build aswell, and for 2x G1s on a dedicated loop.


i just cut 3mm off the ends of the nvidia hb bridge. you cant notice!


----------



## jodasanchezz

TIme for some Testing


----------



## juniordnz

Quote:


> Originally Posted by *jodasanchezz*
> 
> TIme for some Testing


Nice, post back the results. btw, what case is that?


----------



## IronAge

Pretty in pink.


----------



## jodasanchezz

Quote:


> Originally Posted by *juniordnz*
> 
> Nice, post back the results. btw, what case is that?


Il will be back later with results.

The Case is an
http://www.phanteks.com/Enthoo-Evolv-ATX-TemperedGlass.html


----------



## LolCakeLazors

Ran Heaven overnight and it seems that it crashed. Looks like I'll have to do more tweaking. Really want at least 2088-21xx core clock though. Going to look into using Afterburner and not Precision X. Also is it normal for my FTW to cap its voltage at around 1.093? I set the TDP to 120% w/ 100% voltage use in Precision X and it always hit 1.093 and reported it hit the voltage limit. I don't know how some of you are hitting 1.2v.


----------



## juniordnz

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Ran Heaven overnight and it seems that it crashed. Looks like I'll have to do more tweaking. Really want at least 2088-21xx core clock though. Going to look into using Afterburner and not Precision X. Also is it normal for my FTW to cap its voltage at around 1.093? I set the TDP to 120% w/ 100% voltage use in Precision X and it always hit 1.093 and reported it hit the voltage limit. I don't know how some of you are hitting 1.2v.


It is normal. 1.093 is the limit. You can only get 1.2V using XOC BIOS which bricked my FTW.

Try switching to the slave BIOS with 130%TDP. And IME is best to use stock voltage. I can get 2126mhz with stock voltage (even undervolting to 1.031) so you should be able to get something similar too.


----------



## LolCakeLazors

Could I have a screenshot of your settings in Afterburner? I was OCing in Precision X and running Heaven (in 1440p because I have a 1440p monitor) and it would crash if I went any higher than around +80 Clock offset at stock voltages. Nothing like artifacting but it would crash the program and W10 would report that the GPU driver crashed and restarted it.

EDIT: I'll try switching to the Slave bios and see if that helps too.


----------



## juniordnz

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Could I have a screenshot of your settings in Afterburner? I was OCing in Precision X and running Heaven (in 1440p because I have a 1440p monitor) and it would crash if I went any higher than around +80 Clock offset at stock voltages. Nothing like artifacting but it would crash the program and W10 would report that the GPU driver crashed and restarted it.
> 
> EDIT: I'll try switching to the Slave bios and see if that helps too.


Not at home right now, but:

Voltage: 0% (leave it)
TDP%: 130%
TEMPERATURE: 92C
CLOCK: +121 (Press ctrl+F and see what clock is set to 1.062V at the curve. Mine is 2126)
MEMORY: +500

A full run of Firestrike Stress Test (about 10min) is a nice fast way to check stability.

Your clock will open at 2126. How much clock downs will depend on temperatures. I can get 2101mhz until 54-55C.


----------



## LolCakeLazors

Quote:


> Originally Posted by *juniordnz*
> 
> Not at home right now, but:
> 
> Voltage: 0% (leave it)
> TDP%: 130%
> TEMPERATURE: 92C
> CLOCK: +121 (Press ctrl+F and see what clock is set to 1.062V at the curve. Mine is 2126)
> MEMORY: +500
> 
> A full run of Firestrike Stress Test (about 10min) is a nice fast way to check stability.
> 
> Your clock will open at 2126. How much clock downs will depend on temperatures. I can get 2101mhz until 54-55C.


Thanks I'll try it when I get back home.


----------



## wsarahan

Hi guys how are you?

Are someone with Memory clock bug after the new Nvidia driver released yesterday?

372.54 Aniversary Update

The memory usage always appears a huge number something about 437890876

I already closed the rivatuner and afterburner opened again and nothing, restarted the pc and nothing also

Using 1080 SLI and 4.3 beta 4

EDIT: At Nvidia Foruns a guy with 980TI SLI has the same issue with new driver, i`m trying to get more info and more cases about the bug

EDIT 2 : Reverted to the old driver and everything is working like a charm again, so the issue is with the new Nvidia Driver and SLI owners as i tested here, another issue with the new driver is that the core frequency from the cards are lower than with older driver, with older driver by default i get 2063 and with the new driver with same scenario i get 2050 with afterburner

Enviado do meu iPhone usando Tapatalk


----------



## fat4l

New driver is no good. Unstable for me.


----------



## boredgunner

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> Are someone with Memory clock bug after the new Nvidia driver released yesterday?
> 
> 372.54 Aniversary Update
> 
> The memory usage always appears a huge number something about 437890876
> 
> I already closed the rivatuner and afterburner opened again and nothing, restarted the pc and nothing also
> 
> Using 1080 SLI and 4.3 beta 4
> 
> EDIT: At Nvidia Foruns a guy with 980TI SLI has the same issue with new driver, i`m trying to get more info and more cases about the bug
> 
> EDIT 2 : Reverted to the old driver and everything is working like a charm again, so the issue is with the new Nvidia Driver and SLI owners as i tested here, another issue with the new driver is that the core frequency from the cards are lower than with older driver, with older driver by default i get 2063 and with the new driver with same scenario i get 2050 with afterburner
> 
> Enviado do meu iPhone usando Tapatalk


Quote:


> Originally Posted by *fat4l*
> 
> New driver is no good. Unstable for me.


I guess I should avoid the new driver then? And NVIDIA is supposed to be the one with better drivers than AMD, lol. Although I knew this hasn't really been true for a while.


----------



## GreedyMuffin

New driver is stable here.


----------



## GanGstaOne

New driver great here too


----------



## wsarahan

Quote:


> Originally Posted by *boredgunner*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> Are someone with Memory clock bug after the new Nvidia driver released yesterday?
> 
> 372.54 Aniversary Update
> 
> The memory usage always appears a huge number something about 437890876
> 
> I already closed the rivatuner and afterburner opened again and nothing, restarted the pc and nothing also
> 
> Using 1080 SLI and 4.3 beta 4
> 
> EDIT: At Nvidia Foruns a guy with 980TI SLI has the same issue with new driver, i`m trying to get more info and more cases about the bug
> 
> EDIT 2 : Reverted to the old driver and everything is working like a charm again, so the issue is with the new Nvidia Driver and SLI owners as i tested here, another issue with the new driver is that the core frequency from the cards are lower than with older driver, with older driver by default i get 2063 and with the new driver with same scenario i get 2050 with afterburner
> 
> Enviado do meu iPhone usando Tapatalk
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> New driver is no good. Unstable for me.
> 
> Click to expand...
> 
> I guess I should avoid the new driver then? And NVIDIA is supposed to be the one with better drivers than AMD, lol. Although I knew this hasn't really been true for a while.
Click to expand...

For sli users are not ok at all, Afterburner is buggy like a hell, memory usage bugged, core is getting lower with this driver (some boost issue?)

Anyone with same issues?

Enviado do meu iPhone usando Tapatalk


----------



## juniordnz

No problems with the new drivers here. Didn't lose or win anything with it.


----------



## raidflex

Quote:


> Originally Posted by *juniordnz*
> 
> No problems with the new drivers here. Didn't lose or win anything with it.


Same here, I did find the DPC latency has improved also.


----------



## metal409

Quote:


> Originally Posted by *nexxusty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *juniordnz*
> 
> Yes, actually using the curve gives me better results than offset. I can get a clock more with the curve than with the offset.
> 
> When I had the Armor it was the other way, better results with offset. IDK, maybe each chip will overclock better with a method than the other. But at least with FTW I get my better results with the curve.
> 
> 
> 
> Clock means nothing. I can do 2156mhz with the curve and the performance is worse....
> 
> Have any of you actually tested?
> 
> I doubt what I'm seeing is abnormal. Basically everyone says that they get less performance with the curve.
> 
> What makes you and that other guy different? Test your performance when using the curve vs your highest over clock without it.
> 
> I'll bet your highest offset OC is faster performance wise.
Click to expand...

I just did some testing of my own on this, using the curve does give me a boost on my MSI FE.

Stock settings - Graphics Score 23,542
http://www.3dmark.com/fs/9811310

Core Offset +190 (2101mhz), Memory +500 - Graphics Score 24,404
http://www.3dmark.com/fs/9811358

Curve OC 2202Mhz @ 1.062v, Memory +500 - Graphics Score 24,856
http://www.3dmark.com/fs/9811467


----------



## GreedyMuffin

NVM. New driver is NOT ok.

I got green dots on my screen after playing 5 min of BF4. Before many hours didn't crash?

And dots is usually mem, right?


----------



## fat4l

Quote:


> Originally Posted by *GreedyMuffin*
> 
> NVM. New driver is NOT ok.
> 
> I got green dots on my screen after playing 5 min of BF4. Before many hours didn't crash?
> 
> And dots is usually mem, right?


liek I said, my OC is not stable with this driver. so....no good


----------



## Bishop07764

Quote:


> Originally Posted by *justinyou*
> 
> I have just flashed my MSI Gaming X to this Gaming Z bios, so far so good.
> Will be performing some tests afterwards.
> Thanks for sharing this amazing bios, as I can't find it from the MSI website.


I concur. This has been an excellent bios for me as well on my Seahawk Ek. It boosts fixed to about 2070 without me doing anything. Still trying to feel out the max overclock. It's looking like it might be 2164 or so. Let us know how it works for you. I might have to try the curve. Just been using the offset so far.


----------



## GanGstaOne

Gigabyte1080XtremeGamingReviewSampleUpdatedBIOS.zip 251k .zip file


----------



## Whitechap3l

Holy **** that new driver gives me aids...
By the way ;Someone knows if it is harmful to use the boost configuration in afterburner so that your card is constantly at 1.09v and let's say 2170mhz?


----------



## cstarkey42

I spent 30 minutes after switching SSDs trying to figure out why FO3 kept crashing when loading a new/save game only to discover it was the new driver. Rolled back and it works again. Nvidia Inspector also has trouble with it, though there is an update that fixes that problem.


----------



## GanGstaOne

Quote:


> Originally Posted by *Whitechap3l*
> 
> Holy **** that new driver gives me aids...
> By the way ;Someone knows if it is harmful to use the boost configuration in afterburner so that your card is constantly at 1.09v and let's say 2170mhz?


dont know for 1080 but i used same option in precision on my 980 never had a problem this thing is useful in game to maintain constant clock
Quote:


> Originally Posted by *cstarkey42*
> 
> I spent 30 minutes after switching SSDs trying to figure out why FO3 kept crashing when loading a new/save game only to discover it was the new driver. Rolled back and it works again. Nvidia Inspector also has trouble with it, though there is an update that fixes that problem.


update for nvidia inspector ??


----------



## GreedyMuffin

Yep. Same here. Old driver gave me no crashes anymore.

Downclocked my CPU for fun. 4000mhz 1030V, and GPU running om 2050/0.950V. Should draw less power.

Dunno if i miss alot of FPS compared to 4500. We'll see.

The temp is alot colder as well.


----------



## Whitechap3l

Maybe strix t4 bios works now better :-D


----------



## cstarkey42

Quote:


> Originally Posted by *GanGstaOne*
> 
> update for nvidia inspector ??


https://forums.geforce.com/default/topic/957810/v372-54-nvidia-inspector-profile-editor-crashing-update-here-/?offset=4


----------



## GanGstaOne

Quote:


> Originally Posted by *cstarkey42*
> 
> https://forums.geforce.com/default/topic/957810/v372-54-nvidia-inspector-profile-editor-crashing-update-here-/?offset=4


thanks but this is just the profile inspector not the whole program guess they stop updating the program after v1.9.7.6


----------



## juniordnz

Let me correct myself: the new driver is not ok here either.

I get sudden crashes on R6S with the same settings I was rock solid stable before.

Going from 1.031V to 1.062V at the same clock seems to solve the crashes, but no thanks. Rolling back here...


----------



## cstarkey42

Quote:


> Originally Posted by *GanGstaOne*
> 
> thanks but this is just the profile inspector not the whole program guess they stop updating the program after v1.9.7.6


Just the profile inspector? That's like 99% of the reason I even use NI.


----------



## GanGstaOne

Same bios different drivers

368.95
this is actualy 368.95 just the nvidia reports it 372.54 cause was the last installed driver
but the card was running 368.95 the different version come from this when changing bios
and installing drivers after that with different bios ones returnet to the previous bios the
card uses the older driver



372.54



Power Limit is changed by 50W


----------



## EmerilLIVE

Just got my new EVGA GTX 1080 FTW in today. Unfortunately it has massive coil whine. I've opened a ticket with support & suppose I'll have to RMA. Has anyone else gotten a new EVGA GTX 1080 FTW with bad coil whine?


----------



## LolCakeLazors

Quote:


> Originally Posted by *EmerilLIVE*
> 
> Just got my new EVGA GTX 1080 FTW in today. Unfortunately it has massive coil whine. I've opened a ticket with support & suppose I'll have to RMA. Has anyone else gotten a new EVGA GTX 1080 FTW with bad coil whine?


It's not massive but it's there. I actually RMAed my 1080 FTW in hopes of getting a non-coil whine 1080 but this one has coil whine too. I just settled for it since it's hard to hear with the case closed and fans running.


----------



## justinyou

Quote:


> Originally Posted by *Bishop07764*
> 
> I concur. This has been an excellent bios for me as well on my Seahawk Ek. It boosts fixed to about 2070 without me doing anything. Still trying to feel out the max overclock. It's looking like it might be 2164 or so. Let us know how it works for you. I might have to try the curve. Just been using the offset so far.


The truth is, I am a lazy man, I try not to use the MSI AfterBurner to OC the Gaming X card before I start playing a game, that's why I flashed it to the Gaming Z bios.
Now, when the game starts, the Gaming Z bios will automatically boost the card's core clock to 2025Mhz, and this will not happen with the original Gaming X bios.

I will try doing some manual OC when I have some free time during the weekend, but for now, I am totally enjoying the performance of the card.








Am finishing the Rise of The Tomb Raider, which have great game details and will start playing the Witcher3 after finishes the ROTR.


----------



## juniordnz

Well, rolled back to 268.95 and I'm still getting crashes within minutes when playing R6S undervolted to 1.031. That's odd, did some pretty heavy testing with theses settings yesterday, including a full hour or more of R6S, and it went perfectly fine. I just don't get it...

undervolting had saved me some 10% TDP and kepot 2 degrees lower


----------



## EmerilLIVE

Quote:


> Originally Posted by *LolCakeLazors*
> 
> It's not massive but it's there. I actually RMAed my 1080 FTW in hopes of getting a non-coil whine 1080 but this one has coil whine too. I just settled for it since it's hard to hear with the case closed and fans running.


UGH. Really hoping to get one with no coil whine. My computer is pretty quiet with a water cooling loop & I'm planning on getting the EK water block for the EVGA GTX 1080 FTW so I'm hoping for very little noise. My Gigabyte GTX 1070 G1 Gaming has no coil whine whatsoever and the computer is very quiet with it installed.


----------



## Bishop07764

Quote:


> Originally Posted by *justinyou*
> 
> The truth is, I am a lazy man, I try not to use the MSI AfterBurner to OC the Gaming X card before I start playing a game, that's why I flashed it to the Gaming Z bios.
> Now, when the game starts, the Gaming Z bios will automatically boost the card's core clock to 2025Mhz, and this will not happen with the original Gaming X bios.
> 
> I will try doing some manual OC when I have some free time during the weekend, but for now, I am totally enjoying the performance of the card.
> 
> 
> 
> 
> 
> 
> 
> 
> Am finishing the Rise of The Tomb Raider, which have great game details and will start playing the Witcher3 after finishes the ROTR.


I can relate to that. With my old 780 I had to use rbby tool in combination with Afterburner to set my clocks every time. I'm loving just letting the card automatically boost to 2070+. Ridiculously easy to flash to with the MSI batch file too.


----------



## LolCakeLazors

Looks like I can't keep a core clock boost greater than 87. Heaven black screens/crashes any higher. It's currently on slave BIOS with 130% TDP, 92 temp limit, +87 core clock, and +400 memory clock. Haven't touched voltage but it levels out at 1.050V after a while. From there it's just 2076 all the way.


----------



## jodasanchezz

So after not much time Testing with the 1080 Classi im dissapointes atm.

Testen Normal BIOS ans LN
Seems to be the Same 2 BIOS as 1080 FTW.

Scores are Bad and clocks are low...
Im not at home but i will schow resaults in the evenin (EU Time).

MAX Clocks i can get stabel at are 2037mhz

100% Voltage
+38Core
+500Mem
100%Fan

If tried to let Voltage untouched and reched the same Speeds....

I Use Afterburner.
Any advice how i can try to push the Card more


----------



## Bishop07764

Quote:


> Originally Posted by *jodasanchezz*
> 
> So after not much time Testing with the 1080 Classi im dissapointes atm.
> 
> Testen Normal BIOS ans LN
> Seems to be the Same 2 BIOS as 1080 FTW.
> 
> Scores are Bad and clocks are low...
> Im not at home but i will schow resaults in the evenin (EU Time).
> 
> MAX Clocks i can get stabel at are 2037mhz
> 
> 100% Voltage
> +38Core
> +500Mem
> 100%Fan
> 
> If tried to let Voltage untouched and reched the same Speeds....
> 
> I Use Afterburner.
> Any advice how i can try to push the Card more


I'm still in the process of dialing my clocks in myself. Pascal is extremely temperature sensitive. What are your temps? Does it boost higher and then settle at 2037? The colder the card is the more consistent and stable your clocks will be. I would try seeing what you can get without adding any voltage. Try dialing back memory and adding more to the core to see if that helps. Be sure to max the power limit so that you aren't voltage and power starved.


----------



## MrDerrikk

Hey guys, I've been reading since the start of the thread but missed the last 40 pages or so due to Uni starting. My EVGA FTW GTX 1080 has finally, finally arrived though (only took 3 and a half months!) so I tried out a little overclocking. Sadly, it seems that I can only push the core clock by 77 before large amounts of stuttering in Heaven at 2560x1080 occur, resulting in a clock speed of only 2062. I'm really hoping I'm missing something here, as I am a first time overclocker and all, below are some captures whilst running and after the benchmark:


Spoiler: Warning: Spoiler!








PS: I haven't played around with memory much yet, so don't pick on me too much for that.


----------



## justinyou

Quote:


> Originally Posted by *Bishop07764*
> 
> I concur. This has been an excellent bios for me as well on my Seahawk Ek. It boosts fixed to about 2070 without me doing anything. Still trying to feel out the max overclock. It's looking like it might be 2164 or so. Let us know how it works for you. I might have to try the curve. Just been using the offset so far.


I just checked the price for both the Gaming X and Gaming Z in newegg, and guess what, the MSI 1080 Gaming Z is selling at UDS1049, just ridiculous








The price of the Gaming X is only at USD859. Transforming my card to the Gaming Z version just by flashing the bios, I have save myself USD190


----------



## jodasanchezz

Quote:


> Originally Posted by *Bishop07764*
> 
> I'm still in the process of dialing my clocks in myself. Pascal is extremely temperature sensitive. What are your temps? Does it boost higher and then settle at 2037? The colder the card is the more consistent and stable your clocks will be. I would try seeing what you can get without adding any voltage. Try dialing back memory and adding more to the core to see if that helps. Be sure to max the power limit so that you aren't voltage and power starved.


My temps are around 60-62°
100%fan

The Clocks go up to 2050 and after warming up sttle at 2037.

When i Try hitting 2050+ (2067 for my card) Heaven crashes after a fiew secounds....
Thw LN2 BIOS hast 130% Powertarget.

I tried Afterburner and precision x, i think i Need more time for testing but most People are eable 2100 i think...


----------



## gingerbreadman

Hi guys, I'm deciding if I should get a MSI 1080 gaming or gaming z

Is it the same if I OC the gaming compared to a gaming z?


----------



## justinyou

Quote:


> Originally Posted by *gingerbreadman*
> 
> Hi guys, I'm deciding if I should get a MSI 1080 gaming or gaming z
> 
> Is it the same if I OC the gaming compared to a gaming z?


Mate, if you want a MSI 1080, then i recommend the Gaming X, this can save you some money.
Like Bishop07764 and myself, we have the MSI 1080 Gaming X card but have transformed it to become Gaming Z. This can be really easily done by flashing the bios.


----------



## gingerbreadman

Quote:


> Originally Posted by *justinyou*
> 
> Mate, if you want a MSI 1080, then i recommend the Gaming X, this can save you some money.
> Like Bishop07764 and myself, we have the MSI 1080 Gaming X card but have transformed it to become Gaming Z. This can be really easily done by flashing the bios.


Cheers! Is it true that the z can be OC more than the x, or it's just down to luck?


----------



## justinyou

Quote:


> Originally Posted by *gingerbreadman*
> 
> Cheers! Is it true that the z can be OC more than the x, or it's just down to luck?


As far as i can tell, both the Gaming X and Z are having the same pcb board design and using the same components like the capacitors, chokes and etc (with the exception of RGB lighting at the backplate).
So, unless someone can tell me MSI is performing some sort of binned process to hand pick the gpu chip for the Gaming Z, they should OC about the same.


----------



## Thetbrett

looks like my Zotac has gone missing in transit. DHL are looking into it, but not feeling good about it. What's Amazon like with this kind of thing?


----------



## toncij

Quote:


> Originally Posted by *Thetbrett*
> 
> looks like my Zotac has gone missing in transit. DHL are looking into it, but not feeling good about it. What's Amazon like with this kind of thing?


It can't be missing actually. There is tracking. Amazon will refund you if it can't be found. It's being sent from their end so it's on them and the courier.


----------



## ucode

Quote:


> Originally Posted by *GanGstaOne*
> 
> Power Limit is changed by 50W


Default is 250W (100%). If you had set 120% previously (300W) it may have reset to 100% when new driver installed. 375W limit should mean most users won't need the resistor mod. Does AB allow you to set 375W (150%)? You can also set this via nvsmi, "nvidia-smi.exe -pl 375"


----------



## GanGstaOne

Quote:


> Originally Posted by *ucode*
> 
> Default is 250W (100%). If you had set 120% previously (300W) it may have reset to 100% when new driver installed. 375W limit should mean most users won't need the resistor mod. Does AB allow you to set 375W (150%)? You can also set this via nvsmi, "nvidia-smi.exe -pl 375"


Yes i can set 150% only bios that allow this is Gigabyte 1080 Xtreme


----------



## Whitechap3l

Quote:


> Originally Posted by *GanGstaOne*
> 
> Yes i can set 150% only bios that allow this is Gigabyte 1080 Xtreme


Did you also get a Performance boost ? Because that is the whole Problem.. You can increase Voltages TDP Clocks but when you gain nothing it is more or less worthless


----------



## GanGstaOne

Quote:


> Originally Posted by *Whitechap3l*
> 
> Did you also get a Performance boost ? Because that is the whole Problem.. You can increase Voltages TDP Clocks but when you gain nothing it is more or less worthless


Well i didnt actualy check that at max clocks but at the same clocks with G1 bios and Xtreme 2088Mhz i do get little boost in bench and much higher tdp limit


----------



## Whitechap3l

Quote:


> Originally Posted by *GanGstaOne*
> 
> Well i didnt actualy check that at max clocks but at the same clocks with G1 bios and Xtreme 2088Mhz i do get little boost in bench and much higher tdp limit


And what Card are you actually using ?


----------



## jodasanchezz

Average Clockspeeds for 1080 :

Hi guys is is correct to say average clock Speed are around 2100 for oc?

Hav seen a lot People achiving 2164mhz @1.09V
What should be possile ?

Im eable to send my 1080classi back within 14day if im not happy.
ATM max stable is ~2038mhz

Nedd some more time to test but Looks not good to me


----------



## Whitechap3l

Quote:


> Originally Posted by *jodasanchezz*
> 
> Average Clockspeeds for 1080 :
> 
> Hi guys is is correct to say average clock Speed are around 2100 for oc?
> 
> Hav seen a lot People achiving 2164mhz @1.09V
> What should be possile ?
> 
> Im eable to send my 1080classi back within 14day if im not happy.
> ATM max stable is ~2038mhz
> 
> Nedd some more time to test but Looks not good to me


Around 2000 - 2100 is I think the average on air I guess ( I am on water getting 2177Mhz with FE Bios )
You have to understand that with this 1080's clocks dont seem to be anything..

You can get higher clocks with e.g Strix t4 BIOS with no TDP and 1.2V BIOS ( I hit around 2260mhz) BUT no real performence boost over a FE BIOS with 1.08 or 1.09V and 2170mhz..

So the real issue is to convert the more tdp and voltages into Performance I guess..


----------



## jodasanchezz

Quote:


> Originally Posted by *Whitechap3l*
> 
> Around 2000 - 2100 is I think the average on air I guess ( I am on water getting 2177Mhz with FE Bios )
> You have to understand that with this 1080's clocks dont seem to be anything..
> 
> You can get higher clocks with e.g Strix t4 BIOS with no TDP and 1.2V BIOS ( I hit around 2260mhz) BUT no real performence boost over a FE BIOS with 1.08 or 1.09V and 2170mhz..
> 
> So the real issue is to convert the more tdp and voltages into Performance I guess..


Thanks for the Advice !


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> It is normal. 1.093 is the limit. You can only get 1.2V using XOC BIOS which bricked my FTW.
> 
> Try switching to the slave BIOS with 130%TDP. And IME is best to use stock voltage. I can get 2126mhz with stock voltage (even undervolting to 1.031) so you should be able to get something similar too.


The XOC Bios bricked your FTW? Like completely or just the bios you flashed it to?


----------



## shadow85

So what generally overclocls better and is more stable out of a G1 gaming and the EVGA SC ACX models?

I already ordered the G1 but am awaiting stock for it. But everywhere I look, there seems to be alot of SC ACX already instock and feeling tempted to just go for that so I can get the card sooner.

They will be going under water.

EDIT: I forgot to mention I am in Australia so I can't order the G1 from newegg, so I amawaiting Amazon stock.

Don't want to buy locally because I will be paying an extra $200AUD


----------



## GanGstaOne

Quote:


> Originally Posted by *Whitechap3l*
> 
> And what Card are you actually using ?


Gigabyte 1080 G1 Gaming


----------



## Bishop07764

Quote:


> Originally Posted by *justinyou*
> 
> I just checked the price for both the Gaming X and Gaming Z in newegg, and guess what, the MSI 1080 Gaming Z is selling at UDS1049, just ridiculous
> 
> 
> 
> 
> 
> 
> 
> 
> The price of the Gaming X is only at USD859. Transforming my card to the Gaming Z version just by flashing the bios, I have save myself USD190


Yeah! There appears to be a reason that this is not posted on the MSI website any longer. Can only find the bios on their site for the regular gaming x.







Thankfully, mine appears to have zero coil whine just like my 780 Lightning. But those prices.







I paid less than 800 usd for my seahawk ek.
Quote:


> Originally Posted by *jodasanchezz*
> 
> My temps are around 60-62°
> 100%fan
> 
> The Clocks go up to 2050 and after warming up sttle at 2037.
> 
> When i Try hitting 2050+ (2067 for my card) Heaven crashes after a fiew secounds....
> Thw LN2 BIOS hast 130% Powertarget.
> 
> I tried Afterburner and precision x, i think i Need more time for testing but most People are eable 2100 i think...


I wouldn't say that most people are getting 2100. There seem to be more people with founders editions that are hitting those clocks. I wanted to go the cheapest route possible for mine and got one with a waterblock pre-installed + I was able to use a coupon on Newegg and knocked it further off it's msrp price. I did some further testing with Doom last night on mine, and I get a few red streaks in some levels at 2164. They seemed to go away after lowering to 2154 but this might not be stable in other games i.e Witcher 3 seems to be a good test or ROTR. Overclocking these is a bit different. I originally thought mine was going to clock terribly because with the default bios and trying a quick overclock in Doom it only did like 2025. My problem was that Doom didn't like me upping the voltage at all. It's still a complete crap shoot/ lottery with these things. I would be disappointed as well since it's a classified. If you are unhappy and able to exchange it, then you might get a better one next time. Not guaranteed though. If it's better, it might only go to 2050 or something. Can you send it back for any reason?

Quote:


> Originally Posted by *gingerbreadman*
> 
> Hi guys, I'm deciding if I should get a MSI 1080 gaming or gaming z
> 
> Is it the same if I OC the gaming compared to a gaming z?


The gaming Z bios has worked amazingly well for me as well as justinyou. It brings my default boost to almost 2080 and overclocks the memory slightly as well. There may be some type of binning possibly going on with them as I would think they have to guarantee the Z cards to be able to hit the OC bios default on them; but the risk for flashing should be minimal since it's the exact same pcb and everything from what we can tell. You just miss out on the rgb backplate. I wouldn't pay extra for the Z myself.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> The XOC Bios bricked your FTW? Like completely or just the bios you flashed it to?


Just the BIOS I flashed on. Had to switch to the slave, boot into windows, change the switch to master and flash over it. But since I've done that, it appears whatever I flash using nvflash is copied over both master and slave bios. No matter where the switch is at.

Any Ideas on that? nvflash is overwriting both BIOS...


----------



## Deders

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Nope they dont have the GLH bios only GS, I was hoping maybe someone here would have the GLH.


I could upload a GLH bios. Pm me in a few hours to remind me. I should get a chance later on.


----------



## fat4l

Quote:


> Originally Posted by *jodasanchezz*
> 
> Average Clockspeeds for 1080 :
> 
> Hi guys is is correct to say average clock Speed are around 2100 for oc?
> 
> Hav seen a lot People achiving 2164mhz @1.09V
> What should be possile ?
> 
> Im eable to send my 1080classi back within 14day if im not happy.
> ATM max stable is ~2038mhz
> 
> Nedd some more time to test but Looks not good to me


Getting 2202MHz 1.09V on water + tdp hard mod and passing all benches. Having it set 2190MHz for gaming and no issues at all.
I'm hoping for bios editor soon, so we can get 1.15v on water(at least) and 2250MHz+

Usually it clocks between 2050-2150MHz. I would keep any card that cant reach 2100MHz..


----------



## fjordiales

@Bishop07764, @justinyou, @gingerbreadman, I'm leaning towards the part where they do test the chips on which passes or is acceptable to be X or Z. I have the REGULAR 1080 gaming(1620 core) and both X and Z bios work. The issue I'm having is OC bios for Z is not stable, better yet 2025+ clock is not stable. Z gaming bios is fine, all X bios is fine.

I can't really complain since I got this as open box. I think the previous owner thought it was gaming X since it looks exactly the same but not the clocks. I inspected the cards and no blemishes or fingerprints at all. The fan sticker is even intact. Not bad for $560 just to have a 1080 especially with the supply/demand pricing of the card. Just didn't make the cut for Z.


----------



## LolCakeLazors

God I just noticed how loud my coil whine is once I overclock it. The voltage hits 1.050 and it sounds like a wire is stuck in a fan. I don't know if its the GPUs I've been getting since my last 1080 FTW had coil whine and my old 290X Lightning had coil whine as well. Might and go buy a EVGA P2 or something to see if my EVGA G2 isn't up to par.


----------



## GanGstaOne

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Nope they dont have the GLH bios only GS, I was hoping maybe someone here would have the GLH.


Gainward 1080 GLH BIOS:

https://www.techpowerup.com/vgabios/184684/184684


----------



## Deders

Quote:


> Originally Posted by *GanGstaOne*
> 
> Gainward 1080 GLH BIOS:
> 
> https://www.techpowerup.com/vgabios/184684/184684


That's the same bios number as my glh.


----------



## kx11

i uploaded my HOF bios to Techpowerup database a month ago and they didn't upload it yet


----------



## Bishop07764

Quote:


> Originally Posted by *fjordiales*
> 
> @Bishop07764, @justinyou, @gingerbreadman, I'm leaning towards the part where they do test the chips on which passes or is acceptable to be X or Z. I have the REGULAR 1080 gaming(1620 core) and both X and Z bios work. The issue I'm having is OC bios for Z is not stable, better yet 2025+ clock is not stable. Z gaming bios is fine, all X bios is fine.
> 
> I can't really complain since I got this as open box. I think the previous owner thought it was gaming X since it looks exactly the same but not the clocks. I inspected the cards and no blemishes or fingerprints at all. The fan sticker is even intact. Not bad for $560 just to have a 1080 especially with the supply/demand pricing of the card. Just didn't make the cut for Z.


Thats an awesome deal. I think getting a gaming X card would give you a pretty reasonable chance of being successful and stable with a gaming Z flash. Not guaranteed though. I wasn't sure that mine would handle the default oc mode on the Z bios. It's handled it fine and more. I've been quite pleased.


----------



## GanGstaOne

Quote:


> Originally Posted by *kx11*
> 
> i uploaded my HOF bios to Techpowerup database a month ago and they didn't upload it yet


Its uploaded to unverified uploads in NVIDIA uploads they only upload bios files they personally check

Edit: there are 125 1080 bios files in unverified uploads and only 25 bios files in NVIDIA verified by techpowerup uploads so if you want 1080 bios check the unverified ones there are all sorts of bios files


----------



## GanGstaOne

Quote:


> Originally Posted by *fat4l*
> 
> Getting 2202MHz 1.09V on water + tdp hard mod and passing all benches. Having it set 2190MHz for gaming and no issues at all.
> I'm hoping for bios editor soon, so we can get 1.15v on water(at least) and 2250MHz+
> 
> Usually it clocks between 2050-2150MHz. I would keep any card that cant reach 2100MHz..


You can get more then 1.15v even now with different bios like Strix T4 or Gigabyte Xtreme both are able to give you 1.2v the Xtreme has max tdp of 375w


----------



## ucode

Quote:


> Originally Posted by *GanGstaOne*
> 
> Edit: there are 125 1080 bios files in unverified uploads and only 25 bios files in NVIDIA verified by techpowerup uploads so if you want 1080 bios check the unverified ones there are all sorts of bios files


Be aware same BIOS with different serial number will register on TPU as a different hash and subsequently a different BIOS but they are essentially the same. Even with the same serial number they can be different if flashed back with a different flash version.


----------



## GanGstaOne

Quote:


> Originally Posted by *ucode*
> 
> Be aware same BIOS with different serial number will register on TPU as a different hash and subsequently a different BIOS but they are essentially the same. Even with the same serial number they can be different if flashed back with a different flash version.


I know they have several same version bios files but they are different as build date or gigabyte number but so far i only get better results with Xtreme and Xtreme Waterforce bios on my G1

EDIT: Thats why i want to ask all here with Gigabyte 1080 Xtreme and Waterforce cards to please upload their bios to techpowerup
Thanks


----------



## LolCakeLazors

Anyone recommend any good Platinum PSUs? Going to order one off of Newegg and see if it helps the coil whine. I have Premier so returns are hassle-free. Currently eyeing the 850 P2.


----------



## fjordiales

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Anyone recommend any good Platinum PSUs? Going to order one off of Newegg and see if it helps the coil whine. I have Premier so returns are hassle-free. Currently eyeing the 850 P2.


Evga is my top choice especially they have 10 year warranty for p2. It's made by superflower. Season if is also a great choice. I believe corsair is also good.

But with the choices mentioned, pick the best price once you narrow it down.


----------



## boredgunner

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Anyone recommend any good Platinum PSUs? Going to order one off of Newegg and see if it helps the coil whine. I have Premier so returns are hassle-free. Currently eyeing the 850 P2.


Several great choices:

EVGA P2 is great, especially for the price. Reliable, top notch performer, 10 year warranty, EVGA customer service is legendary. The only downside is that it relies on capacitors installed on the cables to get excellent ripple performance. These cables still aren't excessively bulky though, but if something happens to the caps that's not good. On the plus side, they reduce the length of the PSU which matters in some cases.

Seasonic Platinum/Snow Silent delivers similar performance and the same warranty. Slightly more ripple but still barely any, no capacitors on the cables. I don't think their modular interface is quite as good as EVGA, but good enough.

Antec HCP Platinum is another top notch choice, but typically overpriced. They're about as good as it gets for PSUs and don't rely on capacitors being installed on the cables, but they are very long because Delta overbuilds these things. Only the 750W and 1300W were worth the asking price before. The 750W is discontinued now it seems.

Corsair AX = Seasonic Platinum so more excellence. Corsair AXi is also top tier but typically overpriced, I'd buy Seasonic PRIME instead for that Titanium level efficiency. I would not buy Corsair RMi and RMx which aren't even Platinum rated, since they're CWT built and CWT soldering and QC are still far from perfect.

Go with Seasonic PRIME if in stock and if you can afford it. Not Platinum, but higher.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Anyone recommend any good Platinum PSUs? Going to order one off of Newegg and see if it helps the coil whine. I have Premier so returns are hassle-free. Currently eyeing the 850 P2.


I built this new 2x1080 rig with the EVGA SuperNOVA 850w P2 platinum. Solid PSU, plenty of modular cables, dead silent with fan operating. 850w is probably a lot for today's systems, but it's always nice to have some overhead. I don't think I've seen 500w yet on my kill-a-watt. Also the price was right at newegg. I considered most of the other PSUs, particularly the Seasonic, but I just couldn't justify the extra price when these EVGAs have had such solid reviews, and as always have excellent customer service through EVGA.

I need to get some good pictures of my rig up, I'm really happy with how it turned out.


----------



## LolCakeLazors

So due to shipping, I can either get the AX760 or the P2 850 for the same price... Tough decision.


----------



## Kenshiro 26

What about BeQuiet! PSU's?

Was thinking about one for a HTPC build.


----------



## jedimasterben

Seasonic or bust.


----------



## LolCakeLazors

Quote:


> Originally Posted by *jedimasterben*
> 
> Seasonic or
> Quote:
> 
> 
> 
> Originally Posted by *jedimasterben*
> 
> Seasonic or bust.
> 
> 
> 
> I guess the Corsair it is (since it's manufactured by Seasonic). The straight up Seasonic models cost wayy too much.
Click to expand...


----------



## jedimasterben

Quote:


> Originally Posted by *LolCakeLazors*
> 
> I guess the Corsair it is (since it's manufactured by Seasonic). The straight up Seasonic models cost wayy too much.


I'm pretty sure Seasonic has not manufactured Corsair's PSUs in quite a while. I know their initial units were, but not subsequent ones, because some of them haven't been that great. Seasonic is expensive, but that's what happens when you buy the best of the best.


----------



## LolCakeLazors

Quote:


> Originally Posted by *jedimasterben*
> 
> I'm pretty sure Seasonic has not manufactured Corsair's PSUs in quite a while. I know their initial units were, but not subsequent ones, because some of them haven't been that great. Seasonic is expensive, but that's what happens when you buy the best of the best.


Ah screw it I'll take the plunge and get the Seasonic Prime.


----------



## jedimasterben

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Ah screw it I'll take the plunge and get the Seasonic Prime.


#winning


----------



## LolCakeLazors

Quote:


> Originally Posted by *jedimasterben*
> 
> #winning


Don't worry. I'm returning this if it doesn't fix the coil whine


----------



## Bishop07764

Replaced my old antec 900 watt gamer PSU with a EVGA G2 1300 watt. I haven't heard coil whine on any card I've owned since. My 2 cents. Many excellent choices though. Hope it solves your coil whine issues.


----------



## looniam

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Don't worry. I'm returning this if it doesn't fix the coil whine


superflower (evga g2/p2/t2 series) has been slapping around seasonic for ~2 years now:

750 prime 12v ripple 62A load:


850 p2 12v ripple 70.8A load


both are superb but . . .less is always better.


----------



## LolCakeLazors

Quote:


> Originally Posted by *looniam*
> 
> superflower (evga g2/p2/t2 series) has been slapping around seasonic for ~2 years now:
> 
> 750 prime 12v ripple 62A load:
> 
> 
> 850 p2 12v ripple 70.8A load
> 
> 
> both are superb but . . .less is always better.


I think someone mentioned that the EVGA had less ripple but the drawback was that the cables had capacitors in them. I'll probably look into the P2 if the Seasonic doesn't work.


----------



## boredgunner

Quote:


> Originally Posted by *looniam*
> 
> superflower (evga g2/p2/t2 series) has been slapping around seasonic for ~2 years now:
> 
> 750 prime 12v ripple 62A load:
> 
> 
> 850 p2 12v ripple 70.8A load
> 
> 
> both are superb but . . .less is always better.


Negligible. The more substantial difference here is that Superflower/EVGA needs capacitors installed on the cables to do this, which are a liability especially if you ever decide to use custom cables or do your own sleeving. Now imagine if Seasonic used this cheap capacitor cable method, how low the ripple would be.

The efficiency difference between PRIME and P2 is also a more substantial difference but that's an unfair comparison given the existence of EVGA T2.


----------



## looniam

Quote:


> Originally Posted by *boredgunner*
> 
> Negligible. The more substantial difference here is that Superflower/EVGA needs capacitors installed on the cables to do this, which are a liability especially if you ever decide to use custom cables or do your own sleeving. Now imagine if Seasonic used this cheap capacitor cable method, how low the ripple would be.
> 
> The efficiency difference between PRIME and P2 is also a more substantial difference but that's an unfair comparison given the existence of EVGA T2.


i did say BOTH were superb but _cheap?_

please, added ripple suppression is ALWAYS a good thing. but yeah, IF one is custom sleeving then look somewhere else. which brings up - those that have and gotten rid of the caps while sleeving had noticed no increase in ripple that "suddenly" caused an issue.

in other words those caps are negligible.


----------



## nexxusty

Quote:


> Originally Posted by *LolCakeLazors*
> 
> God I just noticed how loud my coil whine is once I overclock it. The voltage hits 1.050 and it sounds like a wire is stuck in a fan. I don't know if its the GPUs I've been getting since my last 1080 FTW had coil whine and my old 290X Lightning had coil whine as well. Might and go buy a EVGA P2 or something to see if my EVGA G2 isn't up to par.


EVGA PSU Fan boy or something? Why don't you buy an I series Corsair? Best PSU's one could own.


----------



## boredgunner

Quote:


> Originally Posted by *looniam*
> 
> i did say BOTH were superb but _cheap?_
> 
> please, added ripple suppression is ALWAYS a good thing. but yeah, IF one is custom sleeving then look somewhere else. which brings up - those that have and gotten rid of the caps while sleeving had noticed no increase in ripple that "suddenly" caused an issue.
> 
> in other words those caps are negligible.


There have been tests on them without cable capacitors? Interesting, I want to see. I wasn't sure if it'd be like the old Antec TPQ 1200, although I'm glad the cables aren't that horrendously bulky. They actually feel rather normal.


----------



## LolCakeLazors

Quote:


> Originally Posted by *nexxusty*
> 
> EVGA PSU Fan boy or something? Why don't you buy an I series Corsair? Best PSU's one could own.


Just a fan of their warranty. They advance RMA'ed my G2 PSU last time with no questions asked. Just the little things.


----------



## looniam

Quote:


> Originally Posted by *boredgunner*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> i did say BOTH were superb but _cheap?_
> 
> please, added ripple suppression is ALWAYS a good thing. but yeah, IF one is custom sleeving then look somewhere else. which brings up - those that have and gotten rid of the caps while sleeving had noticed no increase in ripple that "suddenly" caused an issue.
> 
> in other words those caps are negligible.
> 
> 
> 
> 
> 
> 
> 
> There have been tests on them without cable capacitors? Interesting, I want to see. I wasn't sure if it'd be like the old Antec TPQ 1200, although I'm glad the cables aren't that horrendously bulky. They actually feel rather normal.
Click to expand...

did i say there was testing?

nope i said _noticed;_ ya know stuff like instability of hardware, shut downs . . .

if you want to see, go look on the owners thread were oklahoma wolf (who does the reviews for jonny guru) mentions that the caps don't influence ripple _as much as most believe_ when those sleeving ask about them.

so yeah PSUs have come a long way since 2009/10 when that antec was made.


----------



## justinyou

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah! There appears to be a reason that this is not posted on the MSI website any longer. Can only find the bios on their site for the regular gaming x.
> 
> 
> 
> 
> 
> 
> 
> Thankfully, mine appears to have zero coil whine just like my 780 Lightning. But those prices.
> 
> 
> 
> 
> 
> 
> 
> I paid less than 800 usd for my seahawk ek.


Less than 800 usd for a seahawk ek? That's a steal man









Quote:


> Originally Posted by *fjordiales*
> 
> @Bishop07764, @justinyou, @gingerbreadman, I'm leaning towards the part where they do test the chips on which passes or is acceptable to be X or Z. I have the REGULAR 1080 gaming(1620 core) and both X and Z bios work. The issue I'm having is OC bios for Z is not stable, better yet 2025+ clock is not stable. Z gaming bios is fine, all X bios is fine.


From what I heard, the NVIDIA GTX 1080 Founders Edition cards are easier to OC to higher frequency, maybe because the chips have been binned ? Just maybe......


----------



## shadow85

Hey guys, Gigabyte G1 gaming, or ASUS STRIX?


----------



## boredgunner

Quote:


> Originally Posted by *shadow85*
> 
> Hey guys, Gigabyte G1 gaming, or ASUS STRIX?


Odd choice since the STRIX costs much more. Between the two I'd get the STRIX. Less coil whine complaints it seems, cool lighting modes. But given the normal prices, I wouldn't get either unless I was water cooling in which case I'd consider the G1 Gaming.


----------



## IronAge

Gigabyte Windforce ... same PCB as G1, just lower clock rates and no RGB LEDs.

Nice deal:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817139130&cm_re=ax860i-_-17-139-130-_-Product

Own the Corsair AX760i .... never had any coil whine issues.

Be Quiet Dark Power Pro 11 solved coil whine issues for some GTX 1080 owners..


----------



## Bishop07764

I finally got around to testing out Firestrike.

http://www.3dmark.com/3dm/14228825

My CPU is a bit of a dinosaur now I guess. Thought my graphics score might be closer to 25k. Opinions? Memory clocked too high maybe? Chalk it up to older hardware?

http://www.3dmark.com/3dm/14229051

Timespy results too if it helps.


----------



## juniordnz

Quote:


> Originally Posted by *Bishop07764*
> 
> I finally got around to testing out Firestrike.
> 
> http://www.3dmark.com/3dm/14228825
> 
> My CPU is a bit of a dinosaur now I guess. Thought my graphics score might be closer to 25k. Opinions? Memory clocked too high maybe? Chalk it up to older hardware?


You should be getting more performance with those clocks...I get 25300 with 2126/1377


----------



## Bishop07764

Quote:


> Originally Posted by *juniordnz*
> 
> You should be getting more performance with those clocks...I get 25300 with 2126/1377


Think it might possibly be related to the older processor? I'm not sure how much I really want to chase down a score, but I do want it performing as it's supposed to. Thanks for sharing.

edit:
Thanks so much for your help. It turned out to be the memory. I had it clocked too high. I might still but good to know. My score kept going up in Valley. Firestrike is another story.

http://www.3dmark.com/3dm/14229292

Slight boost in Timespy.

http://www.3dmark.com/3dm/14229703?


----------



## jodasanchezz

So after Testing of Houers with my 1080 Classified im not shure if fhould keep this card.
In Germany im eable to send her vack within 14day after purchase without any cost.
This is my MAX stabel and reproduceabler Score.

What do u Guy think Should i jusat go with 1080 FTW (100€less)


----------



## TWiST2k

Quote:


> Originally Posted by *jodasanchezz*
> 
> So after Testing of Houers with my 1080 Classified im not shure if fhould keep this card.
> In Germany im eable to send her vack within 14day after purchase without any cost.
> This is my MAX stabel and reproduceabler Score.
> 
> What do u Guy think Should i jusat go with 1080 FTW (100€less)


I have a 980 Ti Classified and it was a good card at a good price, but when I looked at teh 1080 Classified, the extra 70 dollars just wasnt worth it over the FTW. I now have a 1080 FTW and could not be happier with it! I would say ditch it and get a FTW.


----------



## nexxusty

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Just a fan of their warranty. They advance RMA'ed my G2 PSU last time with no questions asked. Just the little things.


They do RMA's right. That's for sure.

Corsair is good with PSU RMA's from my experience. Sent me an extra bag of cables.


----------



## Bishop07764

Quote:


> Originally Posted by *jodasanchezz*
> 
> So after Testing of Houers with my 1080 Classified im not shure if fhould keep this card.
> In Germany im eable to send her vack within 14day after purchase without any cost.
> This is my MAX stabel and reproduceabler Score.
> 
> What do u Guy think Should i jusat go with 1080 FTW (100€less)


I'd second that. Pocket the extra cash.


----------



## Bogga

Anyone with strix card/s that have tried the alternative strix bios? I tried it but the only difference I noticed was higher temps...


----------



## Menthol

Quote:


> Originally Posted by *jodasanchezz*
> 
> So after Testing of Houers with my 1080 Classified im not shure if fhould keep this card.
> In Germany im eable to send her vack within 14day after purchase without any cost.
> This is my MAX stabel and reproduceabler Score.
> 
> What do u Guy think Should i jusat go with 1080 FTW (100€less)


What's wrong with that, looks like a good card to me, if you had cool water it would clock a little higher but chances of getting a better card are slim


----------



## Vellinious

Got my FTWs in finally. Did some initial testing on air....waiting for EK to release the blocks. The first card I tested was extremely average. Could only hit 2136 on air....wasn't too happy with it. The 2nd one seems to do a lot better, but the thermal throttling is holding it back. I've had it up to 2164, but...it doesn't last very long there before it starts to drop clocks. No power limit throttling, though...so that's good.

Just tinkering still...but getting close to what I was hoping for.

8487: http://www.3dmark.com/spy/301239

21750: http://www.3dmark.com/fs/9832300

I'll do some SLI runs this weekend.


----------



## AllGamer

After a very long wait it finally arrived!

Awesome product, _*looks*_ like good quality











but it was made in China













I'll move my existing GTX 1080 FE to my Home Theatre PC and use it for VR (HTC Vive)

BTW, HTC Vive doesn't play well if you are using nVidia Surround View, or with the 3+1 setup, it's extremely annoying.

So I'm moving my VR headset to my Home Theatre room instead of my gaming rig room.


----------



## jodasanchezz

Quote:


> Originally Posted by *Vellinious*
> 
> Got my FTWs in finally. Did some initial testing on air....waiting for EK to release the blocks. The first card I tested was extremely average. Could only hit 2136 on air....wasn't too happy with it. The 2nd one seems to do a lot better, but the thermal throttling is holding it back. I've had it up to 2164, but...it doesn't last very long there before it starts to drop clocks. No power limit throttling, though...so that's good.
> 
> Just tinkering still...but getting close to what I was hoping for.
> 
> 8487: http://www.3dmark.com/spy/301239
> 
> 21750: http://www.3dmark.com/fs/9832300
> 
> I'll do some SLI runs this weekend.


Hi there,
if learn yesterday , clocks are not everithing with Pascal.

Look at my Graficscore in the last post with 2062mhz on the GPU (CPU at stock atm)

Try to find the Sweatspot betwean clocks and results....totaly diffrent to maxpel IMO


----------



## Whitechap3l

Quote:


> Originally Posted by *Bogga*
> 
> Anyone with strix card/s that have tried the alternative strix bios? I tried it but the only difference I noticed was higher temps...


Mine runs best with EVGA FE Bios.. The strix T4 bios give you ( at least with newest Drivers ) insane FS scores but when you stresstest it the card gets some Errors unfortunally
SO far I didnt try to use the normal Strix OC bios maybe I will today


----------



## Whitechap3l

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi there,
> if learn yesterday , clocks are not everithing with Pascal.
> 
> Look at my Graficscore in the last post with 2062mhz on the GPU (CPU at stock atm)
> 
> Try to find the Sweatspot betwean clocks and results....totaly diffrent to maxpel IMO


When you are using the no tdp strix t4 bios more is actually more with the newest nvidia Drivers








Started yesterday with 2190mhz - around 25.400 @ FS 2224 mhz ~ 25.600 @ FS and 2244 gives me scores nearly 26k









As I said earlier unfortunally in games/ stresstests the Card crashes for me at least


----------



## IronAge

Quote:


> Originally Posted by *Bishop07764*
> 
> I'd second that. Pocket the extra cash.


It is silicon lottery as usual ... he could get a FTW which does worse + with the latest WHQL you get slightly lower clock rates but higher points in Time Spy.

Classified has a better heatsink ... the one of a buddy does 2126/5600 game stable/stress test stable.


----------



## jodasanchezz

Well found an old score of my sold 980ti.....regret i sold her...
In compariosn to the 1080 classi the classi look not to goof for 850€

I think i send the 1080 back...game for a view month on my 780 and wait for 1080ti or volta....

What do u think?

*980ti Score*
http://www.3dmark.com/fs/6072537

*1080 Classified Score*
http://www.overclock.net/t/1601288/lightbox/post/25449918/id/2853723


----------



## Deders

The 1080 link won't open for me but you should be getting close to 20,000 on it?


----------



## shadow85

Can we use 2 different STRIX models in SLI? e.g.

STRIX-GTX1080-8G-GAMING
and
STRIX-GTX1080-A8G-GAMING

and do they use the same EK blocks?


----------



## Whitechap3l

Quote:


> Originally Posted by *shadow85*
> 
> Can we use 2 different STRIX models in SLI? e.g.
> 
> STRIX-GTX1080-8G-GAMING
> and
> STRIX-GTX1080-A8G-GAMING
> 
> and do they use the same EK blocks?


You can put the same bios on both Cards.. the A8 is the same Card just Little higher clocks so i guess it should work but no guarantee here.. not that confirm with sli

yeah the EK block is the same


----------



## Whitechap3l

Quote:


> Originally Posted by *jodasanchezz*
> 
> Well found an old score of my sold 980ti.....regret i sold her...
> In compariosn to the 1080 classi the classi look not to goof for 850€
> 
> I think i send the 1080 back...game for a view month on my 780 and wait for 1080ti or volta....
> 
> What do u think?
> 
> *980ti Score*
> http://www.3dmark.com/fs/6072537
> 
> *1080 Classified Score*
> http://www.overclock.net/t/1601288/lightbox/post/25449918/id/2853723


Still dont get it why people complaining...
980 -> 1080
980TI -> coming 1080TI

Its like to compare a 1er BWM with a 3er BWM in my eyes


----------



## Pepillo

Hello.

Finally, I can run my MSI 1080 Gaming X @ Gaming Z bios with Kraken G10 + X31 at 2.202 MHz Core, 11.016 MHz Memory. Very happy with this, max temp 43º

http://www.3dmark.com/spy/299923


----------



## IronAge

@shadow85

you should be able to flash the A8G Bios to the 8G and get slightly higher clock rates with that bios.

they have got the same PCB ... so Strix waterblocks will fit on both.

@jodasanchezz

compare with DX12 Benchmark Timespy and there will be a more noticeable difference between GTX980Ti and GTX 1080.

my Gigabyte GTX980Ti Xtreme @1550 MHz does around 10500 Graphics score in Firestrike Extreme.

my Inno3D GTX1080 iChill X3 @ ~2050 MHz does around 11500 Graphics Score in FSE


----------



## schoolofmonkey

Ok I've just started to overclock the Strix and have come across something interesting.
This card hits the power limit even on stock settings, I increased it to 120% with stock clocks and it still hit the power limit, just not as much.

From the Googling I did it seems to be common, I'm guessing the only way around it is a custom BIOS right?


----------



## shadow85

Quote:


> Originally Posted by *IronAge*
> 
> @shadow85
> 
> you should be able to flash the A8G Bios to the 8G and get slightly higher clock rates with that bios.
> 
> they have got the same PCB ... so Strix waterblocks will fit on both.


I have never flashed BIOS before on GPUs. Is it easy to do?


----------



## jodasanchezz

I found an old Score of my 980ti amp Extreme (sold) to compare with the 1080 classi....well the 1080 ist not worth the 850€ i think....will send the Card back

1080 Classified

980ti AMP Extreme
http://www.3dmark.com/fs/6072537
Quote:


> Originally Posted by *Whitechap3l*
> 
> Still dont get it why people complaining...
> 980 -> 1080
> 980TI -> coming 1080TI
> 
> Its like to compare a 1er BWM with a 3er BWM in my eyes


Ist easy i came from 980ti and now i got 1080 i wanna see abenfit or no benefit.....Price 980ti back in the days 750€ 1080 classi 850€ in firestrike 3000points more ist this the mony worth? yes no?
thats the comparison....

for me


----------



## toncij

Quote:


> Originally Posted by *Whitechap3l*
> 
> *coming* 1080TI


That's probably not going to happen this year at all.


----------



## Whitechap3l

Quote:


> Originally Posted by *jodasanchezz*
> 
> I found an old Score of my 980ti amp Extreme (sold) to compare with the 1080 classi....well the 1080 ist not worth the 850€ i think....will send the Card back
> 
> 1080 Classified
> 
> 980ti AMP Extreme
> http://www.3dmark.com/fs/6072537
> Ist easy i came from 980ti and now i got 1080 i wanna see abenfit or no benefit.....Price 980ti back in the days 750€ 1080 classi 850€ in firestrike 3000points more ist this the mony worth? yes no?
> thats the comparison....
> 
> for me


Prices went up high yes.. I mean look at the Titan price...
Hmm.. I mean you get the 1080 for the same Price I guess and more Performance









It is also a question from what you came from.. Sure I can understand that when u sold a 1500mhz 980TI your improvments compared to a 1080 is not that huge ( at least moneywise ) but than I think you have to question yourself why not waiting for the 1080TI series...


----------



## Whitechap3l

Quote:


> Originally Posted by *shadow85*
> 
> I have never flashed BIOS before on GPUs. Is it easy to do?


Not that hard









http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980
Link for Tutorial
I do : nvflash --protectoff before flashing but dont know if it is actually necessary

https://www.techpowerup.com/vgabios/?architecture=NVIDIA&manufacturer=&model=GTX+1080&interface=&memType=&memSize=8192

Link for approved 1080 Bios









If any questions feel free to ask


----------



## toncij

Is there any picture of FTW vs Classified heatsink?

My FTWs can't go over 2114 without crashing.


----------



## IronAge

Just compare dimensions ... Classified got about 22mm more height ... reviews of classified are rare as of now.

actually i have not found any.









FTW costs only 30€ less than Classified (EVGA Europe Shop)

Think i am done with the iCHill X3 and will install the Gigabyte G1 now, got a Zotac Amp on the way.

The Inno3D iChill X3 is nice for a silent pc ... high memory clocks/stock boost ... but fans only have 1620 rpm 100%


----------



## GanGstaOne

Quote:


> Originally Posted by *shadow85*
> 
> Hey guys, Gigabyte G1 gaming, or ASUS STRIX?


I have G1 Gaming super good card for its price flashed Xtreme bios great scores fps no tdp limits overclock very well


----------



## grimboso

I would like to know how this guy managed 2700 on a 1080..



http://hwbot.org/submission/3278564_


----------



## IronAge

Subzero Cooling and ~200 MHz more with GPUPi versus a real 3D Benchmark.


----------



## grimboso

Quote:


> Originally Posted by *IronAge*
> 
> Subzero Cooling and ~200 MHz more with GPUPi versus a real 3D Benchmark.


So then the GPUPi is a very light test then?

2500 still sets it as one of the highest LN2 runs on a 1080.


----------



## GanGstaOne

How are those guys able to get that much out of a 1080 without bios tweaker hard mods


----------



## Bishop07764

Quote:


> Originally Posted by *IronAge*
> 
> It is silicon lottery as usual ... he could get a FTW which does worse + with the latest WHQL you get slightly lower clock rates but higher points in Time Spy.
> 
> Classified has a better heatsink ... the one of a buddy does 2126/5600 game stable/stress test stable.


This is true. It might clock even worse; but with pascal, the PCB seems to make less difference. He paid quite a bit more for the classified. I was originally holding out for a classified myself. When i saw how little it mattered, i just went the cheapest water cooling route. I mean here for me to buy a classified would have cost almost as much as my seahawk ek. Not worth it to me anyway.
Quote:


> Originally Posted by *Pepillo*
> 
> Hello.
> 
> Finally, I can run my MSI 1080 Gaming X @ Gaming Z bios with Kraken G10 + X31 at 2.202 MHz Core, 11.016 MHz Memory. Very happy with this, max temp 43º


Awesome! Glad to see the gaming z bios treating you well. Nice clocks, my score starts going down slightly when i go over 2154 core.
Quote:


> Originally Posted by *AllGamer*
> 
> After a very long wait it finally arrived!
> 
> Awesome product, _*looks*_ like good quality
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but it was made in China
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll move my existing GTX 1080 FE to my Home Theatre PC and use it for VR (HTC Vive)
> 
> BTW, HTC Vive doesn't play well if you are using nVidia Surround View, or with the 3+1 setup, it's extremely annoying.
> 
> So I'm moving my VR headset to my Home Theatre room instead of my gaming rig room.


Wow. How many 1080's do you have? Loving my seahawk ek here.


----------



## toncij

Quote:


> Originally Posted by *Bishop07764*
> 
> This is true. It might clock even worse; but with pascal, the PCB seems to make less difference. He paid quite a bit more for the classified. I was originally holding out for a classified myself. When i saw how little it mattered, i just went the cheapest water cooling route. I mean here for me to buy a classified would have cost almost as much as my seahawk ek. Not worth it to me anyway.
> Awesome! Glad to see the gaming z bios treating you well. Nice clocks, my score starts going down slightly when i go over 2154 core.
> Wow. How many 1080's do you have? Loving my seahawk ek here.


Where did you get SeaHawk? Europe store or US? Seems unavailable for the most stores...


----------



## Bishop07764

Quote:


> Originally Posted by *toncij*
> 
> Where did you get SeaHawk? Europe store or US? Seems unavailable for the most stores...


I got it from Newegg here in the US. Been waiting on it for a long time, I was checking stock constantly. Like most of the 1080's stock here, it appears to sell out within minutes at Newegg at least. I really should have another to justify all the rad space that it's sitting on. It would be quite hard to actually get another right now at least. Cheapest way that I could find to go with a full block.


----------



## AllGamer

Quote:


> Originally Posted by *Bishop07764*
> 
> Wow. How many 1080's do you have? Loving my seahawk ek here.


Just the 3 so far.

I still need 1 more for my wife gaming rig,

but the Sea Hawk EK is sold out everywhere, it's so hard to get hold of them.


----------



## LolCakeLazors

Quote:


> Originally Posted by *Bishop07764*
> 
> This is true. It might clock even worse; but with pascal, the PCB seems to make less difference. He paid quite a bit more for the classified. I was originally holding out for a classified myself. When i saw how little it mattered, i just went the cheapest water cooling route. I mean here for me to buy a classified would have cost almost as much as my seahawk ek. Not worth it to me anyway.


Yep, clocks are highly tied to temperature on Pascal due to GPU Boost 3.0. Anything above 50 degrees C and my card starts to throttle its voltage resulting in a downclock. If I leave my case open with my AC on, I can hold 1.062 @ 2114 core clock. Otherwise it drops to 2088 after a while.


----------



## IronAge

Quote:


> Originally Posted by *GanGstaOne*
> 
> I have G1 Gaming super good card for its price flashed Xtreme bios great scores fps no tdp limits overclock very well


Just did the same ... flashed G1 with xtreme Bios and it runs into Perfcap PWR with around 2100 MHz during 3D Mark Future Strike Extreme.

With G1 Bios VDDC is only 1..043V ... i flashed it with FTW now VDDC 1.062 V... set PL 110% ... works better now.

So what overclock you are getting with the Xtreme Bios on your G1 ?


----------



## Bishop07764

Quote:


> Originally Posted by *AllGamer*
> 
> Just the 3 so far.
> 
> I still need 1 more for my wife gaming rig,
> 
> but the Sea Hawk EK is sold out everywhere, it's so hard to get hold of them.


Just the 3. Ha ha. I salute you sir. Enjoy.








Quote:


> Originally Posted by *LolCakeLazors*
> 
> Yep, clocks are highly tied to temperature on Pascal due to GPU Boost 3.0. Anything above 50 degrees C and my card starts to throttle its voltage resulting in a downclock. If I leave my case open with my AC on, I can hold 1.062 @ 2114 core clock. Otherwise it drops to 2088 after a while.


I'm sure that my temps never going above 40 C for my card definitely helps. I have been testing out the Witcher 3 at 4k this am maxed minus hairworks. It would artifact occasionally at 2154. Took down the offset by 3 and lowered the voltage. Same clocks and artifacts appear to be gone. Anyone else noticed anything similar? I was just testing briefly and long term stability to be determined still, but I wonder if it was trying to boost higher and thats what might have been causing the occasional artifact.


----------



## grimboso

Quick question guys, if I where to run with two res in my loop, how much of an performance-loss do you think I am looking at, if I have 1 before pump and one after?
Would the water hit the "bottom" of the second res and loose preassure or Will it find its way without a large performance-drop?


----------



## fat4l

Quote:


> Originally Posted by *GanGstaOne*
> 
> You can get more then 1.15v even now with different bios like Strix T4 or Gigabyte Xtreme both are able to give you 1.2v the Xtreme has max tdp of 375w


That bios is no good for FE. More clocks but less performance..


----------



## LolCakeLazors

Quote:


> Originally Posted by *Bishop07764*
> 
> I'm sure that my temps never going above 40 C for my card definitely helps. I have been testing out the Witcher 3 at 4k this am maxed minus hairworks. It would artifact occasionally at 2154. Took down the offset by 3 and lowered the voltage. Same clocks and artifacts appear to be gone. Anyone else noticed anything similar? I was just testing briefly and long term stability to be determined still, but I wonder if it was trying to boost higher and thats what might have been causing the occasional artifact.


I don't have an exact answer but I noticed that it's better to leave the voltage at stock for me. It becomes way more unstable if you set voltage to 100 in Afterburner/Precision X. I could barely do 21xx when I touched the voltage meter in Precision X and it was clocking 2088 @ 1.092V.


----------



## GanGstaOne

Quote:


> Originally Posted by *IronAge*
> 
> Just did the same ... flashed G1 with xtreme Bios and it runs into Perfcap PWR with around 2100 MHz during 3D Mark Future Strike Extreme.
> 
> With G1 Bios VDDC is only 1..043V ... i flashed it with FTW now VDDC 1.062 V... set PL 110% ... works better now.
> 
> So what overclock you are getting with the Xtreme Bios on your G1 ?


Stable in 3DMark 2260Mhz with the Xtreme one with water cooling temps never go over 45C with G1 2200Mhz never tried above 2260 cause i dont need it i keep my card at 2088Mhz sweet spot very good score and fps with Xtreme one G1 bios too but i prefer Xtreme found few more different Xtreme bios file will try them too after i reinstall my pc win


----------



## GanGstaOne

Quote:


> Originally Posted by *fat4l*
> 
> That bios is no good for FE. More clocks but less performance..


I've seen people with FE and Strix T4 with good performace but with water


----------



## IronAge

Quote:


> Originally Posted by *GanGstaOne*
> 
> Stable in 3DMark 2260Mhz with the Xtreme one with water cooling temps never go over 45C with G1 2200Mhz never tried above 2260 cause i dont need it i keep my card at 2088Mhz sweet spot very good score and fps with Xtreme one G1 bios too but i prefer Xtreme found few more different Xtreme bios file will try them too after i reinstall my pc win


You have taken the Xtreme Review Sample Bios ?

Probably you could upload the one you have you your G1 or let me know where you have Downloaded it ?

Watercooling makes more difference than a more beefy VRM or power supply then.

Really nice overclocking you are getting there even for H20 cooling ... congrats.


----------



## GanGstaOne

Quote:


> Originally Posted by *IronAge*
> 
> You have taken the Xtreme Review Sample Bios ?
> 
> Probably you could upload the one you have you your G1 or let me know where you have Downloaded it ?
> 
> Watercooling makes more difference than a more beefy VRM or power supply then.
> 
> Really nice overclocking you are getting there even for H20 cooling ... congrats.


Thanks but as i said i think for almost all 1080 sweet spot is between 2000-2100 so find something in between and stick with it
You can download all bios files i have from techpowerup unverified uploads
And yes i'm using Xtreme Review Sample Updated Bios which end in .A3 there 3 more A3 bios file but one of them is different check bellow bios number you will see gigabyte number you will see they are different same with Xtreme Waterforce bios foles that ends in .B2
G1 bios are .07, 66 and FD


----------



## Joshwaa

Quote:


> Originally Posted by *grimboso*
> 
> Quick question guys, if I where to run with two res in my loop, how much of an performance-loss do you think I am looking at, if I have 1 before pump and one after?
> Would the water hit the "bottom" of the second res and loose preassure or Will it find its way without a large performance-drop?


What is the reason for the second res?


----------



## kx11

it's instock again

https://www.bhphotovideo.com/c/search?Ntt=geforce+gtx+1080&N=0&InitialSearch=yes&sts=ps&typedValue=


----------



## grimboso

Quote:


> Originally Posted by *Joshwaa*
> 
> What is the reason for the second res?


Purely aesthetics. I want to "frame" the hardware and make use of the two cilynder res I have, if possible








I originally opted for a dual loop, but as I didnt go SLI I don't see a reason for a loop for the GpU only








Still have the hardware tho do I Might as well use it.


----------



## Pepillo

Quote:


> Originally Posted by *Bishop07764*
> 
> Awesome! Glad to see the gaming z bios treating you well. Nice clocks, my score starts going down slightly when i go over 2154 core.


Thanks.

My score it is scaling well with every core step up:



Perhaps I can go more ......


----------



## DrFreeman35

So I'm building a rig for the first time, and was going to buy another FTW when available. I was waiting on Caselabs to take orders again, but apparently that's not happening until end of year. Any news on Ti versions? Should I wait? Or just buy another FTW? Haven't opened this one, or a majority of my components, due to not having a case to work with. Thanks


----------



## LolCakeLazors

Quote:


> Originally Posted by *DrFreeman35*
> 
> 
> 
> So I'm building a rig for the first time, and was going to buy another FTW when available. I was waiting on Caselabs to take orders again, but apparently that's not happening until end of year. Any news on Ti versions? Should I wait? Or just buy another FTW? Haven't opened this one, or a majority of my components, due to not having a case to work with. Thanks


You could just build the computer out of a case and run it on the motherboard box. You probably should in the first place to test if all the components work. The 1080ti will probably come up early next year to coincide with Zen.


----------



## DrFreeman35

Quote:


> Originally Posted by *LolCakeLazors*
> 
> You could just build the computer out of a case and run it on the motherboard box. You probably should in the first place to test if all the components work. The 1080ti will probably come up early next year to coincide with Zen.


Thanks for the reply, and I would build early, but I'm going to be doing a custom loop for WC'ing my rig. The CPU I'm buying doesn't have a fan with it, and no way to control temps. Would rather not spend more than the crazy amount of $ that I've already put on my CC lol. This computer enthusiast stuff is addicting and I'm trying to behave. I'll probably buy another FTW and just get 2 WB from EK when they actually become available. Thanks


----------



## juniordnz

Guys who have modded their cards to use watercooler, help me here...

I'm going to buy a copper shim to use on my FTW. Since it has those fingers on the heatplate, a copper shim is needed to get the waterblock in contact with the die (pic below).


Spoiler: FTW Heatplate Fingers







My question is: how thick should this copper shim be? I could only find up to 2mm thickness, should that be enough to overcome the heatplate?

Thanks in advance.


----------



## fat4l

Quote:


> Originally Posted by *GanGstaOne*
> 
> I've seen people with FE and Strix T4 with good performace but with water


thats cuz of no tdp limits. I dotn have tdp limits as I done the hard mod.
Bios itself is no good..


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> Guys who have modded their cards to use watercooler, help me here...
> 
> I'm going to buy a copper shim to use on my FTW. Since it has those fingers on the heatplate, a copper shim is needed to get the waterblock in contact with the die (pic below).
> 
> 
> Spoiler: FTW Heatplate Fingers
> 
> 
> 
> 
> 
> 
> 
> My question is: how thick should this copper shim be? I could only find up to 2mm thickness, should that be enough to overcome the heatplate?
> 
> Thanks in advance.


You can find on reddit that guys are using even 1mm thick shim with G10 and 1080 FTW but better use 1.5mm if you can get one but 2mm works too


----------



## LolCakeLazors

For anyone that has 2 1080s SLI'ed, can you tell me the max load wattage used? Interested in maybe putting two 1080 FTWs under water in the future.

EDIT: In other news, Newegg gave me Fedex Home Delivery so I'm getting my Seasonic Prime on Saturday (tomorrow)


----------



## Bishop07764

Quote:


> Originally Posted by *LolCakeLazors*
> 
> I don't have an exact answer but I noticed that it's better to leave the voltage at stock for me. It becomes way more unstable if you set voltage to 100 in Afterburner/Precision X. I could barely do 21xx when I touched the voltage meter in Precision X and it was clocking 2088 @ 1.092V.


It's weird because I have experienced it both ways. Doom adding extra voltage says F that stock all the way for stability. GTA V and Witcher 3 appear stable adding extra voltage. Valley wants extra voltage. My voltage doesn't always even go up when I put the slider up either. Guess chalk it up to boost 3.0 only using what it wants when it wants. Definitely the most instability seen with voltage maxed just like you.

Quote:


> Originally Posted by *Pepillo*
> 
> Thanks.
> 
> My score it is scaling well with every core step up:
> 
> 
> 
> Perhaps I can go more ......


See how high you can go. My scores start going down past 2154. Interestingly I got your same graphics score at 2154 core. Guess core clocks aren't always everything. Really starting to think that my voltage slider is placebo.









Edit: Have you tried a lower memory clock?I forgot that the Z bios upped my memory already before adding +500 to the memory. Dropped mine back by 50 and my score definitely went up. Wondering if you would see the same.


----------



## Fediuld

I bought an MSI Armor OC last week, and watercooled it last night. (EK)
At *stock* settings with aircooling was hitting 1911, under water is raised to 1946.

Overclocking, used to stay at 2101 now hits constant 2126 with couple of spikes at 2154 while benching. My settings are +100% vcore, +121% PL, +185, +0 vram (normal speed 5006) and temps do not go above 44C while benching.
Max power stays at 110% (axis goes all way to 150%) and hits the Voltage limit only when going to max speed. It doesn't hit Power or Temp limit at all.

Now, when gaming, I see on the overlay a constant 2139, however somehow that cannot be achieved while benching. (using Spy mainly).

Is there any way to improve the voltage limit? Some said use the Gaming X (maybe better use the Gaming Z?) bios, however I do not want to brick the card.
Any other idea?

Thank you


----------



## Bishop07764

Quote:


> Originally Posted by *grimboso*
> 
> Purely aesthetics. I want to "frame" the hardware and make use of the two cilynder res I have, if possible
> 
> 
> 
> 
> 
> 
> 
> 
> I originally opted for a dual loop, but as I didnt go SLI I don't see a reason for a loop for the GpU only
> 
> 
> 
> 
> 
> 
> 
> 
> Still have the hardware tho do I Might as well use it.


Sounds like you do have a reason then to go with a dedicated Gpu loop if you have the parts already. I'm not sure how it would impact performance if added to your existing loop. I opted for independent loops. It let me slap in my 1080 faster anyway. Even if the rads do get bored cooling just a single graphics card. Sometimes I don't even run all the fans. I love being able to get good temps with having my fans spin no faster than maybe 800 rpm. Some of benefits to having another loop that I've seen anyway.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *LolCakeLazors*
> 
> For anyone that has 2 1080s SLI'ed, can you tell me the max load wattage used? Interested in maybe putting two 1080 FTWs under water in the future.
> 
> EDIT: In other news, Newegg gave me Fedex Home Delivery so I'm getting my Seasonic Prime on Saturday (tomorrow)


Just tested with Heaven running full 4K, I didn't get over 690w. I imagine my CPU was pretty bored, so it might go higher, but I can't imagine much higher. I'll reply with GTA5 results as well since that'll hit everything very hard playing at full 4K.


----------



## Bishop07764

Quote:


> Originally Posted by *Fediuld*
> 
> I bought an MSI Armor OC last week, and watercooled it last night. (EK)
> At *stock* settings with aircooling was hitting 1911, under water is raised to 1946.
> 
> Overclocking, used to stay at 2101 now hits constant 2126 with couple of spikes at 2154 while benching. My settings are +100% vcore, +121% PL, +185, +0 vram (normal speed 5006) and temps do not go above 44C while benching.
> Max power stays at 110% (axis goes all way to 150%) and hits the Voltage limit only when going to max speed. It doesn't hit Power or Temp limit at all.
> 
> Now, when gaming, I see on the overlay a constant 2139, however somehow that cannot be achieved while benching. (using Spy mainly).
> 
> Is there any way to improve the voltage limit? Some said use the Gaming X (maybe better use the Gaming Z?) bios, however I do not want to brick the card.
> Any other idea?
> 
> Thank you


Personally I would try decreasing your voltage to see if that will help you. It helps my overclocks. I would suggest that you back up your bios and think about trying the Z bios as that PCB is also the same as the X and Z if I'm not mistaken. It might help out. I've definitely seen an improvement on mine with the Z bios, but I didn't try the stock one for very long. If it helps at all, the bios is a batch file executed bios that comes with very good instructions. It's made directly by MSI. You literally have to click yes for the Windows prompt, press g for gaming or O for the OC bios, maybe click yes again and it'll do its thing and automatically reboot. Something to think about anyway. You will have to reinstall your display driver too. My power limit has never gone above 90% that I can recall. It will change your limit to 107% which is normal. You could always flash back if you so desired. They have the same procedure and official bios for the gaming X.


----------



## Fediuld

Thank you, i will give it a try and let you know


----------



## Pepillo

Quote:


> Originally Posted by *Bishop07764*
> 
> See how high you can go. My scores start going down past 2154. Interestingly I got your same graphics score at 2154 core. Guess core clocks aren't always everything. Really starting to think that my voltage slider is placebo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Have you tried a lower memory clock?I forgot that the Z bios upped my memory already before adding +500 to the memory. Dropped mine back by 50 and my score definitely went up. Wondering if you would see the same.


The memory, if I set 50 MHz more or less, my score go down, 11.016 MHz are the maximum. The core, really I do not know his limit, every 13 MHz more, the score go up without problems. Now, 2.215 MHz::


----------



## Bishop07764

Quote:


> Originally Posted by *Pepillo*
> 
> The memory, if I set 50 MHz more or less, my score go down, 11.016 MHz are the maximum. The core, really I do not know his limit, every 13 MHz more, the score go up without problems. Now, 2.515 MHz::


Wow. That's awesome. Keep pushing it. You hit 2515?


----------



## Pepillo

Quote:


> Originally Posted by *Bishop07764*
> 
> Wow. That's awesome. Keep pushing it. You hit 2515?


Ups ...... my mistake, 2.215 MHz


----------



## Bishop07764

Quote:


> Originally Posted by *Pepillo*
> 
> Ups ...... my mistake, 2.215 MHz


Are you using the curve in Afterburner to do these clocks? I haven't tried it yet. It would probably be my only hope for bypassing the wall that I hit at 2154. Results appear rather mixed to poorer performing with the curve earlier in the thread though.


----------



## LolCakeLazors

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> Just tested with Heaven running full 4K, I didn't get over 690w. I imagine my CPU was pretty bored, so it might go higher, but I can't imagine much higher. I'll reply with GTA5 results as well since that'll hit everything very hard playing at full 4K.


Alright cool, thanks for your hard work.


----------



## Fediuld

Quote:


> Originally Posted by *Bishop07764*
> 
> Personally I would try decreasing your voltage to see if that will help you. It helps my overclocks. I would suggest that you back up your bios and think about trying the Z bios as that PCB is also the same as the X and Z if I'm not mistaken. It might help out. I've definitely seen an improvement on mine with the Z bios, but I didn't try the stock one for very long. If it helps at all, the bios is a batch file executed bios that comes with very good instructions. It's made directly by MSI. You literally have to click yes for the Windows prompt, press g for gaming or O for the OC bios, maybe click yes again and it'll do its thing and automatically reboot. Something to think about anyway. You will have to reinstall your display driver too. My power limit has never gone above 90% that I can recall. It will change your limit to 107% which is normal. You could always flash back if you so desired. They have the same procedure and official bios for the gaming X.


Can you provide me the link for the Z bios because couldn't find it, or I am blind


----------



## gree

Is assetto corsa not very demanding? It felt pretty smooth with a full grid at 4k


----------



## boredgunner

Quote:


> Originally Posted by *gree*
> 
> Is assetto corsa not very demanding? It felt pretty smooth with a full grid at 4k


When fully maxed out at 2560 x 1440 it brought my GTX 780 Ti to its knees somewhat, and perhaps the GTX 980 too. Never had an issue with the GTX 980 Ti or GTX 1080 however, the frame rate is through the roof with both.


----------



## Bishop07764

Quote:


> Originally Posted by *tin0*
> 
> As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.
> 
> 
> 
> *Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).
> 
> When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


Quote:


> Originally Posted by *Fediuld*
> 
> Can you provide me the link for the Z bios because couldn't find it, or I am blind


No problem. Here is the post and file by tin0. I chose the OC option and now my card boosts to about 2080 by default.


----------



## DrFreeman35

EK 560 CE enough to cool 2x1080 when OC? Or should I put another 240 radiator in with it? Or go with a thicker radiator? New to this, figured someone would have some info as this is where all the 1080 users are at....


----------



## axiumone

Quote:


> Originally Posted by *DrFreeman35*
> 
> EK 560 CE enough to cool 2x1080 when OC? Or should I put another 240 radiator in with it? Or go with a thicker radiator? New to this, figured someone would have some info as this is where all the 1080 users are at....


More than enough, if a 120 rad in the hybrid can keep a 1080 @ 2050 in the low 50c range in push/pull.


----------



## DStealth

Quote:


> Originally Posted by *Pepillo*
> 
> The memory, if I set 50 MHz more or less, my score go down, 11.016 MHz are the maximum. The core, really I do not know his limit, every 13 MHz more, the score go up without problems. Now, 2.215 MHz::


Are you using a curve, while i achieve better GPU scores with 100mhz less...and here's my best one @2152mhz


----------



## DrFreeman35

Quote:


> Originally Posted by *axiumone*
> 
> More than enough, if a 120 rad in the hybrid can keep a 1080 @ 2050 in the low 50c range in push/pull.


Ok thanks, was looking into adding a 240, that will save me some $ lol.


----------



## DStealth

Could be interested for all of you having issues with memory overclock. I did some research as close as possible for all the runs when the temp goes to 36* and to 50* at the end
all test are made with 2100 fixed core and FS GT1 test...It seems my memory sweet spot is between +530 and +540 actually in the middle +535

Code:



Code:


+0 GT1 117.31
+100 GT1 115.96
+200 GT1 118.82
+300 GT1 118.08
+400 GT1 120.97
+500 GT1 118.64
+530 GT1 121.41
+533 GT1 121.89
+534 GT1 122.24
+535 GT1 121.96
+536 GT1 121.69
+540 GT1 121.77
+550 GT1 120.59
+560 GT1 119.62
+600 GT1 119.29
+700 GT1 117.80


----------



## DStealth

Quote:


> Originally Posted by *DStealth*
> 
> Could be interesting for all of you having issues with memory overclock. I did some research as close as possible for all the runs when the temp goes to 36* and to 50* at the end
> all test are made with 2100 fixed core and FS GT1 test...It seems my memory sweet spot is between +530 and +540 actually in the middle +535
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> +0 GT1 117.31
> +100 GT1 115.96
> +200 GT1 118.82
> +300 GT1 118.08
> +400 GT1 120.97
> +500 GT1 118.64
> +530 GT1 121.41
> +533 GT1 121.89
> +534 GT1 122.24
> +535 GT1 121.96
> +536 GT1 121.69
> +540 GT1 121.77
> +550 GT1 120.59
> +560 GT1 119.62
> +600 GT1 119.29
> +700 GT1 117.80


Edit: Here's the best run +534mhz


Edit2: Just did my best FS GPU score with this discovery







25361


----------



## toncij

Careful with 372.54.
Not sure how that driver got certified but it's a mess. My 1080s are seriously artifacting and crashing with that driver. Not even a slightest overclock works.


----------



## Deders

Hmm. There have been quite a few reports of artifacting with these drivers over on the geforce forums, even at factory overclocked defaults.

Does anyone know how to switch debug mode off? For me, once I switch it on, it stays ticked and greyed out. I had to reinstall the drivers to get my overclock back.


----------



## MattBee

Hi all,

I am thinking about getting a Asus STRIX-GTX1080-A8G.

Is this card ok? Or are the oc versions better


----------



## kx11

Quote:


> Originally Posted by *toncij*
> 
> Careful with 372.54.
> Not sure how that driver got certified but it's a mess. My 1080s are seriously artifacting and crashing with that driver. Not even a slightest overclock works.


my 1080s are not affected by that driver , OC them @ 80+ core clock - 500+ mem and passed Heaven benchmark


----------



## GanGstaOne

Quote:


> Originally Posted by *toncij*
> 
> Careful with 372.54.
> Not sure how that driver got certified but it's a mess. My 1080s are seriously artifacting and crashing with that driver. Not even a slightest overclock works.


Its working fine here i even get higher scores with it then 368.95 and 369.05


----------



## Deders

Quote:


> Originally Posted by *kx11*
> 
> my 1080s are not affected by that driver , OC them @ 80+ core clock - 500+ mem and passed Heaven benchmark


Heaven stresses the tessellation units the most but that might not be the source of the artifacting.

For me, after playing rise of the tomb raider in dx11 mode for 4-5 hours, I saw the occasional few lines that shouldn't have been there. They went away very quickly, but then an hour later I glimpsed them again in another scene. Could have almost been mistaken for rain.

It's going to vary from game to game, as well as card to card.

Temp has been a constant 77c throughout.


----------



## Pepillo

Quote:


> Originally Posted by *Bishop07764*
> 
> Are you using the curve in Afterburner to do these clocks? I haven't tried it yet. It would probably be my only hope for bypassing the wall that I hit at 2154. Results appear rather mixed to poorer performing with the curve earlier in the thread though.


Quote:


> Originally Posted by *DStealth*
> 
> Are you using a curve, while i achieve better GPU scores with 100mhz less...and here's my best one @2152mhz


Yes, I am using the curve. "DStealth", normally I can see better graphics sores with Skylake than Haswell-E on Time Spy (my 5960X only gives 4.400-4500 MHz), but I will try later to overclock without curve and post results, thanks for the tip.


----------



## shadow85

Quote:


> Originally Posted by *MattBee*
> 
> Hi all,
> 
> I am thinking about getting a Asus STRIX-GTX1080-A8G.
> 
> Is this card ok? Or are the oc versions better


No wait for the MSI Lightnings.


----------



## Fediuld

Quote:


> Originally Posted by *Bishop07764*
> 
> No problem. Here is the post and file by tin0. I chose the OC option and now my card boosts to about 2080 by default.


Thank yhou


----------



## Fediuld

Quote:


> Originally Posted by *Pepillo*
> 
> Yes, I am using the curve. "DStealth", normally I can see better graphics sores with Skylake than Haswell-E on Time Spy (my 5960X only gives 4.400-4500 MHz), but I will try later to overclock without curve and post results, thanks for the tip.


Can you post here your curve please?


----------



## oGodMyNameWontF

Hello,

I am dithering between which 1080 I should buy (possible SLI later).
They will be water cooled, and OC.

Water cool brand: EKwb or HeatKiller

Which card should I buy for the best OC results (over 2000MHz) ?


----------



## GanGstaOne

Quote:


> Originally Posted by *oGodMyNameWontF*
> 
> Hello,
> 
> I am dithering between which 1080 I should buy (possible SLI later).
> They will be water cooled, and OC.
> 
> Water cool brand: EKwb or HeatKiller
> 
> Which card should I buy for the best OC results (over 2000MHz) ?


Gigabyte Xtreme best card out there much better then Zotac AMP Xtreme or EVGA FTW and Classified


----------



## oGodMyNameWontF

Quote:


> Originally Posted by *GanGstaOne*
> 
> Gigabyte Xtreme best card out there much better then Zotac AMP Xtreme or EVGA FTW and Classified


Isn't there any Founder Edition capable of going over 2GHz (essentially for the Water Block) ?


----------



## DrFreeman35

Quote:


> Originally Posted by *oGodMyNameWontF*
> 
> Hello,
> 
> I am dithering between which 1080 I should buy (possible SLI later).
> They will be water cooled, and OC.
> 
> Water cool brand: EKwb or HeatKiller
> 
> Which card should I buy for the best OC results (over 2000MHz) ?


All of them OC around the same and most have been over the 2000 MHz, should look at pricing and availability. Not opinions....


----------



## Pepillo

Quote:


> Originally Posted by *Fediuld*
> 
> Can you post here your curve please?


----------



## Bishop07764

Quote:


> Originally Posted by *Deders*
> 
> Heaven stresses the tessellation units the most but that might not be the source of the artifacting.
> 
> For me, after playing rise of the tomb raider in dx11 mode for 4-5 hours, I saw the occasional few lines that shouldn't have been there. They went away very quickly, but then an hour later I glimpsed them again in another scene. Could have almost been mistaken for rain.
> 
> It's going to vary from game to game, as well as card to card.
> 
> Temp has been a constant 77c throughout.


Interesting. I've seen something similar. Everything that I've tried was stable at 2154. Then i moved to just cause 3 last night. Slight red does on the water wouldn't totally go away until i lowered it to 2126 and no extra voltage. I have a feeling that this also might vary between drivers.

Quote:


> Originally Posted by *shadow85*
> 
> No wait for the MSI Lightnings.


Based on everyone's experience thus far including classified owners, I'm not sure that it would make much of a difference. Pains me to say as a owner of a 780 lightning that worked awesomely at 1.375 -1.4ghz constant useage for about 3 years. But that one kind of won the silicone lottery i suppose.


----------



## Whitechap3l

Quote:


> Originally Posted by *oGodMyNameWontF*
> 
> Hello,
> 
> I am dithering between which 1080 I should buy (possible SLI later).
> They will be water cooled, and OC.
> 
> Water cool brand: EKwb or HeatKiller
> 
> Which card should I buy for the best OC results (over 2000MHz) ?


I agree DrFreeman35 totally...
Get the cheapest two u can get with waterblock compatility... maybe get a extra 6 or 8 pin connector for power maybe a better Bios/mod will be available sometimes but yeah atm the differnces between cards arent hugh..
And you can flash Bioses anyways


----------



## LolCakeLazors

Quote:


> Originally Posted by *shadow85*
> 
> No wait for the MSI Lightnings.


Considering the weirdness that's GPU Boost 3.0, I doubt it would really help.


----------



## Whitechap3l

Quote:


> Originally Posted by *Pepillo*


Is that the Xtreme Bios?
With Strix this Bios seems not to work really well... I get crashes with clocks around 2025


----------



## Pepillo

No, Gaming Z bios on Gaming X with Kraken G10 + X31


----------



## Whitechap3l

Quote:


> Originally Posted by *Pepillo*
> 
> No, Gaming Z bios on Gaming X with Kraken G10 + X31


ah okay








I will try later then


----------



## smonkie

Has anyone tried to reapply thermal paste on the MSI Gaming 1080 X? MSI claims they use a premium compound, but I wonder how premium it really is. Would it be worth it to give that a shot?


----------



## Whitechap3l

Quote:


> Originally Posted by *Pepillo*
> 
> No, Gaming Z bios on Gaming X with Kraken G10 + X31


Do you use the older or the newest Nvidia drivers?


----------



## Deders

Quote:


> Originally Posted by *smonkie*
> 
> Has anyone tried to reapply thermal paste on the MSI Gaming 1080 X? MSI claims they use a premium compound, but I wonder how premium it really is. Would it be worth it to give that a shot?


Unless you are having temp/noise problems, I wouldn't bother. Aren't they amongst the coolest/quietest?


----------



## Fediuld

Quote:


> Originally Posted by *smonkie*
> 
> Has anyone tried to reapply thermal paste on the MSI Gaming 1080 X? MSI claims they use a premium compound, but I wonder how premium it really is. Would it be worth it to give that a shot?


Here is how it looked when I removed the MSI Armor OC heatsink to fit the EK waterblock.
I bet using the same stuff?


----------



## boi801

hello!

I did this:


----------



## wangle0485

I've spent a couple of hours running benchmarks today to test out a few fan configurations and come across something unusual - I can complete Time spy, FS Ultra & FS extreme runs time and time again, but Regular Firstrike crashes everytime during graphics test 2, around 18 seconds in. Hitting 2076 core/10990 memory and stable in absolutely everything else :-/ Anybody experienced/resolved this before?


----------



## pantsoftime

Quote:


> Originally Posted by *wangle0485*
> 
> I've spent a couple of hours running benchmarks today to test out a few fan configurations and come across something unusual - I can complete Time spy, FS Ultra & FS extreme runs time and time again, but Regular Firstrike crashes everytime during graphics test 2, around 18 seconds in. Hitting 2076 core/10990 memory and stable in absolutely everything else :-/ Anybody experienced/resolved this before?


I often run into stability problems like this when overclocking as well. When you're running extreme stress cases (FSE, Time Spy, etc) you're always in the higher points of the voltage curve. When you're running less stressful cases you may end up lower on it and expose different issues. You may want to tweak your curve and focus a bit on voltages down around 1.012 volts and below - maybe drop them a few MHz.


----------



## Whitechap3l

Quote:


> Originally Posted by *pantsoftime*
> 
> I often run into stability problems like this when overclocking as well. When you're running extreme stress cases (FSE, Time Spy, etc) you're always in the higher points of the voltage curve. When you're running less stressful cases you may end up lower on it and expose different issues. You may want to tweak your curve and focus a bit on voltages down around 1.012 volts and below - maybe drop them a few MHz.


Maybe try Evga bios.. I tested both strix bios, amp, g1 xtreme, msi gaming x and z and even with some I can't hit 2250+ easily non of them gave me better scores than Evga fe @2177 with 1.081v for me under water


----------



## THEROTHERHAMKID

Why have I got a black margin down either side of screen? Drivers?


----------



## Pepillo

Quote:


> Originally Posted by *Whitechap3l*
> 
> Do you use the older or the newest Nvidia drivers?


The last one. Today, 2.228 MHz, I run it a couple of times to be sure, and the score continue scaling and going up:



Tomorrow, I will try 2.240 MHz


----------



## tps3443

Hey everyone sign me up for the CLUB!

I picked up my sealed in the box GTX1080 FE for $280 bucks on Craigslist. What a deal.

A Young man, going to college for graphics design received the card paid for by "Financial Aid" He changed his classes around, so now he no longer needs it! So,even though he didn't even know what it was, And, I was fortunate to pick it up for $280 bucks CASH money. I paid more then that for my GTX 1070.

This is a once in a lifetime deal for me.

https://www.techpowerup.com/gpuz/details/g83dk

I came across this card, because I was looking to trade my GTX1070, and add cash on my end to get a GTX1080. So, it turns out it worked out even better. The GTX1080 is a great 4K gaming card.


----------



## tps3443

Can anyone catch me up? What is a good bios to flash on my Founders edition 1080?


----------



## LolCakeLazors

Got my Seasonic Prime and alas it didn't fix my coil whine on my 1080 FTW. Even went and moved it around to a couple of rooms and plugged it directly into the wall and it didn't help. In fact, it has really minor buzzing (put my ear next to the top vent of the PSU) as well and coil whines when you turn the system off. Returning this for a EVGA T2 or I'm going to RMA my prime and see if a new one helps.

EDIT: Or I'm just going to bother EVGA for another 1080 FTW replacement.

EDIT2: Experimented with my Prime and a surge protector that was properly grounded. Holy **** it's almost bearable now. Almost as loud as the fans in my case running around 40%. Completely inaudible with the side panel on (unless you put your ear right up against it)


----------



## Derpinheimer

Quote:


> Originally Posted by *boi801*
> 
> hello!
> 
> I did this:


Looks great! On a 1080 ftw i just left the default thermal plate on with a universal waterblock. Without airflow the metal plate gets near 75c. Gonna have to beef it up some


----------



## tps3443

Anyone playing Fallout 4 at 4K with a GTX 1080? It seems to run kind of shady. First of all, I am not really playing at 4K, but I have the up scaling forced in the Driver settings. All of my other games run really smooth, and look just amazing!

Fallout 4, @ 4K resolution, high shadows, and high god rays, no/AA. Everything else is maxed out. Walking around the game where there are buildings, and streets, or towns I'm averaging 60+ FPS, but once I walk in to the thick woods my frame rates DROP to roughly 25-30 it is terrible. GTA V runs way smoother than this @4K resolution. Is this normal?

I have a fresh windows installed, and the newest Nvidia 372 drivers as well. My video card s overclocked fairly high. I have tried default, and overclocked. it seems like a really bad performance issue. And based on GTX1080 reviews. I shouldn't be going below 45?

What kind of performance are you guys getting at 4K ?


----------



## ssgwright

Quote:


> Originally Posted by *tps3443*
> 
> Hey everyone sign me up for the CLUB!
> 
> I picked up my sealed in the box GTX1080 FE for $280 bucks on Craigslist. What a deal.
> 
> A Young man, going to college for graphics design received the card paid for by "Financial Aid" He changed his classes around, so now he no longer needs it! So,even though he didn't even know what it was, And, I was fortunate to pick it up for $280 bucks CASH money. I paid more then that for my GTX 1070.
> 
> This is a once in a lifetime deal for me.
> 
> https://www.techpowerup.com/gpuz/details/g83dk
> 
> I came across this card, because I was looking to trade my GTX1070, and add cash on my end to get a GTX1080. So, it turns out it worked out even better. The GTX1080 is a great 4K gaming card.


sure...


----------



## justinyou

Quote:


> Originally Posted by *boi801*
> 
> hello!
> 
> I did this:


Why did you add 2 layers of thermalpad?


----------



## Fediuld

With those settings managed to do 2164 core constant and achieved on Spy GPU 8255








Huzzah.



Using the normal Armor OC BIOS.

http://www.3dmark.com/spy/313615


----------



## sew333

Hello. Sometimes GPUZ reporting at starting games for a moment, perfcap reason THRM which is for 15 seconds, on my Gtx 1080 Xtreme. I dont know why because temps are very low 50C. Screen



Any ideas why?Ahh and restart pc fixing this completelly until again. Thx for suggestion. My temps are very low , so i dont know why it reporting THRM. THX!


----------



## Asus11

Quote:


> Originally Posted by *fat4l*
> 
> thats cuz of no tdp limits. I dotn have tdp limits as I done the hard mod.
> Bios itself is no good..


what hardmod have you done I thought that mod didn't really stop the tdp limits?


----------



## Deders

Quote:


> Originally Posted by *sew333*
> 
> Hello. Sometimes GPUZ reporting at starting games for a moment, perfcap reason THRM which is for 15 seconds, on my Gtx 1080 Xtreme. I dont know why because temps are very low 50C. Screen
> 
> 
> 
> Any ideas why?Ahh and restart pc fixing this completelly until again. Thx for suggestion. My temps are very low , so i dont know why it reporting THRM. THX!


Just going to point out that it isn't just reporting THRM, it's reporting power, voltage reliability (vRel) as well as temperature (Thrm) all at the same time.

I've been through this with him on other forums and from what I can see it's due to a bug where GPUboost 3 isn't reporting correctly, which can lead to lower power states being implemented until you restart the computer.

He may find some comfort in knowing that this happens to quite a lot of us and will most likely be resolved in future driver updates. There is a thread over on the Geforce forums that Nvidia have taken note of and reproduced the downclocking issue.


----------



## sew333

So
Quote:


> Originally Posted by *Deders*
> 
> Just going to point out that it isn't just reporting THRM, it's reporting power, voltage reliability (vRel) as well as temperature (Thrm) all at the same time.
> 
> I've been through this with him on other forums and from what I can see it's due to a bug where GPUboost 3 isn't reporting correctly, which can lead to lower power states being implemented until you restart the computer.
> 
> He may find some comfort in knowing that this happens to quite a lot of us and will most likely be resolved in future driver updates. There is a thread over on the Geforce forums that Nvidia have taken note of and reproduced the downclocking issue.


So nothing to worry about this Pwr THRM(THRM) Vrel ???????????????


----------



## Deders

Quote:


> Originally Posted by *sew333*
> 
> So
> So nothing to worry about this Pwr THRM(THRM) Vrel ???????????????


I think it's all related to the same issue. Gpu boost isn't reporting correctly so the card defaults to lower clockspeed as a fail-safe.

Nothing to worry about especially as many others have been having the same issue and Nvidia are looking into it.


----------



## killuchen

Hey guys, I haven't been following this forum lately due to a busy work schedule. Is there a certain bios I can flash to so I can achieve higher clocks on my 1080 ftw?


----------



## Fediuld

What are the settings you have tried in MSI AB?

At stock speeds, with the Vcore at +100%, what is your boost clock while gaming? (use the MSI Overlay).

How are your temps? Because the 1080 starts throttling the boost from 60C even when happily works at 70+....


----------



## sew333

Quote:


> Originally Posted by *Deders*
> 
> Just going to point out that it isn't just reporting THRM, it's reporting power, voltage reliability (vRel) as well as temperature (Thrm) all at the same time.
> 
> I've been through this with him on other forums and from what I can see it's due to a bug where GPUboost 3 isn't reporting correctly, which can lead to lower power states being implemented until you restart the computer.
> 
> He may find some comfort in knowing that this happens to quite a lot of us and will most likely be resolved in future driver updates. There is a thread over on the Geforce forums that Nvidia have taken note of and reproduced the downclocking issue.


So did you find somebody on other forums with the same VrelThrmPwr perfcap? Or only Pwr percap?


----------



## Deders

You are the only one I've seen who has noticed all 3 at the same time, but it's still a gpuboost related bug. Best wait for the driver that fixes it and see if it still happens.


----------



## fat4l

Quote:


> Originally Posted by *Asus11*
> 
> what hardmod have you done I thought that mod didn't really stop the tdp limits?


This one.


----------



## boi801

Quote:


> Originally Posted by *justinyou*
> 
> Why did you add 2 layers of thermalpad?


The original thermal pads with the MSI cooler didn't touch the memory chips, so I had to add thicker ones...


----------



## sew333

Quote:


> Originally Posted by *Hilpi234*
> 
> below 25° [email protected]
> above 25° [email protected]
> above 30° [email protected]
> above 40° [email protected]
> 
> If it Spikes over a certain Point only for 1 second your clock drops, or adds more Voltage, until the Tempreature reaches its lower Breakpoint
> 
> 
> 
> If you have those Stripes in GpuZ, it will throttle, or is going to... these, do not appear, if you use an Offset Overclock, because the GPU has complete control, over its Voltage.
> 
> It seems to be, somekind of stability mode...
> 
> 
> 
> ... because the average frames, are much lower 64 to 61
> 
> Also, the Card gets hotter... this could be, the little more voltage in Offsetmode, but there is still the 3 frame difference, in consecutive runs...


Pwr.Thrm.Vrel.Vop

What is the reason of your percap PwrThrmVrelVop? Is this ok? II am getting sometimes similiar perfcap for 10 seconds after starting 3d, on my GTX 1080 Xtreme. PwrThrmVrel. Screen:


----------



## kx11

Zotac 1080 PGF infos because why not??


----------



## looniam

^ can also be used as a plate when eating at the buffet.


----------



## xer0h0ur

What is that abomination? Kill it with fire!


----------



## DrFreeman35

Quote:


> Originally Posted by *xer0h0ur*
> 
> What is that abomination? Kill it with fire!


To each their own I guess lol


----------



## Faydes

So how do you guys rank the BIOS? Im on an MSI Gaming X sitting on EVGA FTW BIOS.


----------



## Bishop07764

Quote:


> Originally Posted by *wangle0485*
> 
> I've spent a couple of hours running benchmarks today to test out a few fan configurations and come across something unusual - I can complete Time spy, FS Ultra & FS extreme runs time and time again, but Regular Firstrike crashes everytime during graphics test 2, around 18 seconds in. Hitting 2076 core/10990 memory and stable in absolutely everything else :-/ Anybody experienced/resolved this before?


I have experienced complete stability in Firestrike and Timespy but artifacts in games at the same clocks namely Just Cause 3. I think these cards may be more finicky with max overclocks than we have seen in the past. Upping voltage doesn't always help me either and actually seems to hurt in several games.

Quote:


> Originally Posted by *tps3443*
> 
> Hey everyone sign me up for the CLUB!
> 
> I picked up my sealed in the box GTX1080 FE for $280 bucks on Craigslist. What a deal.
> 
> A Young man, going to college for graphics design received the card paid for by "Financial Aid" He changed his classes around, so now he no longer needs it! So,even though he didn't even know what it was, And, I was fortunate to pick it up for $280 bucks CASH money. I paid more then that for my GTX 1070.
> 
> This is a once in a lifetime deal for me.
> 
> https://www.techpowerup.com/gpuz/details/g83dk
> 
> I came across this card, because I was looking to trade my GTX1070, and add cash on my end to get a GTX1080. So, it turns out it worked out even better. The GTX1080 is a great 4K gaming card.


You sure that it wasn't an old Gtx 280 that you bought?







Enjoy. Don't have fallout 4 but have tested Gtx V and some others. Gta V starts having some slowdowns when there's a lot of foliage at 4K. Forget about the ultra foliage setting at that res.


----------



## GreedyMuffin

Wondering about flashing back to the strix bios. Seems like I can hit 2176 at 1075MV. Not bad if it's stable.

Temps are currently at 38'C while folding. I need a seperate loop for the GPU and the CPU as in gaming it rise to 44-48'C.


----------



## tps3443

I am having a hardtime overclocking my MSI GTX1080 FE model. I can go +225 on my core, and + 600 on my memory stable. And once I open firestrike and run it, the core clocks seem to go upwards of 2126Mhz, and then the warmer the card gets, it just works its way down to roughly 1974mhz. I am running 100% fan speeds. Temps do not go above 70C.

My memory at 11,200 is great! Tons of bandwidth. But, my core speed is just all over the place! lol

The most I can get in firestrike graphics score is 24,100 I assume this is normal?


----------



## schoolofmonkey

Quote:


> Originally Posted by *tps3443*
> 
> I am having a hardtime overclocking my MSI GTX1080 FE model. I can go +225 on my core, and + 600 on my memory stable. And once I open firestrike and run it, the core clocks seem to go upwards of 2126Mhz, and then the warmer the card gets, it just works its way down to roughly 1974mhz. I am running 100% fan speeds. Temps do not go above 70C.


It's the same on my Non Overclocked Strix (It basically runs at FE clocks with the Strix cooler, but has 6+8 power connectors), as soon as the temps hit 50c they start dropping by increments of 13Mhz until they stabilize at [email protected], this is with a 100Mhz core overclock.

But even with the card at stock it will boost to 1987Mhz anyway but settles at 1897Mhz, it's advertised boost clocks are 1743mhz, so it goes well over that at stock.

You'd think 65c - 70c max temps would be acceptable for the card, but nope it starts dropping clocks once it hits 50c, tested with 100% fan speed, kept the clocks higher, but stupidly loud..


----------



## DStealth

Quote:


> Originally Posted by *looniam*
> 
> ^ can also be used as a plate when eating at the buffet.


Lol should be focused at the Indian market with such gypsy look will have a great success over there









Just exceeded 35k GPU in 3dmark2011...Thanks to the OC socket on the new MB some couple of mhz were needed







)
[email protected] cooler 2177/11080 + [email protected]
*P26 858 with NVIDIA GeForce GTX 1080(1x) and Intel Core i7-5820K Processor
Graphics Score 35 065
Physics Score 16 162
Combined Score 15 238*


----------



## boredgunner

Quote:


> Originally Posted by *kx11*
> 
> Zotac 1080 PGF infos because why not??


Biggest PCB I've ever seen on a GPU.


----------



## DStealth

Quote:


> Originally Posted by *boredgunner*
> 
> Biggest PCB I've ever seen on a GPU.


Made especially for those willing to remove side fans from their cases


----------



## shadow85

If I am to have a dedicated loop for 2x GTX 1080 STRIX, what size radiator(s) will I need if I plan on Ocing them to around 2GHz boost.


----------



## Cornerer

Quote:


> Originally Posted by *kx11*


From what I read in Chinese forums, that 16+2 power phases (stated in this pic) doesn't really seem to help much.
Majority of users still getting sub 2.1GHz OC, with highest I read 2134MHz, roughly same as what's reported for AMP (Extreme) Edition.
System power consumption was also close to 350W level on a certain 6700K bench which is ridiculous for a 1080.

It's likely to be just those gimmicky for certain group of not-so-smart Chinese and that's why it's only being sold in China.


----------



## reb00tas

The One closest to the power connecter seems to give 80-90% instead of 120% i dunno what shorting the 2 others do someone claims it Will lock you to 135mhz.


----------



## Avant Garde

Anyone runs EVGA FTW on Slave BIOS?


----------



## toncij

Quote:


> Originally Posted by *Avant Garde*
> 
> Anyone runs EVGA FTW on Slave BIOS?


Everyone?


----------



## KickAssCop

Asus Strix AG vs. OG. Can I flash the bios on AG to OG without hassle? Anyone tried it yet?
What about the non overclock strix versions? Worth spending 60 extra for higher clocks or OK to just flash the non OC version to OG bios?

What up?


----------



## Whitechap3l

Quote:


> Originally Posted by *KickAssCop*
> 
> Asus Strix AG vs. OG. Can I flash the bios on AG to OG without hassle? Anyone tried it yet?
> What about the non overclock strix versions? Worth spending 60 extra for higher clocks or OK to just flash the non OC version to OG bios?
> 
> What up?


I got the cheapest non OC Strix and I can run every BIOS i want with no Problems so far








Safe Money - buy the non OC Strix and download BIOS here from Forum or
https://www.techpowerup.com/vgabios/?architecture=NVIDIA&manufacturer=&model=GTX+1080&interface=&memType=&memSize=8192

Mine is under water and seems to perform best with No Limit t4 strix bios ( only with newest nvidia Drivers ) , evga FE bios and strix OC Version bios









Hope I could help


----------



## kx11

Quote:


> Originally Posted by *Cornerer*
> 
> From what I read in Chinese forums, that 16+2 power phases (stated in this pic) doesn't really seem to help much.
> Majority of users still getting sub 2.1GHz OC, with highest I read 2134MHz, roughly same as what's reported for AMP (Extreme) Edition.
> System power consumption was also close to 350W level on a certain 6700K bench which is ridiculous for a 1080.
> 
> It's likely to be just those gimmicky for certain group of not-so-smart Chinese and that's why it's only being sold in China.


that is disappointing

in another note GALA\KFA2 HOF 1080 limited edition have a base clock of 1809mhz and boost to 1961mhz which is the highest clocks promoted for a GPU

link
http://www.galax.com/en/graphics-card/hof/galax-geforcer-gtx-1080-hof.html


----------



## justinyou

Quote:


> Originally Posted by *Faydes*
> 
> So how do you guys rank the BIOS? Im on an MSI Gaming X sitting on EVGA FTW BIOS.


Hi, can you share the EVGA FTW bios to me?
I am currently using the Gaming X with the Gaming Z bios, and so far so good.
Thinking to try another bios version and since you have tried the EVGA FTW bios, i would like to test it out


----------



## Asmola

Quote:


> Originally Posted by *Whitechap3l*
> 
> I got the cheapest non OC Strix and I can run every BIOS i want with no Problems so far
> 
> 
> 
> 
> 
> 
> 
> 
> Safe Money - buy the non OC Strix and download BIOS here from Forum or
> https://www.techpowerup.com/vgabios/?architecture=NVIDIA&manufacturer=&model=GTX+1080&interface=&memType=&memSize=8192
> 
> Mine is under water and seems to perform best with No Limit t4 strix bios ( only with newest nvidia Drivers ) , evga FE bios and strix OC Version bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope I could help


Have you noticed any stability problems on games with XOC T4 bios? So far T4 bios have been working well, but some people have had crashes on games while using it. Currently full stability @ 2114MHz AIR with T4 bios, temps are also fine.


----------



## Whitechap3l

Quote:


> Originally Posted by *justinyou*
> 
> Hi, can you share the EVGA FTW bios to me?
> I am currently using the Gaming X with the Gaming Z bios, and so far so good.
> Thinking to try another bios version and since you have tried the EVGA FTW bios, i would like to test it out


https://www.techpowerup.com/vgabios/?architecture=NVIDIA&manufacturer=&model=GTX+1080&interface=&memType=&memSize=8192

There you can find it too


----------



## Whitechap3l

Quote:


> Originally Posted by *Asmola*
> 
> Have you noticed any stability problems on games with XOC T4 bios? So far T4 bios have been working well, but some people have had crashes on games while using it. Currently full stability @ 2114MHz AIR with T4 bios, temps are also fine.


With the nvidias 368 Drivers I hit good scores but no Chance to get it stabil in games / stress tests.. Now with the newest Drivers it seems I can go a Little bit higher.. Managed yesterday 2202mhz with Performance increase to my 2177mhz evga bios and it was stable in stress test. I will try it out in games today


----------



## Denilson

what to buy KFA2 GeForce GTX 1080 HOF or EVGA gtx 1080 classified....???


----------



## TWiST2k

Quote:


> Originally Posted by *Denilson*
> 
> what to buy KFA2 GeForce GTX 1080 HOF or EVGA gtx 1080 classified....???


EVGA 1080 FTW, no reason to waste money on the Classified this time around.


----------



## jodasanchezz

Quote:


> Originally Posted by *toncij*
> 
> Is there any picture of FTW vs Classified heatsink?
> 
> My FTWs can't go over 2114 without crashing.


i just had a 1070 ftw an now 1080
Quote:


> Originally Posted by *TWiST2k*
> 
> EVGA 1080 FTW, no reason to waste money on the Classified this time around.


Absolutly are !!!

Had a Classified for 5 Day sent it back because max Stable Clock where 2038mhz...firestrike store ~ 24500 Grafic score
u buy at least a FTW wiht better results


----------



## Whitechap3l

Quote:


> Originally Posted by *Denilson*
> 
> what to buy KFA2 GeForce GTX 1080 HOF or EVGA gtx 1080 classified....???


At the moment, what I experienced and also read here it is basically indiffernet which Card you get...
A "standart" or FE with only 1 8pin performs nearly or in some cases even better than a Classified / Hof etc...

It is not a power/voltage/clock Problem at the moment I guess

I dont know what the prices are in your country but for me e.g the price differnce between a FE/ palit super jetstream to a HOF is nearly 80€.
I would buy me something extra with this safed Money but thats me


----------



## Cornerer

Quote:


> Originally Posted by *Whitechap3l*
> 
> At the moment, what I experienced and also read here it is basically indiffernet which Card you get...
> A "standart" or FE with only 1 6pin performs nearly or in some cases even better than a Classified / Hof etc...
> 
> It is not a power/voltage/clock Problem at the moment I guess
> 
> I dont know what the prices are in your country but for me e.g the price differnce between a FE/ palit super jetstream to a HOF is nearly 80€.
> I would buy me something extra with this safed Money but thats me


Agree. Though u probably referring to 8-pin rather than 6-pin









My Palit JetStream (not even Super ver.) can manage steady 2138MHz when utilizing "fixed" fan profile.


----------



## Cornerer

Quote:


> Originally Posted by *kx11*
> 
> that is disappointing
> 
> in another note GALA\KFA2 HOF 1080 limited edition have a base clock of 1809mhz and boost to 1961mhz which is the highest clocks promoted for a GPU
> 
> link
> http://www.galax.com/en/graphics-card/hof/galax-geforcer-gtx-1080-hof.html


The actual boost clocks vary for each user, usually higher. Top guns like Galax HOF, Zotac AMP and ASUS Strix OC are said to offer boost clock as extreme as 2050-2101 out of box.

But even most of those wouldn't get close to 2.15GHz mark.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *DrFreeman35*
> 
> EK 560 CE enough to cool 2x1080 when OC? Or should I put another 240 radiator in with it? Or go with a thicker radiator? New to this, figured someone would have some info as this is where all the 1080 users are at....


Quote:


> Originally Posted by *shadow85*
> 
> If I am to have a dedicated loop for 2x GTX 1080 STRIX, what size radiator(s) will I need if I plan on Ocing them to around 2GHz boost.


So I'm running 2x1080 SLI, one was an EVGA SC, one was EVGA FE, both are on the SC bios (just duplicated and flashed to both cards for simplicity). I'm using the EK blocks with EK bridge, CPU is also on the loop (6600K, stock), and all three are cooled with just a single 280mm radiator with EK Vardar F2 fans. I haven't spent a ton of time pushing my clocks, but the cards are easily keeping below 55c on the GPUs (that's for the top GPU, the lower GPU stays about 5c cooler) while playing GTA 5 at 4K with almost everything on very high (the shadows and grass on ultra bring framerate from solid 60 to ~45, unacceptable). I usually run between 1950 and 2050mhz on the GPUs, and even when looping firestrike ultra, the cards don't go over 55c over a prolonged period of time. The fans kick up to their 1600rpm max with this, but it's fine, it's not that loud, and my setup isn't "optimized".

I thought I'd need much more radiator, but I'm pretty pleased at the performance I'm getting with just 280mm on CPU+2GPUs. The die shrink is real! Hopefully this helps people wondering how much radiator they need. I would recommend my setup at minimum, I wouldn't go smaller than 280/360mm for CPU+2GPUs, but I can see where CPU+GPU or just 2 GPUs you could get away with 240mm without much problem.


----------



## Bishop07764

Quote:


> Originally Posted by *shadow85*
> 
> If I am to have a dedicated loop for 2x GTX 1080 STRIX, what size radiator(s) will I need if I plan on Ocing them to around 2GHz boost.


If you are going the dedicated gpu route, I would recommend that you get the largest that you can fit. I went a bit overkill with mine because back in the day I was originally planning on having 2 massively overvolted 780 lightning firebreathers one my dedicated loop (480+280 rad). Consequently, I can keep my fans at an extremely low and rather silent speed or even turn some off altogether with excellent cooling results. Shooting for a specific size, I would say that a good 360mm rad would do the trick though because I personally like to keep my fan speed as low as possible. Maybe others can chime in that actually have sli 1080's in their loops.


----------



## Reckit

Quote:


> Originally Posted by *KickAssCop*
> 
> Asus Strix AG vs. OG. Can I flash the bios on AG to OG without hassle? Anyone tried it yet?
> What about the non overclock strix versions? Worth spending 60 extra for higher clocks or OK to just flash the non OC version to OG bios?
> 
> What up?


Pay the extra money I would say. I've had 2 (one was returned due to dodgy fan bearings) non oc 1080 strix and flashed the bios on both. I run the oc bios not because I automatically achieve a higher clock. but my personal experience is that the oc bios makes the card more stable when overclocked .

Non oc bios @ 1900 attempts to boost to 2050-2060 on firestrike this crashes on the second gfx test
oc bios @ 1900 only boost's to 2010-2025 but runs firestrike bench without issues.

Both cards were near enough identical

So with oc bios Im actually reducing the clock by 36(MHz) compared to the OC profile on GPU Tweak and somehow the boost is less aggressive and gives me a more stable overclock
However you could take your chances and go for the non oc version you may get a better GPU than me.


----------



## DrFreeman35

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> So I'm running 2x1080 SLI, one was an EVGA SC, one was EVGA FE, both are on the SC bios (just duplicated and flashed to both cards for simplicity). I'm using the EK blocks with EK bridge, CPU is also on the loop (6600K, stock), and all three are cooled with just a single 280mm radiator with EK Vardar F2 fans. I haven't spent a ton of time pushing my clocks, but the cards are easily keeping below 55c on the GPUs (that's for the top GPU, the lower GPU stays about 5c cooler) while playing GTA 5 at 4K with almost everything on very high (the shadows and grass on ultra bring framerate from solid 60 to ~45, unacceptable). I usually run between 1950 and 2050mhz on the GPUs, and even when looping firestrike ultra, the cards don't go over 55c over a prolonged period of time. The fans kick up to their 1600rpm max with this, but it's fine, it's not that loud, and my setup isn't "optimized".
> 
> I thought I'd need much more radiator, but I'm pretty pleased at the performance I'm getting with just 280mm on CPU+2GPUs. The die shrink is real! Hopefully this helps people wondering how much radiator they need. I would recommend my setup at minimum, I wouldn't go smaller than 280/360mm for CPU+2GPUs, but I can see where CPU+GPU or just 2 GPUs you could get away with 240mm without much problem.


Thanks for the info, il keep that in mind.


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> Pay the extra money I would say. I've had 2 (one was returned due to dodgy fan bearings) non oc 1080 strix and flashed the bios on both. I run the oc bios not because I automatically achieve a higher clock. but my personal experience is that the oc bios makes the card more stable when overclocked .
> 
> Non oc bios @ 1900 attempts to boost to 2050-2060 on firestrike this crashes on the second gfx test
> oc bios @ 1900 only boost's to 2010-2025 but runs firestrike bench without issues.
> 
> Both cards were near enough identical
> 
> So with oc bios Im actually reducing the clock by 36(MHz) compared to the OC profile on GPU Tweak and somehow the boost is less aggressive and gives me a more stable overclock
> However you could take your chances and go for the non oc version you may get a better GPU than me.


So you basically paid 50€ for a Bios ?
I am pretty sure that the non OC, A8 and OC Versions are exactly the same Cards. The production would be way to expensive if they would make 3 Cards









It is just the Bios and when you dont have two left Hands you should be able to Flash the bios. Even rma a flashed Card is no Problem, I did it at least 2 times before


----------



## Reckit

Quote:


> Originally Posted by *KickAssCop*
> 
> Asus Strix AG vs. OG. Can I flash the bios on AG to OG without hassle? Anyone tried it yet?
> What about the non overclock strix versions? Worth spending 60 extra for higher clocks or OK to just flash the non OC version to OG bios?
> 
> What up?


Quote:


> Originally Posted by *Whitechap3l*
> 
> So you basically paid 50€ for a Bios ?
> I am pretty sure that the non OC, A8 and OC Versions are exactly the same Cards. The production would be way to expensive if they would make 3 Cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is just the Bios and when you dont have two left Hands you should be able to Flash the bios. Even rma a flashed Card is no Problem, I did it at least 2 times before


No I bought
Quote:


> Originally Posted by *Whitechap3l*
> 
> So you basically paid 50€ for a Bios ?
> I am pretty sure that the non OC, A8 and OC Versions are exactly the same Cards. The production would be way to expensive if they would make 3 Cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is just the Bios and when you dont have two left Hands you should be able to Flash the bios. Even rma a flashed Card is no Problem, I did it at least 2 times before


No I bought the non oc version because there was a supply issue at the time and couldnt wait. I had a very expensive paper weight under my desk.

Both cards would not run stable at the 1936 clock speed the oc version is claimed to achieve using the OC profile. I agree the cards are probably exactly the same but the gpu's must be hand picked for the oc version.


----------



## juniordnz

I believe that the only explanation for those cards that have exactly the same construction but different names/clocks (Like MSI Armor/GamingX/GamingZ) is that they test each GPU and brand the best ones as the top/oc models.

If we have some Armors that can't handle GamingZ OC Mode (like mine), even though they are exactly the same hardware beneath the hetplate/backplate/cooler, that means something.


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> No I bought the non oc version because there was a supply issue at the time and couldnt wait. I had a very expensive paper weight under my desk.
> 
> Both cards would not run stable at the 1936 clock speed the oc version is claimed to achieve using the OC profile. I agree the cards are probably exactly the same but the gpu's must be hand picked for the oc version.


ahhh okay I got your Point








yes maybe that is the case but I dont know what they production Looks like.. I mean they gurantee a certain clock speed on all 3 Versions. Would I be the Boss and all my Chips would be able to hit the max. clocks ( OC Version ) i wouldnt even bother to select some "Special ones" for the OC Version...

But that is all pure speculation


----------



## Kerian

Well after tweaking my MSI GTX 1080 FE since release day here is my final results for a full stable OC (game + benchmark) :

Graphic setup :
- 1440p , 144 Hz, GSYNC

Computer :
- i5 4670k at 4.2 Ghz
-16 Go RAM at 1600 MHz
- Motherboard Asus Z97-A BIOS 1503

My OC results:

Parameters :
+120% TDP
Temp max : 85°C

- Core +150 MHz giving a stabilized clock of 1949 Mhz at full load
- Memory + 425 MHz
- Temp at full load : ~ 80°C with custom fan curve (1%/1°C beyond 40 °C)
- Maxed monitored Vcore : ~ 1,06 V

I start having artifacts in games (in particular Witcher 3) with a +450 MHz on memory, which is a poor OC in comparison with some of your OC.
Core + 175 MHz will crash drivers in some games / benchmark. Example : 3 loops stable with Unigine heaven but crash with Unigine Valley at the exact same moment into the benchmark
Boosting Vcore doesn"t help the OC . I have the same result with +100% vcore (MSI afterburner).

I'm a little bit disappointed by those results as I wanted to reach 2GHz barrier but the gaming experience is amazing and I have no regrets in purchasing this card.
Coming from a GTX680 2Go the performance increase is absolutely incredible.


----------



## juniordnz

Quote:


> Originally Posted by *Kerian*
> 
> Well after tweaking my MSI GTX 1080 FE since release day here is my final results for a full stable OC (game + benchmark) :
> 
> Graphic setup :
> - 1440p , 144 Hz, GSYNC
> 
> Computer :
> - i5 4670k at 4.2 Ghz
> -16 Go RAM at 1600 MHz
> - Motherboard Asus Z97-A BIOS 1503
> 
> My OC results:
> 
> Parameters :
> +120% TDP
> Temp max : 85°C
> 
> - Core +150 MHz giving a stabilized clock of 1949 Mhz at full load
> - Memory + 425 MHz
> - Temp at full load : ~ 80°C with custom fan curve (1%/1°C beyond 40 °C)
> - Maxed monitored Vcore : ~ 1,06 V
> 
> I start having artifacts in games (in particular Witcher 3) with a +450 MHz on memory, which is a poor OC in comparison with some of your OC.
> Core + 175 MHz will crash drivers in some games / benchmark. Example : 3 loops stable with Unigine heaven but crash with Unigine Valley at the exact same moment into the benchmark
> Boosting Vcore doesn"t help the OC . I have the same result with +100% vcore (MSI afterburner).
> 
> I'm a little bit disappointed by those results as I wanted to reach 2GHz barrier but the gaming experience is amazing and I have no regrets in purchasing this card.
> Coming from a GTX680 2Go the performance increase is absolutely incredible.


You're probably throttling a lot with those temps. If you adapt an AIO Watercooler you'll certainly get those 2ghz you want.


----------



## Reckit

Quote:


> Originally Posted by *juniordnz*
> 
> I believe that the only explanation for those cards that have exactly the same construction but different names/clocks (Like MSI Armor/GamingX/GamingZ) is that they test each GPU and brand the best ones as the top/oc models.
> 
> If we have some Armors that can't handle GamingZ OC Mode (like mine), even though they are exactly the same hardware beneath the hetplate/backplate/cooler, that means something.


This must be the case, strix oc cards are like rocking house S*&t there is a waiting list as long as my arm. If they were exactly the same surely there wouldnt be a supply issue. Unless they are doing it deliberately to increase the price. They wouldn't do that would they


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> This must be the case, strix oc cards are like rocking house S*&t there is a waiting list as long as my arm. If they were exactly the same surely there wouldnt be a supply issue. Unless they are doing it deliberately to increase the price. They wouldn't do that would they












I lost faith in humanity anyways so who knows


----------



## fat4l

Ok guys...

New scores!

Time Spy:
4790K @ 5.1Ghz - GTX 1080 @ 2189MHz Core / 11016MHz Mem

*8508Graphics score!*
http://www.3dmark.com/3dm/14317122?


----------



## Fediuld

Quote:


> Originally Posted by *fat4l*
> 
> Ok guys...
> 
> New scores!
> 
> Time Spy:
> 4790K @ 5.1Ghz - GTX 1080 @ 2189MHz Core / 11016MHz Mem
> 
> *8508Graphics score!*
> http://www.3dmark.com/3dm/14317122?


Challenge accepted, if you share the curve









Currently the best achieved with linear (aka basic) overclock is 8255 for my MSI Armor


----------



## Sirstiv

Hello there guys.

I recently won a 1080gtx courtesy of nvidia.

It's the discrete Asus 1080 GTX reference "Turbo" model. Single blower.

I was surprised to see it benching well clocking to 2088mhz / 11000mhz.

Scored 15,500ish in 3D mark under the basic edition.

Factory vs OC saw me get 12-15% FPS increase in games like Tomb Raider and Shadow or Mordor.

Temps were 79c while in OC'd.

Do many cards go over 2100mhz? Not many were listed in the 3D mark results.


----------



## HeyThereGuy

I am in the process of upgrading my build. Has there been any findings of a non reference 1080 that is binned decent or a specific card that seems to be consistent at getting close to 2100mhz? Thanks in advance.


----------



## TWiST2k

Quote:


> Originally Posted by *Kerian*
> 
> Well after tweaking my MSI GTX 1080 FE since release day here is my final results for a full stable OC (game + benchmark) :
> 
> Graphic setup :
> - 1440p , 144 Hz, GSYNC
> 
> Computer :
> - i5 4670k at 4.2 Ghz
> -16 Go RAM at 1600 MHz
> - Motherboard Asus Z97-A BIOS 1503
> 
> My OC results:
> 
> Parameters :
> +120% TDP
> Temp max : 85°C
> 
> - Core +150 MHz giving a stabilized clock of 1949 Mhz at full load
> - Memory + 425 MHz
> - Temp at full load : ~ 80°C with custom fan curve (1%/1°C beyond 40 °C)
> - Maxed monitored Vcore : ~ 1,06 V
> 
> I start having artifacts in games (in particular Witcher 3) with a +450 MHz on memory, which is a poor OC in comparison with some of your OC.
> Core + 175 MHz will crash drivers in some games / benchmark. Example : 3 loops stable with Unigine heaven but crash with Unigine Valley at the exact same moment into the benchmark
> Boosting Vcore doesn"t help the OC . I have the same result with +100% vcore (MSI afterburner).
> 
> I'm a little bit disappointed by those results as I wanted to reach 2GHz barrier but the gaming experience is amazing and I have no regrets in purchasing this card.
> Coming from a GTX680 2Go the performance increase is absolutely incredible.


What monitor do you have? I have been looking at picking up a 1440p g-sync monitor and just curious what you use?


----------



## kx11

Quote:


> Originally Posted by *Denilson*
> 
> what to buy KFA2 GeForce GTX 1080 HOF or EVGA gtx 1080 classified....???


i don't know about 1080 classy but HOF can run games around 2126mhz most of the time but you have to use xtremetunerplus for that then close it and run MSI AB @ default clocks just so you can have RTTS OSD stuff on screen


----------



## boredgunner

Quote:


> Originally Posted by *TWiST2k*
> 
> EVGA 1080 FTW, no reason to waste money on the Classified this time around.


I'm not so sure. Most of the FTW's I see become unstable at above 2050 MHz or so, while most Classifieds are running games at over 2100 MHz no problem. This is based on forum comments and verified reviews for what it's worth.

The HOF is probably a monster as well.


----------



## DrFreeman35

Quote:


> Originally Posted by *boredgunner*
> 
> I'm not so sure. Most of the FTW's I see become unstable at above 2050 MHz or so, while most Classifieds are running games at over 2100 MHz no problem. This is based on forum comments and verified reviews for what it's worth.
> 
> The HOF is probably a monster as well.


No links? Proof of any kind? I've seen all kinds of "verified" posts about most cards being completely different. Depends on luck of the draw with these 1080's..... Buddy has 1080 OC strix and can't get above 2088, but I've seen some go over 2100.


----------



## cheddle

SO. my 1080 FE isn't a bad little clocker. 2130-2150mhz looks to be the max but im tired of these pcaps though. power limit stops me upping the volts, and stock volts stops me upping the mhz a bit more. if I declock my ram I can get more out of the core....

Leading me to believe I am hitting TDP wall...

What is the go-to BIOS to flash onto the FE for removing the TDP wall (or at least giving me another 20% power) - I dont really want more than 1.1v - id just like a steady clock speed. ive read the AIB BIOS on a FE card results in less perf at higher clocks? anyone modified a FE BIOS?

temps are fine with sub 35c on a full cover block.


----------



## Reckit

Quote:


> Originally Posted by *cheddle*
> 
> SO. my 1080 FE isn't a bad little clocker. 2130-2150mhz looks to be the max but im tired of these pcaps though. power limit stops me upping the volts, and stock volts stops me upping the mhz a bit more. if I declock my ram I can get more out of the core....
> 
> Leading me to believe I am hitting TDP wall...
> 
> What is the go-to BIOS to flash onto the FE for removing the TDP wall (or at least giving me another 20% power) - I dont really want more than 1.1v - id just like a steady clock speed. ive read the AIB BIOS on a FE card results in less perf at higher clocks? anyone modified a FE BIOS?
> 
> temps are fine with sub 35c on a full cover block.


I would say yes


----------



## Reckit

Quote:


> Originally Posted by *TWiST2k*
> 
> What monitor do you have? I have been looking at picking up a 1440p g-sync monitor and just curious what you use?


I've got an Asus PG279Q and its a beast. There is talk of panel bleeding and its true to say there is some but it is minimal and I only notice it in the corners in a dark room.

Its expensive but it is definitely the best monitor I have ever owned


----------



## TWiST2k

Quote:


> Originally Posted by *Reckit*
> 
> I've got an Asus PG279Q and its a beast. There is talk of panel bleeding and its true to say there is some but it is minimal and I only notice it in the corners in a dark room.
> 
> Its expensive but it is definitely the best monitor I have ever owned


That is the exact one I was looking at, I am waiting for Amazon to get stock in for a better price and so I can use prime for easy returns if I get a dud. Currently there is only 3rd party sellers and there like $959, so absurd. I read that Amazon in the UK stopped selling it on there site, I hope the same is not true for America because it was just on there site on like August 8th and I missed it.

Where did you get yours from and how much did you pay if you don't mind me asking?


----------



## KickAssCop

Thanks to those who answered my ASUS STRIX related question.


----------



## Whitechap3l

Quote:


> Originally Posted by *KickAssCop*
> 
> Thanks to those who answered my ASUS STRIX related question.


I read some more user comments about the strix/ non OC and OC.
On Air the most of the People get there non oc Strix between 2065 and 2100 - didnt saw someone get a better result with the OC Version









If the OC uses the "better" Chips I cant agree or disagree but I dont see the point going with the OC Version especially on air








Maybe later on with new bioses / mods etc. u COULD get some additional mHz out of a OC Version @ water but the comments I read the max. clock is ~2177mhz with old nividia Drivers - for OC and non OC

If that minor advantage that COULD be is you worth the additional price go for the OC.

For me I got my non OC Strix for 735€ and pretty happy with that


----------



## juniordnz

Quote:


> Originally Posted by *boredgunner*
> 
> I'm not so sure. Most of the FTW's I see become unstable at above 2050 MHz or so, while most Classifieds are running games at over 2100 MHz no problem. This is based on forum comments and verified reviews for what it's worth.
> 
> The HOF is probably a monster as well.


My 1080FTW is rock solid at 2126mhz/1.050V/130%TDP. Gaming at 2101-2114 depending on temps (I'm on air), never below that even for hours.

Most reviews got a first batch of FTWs and they really were poor overclockers, I think bjorn3d stated that they will test with new models...


----------



## Reckit

Quote:


> Originally Posted by *Reckit*
> 
> I would say yes


Quote:


> Originally Posted by *TWiST2k*
> 
> That is the exact one I was looking at, I am waiting for Amazon to get stock in for a better price and so I can use prime for easy returns if I get a dud. Currently there is only 3rd party sellers and there like $959, so absurd. I read that Amazon in the UK stopped selling it on there site, I hope the same is not true for America because it was just on there site on like August 8th and I missed it.
> 
> Where did you get yours from and how much did you pay if you don't mind me asking?


I got mine from www.scan.co.uk (it was a lot cheaper than amazon at the time). it cost me £671.00 inc VAT.


----------



## tin0

Quote:


> Originally Posted by *HeyThereGuy*
> 
> I am in the process of upgrading my build. Has there been any findings of a non reference 1080 that is binned decent or a specific card that seems to be consistent at getting close to 2100mhz? Thanks in advance.


MSI GTX 1080 GAMING Z cards are binned.


----------



## Deders

But how do different companies bin their chips?

Do they just test go how high it will boost at the standard voltage, which wouldn't give any clues as to what the chip's ceiling would be?


----------



## TWiST2k

Quote:


> Originally Posted by *Reckit*
> 
> I got mine from www.scan.co.uk (it was a lot cheaper than amazon at the time). it cost me £671.00 inc VAT.


Right after I checked your comment, I went and checked Amazon just for the heck of it and low and behold they got stock! I placed my order and refreshed and they were already gone! I hope I get a good one, I would rather not have to send this back to Amazon a few times till I get one with decent QA haha.


----------



## IronAge

Quote:


> Originally Posted by *tin0*
> 
> MSI GTX 1080 GAMING Z cards are binned.


They are not binned by MSI ... MSI does not bin at all, thats what i was told by an MSI PM during an MSI event.

He said GPUs get binned by NVIDIA and that they buy GPUs from different bin ranges

Boost Clock of 1911 ... most likely all GTX1080 will do that with enough VDDC and adequate PWR Supply / cooling / fan curves..

I know that they actually will Boost higher under Full Load ... no worries.









third card i got is a ZOTAC GTX 1080 AMP! (non Extreme) MAX PWR Limit 120% equals to 258W.

Funny thing is i removed my G1 with EVGA FTW Bios without using DDU and put in the AMP.

3D Mark Systeminfo still displaying the EVGA FTW based Info und i been able to reach these clock rates / Scores:

http://www.3dmark.com/3dm/14320295

After i used DDU and the AMP been recognized as AMP by the driver max OC got worse and VDDC went from 1.062 to 1.050.

So am about to flash the Zotac AMP with the FTW Master Bios too.


----------



## Besty

Quote:


> Originally Posted by *DrFreeman35*
> 
> No links? Proof of any kind? I've seen all kinds of "verified" posts about most cards being completely different. Depends on luck of the draw with these 1080's..... Buddy has 1080 OC strix and can't get above 2088, but I've seen some go over 2100.


My classified would run 2.1Ghz @ 1.09v, not more beyond that though. Not much quicker than my 980ti @ 1.5ghz so a bit of a side-grade.

Both the 1080 xtreme waterforce and the 1080 classified boost to 2050 out of the box no problem, beyond that is a lottery.

Both cards have been sent back due to lack of voltage control beyond 1.09v, might as well buy a cheap reference card and stick a block on it IMHO. For air cooling, the 1080 classified was a very nice piece of kit, can highly recommend. 1000 EUR/USD though !


----------



## IronAge

Classified has an EVBOT Plug ... so OV should be possible ... not sure if EVGA has released EVBOT Firmware Update for GTX1080 Classified yet.


----------



## x-apoc

I couldn't get my zotac 1080 amp past 2075 thats OC included.


----------



## IronAge

So you have returned it or you own it ? EVGA FTW Bios should help.


----------



## Reckit

Quote:


> Originally Posted by *TWiST2k*
> 
> Right after I checked your comment, I went and checked Amazon just for the heck of it and low and behold they got stock! I placed my order and refreshed and they were already gone! I hope I get a good one, I would rather not have to send this back to Amazon a few times till I get one with decent QA haha.


fingers crossed, hopefully it will be alright. There will be some bleed, it just depends on how bad it is


----------



## fat4l

Quote:


> Originally Posted by *Fediuld*
> 
> Challenge accepted, if you share the curve
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Currently the best achieved with linear (aka basic) overclock is 8255 for my MSI Armor


Here it is


----------



## toncij

Quote:


> Originally Posted by *Besty*
> 
> My classified would run 2.1Ghz @ 1.09v, not more beyond that though. Not much quicker than my 980ti @ 1.5ghz so a bit of a side-grade.
> 
> Both the 1080 xtreme waterforce and the 1080 classified boost to 2050 out of the box no problem, beyond that is a lottery.
> 
> Both cards have been sent back due to lack of voltage control beyond 1.09v, might as well buy a cheap reference card and stick a block on it IMHO. For air cooling, the 1080 classified was a very nice piece of kit, can highly recommend. 1000 EUR/USD though !


My FTW goes to 2139 with no sweat, but above that runs into thermal wall even at max fan.I'd expect Classified to hit 2150 with ease.


----------



## x-apoc

I still have the card.


----------



## Cornerer

Quote:


> Originally Posted by *Besty*
> 
> Both cards have been sent back due to lack of voltage control beyond 1.09v, might as well buy a cheap reference card and stick a block on it IMHO. For air cooling, the 1080 classified was a very nice piece of kit, can highly recommend. 1000 EUR/USD though !


Quote:


> Originally Posted by *IronAge*
> 
> Classified has an EVBOT Plug ... so OV should be possible ... not sure if EVGA has released EVBOT Firmware Update for GTX1080 Classified yet.


Not sure about the voltage thing. Even the voltage hard mod page claimed that Pascal doesn't respond well/unstable at higher voltage. ASUS Strix is probably the best example of that.

https://xdevs.com/guide/pascal_oc/#voltsc

As for air cooling EVGA also has models with exact cooler on reference board and way cheaper, but even so the best air cooling models should probably go to AMP Extreme, Gigabyte Xtreme Gaming, ASUS Strix and most Palit/Gainward variants, all featuring nothing short of a gigantic heatsink


----------



## fat4l

Quote:


> Originally Posted by *toncij*
> 
> My FTW goes to 2139 with no sweat, but above that runs into thermal wall even at max fan.I'd expect Classified to hit 2150 with ease.


It doesnt depend on the "name" of the card but rather on the chip itself. Most of the aib cards can go over 2.1GHz. Period..


----------



## x-apoc

Can anyone confirm if reflashing bios to FE on zotac changes anything?


----------



## IronAge

try FTW Bios for the AMP .... fits better IMHO ... Dual Fan and Zero db Mode @ idle.


----------



## THEROTHERHAMKID

Does changing power settings in nvidia control panel from optimal to performance make a difference when overclocking?


----------



## Fediuld

Quote:


> Originally Posted by *fat4l*
> 
> Here it is


Thank you. I will try to apply it and let you know.

Is shame the curve shows all way to 1.2v








Wish we could crack the bios to allow it use that power.


----------



## x-apoc

Quote:


> Originally Posted by *IronAge*
> 
> try FTW Bios for the AMP .... fits better IMHO ... Dual Fan an Zero db Mode @ idle.


Thanks, will give it a try tonight.


----------



## danjal

my new zotac 1080 amp edition.. +100core +250mem 120% power limit +10% voltage. Still tinkering with it...

[email protected]
2560x1440 75hz monitor.


----------



## tps3443

Quote:


> Originally Posted by *Kerian*
> 
> Well after tweaking my MSI GTX 1080 FE since release day here is my final results for a full stable OC (game + benchmark) :
> 
> Graphic setup :
> - 1440p , 144 Hz, GSYNC
> 
> Computer :
> - i5 4670k at 4.2 Ghz
> -16 Go RAM at 1600 MHz
> - Motherboard Asus Z97-A BIOS 1503
> 
> My OC results:
> 
> Parameters :
> +120% TDP
> Temp max : 85°C
> 
> - Core +150 MHz giving a stabilized clock of 1949 Mhz at full load
> - Memory + 425 MHz
> - Temp at full load : ~ 80°C with custom fan curve (1%/1°C beyond 40 °C)
> - Maxed monitored Vcore : ~ 1,06 V
> 
> I start having artifacts in games (in particular Witcher 3) with a +450 MHz on memory, which is a poor OC in comparison with some of your OC.
> Core + 175 MHz will crash drivers in some games / benchmark. Example : 3 loops stable with Unigine heaven but crash with Unigine Valley at the exact same moment into the benchmark
> Boosting Vcore doesn"t help the OC . I have the same result with +100% vcore (MSI afterburner).
> 
> I'm a little bit disappointed by those results as I wanted to reach 2GHz barrier but the gaming experience is amazing and I have no regrets in purchasing this card.
> Coming from a GTX680 2Go the performance increase is absolutely incredible.


If my temps go over 80C with extreme overclock I get the same driver crashes.

I usually run +185 Core, and +600 on my memory. The fan was at the default profile and my gtx1080FE crashed the drivers.

After setting my fan to more aggressive profile, keeping temps below 80C . The card no longer crashed the drivers.


----------



## Fediuld

Quote:


> Originally Posted by *fat4l*
> 
> Here it is


I tried to match it as best as possible (because my card starts with an extra boost). Did first Spy test the and the FPS were +10%, however crashed almost at the end of the second test.
Planning to use a photo editing software to find the exact points locations tomorrow







but looks like the curve is sound. Thank you


----------



## x-apoc

I think I found the right bios file was called gp104 for FE evga
Flashed, did some tests, started artifact while running valley bench at gpu clock 2100mhz, max power @1.08v +500 mem clock. My temps were ok 72c at like 60% fan speed.

Rolled back bios to original.


----------



## 0ppositeLock

Hey guys. Is it safe to use Coollab liquid pro on the 1080?


----------



## Bravoexo

Getting ready for Forza H3!!!










My 5th SLi cards (prev 6600GT, 8800GT, GTX295s, GTX680 4Gb)


----------



## Reckit

Quote:


> Originally Posted by *Bravoexo*
> 
> Getting ready for Forza H3!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 5th SLi cards (prev 6600GT, 8800GT, GTX295s, GTX680 4Gb)


----------



## KickAssCop

Quote:


> Originally Posted by *Bravoexo*
> 
> Getting ready for Forza H3!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 5th SLi cards (prev 6600GT, 8800GT, GTX295s, GTX680 4Gb)


Probably won't support MGPU like Forza 6.


----------



## newguy69

Anyone have Palit Gamerock Premium Edition? I have regular jetstream but I would love to try out Gamerock Premium Edition bios so I don't have to use any 3rd party software.

If anyone could be so helpful and save and send the Gamerock Premium Edition bios I would be more than happy.


----------



## sena

Folks which gpu block is compatible with sli hb bride beside aqua computer.


----------



## IronAge

Quote:


> Originally Posted by *newguy69*
> 
> Anyone have Palit Gamerock Premium Edition? I have regular jetstream but I would love to try out Gamerock Premium Edition bios so I don't have to use any 3rd party software.
> 
> If anyone could be so helpful and save and send the Gamerock Premium Edition bios I would be more than happy.


Have you tried this Review Sample Bios from Palit ?

https://www.techpowerup.com/vgabios/183823/palit-gtx1080-8192-160601

Boost Clock 1886 ... thats very likely the Gamerock Premium Bios ... and usually Review Sample Bios are not bad.


----------



## IronAge

Quote:


> Originally Posted by *0ppositeLock*
> 
> Hey guys. Is it safe to use Coollab liquid pro on the 1080?


IMHO its not really safe since there are SMD parts around the die.

it would be safe when you cover those parts with clearcoat or plastidip or a TIM from Shin Etsu which is not conductive.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *sena*
> 
> Folks which gpu block is compatible with sli hb bride beside aqua computer.


If you don't mind minor modifications, they should all fit. With my EK blocks, I just took off the LED/designed front (simple screws from the back), and then clipped the points on the PCB with a wire cutter, and it fit right in with my EK blocks and EK liquid bridge.

I don't think any waterblocks fit with it stock currently.


----------



## juniordnz

Quote:


> Originally Posted by *IronAge*
> 
> IMHO its not really safe since there are SMD parts around the die.
> 
> it would be safe when you cover those parts with clearcoat or plastidip or a TIM from Shin Etsu which is not conductive.


MX4 is non conductive also isnt it? Would work if you cover around the die with a thick layer of mx4


----------



## IronAge

TIM viscosity may be influenced by the temprature ... so clearcoat or plastidip would be safer and durable.


----------



## Benjiw

Quote:


> Originally Posted by *juniordnz*
> 
> MX4 is non conductive also isnt it? Would work if you cover around the die with a thick layer of mx4


Cleaning that up would be an utter PITA, Clear nail polish is easier to apply and acetone isopropyl alcohol will clean it out extremely easy.


----------



## sena

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> If you don't mind minor modifications, they should all fit. With my EK blocks, I just took off the LED/designed front (simple screws from the back), and then clipped the points on the PCB with a wire cutter, and it fit right in with my EK blocks and EK liquid bridge.
> 
> I don't think any waterblocks fit with it stock currently.


Thx mate, i have heard in titan x topic that aqua computer would fit, i really want to rig looks like beast, i need that led, everything will be related to green in my build.


----------



## VPII

Quote:


> Originally Posted by *IronAge*
> 
> Quote:
> 
> 
> 
> Originally Posted by *0ppositeLock*
> 
> Hey guys. Is it safe to use Coollab liquid pro on the 1080?
> 
> 
> 
> IMHO its not really safe since there are SMD parts around the die.
> 
> it would be safe when you cover those parts with clearcoat or plastidip or a TIM from Shin Etsu which is not conductive.
Click to expand...

I've re applied thermal paste many times on my 1080 and never an issue. I've used MX4 but not had to go with cm extreme fusion x1 which appears to work a little better. The components around the gpu is not affected by the thermal paste or most of those you get these days. But I still prefer to apply just enough to cover the die when pressing down the cooler hs.

Sent from my SM-G925F using Tapatalk


----------



## fat4l

Quote:


> Originally Posted by *0ppositeLock*
> 
> Hey guys. Is it safe to use Coollab liquid pro on the 1080?


ueah if you cover the capacitors/resistors around the die.


----------



## newguy69

Quote:


> Originally Posted by *IronAge*
> 
> Have you tried this Review Sample Bios from Palit ?
> 
> https://www.techpowerup.com/vgabios/183823/palit-gtx1080-8192-160601
> 
> Boost Clock 1886 ... thats very likely the Gamerock Premium Bios ... and usually Review Sample Bios are not bad.


This would be right bios for me. Thanks.

One more stupid and maybe silly question. Can I update bios from Windows? It has been long time that I have updated VGA bios and I'm pretty sure something has changed in couple (read: ten) of years. So I am asking instructions how to do this? Sorry for all the trouble and thanks anyone whos willing to help newbie out.


----------



## IronAge

No need to say sorry ... we are here to help.

flashing works well under Windows these days ... in fact you need Windows command line to flash the Bios

download this file and unpack it:

http://www13.zippyshare.com/v/BjqTjBZ3/file.html

copy the downloaded rom to the same folder

open windows command line with admin rights and change to the unpack folder and enter:

nvflash -6 biosname.rom

confirm with y and wait until nvflash has finished

use DDU http://www.wagnardmobile.com/?q=node/94 to wipe the driver and the device entries, restart the System and reinstall driver

btw: it is recommended backing up the orignial Bios of your graphics card using gpu-z or nvflash before you flash a different one.


----------



## newguy69

Quote:


> Originally Posted by *IronAge*
> 
> No need to say sorry ... we are here to help.
> 
> flashing works well under Windows these days ... in fact you need Windows command line to flash the Bios
> 
> download this file and unpack it:
> 
> http://www13.zippyshare.com/v/BjqTjBZ3/file.html
> 
> copy the downloaded rom to the same folder
> 
> open windows command line with admin rights and change to the unpack folder and enter:
> 
> nvflash -6 biosname.rom
> 
> confirm with y and wait until nvflash has finished
> 
> use DDU http://www.wagnardmobile.com/?q=node/94 to wipe the driver and the device entries, restart the System and reinstall driver
> 
> btw: it is recommended backing up the orignial Bios of your graphics card using gpu-z or nvflash before you flash a different one.


Did all the steps and card works now perfectly. (10 minutes of gameplay)

On regular bios it boosts around 1800. Now its +2000 all the time. Now I can finally get rid of afterburner.

Thanks for all the help. You really made my day!









E: If you guys can run gamerock premium clocks on your jetstream I can recommend this if you don't want to use 3rd party software to OC.


----------



## IronAge

thanks for giving feedback that it has worked well for you.


----------



## kx11

FINALLY the right bridge (4 slots) arrived from china










the bridge is the oddly unadvertised GALAX HB SLi bridge


----------



## Fediuld

Quote:


> Originally Posted by *kx11*
> 
> FINALLY the right bridge (4 slots) arrived from china
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the bridge is the oddly unadvertised GALAX HB SLi bridge


Wow
this looks amazing


----------



## danjal

Can an [email protected] ghz be bottlenecking a zotac 1080 amp edition?

I had a 1070 amp edition and I could overclock it decently, I was scoring 95fps in Heaven, and 5800-5900 in timespy..

The 1080 I'm scoring 109-115 in heaven and 6780 (so far) in timespy, but overclocking isnt netting any where near the gains I seen with my 1070..

I mean with the 1080 going from stock clocks to +100core/+250 memory I'm getting like 4fps difference from stock clocks in heaven, does that make sense?


----------



## Fediuld

Quote:


> Originally Posted by *danjal*
> 
> Can an [email protected] ghz be bottlenecking a zotac 1080 amp edition?
> 
> I had a 1070 amp edition and I could overclock it decently, I was scoring 95fps in Heaven, and 5800-5900 in timespy..
> 
> The 1080 I'm scoring 109-115 in heaven and 6780 (so far) in timespy, but overclocking isnt netting any where near the gains I seen with my 1070..
> 
> I mean with the 1080 going from stock clocks to +100core/+250 memory I'm getting like 4fps difference from stock clocks in heaven, does that make sense?


Without overclocking what are your speeds?

Do you hit the Temp Limit at all?


----------



## 0ppositeLock

Thanks for the advice guys. I'll play it safe and get some regular thermal paste.


----------



## Asus11

has anyone recently done a shunt mod and noticed a good difference


----------



## danjal

Quote:


> Originally Posted by *Fediuld*
> 
> Without overclocking what are your speeds?
> 
> Do you hit the Temp Limit at all?


These are with default clock settings... temp limit set at 75c. never went over 70c during benches, even warmed the card up by letting it loop in heaven for about 10minutes, then ran the benches..

stock clocks





overclocked..100/250


----------



## Sazexa

Finally got the 1080's under water. Now, to overclock them, soon. >


----------



## Descadent

finally got a 1080 I've been wanting and waiting and missing out on. gigabyte gaming xtreme on the way for my 34" ultrawide 1440p! can't wait to see how different vr feels with a 1080 over a 980 in sim racing with the rift.


----------



## Cornerer

Quote:


> Originally Posted by *danjal*


I ran Heaven at 2088-2126MHz core / 5341MHz mem.
Score: 3113
6600K at 4.78GHz

Don't think Heaven could bottleneck an i5 though. It's literally a pure GPU benchmark.


----------



## danjal

Quote:


> Originally Posted by *Cornerer*
> 
> I ran Heaven at 2088-2126MHz core / 5341MHz mem.
> Score: 3113
> 6600K at 4.78GHz
> 
> Don't think Heaven could bottleneck an i5 though. It's literally a pure GPU benchmark.


but overclocking the core +100 and mem+250 I'm netting 4fps difference.. that doesnt seem like much for an overclock.

But I think what you've posted shows my [email protected] is getting close to being bottlenecked at my 4.5ghz mark with the 1080 overclocked because its hitting a point of diminishing returns.

I'm also running at 2560/1440.


----------



## Whitechap3l

Quote:


> Originally Posted by *danjal*
> 
> but overclocking the core +100 and mem+250 I'm netting 4fps difference.. that doesnt seem like much for an overclock.
> 
> But I think what you've posted shows my [email protected] is getting close to being bottlenecked at my 4.5ghz mark with the 1080 overclocked because its hitting a point of diminishing returns.
> 
> I'm also running at 2560/1440.


I cant imagine that your CPU will bottleneck. Sure with i7 u get better results but i5 6600k should be way enough


----------



## toncij

Quote:


> Originally Posted by *Sazexa*
> 
> Finally got the 1080's under water. Now, to overclock them, soon. >


Dual rad?

Also, do this for SLI bridge (it works as the HB one):


----------



## MattBee

To the post above me that pc is beutiful.

My gpu fan didnt kick in today playing gta v and the game slowed down to 30 fps. Turns out it was running at 94 c and fans not spinning. For some reason gpu tweak didnt fan work at all for the fan profile I set and tested.

Im using msi afterburner to set a good fan profile now and its working well again. I hate the idea of damaging it by heat.


----------



## Cornerer

Quote:


> Originally Posted by *danjal*
> 
> but overclocking the core +100 and mem+250 I'm netting 4fps difference.. that doesnt seem like much for an overclock.
> 
> But I think what you've posted shows my [email protected] is getting close to being bottlenecked at my 4.5ghz mark with the 1080 overclocked because its hitting a point of diminishing returns.
> 
> I'm also running at 2560/1440.







I'm already aware that any i5 would lead us down a bit in CPU-intensive games, even at 4.9GHz, but I've just searched for 1080 benchmarks again and after watching the link above I would still say you shouldn't being bottlenecked by your CPU in your scenario at least. Heaven is barely touching the extra 4 threads of i7. Can't figure out what's going on ....


----------



## toncij

Quote:


> Originally Posted by *MattBee*
> 
> To the post above me that pc is beutiful.
> 
> My gpu fan didnt kick in today playing gta v and the game slowed down to 30 fps. Turns out it was running at 94 c and fans not spinning. For some reason gpu tweak didnt fan work at all for the fan profile I set and tested.
> 
> Im using msi afterburner to set a good fan profile now and its working well again. I hate the idea of damaging it by heat.


The post above you is mine and I don't find my PC beautiful - it's ugly forest of cables atm, it's a test bench








But the one I quote, yes, it's very nice (the water cooled one).


----------



## Asus11

Quote:


> Originally Posted by *Sazexa*
> 
> Finally got the 1080's under water. Now, to overclock them, soon. >


beasty rig


----------



## juniordnz

Nice tip about the nail polish to isolate the surroudings of the die...

Another question: For those using an AIO solution designed for CPUs. The fact that the waterblock isn't completely flat (because CPU's IHS aren't completely flat) would be a problem if using liquid metal like CLU, that should be applied in a very thin layer? I mean, would it make complete contact with the GPU die?


----------



## toncij

Quote:


> Originally Posted by *juniordnz*
> 
> Nice tip about the nail polish to isolate the surroudings of the die...
> 
> Another question: For those using an AIO solution designed for CPUs. The fact that the waterblock isn't completely flat (because CPU's IHS aren't completely flat) would be a problem if using liquid metal like CLU, that should be applied in a very thin layer? I mean, would it make complete contact with the GPU die?


Not sure about lm, but normal paste applies very nice. Its perfectly able to fill small imperfections of the block.


----------



## Klocek001

Quote:


> Originally Posted by *Sazexa*
> 
> Finally got the 1080's under water. Now, to overclock them, soon. >


Oh my God yes! that is TASTEFUL


----------



## SAFX

Installing evga 1080 FTW on Windows 7, been using AMD 295x2 for nearly 2 years,

Aside from the gpu driver, is there anything else I need to upgrade, like chipsets, or tweak, just asking, want to do this right and avoid complaining later because I missed a step simply for not asking


----------



## zlpw0ker

Quote:


> Originally Posted by *SAFX*
> 
> Installing evga 1080 FTW on Windows 7, been using AMD 295x2 for nearly 2 years,
> 
> Aside from the gpu driver, is there anything else I need to upgrade, like chipsets, or tweak, just asking, want to do this right and avoid complaining later because I missed a step simply for not asking


ye,use DDU to completly remove any AMD drivers regarding the GPU upgrade.


----------



## cr4p

Hi Guys,

Is this a good score for Firestrike?









http://www.3dmark.com/fs/9918688


----------



## Snabeltorsk

Quote:


> Originally Posted by *cr4p*
> 
> Hi Guys,
> 
> Is this a good score for Firestrike?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/9918688


Yes that is a normal score for an overclocked 1080. Most ppl get around 23500 to 25000 with overclocks.


----------



## r0l4n

Did anybody experience a change in max stable clocks when changing the motherboard? I had to reduce the 24/7 clocks by 2 bins after switching to X99, I get artifacts with the same clocks that were stable with my previous Z97...


----------



## wardo3640

I have a pair of 1080 Sea Hawk X's headed my way. When I get them in my hot little hands where should I start? Same as most I would like to hit 2100 at least on the pair.

Also need a hb sli bridge for them going in slot 1 and slot 3 on a R5E10 mobo. I am having trouble finding this bridge tho.


----------



## SAFX

Just finished installing 1080 FTW, 100% stock, no OC, no custom fan, etc., seeing decent improvements compared to my previous 295x2, lots of room for more









*Overall Improvements*

FPS - *24%*
Score - *24%*
Min FPS - *29%*
Max FPS - *28%*

*Valley (AMD 295x2)*


*Valley (Evga 1080 FTW)*


*1080 Temps*


----------



## Reckit

Quote:


> Originally Posted by *SAFX*
> 
> Just finished installing 1080 FTW, 100% stock, no OC, no custom fan, etc., seeing decent improvements compared to my previous 295x2, lots of room for more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Overall Improvements*
> 
> FPS - *24%*
> Score - *24%*
> Min FPS - *29%*
> Max FPS - *28%*
> 
> *Valley (AMD 295x2)*
> 
> 
> *Valley (Evga 1080 FTW)*
> 
> 
> *1080 Temps*


Is that the temp reading from your benchmark?

I would say that 82 degrees is a little hot for a non oc benchmark. What was your ambient temperature? I would use a custom fan speed setup, the standard seems to be a little conservative IMO


----------



## juniordnz

Quote:


> Originally Posted by *SAFX*
> 
> Just finished installing 1080 FTW, 100% stock, no OC, no custom fan, etc., seeing decent improvements compared to my previous 295x2, lots of room for more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Overall Improvements*
> 
> FPS - *24%*
> Score - *24%*
> Min FPS - *29%*
> Max FPS - *28%*
> 
> *1080 Temps*


Geez, you got over 80ºC with something light like Heaven? I would set a 100%fan profile under load ASAP. You must be throttling like crazy with that.


----------



## toncij

Now, if only 1080Tis would exist...
Quote:


> Originally Posted by *zlpw0ker*
> 
> ye,use DDU to completly remove any AMD drivers regarding the GPU upgrade.


Should't be a need for that on modern machines and Windows.


----------



## tps3443

Hey guys, which bios should I use on my MSI GTX 1080 Founders Edition?

It will boost to around 1800Mhz stock. I run afterburner but, I would like it to boost around 2,000mhz without after burner, like my GTX 1070 SC ACX did.

I've got the flashing software already.

Could some one post a link, of the bios to use?

Thanks!


----------



## GreedyMuffin

Seems like the 1080 can downvolt without loosing too much speed.

I'm playing on 2088mhz at 0.975V. Gonna go for 2100 next, Mem is on 500+. No crashes so far after many hours in BF4 nor in folding. Stable enough for me.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Seems like the 1080 can downvolt without loosing too much speed.
> 
> I'm playing on 2088mhz at 0.975V. Gonna go for 2100 next, Mem is on 500+. No crashes so far after many hours in BF4 nor in folding. Stable enough for me.


That's nice. I thought I was stable at 2126mhz/1.031V after several flawless firestrike runs, but a few minutes in Rainbow Six Siege proved me wrong







1.050V seems to work though.


----------



## GreedyMuffin

Played 3-4 hours of BF4. BF4 is much more intense on the OC than R6S in my experience.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Played 3-4 hours of BF4. BF4 is much more intense on the OC than R6S in my experience.


I also thought R6S would be very light on the OC, specially after passing several Firestrike Stress Tests with +99% (that proved to be a big OC breaker for me), but I can't even play for 5 min that I get crazy screen freezes when undervolted.

Boomer, I was very happy to run a steady 2100mhz after the temp clockdowns with 1.031V.


----------



## GreedyMuffin

Quote:


> Originally Posted by *juniordnz*
> 
> I also thought R6S would be very light on the OC, specially after passing several Firestrike Stress Tests with +99% (that proved to be a big OC breaker for me), but I can't even play for 5 min that I get crazy screen freezes when undervolted.
> 
> Boomer, I was very happy to run a steady 2100mhz after the temp clockdowns with 1.031V.


I've atcually never gotten a crash in R6S. If you have BF4 I would def. test with that.

A family member is currently playing on my comp. Traveled from Iceland to Norway. So it's a win-win. He will test the system for me. hehe


----------



## Benjiw

If you're overclocks are passing stress tests by anything less than 100% and you're crashing games, then you're not stable, simples.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Benjiw*
> 
> If you're overclocks are passing stress tests by anything less than 100% and you're crashing games, then you're not stable, simples.


^This.

Luckily mine is atcually stable.


----------



## juniordnz

Quote:


> Originally Posted by *Benjiw*
> 
> If you're overclocks are passing stress tests by anything less than 100% and you're crashing games, then you're not stable, simples.


Are you familiarized with Firestrike Stress Test? Anything above 97% is considered a successful pass.

"After running a 3DMark Stress Test, you will see your system's Frame Rate Stability score. A high score means your PC's performance under load is stable and consistent. To pass the test, your system must complete all loops with a Frame Rate Stability of at least 97%. For more details, please read our 3DMark Technical Guide." https://www.futuremark.com/pressreleases/check-your-pcs-stability-with-new-3dmark-stress-tests

This rating takes into account framerate consistency. If your card's clock fluctuates, you lose consistency. We all know that Pascal GPU opens at a clock and get's downclocked over time due to temperatures. Mine get's 2 clock downs, from 2126 to 2100 from idle to 55ºC. That's why no card will get 100%.

Go ahead and try with stock settings to see if you get 100% with a aircooling and temp throttle.


----------



## PasK1234Xw

edit

sold


----------



## SAFX

Quote:


> Originally Posted by *Reckit*
> 
> Is that the temp reading from your benchmark?
> 
> I would say that 82 degrees is a little hot for a non oc benchmark. What was your ambient temperature? I would use a custom fan speed setup, the standard seems to be a little conservative IMO


Few things working against me on that last benchmark,

Front panel CPU rad/fan blowing directly over 1080
Ambient temp 76-78 F
No custom fan profile selected
Forgot to uninstall SpeedFan (previously used with 295x, interfering with system fan)
I moved the cpu rad/fan higher so hot air blows out the top of the case.
Set fan profile to Aggressive.
Uninstalled SpeedFan

Ran test again, temps came down 6-8C,


----------



## juniordnz

Quote:


> Originally Posted by *SAFX*
> 
> Few things working against me on that last benchmark,
> 
> Front panel CPU rad/fan blowing directly over 1080
> Ambient temp 76-78 F
> No custom fan profile selected
> Forgot to uninstall SpeedFan (previously used with 295x, interfering with system fan)
> I moved the cpu rad/fan higher so hot air blows out the top of the case.
> Set fan profile to Aggressive.
> Uninstalled SpeedFan
> 
> Ran test again, temps came down 6-8C,


Still too high IMO for a FTW. Mine always stays around 55C with 100% fan setting. And that in a hot county with no AC in my room.


----------



## Goroshi

Quote:


> Originally Posted by *juniordnz*
> 
> Still too high IMO for a FTW. Mine always stays around 55C with 100% fan setting. And that in a hot county with no AC in my room.


you must have a chip that doesnt produce much heat then, my FTW at 100% hovers 65-70C in games and sometimes I even seen 74-75C when it starts using 110%+ power.


----------



## jase78

Quote:


> Originally Posted by *Goroshi*
> 
> you must have a chip that doesnt produce much heat then, my FTW at 100% hovers 65-70C in games and sometimes I even seen 74-75C when it starts using 110%+ power.


Maybe he's just talking about when benchmarking and not prolonged gaming.


----------



## SAFX

Quote:


> Originally Posted by *juniordnz*
> 
> Still too high IMO for a FTW. Mine always stays around 55C with 100% fan setting. And that in a hot county with no AC in my room.


Appreciate the info, but don't throw numbers around without being clear on what you're actually saying; 55C @100% fan under what scenario? idle, Valley, Heaven, 3DMark, drunk, high, floating in space, etc?









At IDLE, I'm at 43C, fan 1400rpm, ambient 26C/79F, stock gpu.


----------



## tps3443

Which bios? Anyone know? I cannot find much on Google. I have a 1080FE. I want it to boost to 2000 without msi afterburner.

Anyone?


----------



## Goroshi

Quote:


> Originally Posted by *jase78*
> 
> Maybe he's just talking about when benchmarking and not prolonged gaming.


He said 55C through a Firestrike Stress Test, that heats up my GPU to around the 68C mark so Idk.


----------



## Reckit

yep mine is about 60-65c on firestrike fan at 60%, ambient around 20 to 25c


----------



## wholeeo

So blew some cash on some EK Gear for my FE. Just waiting on radiators before I throw my mITX under water which I promised myself I wouldn't do again.


----------



## juniordnz

Quote:


> Originally Posted by *jase78*
> 
> Maybe he's just talking about when benchmarking and not prolonged gaming.


Quote:


> Originally Posted by *SAFX*
> 
> Appreciate the info, but don't throw numbers around without being clear on what you're actually saying; 55C @100% fan under what scenario? idle, Valley, Heaven, 3DMark, drunk, high, floating in space, etc?
> 
> 
> 
> 
> 
> 
> 
> 
> At IDLE, I'm at 43C, fan 1400rpm, ambient 26C/79F, stock gpu.


Quote:


> Originally Posted by *Goroshi*
> 
> He said 55C through a Firestrike Stress Test, that heats up my GPU to around the 68C mark so Idk.


My card is now idling at 34C 60% fan speed (inaudible). Local temp is 30C here (can't tell room temp).

I get +-55C with hours of gaming, mostly R6S. Through a Firestrike bench if it's a cool day I get 50C tops. Heaven, which is light too, tops out at 55C. I get veeeeery random, not so often, spikes to 59-60. But most of the time it's around 55C.

2126mhz/+500mem/1.062V/130%TDP/100%fan/No side panel

Clear enough?


----------



## SAFX

Quote:


> Originally Posted by *juniordnz*
> 
> Local temp is 30C here (can't tell room temp).


Much better, ty, and nice temps, especially idle, but what's your ambient temp? Is your house an igloo?


----------



## Cornerer

Quote:


> Originally Posted by *SAFX*
> 
> At IDLE, I'm at 43C, fan 1400rpm, ambient 26C/79F, stock gpu.


You sure your GPU was actually idling (like at 139MHz)?

My Palit JetStream sits at ~38C with fans not spinning. Ambient 28/29C.


----------



## tps3443

My card runs about 72C at 100% fan speed is this normal? It's a reference 1080.

Idle is about 40 or so.


----------



## Cornerer

Quote:


> Originally Posted by *tps3443*
> 
> My card runs about 72C at 100% fan speed is this normal? It's a reference 1080.
> 
> Idle is about 40 or so.


I've seen review(s) reporting low 60s at 100% fan. Don't think ambient was mentioned though.


----------



## juniordnz

Quote:


> Originally Posted by *SAFX*
> 
> Much better, ty, and nice temps, especially idle, but what's your ambient temp? Is your house an igloo?


hahaha not even close to an igloo, sadly...I have no AC in my room, but I live 7 floors above ground so it's a bit colder than local temp. I can't really tell because I have no instrument to measure that, but I guess 4 or 5 degrees lower than local temp outside? IDK...When idling I always get something like 5 degrees higher than local temp, must be 10 above room temp or so...

Maybe I have a good piece of sylicon, IDK, I get a pretty mediocre overclock ([email protected] Voltage). What makes a HUGE difference is case air flow. If you don't have impecable airflow, with fans on the side blowing cold air to the card, I suggest you keep your side panel open. Wouldn't mind 70C if these cards didn't throttle so much with temperature increase. With 55-60 I can keep a steady 2100mhz under load, and I'm happy with that.


----------



## GanGstaOne

So first time using liquid metal didnt expect much load cpu temps drop with 4-5C but my 1080 drop with 3C on idle and 8-10 on load i have 35-37C in 3Dmark now this will even go down few deg. in some games haha this thermal grizzly is really good
Will post screenshots tomorrow if i have some time


----------



## juniordnz

Quote:


> Originally Posted by *GanGstaOne*
> 
> So first time using liquid metal didnt expect much load cpu temps drop with 4-5C but my 1080 drop with 3C on idle and 8-10 on load i have 35-37C in 3Dmark now this will even go down few deg. in some games haha this thermal grizzly is really good
> Will post screenshots tomorrow if i have some time


8-10 degrees only by changing normal paste for liquid metal? The's veeery nice! Which card do you have? And which cooler?


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> 8-10 degrees only by changing normal paste for liquid metal? The's veeery nice! Which card do you have? And which cooler?


Gigabyte 1080 G1 Gaming Kraken G10 + Thermatake 3.0 Extreme S 240mm
But i'm changing it for Arctic Cooling Liquid Freezer 240 thicker rad
I didnt expect more than 2-3C drop but the gpu temps was big and good suprise


----------



## Tdbeisn554

So my GTX 770 is starting to showing her age... And I really want a new card. Convince me to buy a 1080 (I really want one but it is a bit expensive) and What model are you recommending?
I am playing at 1080p but I want to ascent to Glorious 4K when there are good 4K IPS 120+ Hz monitors out (More for futureproofing, not really to play all games at 4K 120Hz) So I do not need to buy a monitor every 2 years.)


----------



## GanGstaOne

Quote:


> Originally Posted by *Archang3l*
> 
> So my GTX 770 is starting to showing her age... And I really want a new card. Convince me to buy a 1080 (I really want one but it is a bit expensive) and What model are you recommending?
> I am playing at 1080p but I want to ascent to Glorious 4K when there are good 4K IPS 120+ Hz monitors out.


If you really want good 4K go for Titan X Pascal or wait for 1080 TI cause 1080
is still not 100% 4k card


----------



## Tdbeisn554

Quote:


> Originally Posted by *GanGstaOne*
> 
> If you really want good 4K go for Titan X Pascal or wait for 1080 TI cause 1080
> is still not 100% 4k card


I know, but TItan is waay to expensive for me and I just want a good 4K screen for future proofing I plan to keep that monitor for at least 7 years then + you can still play games at 1080 or 1440p right? or just settings a bit less.
I just want to have a great card I can keep for 4 maybe 5 years


----------



## toncij

Quote:


> Originally Posted by *Archang3l*
> 
> So my GTX 770 is starting to showing her age... And I really want a new card. Convince me to buy a 1080 (I really want one but it is a bit expensive) and What model are you recommending?
> I am playing at 1080p but I want to ascent to Glorious 4K when there are good 4K IPS 120+ Hz monitors out.


To be able to play at [email protected] you will always need two cards of the Titan level. Each year or max two years. Games advance and [email protected] is extremely difficult to reach.


----------



## Tdbeisn554

Quote:


> Originally Posted by *toncij*
> 
> To be able to play at [email protected] you will always need two cards of the Titan level. Each year or max two years. Games advance and [email protected] is extremely difficult to reach.


I know, edited my post, 4K 120Hz monitor is more for futureproofing and just having the option to play games at 120Hz on 1080, 1440 or 4K if it is possible.


----------



## TWiST2k

Quote:


> Originally Posted by *Archang3l*
> 
> So my GTX 770 is starting to showing her age... And I really want a new card. Convince me to buy a 1080 (I really want one but it is a bit expensive) and What model are you recommending?
> I am playing at 1080p but I want to ascent to Glorious 4K when there are good 4K IPS 120+ Hz monitors out (More for futureproofing, not really to play all games at 4K 120Hz) So I do not need to buy a monitor every 2 years.)


These posts crack me up, bro I don't work for any graphic card company and "convincing" you has 0 effect on my existence. Do the research, do your homework and make an informed decision, everything you need to know is out in the wild and easily accessible. I have a 1080 FTW and I could not be happier and I came from a 980 Ti Classified and I game @ 2560x1440, to me 4k is pointless with games right now.


----------



## SAFX

Quote:


> Originally Posted by *Cornerer*
> 
> You sure your GPU was actually idling (like at 139MHz)?
> 
> My Palit JetStream sits at ~38C with fans not spinning. Ambient 28/29C.


AH-HA! ....it wasn't idling after all, locked at 1721Mhz, totally forgot I adjusted power settings.

What a shame, I got so disappointed with my initial temps, needed some cheering up, picked up *CLASSIFIED , KA-BOOM!*








Totally lucked out! MicroCenter was sold out for weeks, settled on FTW, driving back home earlier today, dropped by MC just to check, had 1 in stock, MINE!


----------



## KickAssCop

Got 2 Strix AG incoming.


----------



## SAFX

Quote:


> Originally Posted by *KickAssCop*
> 
> Got 2 Strix AG incoming.


Will be a nice setup! you water cooling?


----------



## MattBee

f


----------



## TWiST2k

Quote:


> Originally Posted by *SAFX*
> 
> AH-HA! ....it wasn't idling after all, locked at 1721Mhz, totally forgot I adjusted power settings.
> 
> What a shame, I got so disappointed with my initial temps, needed some cheering up, picked up *CLASSIFIED , KA-BOOM!*
> 
> 
> 
> 
> 
> 
> 
> 
> Totally lucked out! MicroCenter was sold out for weeks, settled on FTW, driving back home earlier today, dropped by MC just to check, had 1 in stock, MINE!


Thumbs down, should of stuck with the FTW haha! I have a 980 Ti Classified and it was dope, but this time around, not worth the extra cash IMO.


----------



## Sirstiv

Hey guys,

It may be a repeated question but i'm feeling lazy.

Do you void warranty if you remove the stock fan of a new card like the 1080 and re apply a better thermal solution?

I'd like to knock down some extra temps. (79 degrees max temps on stock setting)

Also what's the best aftermarket air solution? (that's not to loud) for a reference 1080. (asus)


----------



## TWiST2k

Quote:


> Originally Posted by *Sirstiv*
> 
> Hey guys,
> 
> It may be a repeated question but i'm feeling lazy.
> 
> Do you void warranty if you remove the stock fan of a new card like the 1080 and re apply a better thermal solution?
> 
> I'd like to knock down some extra temps.
> 
> Also what's the best aftermarket air solution? (that's not to loud) for a reference 1080. (asus)


Not with EVGA, I do not know about anyone else. Google could probably give you a better answer in much less time.


----------



## SAFX

Quote:


> Originally Posted by *TWiST2k*
> 
> Thumbs down, should of stuck with the FTW haha! I have a 980 Ti Classified and it was dope, but this time around, not worth the extra cash IMO.


*Who said I paid extra for it?*


----------



## TWiST2k

Quote:


> Originally Posted by *SAFX*
> 
> *Who said I paid extra for it?*


Well there you go then, that is worth it haha! Congrats on the *****in' deal!


----------



## KickAssCop

Quote:


> Originally Posted by *SAFX*
> 
> Will be a nice setup! you water cooling?


Not this time. I may put on the NZXT brackets if I am feeling adventurous.
EK blocks are too expensive and a ***** to sell.


----------



## SAFX

Quote:


> Originally Posted by *KickAssCop*
> 
> Not this time. I may put on the NZXT brackets if I am feeling adventurous.
> EK blocks are too expensive and a ***** to sell.


..but those EKs sure look awesome! oh man, the temptation!


----------



## nexxusty

Quote:


> Originally Posted by *Archang3l*
> 
> I know, but TItan is waay to expensive for me and I just want a good 4K screen for future proofing I plan to keep that monitor for at least 7 years then + you can still play games at 1080 or 1440p right? or just settings a bit less.
> I just want to have a great card I can keep for 4 maybe 5 years


You won't find anyone here who keeps a card for 4 or 5 years. That's ridiculous....

GTX 770 started showing its age the second it was released. You do know it's a 680 with faster memory? Thats it.

Buy a good card. You'll enjoy it.


----------



## Cornerer

Quote:


> Originally Posted by *nexxusty*
> 
> You won't find anyone here who keeps a card for 4 or 5 years. That's ridiculous....


I had been using my HD 6950 for freakin' 6 years till recently when VRMs just keep boiling even at 4700rpm fan.

Rate my ridiculousness


----------



## nexxusty

Quote:


> Originally Posted by *Cornerer*
> 
> I had been using my HD 6950 for freakin' 6 years till recently when VRMs just keep boiling even at 4700rpm fan.
> 
> Rate my ridiculousness


Extreme.

Maybe find a hobby you can afford?

I kid.... lol.


----------



## Cornerer

Well obviously not going cheapish on my rig anymore :3

May consider Kraken G10 cooling in near future but 1st need to decide whether going dual tower or AIO for CPU.


----------



## Sirstiv

Has anyone done a thermal paste 'upgrade' on these cards yet with any decent results?


----------



## GanGstaOne

Quote:


> Originally Posted by *Sirstiv*
> 
> Has anyone done a thermal paste 'upgrade' on these cards yet with any decent results?


With custom cooling changed MX-4 for thermal grizzly conductonaut very good results almost 10C drop in full load compare to mx-4 but changing stock paste with something like mx-4 or other good paste if you keep stock cooler no difference at all


----------



## reb00tas

heaven on my Evga 1080 FE.

225/500


----------



## andytom69

Hi this is my 1080, run at 2100

IMG_20160825_203605.jpg 3809k .jpg file


----------



## Papazmurf

Quote:


> Originally Posted by *Cornerer*
> 
> I had been using my HD 6950 for freakin' 6 years till recently when VRMs just keep boiling even at 4700rpm fan.
> 
> Rate my ridiculousness


I was right there with you brotha. Had HD6970s in crossfire till about a month ago. That 2GB of VRAM was killing me. My new Gigabyte Xtreme Water Cooling version 1080 is quite the improvement.

Still rocking out on X58 with a Xeon x5660 as well, but now I'm getting off topic


----------



## PasK1234Xw

Quote:


> Originally Posted by *reb00tas*
> 
> heaven on my Evga 1080 FE.
> 
> 225/500


Try running in full screen if score doesn't go higher then dial back your memory i don't think it is stable and ECC is kicking in.

I can get 115 at 2ghz
with +200 on memory.


----------



## barsh90

Any one have any news on the ek HB bridges?
Currently have 2 1080 ek-Seahawks and Nvidia fugly hb bridge is a no-go, due to the hideous design that Interferes with waterblocks.








2 regular sli bridges ain't cutting it at 1440p 144hz specially with Witcher 3


----------



## PasK1234Xw

IMO all hard bridges are ugly. Only ones i like are simple floppy bridge with some plastic dip.
Everyone likes to go crazy on design. I also hate logo even more so when lit up. Keep it simple and sleek and have it sit flush on in cards


----------



## SAFX

Finally seeing idle temps consistent with expectations and reports from other members, *30C* - EVGA Classified (stock) @253Mhz 40% fan speed, ambient 26C/79F


----------



## Sirstiv

Quote:


> Originally Posted by *GanGstaOne*
> 
> With custom cooling changed MX-4 for thermal grizzly conductonaut very good results almost 10C drop in full load compare to mx-4 but changing stock paste with something like mx-4 or other good paste if you keep stock cooler no difference at all


Hmm was thinking with replacing stock paste with grizzly and stock cooler... may have to test.


----------



## andytom69




----------



## tps3443

Quote:


> Originally Posted by *SAFX*
> 
> AH-HA! ....it wasn't idling after all, locked at 1721Mhz, totally forgot I adjusted power settings.
> 
> What a shame, I got so disappointed with my initial temps, needed some cheering up, picked up *CLASSIFIED , KA-BOOM!*
> 
> 
> 
> 
> 
> 
> 
> 
> Totally lucked out! MicroCenter was sold out for weeks, settled on FTW, driving back home earlier today, dropped by MC just to check, had 1 in stock, MINE!


I got my MSI GTX1080 Founders edition for $280 sealed in the box. I am pretty sure ive got everyone beat. LOL
Some young black kids attained it through "financial aid" after getting a GRANT to go to school for graphics design. So $ 280 was steal


----------



## andytom69

gigabyte g1 ek


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *andytom69*
> 
> 
> 
> gigabyte g1 ek


Cool I have the g1? Expensive to put under water?


----------



## tps3443

Quote:


> Originally Posted by *andytom69*
> 
> 
> 
> gigabyte g1 ek


Looks good! I am glad to see with the 1070's and 1080's there are a lot more watercooled AIB cards. Before the 10 series it was only evga and there HydroCopper model. I wish I had a customer loop. But unfortunately, all I have is my FE lol.


----------



## toncij

Quote:


> Originally Posted by *SAFX*
> 
> AH-HA! ....it wasn't idling after all, locked at 1721Mhz, totally forgot I adjusted power settings.
> 
> What a shame, I got so disappointed with my initial temps, needed some cheering up, picked up *CLASSIFIED , KA-BOOM!*
> 
> 
> 
> 
> 
> 
> 
> 
> Totally lucked out! MicroCenter was sold out for weeks, settled on FTW, driving back home earlier today, dropped by MC just to check, had 1 in stock, MINE!


So, how does it work? Max clock under 80% fan & closed case and max at 100%? Btw, gonna SLI it with FTW or just swap?


----------



## zlpw0ker

have a few questions regarding my 1080 seahawk x
my gpu core clock is 1987mhz out of the box, is that good or normal?
second question is, how much can I OC the gpu?
does anyone have the 1080 seahawk that have gotten a stabile overclock that is somewhat good?


----------



## toncij

Quote:


> Originally Posted by *zlpw0ker*
> 
> have a few questions regarding my 1080 seahawk x
> my gpu core clock is 1987mhz out of the box, is that good or normal?
> second question is, how much can I OC the gpu?
> does anyone have the 1080 seahawk that have gotten a stabile overclock that is somewhat good?


It's fine. OC as much as you want. If it doesn't work, the app or the driver will crash.


----------



## zlpw0ker

Quote:


> Originally Posted by *toncij*
> 
> It's fine. OC as much as you want. If it doesn't work, the app or the driver will crash.


Im not installing any other MSI sw besides my afterburner,last time I installed the msi sw from the driver cd my pc went haywire and crippled my pc, everything was laggy and it took a 3secs to mark a folder for example.

Im thinking of using the afterburner or XOC,not sure if its possible to use XOC on nonEVGA cards.

and a question thing I dont know the answer to.
I set my fan rpm to 44% and everytime I reboot my pc it gets down to 33% and auto. how do I change that stay on 44% when rebooting my OS.


----------



## Bishop07764

Quote:


> Originally Posted by *zlpw0ker*
> 
> have a few questions regarding my 1080 seahawk x
> my gpu core clock is 1987mhz out of the box, is that good or normal?
> second question is, how much can I OC the gpu?
> does anyone have the 1080 seahawk that have gotten a stabile overclock that is somewhat good?


I have a Seahawk EK. That's what mine did on stock gaming bios. Didn't try flipping it to the oc stock bios. If you want higher clocks without doing anything else, you can flash the gaming Z bios in oc mode onto your card. That's what I did, and it boosts to about 2080 at stock now. I would back up your bios or just download the regular gaming x bios to flash back if your card can't handle it.


----------



## TWiST2k

Quote:


> Originally Posted by *GanGstaOne*
> 
> With custom cooling changed MX-4 for thermal grizzly conductonaut very good results almost 10C drop in full load compare to mx-4 but changing stock paste with something like mx-4 or other good paste if you keep stock cooler no difference at all


I switched to mx-4 on my FTW and dropped like 2c, nothing mind blowing at all.


----------



## SAFX

Quote:


> Originally Posted by *toncij*
> 
> So, how does it work? Max clock under 80% fan & closed case and max at 100%? Btw, gonna SLI it with FTW or just swap?


been benchmarking all day, preliminary results are suspiciously negative compared to my FTW; more work to do, but here's some results.
BTW: In Nvidia Control Panel, I set *Power management mode* to _Prefer maximum performance_, is it common to leave other settings default to ensure max performance, purely for benchmarking, not gaming.

*Valley (All Stock except power managemend in NCP)*

Code:



Code:


Card        score    fps      min/max
------------------------------------------------------
FTW         4116     98       39/186 
CLASS       4048     97       36/170


----------



## Benjiw

Quote:


> Originally Posted by *SAFX*
> 
> been benchmarking all day, preliminary results are suspiciously negative compared to my FTW; more work to do, but here's some results.
> BTW: In Nvidia Control Panel, I set *Power management mode* to _Prefer maximum performance_, is it common to leave other settings default to ensure max performance, purely for benchmarking, not gaming.
> 
> *Valley (All Stock except power managemend in NCP)*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Card        score    fps      min/max
> ------------------------------------------------------
> FTW         4116     98       39/186
> CLASS       4048     97       36/170


I do hope you have the classy on water or better cooling, comparing them both on air is pretty pointless..


----------



## toncij

Quote:


> Originally Posted by *Benjiw*
> 
> I do hope you have the classy on water or better cooling, comparing them both on air is pretty pointless..


Why would it be? A better heatsink and better binned chips should give some benefits.


----------



## SAFX

Quote:


> Originally Posted by *Benjiw*
> 
> I do hope you have the classy on water or better cooling, comparing them both on air is pretty pointless..


Respectfully disagree; comparing benchmarks _between systems_ is pointless given the disparities on case style, air flow, ambient temps, etc., but comparing two cards in the same system is far more accurate, at least relatively, because the system itself is constant.


----------



## Awsan

So sup boys, Pulling the trigger on a FTW 1080 any last thoughts?


----------



## Vellinious

The Classy's only advantage is the voltage control, if the 10 series even has it, and an increased power limit. Voltage is negated almost immediately by heat generation, and the power limits, hopefully, will get addressed when the Pascal bios tweaker releases.

The Pascal Classy without sub-ambient cooling is a waste....those that fall for the marketing have entertainment value, though.


----------



## SAFX

Quote:


> Originally Posted by *Vellinious*
> 
> The Classy's only advantage is the voltage control, if the 10 series even has it, and an increased power limit. Voltage is negated almost immediately by heat generation, and the power limits, hopefully, will get addressed when the Pascal bios tweaker releases.
> 
> The Pascal Classy without sub-ambient cooling is a waste....those that fall for the marketing have entertainment value, though.


lol, safe to say we're all duped by clever marketing; don't think it only applies to those who purchase classy's


----------



## KickAssCop

Quote:


> Originally Posted by *SAFX*
> 
> ..but those EKs sure look awesome! oh man, the temptation!


They do indeed but spending another 300 + ship on blocks is too rich for my blood on cards that are gimped ones. Maybe on the 1080 Tis I put blocks. These cards are here for only 6-8 months to be honest.


----------



## Cornerer

Quote:


> Originally Posted by *Papazmurf*
> 
> I was right there with you brotha. Had HD6970s in crossfire till about a month ago. That 2GB of VRAM was killing me.


My 6950 was even the 1GB VRAM ver. , so bad that my 6970 BIOS flash was pointless in most games P
Quote:


> Originally Posted by *Vellinious*
> 
> The Classy's only advantage is the voltage control, if the 10 series even has it, and an increased power limit. Voltage is negated almost immediately by heat generation, and the power limits, hopefully, will get addressed when the Pascal bios tweaker releases.
> 
> The Pascal Classy without sub-ambient cooling is a waste....those that fall for the marketing have entertainment value, though.


Exactly. Another unfortunate Classy friend here got even less OC (2101) than with my 6+2 phases & no voltage unlock.
Latest gens GPUs' (even more so for Pascal) max OCs are dominated by silicon lottery under air/water-cooling


----------



## TWiST2k

Quote:


> Originally Posted by *Awsan*
> 
> So sup boys, Pulling the trigger on a FTW 1080 any last thoughts?


Don't wait, just do it! haha!


----------



## gree

I mean I don't know the exact fps, since the Ui
Quote:


> Originally Posted by *TWiST2k*
> 
> Don't wait, just do it! haha!


Don't let your dreams be dreams


----------



## toncij

Yes, putting 1080s under water is a waste. Too short their life will be because Volta will trump it.


----------



## Spieler4

Quote:


> Originally Posted by *zlpw0ker*
> 
> have a few questions regarding my 1080 seahawk x
> my gpu core clock is 1987mhz out of the box, is that good or normal?
> second question is, how much can I OC the gpu?
> does anyone have the 1080 seahawk that have gotten a stabile overclock that is somewhat good?


Out of box clocks seems normal
With the latest driver 372.54 in win 7 I can Oc stable at 2139mhz without Voltage Oc
Power limit 100%
Coreclock +176
memory Clock +496

radiator Fan speed 1050rpm
Temps <55


----------



## Deders

Quote:


> Originally Posted by *SAFX*
> 
> been benchmarking all day, preliminary results are suspiciously negative compared to my FTW; more work to do, but here's some results.
> BTW: In Nvidia Control Panel, I set *Power management mode* to _Prefer maximum performance_, is it common to leave other settings default to ensure max performance, purely for benchmarking, not gaming.
> 
> *Valley (All Stock except power managemend in NCP)*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Card        score    fps      min/max
> ------------------------------------------------------
> FTW         4116     98       39/186
> CLASS       4048     97       36/170


Is that benched at the highest preset?


----------



## shadow85

Finally got my 2x STRIX A8. Both my cards are boosting to ~1930-2000 out of the box, without touching it yet. Is this good?

Finally I can play W3 @4K mostly max details 60fps. I have shadows on medium (Barely noticeable from Ultra) and Hairworks AA on 2x and it doesnt drop under 60 fps. If I change hairworks AA to 8x and shadows to Ultra my fps drops to ~48 fps.


----------



## Bravoexo

I recently got mine too. Overclocking headroom is not that great though, but was already known somewhat. Tested to just +107 core. But decided to stick to +100 with a custom fan curve for now.


----------



## uplink

*Spieler4 - *how did You get her stable on +176 MHz Core?

Here's mine: https://goo.gl/photos/iatXuM2RcekWQxo9A

I can go up to +150/+700 for benchmarks, but for gaming I'm topped on +100/+500. Do I have a bad piece? My temps never go above 46°C in gaming and 48°C with +150/+700 in heaven.

I need to have 105% power limit, w/o it I can't even go on +100/+500 :/

best regards

uplink


----------



## SAFX

Quote:


> Originally Posted by *Deders*
> 
> Is that benched at the highest preset?


To what "preset" are you referring?


----------



## SAFX

Playing around with Scanner in Precision X, ran simple test from 0 to 125MHz, 12.5 steps, 20s intervals. Test completed, just not sure what to make of that gap?


----------



## Cornerer

Quote:


> Originally Posted by *SAFX*
> 
> Playing around with Scanner in Precision X, ran simple test from 0 to 125MHz, 12.5 steps, 20s intervals. Test completed, just not sure what to make of that gap?


From my recent research manual OC should give better results than no matter how you did to voltage curve. Not sure if this has been updated.


----------



## Deders

Quote:


> Originally Posted by *SAFX*
> 
> To what "preset" are you referring?


There are 3 presets, not including custom. Thinking about it now you were probably benching at a custom resolution.


----------



## SAFX

Quote:


> Originally Posted by *Cornerer*
> 
> From my recent research manual OC should give better results than no matter how you did to voltage curve. Not sure if this has been updated.


Is that the general consensus? I'm playing with it cuz it's kinda cool and interesting, but to early for me to comment on performance.


----------



## SAFX

Quote:


> Originally Posted by *Deders*
> 
> There are 3 presets, not including custom. Thinking about it now you were probably benching at a custom resolution.


Factory/stock, the only change I made was *Performance management mode* in NCP.


----------



## Spieler4

Quote:


> Originally Posted by *uplink*
> 
> *Spieler4
> -
> *how did You get her stable on +176 MHz Core?
> 
> Here's mine:
> https://goo.gl/photos/iatXuM2RcekWQxo9A
> 
> I can go up to +150/+700 for benchmarks, but for gaming I'm topped on +100/+500. Do I have a bad piece? My temps never go above 46°C in gaming and 48°C with +150/+700 in heaven.
> 
> I need to have 105% power limit, w/o it I can't even go on +100/+500 :/
> 
> best regards
> 
> uplink


I Use MSI afterburner v4.3.08352 beta 4

With the earlier Nvidia drivers I couldnt get past
Powerlimit 94%
Coreclock +169
Mem +496

I then tried another powercable and other plug in powersupply. No difference. Then moved cabel back. Now It works better 

X79 Deluxe mobo
i4820K at 4.5Ghz
4x4gb Corsair vengence 1866 Oc at 2133 9 10 9 27 2


----------



## Deders

Quote:


> Originally Posted by *SAFX*
> 
> Factory/stock, the only change I made was *Performance management mode* in NCP.


I meant with valley.

Now trying to find curve scanner in PrecisionX. Not sure if it's available for other brands. How do you get into it?

Edit: found it, but the run button is greyed out.


----------



## nexxusty

Quote:


> Originally Posted by *tps3443*
> 
> I got my MSI GTX1080 Founders edition for $280 sealed in the box. I am pretty sure ive got everyone beat. LOL
> Some young black kids attained it through "financial aid" after getting a GRANT to go to school for graphics design. So $ 280 was steal


Almost as good as me....


----------



## Cornerer

Quote:


> Originally Posted by *SAFX*
> 
> Is that the general consensus? I'm playing with it cuz it's kinda cool and interesting, but to early for me to comment on performance.







He's using Classy for this vid. Latter part most important.


----------



## SAFX

Quote:


> Originally Posted by *Deders*
> 
> Edit: found it, but the run button is greyed out.


In Scanner, switch from *Basic* to *Manual* mode, then make some edits in the curve, run button should activate.


----------



## SAFX

Quote:


> Originally Posted by *Cornerer*
> 
> 
> 
> 
> 
> 
> He's using Classy for this vid. Latter part most important.


Thanks, watched that last night


----------



## uplink

Quote:


> Originally Posted by *Spieler4*
> 
> I Use MSI afterburner v4.3.08352 beta 4
> 
> With the earlier Nvidia drivers I couldnt get past
> Powerlimit 94%
> Coreclock +169
> Mem +496
> 
> I then tried another powercable and other plug in powersupply. No difference. Then moved cabel back. Now It works better
> 
> X79 Deluxe mobo
> i4820K at 4.5Ghz
> 4x4gb Corsair vengence 1866 Oc at 2133 9 10 9 27 2


I see, I use the newest driver, the same version of MSi AB as You, but I never thought about that power cable. I'll try to pamper with it. Thank You!


----------



## SAFX

If I'm using manual OC settings in PrecisionX, why are clock speeds +/- 150MHz when benching?

Having a problem, I first installed FTW, then picked up CLASSY, removed FTW and installed that, now I removed CLASSY and I'm back to FTW, all throughout installing/uninstalling drivers and PrecisionX OC. But now, my FTW, on stock settings, appears to be picking up OC settings I used for benching CLASS and the first FTW I installed. Clock speeds shooting up to 2025MHz, 70c on Valley, ***. I used Revo Uninstaller to remove PrecisionX, deleted all config/profiles under C:/programs files (x86)/EVGA/PrecisionX, same problem.

Any ideas?

Nevermind, staring at screen for too long, forgot the basics


----------



## TK421

Anyone know why this is happening?


----------



## SAFX

Quote:


> Originally Posted by *TK421*
> 
> Anyone know why this is happening?


Why _what_ is happening? some context would be nice


----------



## TK421

Quote:


> Originally Posted by *SAFX*
> 
> Why _what_ is happening? some context would be nice


usage up and down for some reason


----------



## Bishop07764

Quote:


> Originally Posted by *TK421*
> 
> Anyone know why this is happening?


If you have been messing around a lot pushing it to the limit, you might try resetting to see if it goes back to normal. That was necessary for me a time or two after a lot of crashes in Firestrike. People have been complaining about general crap and weirdness with the latest drivers 372.54, have you tried older drivers?


----------



## Benjiw

Quote:


> Originally Posted by *TK421*
> 
> usage up and down for some reason


Perf caps is usually the reason, that's what caused the same thing on my 970 until I flashed it and got rid of the perfcaps and boost.


----------



## Benjiw

Quote:


> Originally Posted by *toncij*
> 
> Why would it be? A better heatsink and better binned chips should give some benefits.


Because these chips like to stay cool, just like maxwell before them, air coolers are rubbish at keeping them cool enough. Binned or not doesn't matter if the chips aren't staying cool and holding back clocks.


----------



## TK421

Quote:


> Originally Posted by *Bishop07764*
> 
> If you have been messing around a lot pushing it to the limit, you might try resetting to see if it goes back to normal. That was necessary for me a time or two after a lot of crashes in Firestrike. People have been complaining about general crap and weirdness with the latest drivers 372.54, have you tried older drivers?


Stupid mistake, it was malwarebytes notification creating the usage drops.

Once I disable the notification temporarily the gpu usage will stay 99-100%


----------



## Bishop07764

Quote:


> Originally Posted by *TK421*
> 
> Stupid mistake, it was malwarebytes notification creating the usage drops.
> 
> Once I disable the notification temporarily the gpu usage will stay 99-100%


Glad that it was something simple. Forgot that I had vsync forced on my monitor before and was wondering what was going on with my usage.


----------



## tps3443

The GTX980 4GB had nearly a (2) year life cycle. Are we expecting the same with our GTX1080 8GB cards?

That is a great life cycle!

I must admit.


----------



## TK421

Quote:


> Originally Posted by *tps3443*
> 
> The GTX980 4GB had nearly a (2) year life cycle. Are we expecting the same with our GTX1080 8GB cards?
> 
> That is a great life cycle!
> 
> I must admit.


I hope so...


----------



## Cornerer

Quote:


> Originally Posted by *tps3443*
> 
> The GTX980 4GB had nearly a (2) year life cycle. Are we expecting the same with our GTX1080 8GB cards?


http://videocardz.com/63413/nvidia-nvlink-2-0-to-arrive-with-volta-in-2017


----------



## SAFX

Quote:


> Originally Posted by *Bishop07764*
> you might try resetting to see if it goes back to normal.


How did you _reset_ your card?

Quote:


> Originally Posted by *Benjiw*
> 
> Perf caps is usually the reason, that's what caused the same thing on my 970 until I flashed it and got rid of the perfcaps and boost.


WTH! Having similar problems, ran single Heaven test, FTW throttling on stock! temps way too high.
Reinstalled drivers, PrecisionX, twice, same issue.

*Nvidia Control Panel*
===========================
*Power management:* _Prefer max performance_
*VSync:* _Use 3D application settings_

*PerformanceX OC*
*Basic curve*
*Power/temp/clock/mem offset:* ALL STOCK


----------



## Denilson

Hello

ok it's time to pull the trigger....what to "bang"

ASUS ROG Strix GTX 1080 or EVGA GeForce GTX 1080 FTW

which this two monster overclock better......


----------



## TK421

Quote:


> Originally Posted by *Denilson*
> 
> Hello
> 
> ok it's time to pull the trigger....what to "bang"
> 
> ASUS ROG Strix GTX 1080 or EVGA GeForce GTX 1080 FTW
> 
> which this two monster overclock better......


ftw probably has a better cooler, since the heat is spread around many heatpipes compared to the strix


----------



## MrDerrikk

Hey guys quick question, I'm trying to overclock my FTW again at 100% fan speed just for max, but it seems that if I go over +34 to the core clock then it refuses to bump up speed from 2025 but still stays perfectly stable at even +85.

I managed to get speeds to almost 2100 last time I tried so I think there's something odd going on, any ideas? Using Afterburner for overclocking and GPU-Z for speeds.

Edit: So after playing around more it seems 2075.5 is the highest I can get on air in my case, any raising of the core clocks from there just increases heat and causes it to clock down. Mine automatically downclocks from 2075.5 to 2062.5 at 50C, and again to 2050 at 55C. That's with a very aggressive fan curve too, so I'm thinking that's the best I can do until I get a hybrid kit for it.


----------



## KickAssCop

Quote:


> Originally Posted by *Denilson*
> 
> Hello
> 
> ok it's time to pull the trigger....what to "bang"
> 
> ASUS ROG Strix GTX 1080 or EVGA GeForce GTX 1080 FTW
> 
> which this two monster overclock better......


Got the ROG Strix. Too many complaints about coil whine and poor cooler performance on FTW cards that I read.
My cards should arrive today or tomorrow.


----------



## toncij

Quote:


> Originally Posted by *KickAssCop*
> 
> Got the ROG Strix. Too many complaints about coil whine and poor cooler performance on FTW cards that I read.
> My cards should arrive today or tomorrow.


Hmm, I hear no coil whine from the two of them and also cooler is pretty great.


----------



## shadow85

Will an i7-5930K @ 4.0GHz bottleneck ASUS STRIX GTX 1080 SLI?


----------



## Bravoexo

Just finished benching my two Asus ROG Strix GTX-1080-A8G-Gaming cards






Still running an i7-3770K @4.5ghz and 16GB of RAM too, Had to up the VCore of my CPU to 1.3V from 1.285V just to keep the 4.5Ghz stable when rendering in Premiere


----------



## KickAssCop

Quote:


> Originally Posted by *toncij*
> 
> Hmm, I hear no coil whine from the two of them and also cooler is pretty great.


Of course there will be people with good cards as well. Can't win in this kind of arguments on the internet







.


----------



## KickAssCop

Quote:


> Originally Posted by *Bravoexo*
> 
> Just finished benching my two Asus ROG Strix GTX-1080-A8G-Gaming cards
> 
> Still running an i7-3770K @4.5ghz and 16GB of RAM too, Had to up the VCore of my CPU to 1.3V from 1.285V just to keep the 4.5Ghz stable when rendering in Premiere


How far they clock. Hopefully my cards are here today or tomorrow.


----------



## toncij

Quote:


> Originally Posted by *KickAssCop*
> 
> Of course there will be people with good cards as well. Can't win in this kind of arguments on the internet
> 
> 
> 
> 
> 
> 
> 
> .


Well, both of my FTWs get to 2139 stable at 100% fan, 70°C. Throttle to 2114MHz if at 80% fan. I find that... fine.


----------



## KickAssCop

Getting two cards stable at 2100+ overclocks on air cooling is a feat in itself.
Will post what I get when my cards finally show up in a day or two.


----------



## Bravoexo

Quote:


> Originally Posted by *KickAssCop*
> 
> How far they clock. Hopefully my cards are here today or tomorrow.


No overvolting, not much. +107 on the Core was all Firestrike would allow, I could easily do +125 on Timespy though, so that's +178mhz over base/founders, I guess. But long boost leveled around 1946 after running a while, otherwise, it'll; briefly boost to 2075 max I think...


----------



## grimboso

I noticed that I am able to get a somewhat higher overclock when I have vsync / gsync ON vs OFF.

Now obviously my Firestrike-scores will be lower, due to 100 fps cap (I have a acer x34a), but in high-demanding games where I won't hit 100 fps stable with everything maxed (Witcher 3, Division, Fallout 4 with mods etc) I am able to increase my average FPS with 3-7 and my min fps with 4-7.

With VSYNC off I can only run 2126 on core and +478 on memory and still pass Firestrike and the stability test.
With VSYNC on I can run 2151 (two steps up) and 512 on memory and pass everything above.

Anyone else experienced this or is it perfectly normal?


----------



## GreedyMuffin

That is completely normal.

You are not stressing the card as much since the cap is reached. The lower usage - the less stress on the GPU = higher OC.


----------



## Benjiw

There is a lot to learn in this thread it seems. Sadly we won't get the max overclocks on these cards until bios modding is available. You need to keep these cards cool, it was said a billion times with the 9xx series cards and the same holds true for these, the cooler the card the better your overclock with boost enabled.

You also need to keep in mind that just pumping up those sliders in your OC program isn't going to net you that overclock, valley etc read the wrong speeds, you should be monitoring speeds with GPUz or similar, GPUz will also tell you what PerfCap you're hitting.

Example: My first 970 I pumped the sliders up and hit 1500mhz for the first time but it wasn't staying there because of perfcaps even though valley read that I was it never actually hit 1500mhz.

The guys running these cards on water will be removing their thermal barriers but until we can truly unlock voltage and power we're yet to see their full potential.


----------



## grimboso

Quote:


> Originally Posted by *GreedyMuffin*
> 
> That is completely normal.
> 
> You are not stressing the card as much since the cap is reached. The lower usage - the less stress on the GPU = higher OC.


But I am still stressing the card more in The Witcher 3 with the 2151 OC than I do with the 2126 OC, but it remains stable, as I am not at 100 fps cap yet, so every extra frame means the card works a bit harder. No?

I get why the stress-test is stable with vsync on, but I don't get why the game is stable with vsync on.

If I run the 2151 clock with vsync off, the witcher 3 will cause driver crash. if I run the 2151 clock with vsync on, the game runs just fine.


----------



## Benjiw

Quote:


> Originally Posted by *grimboso*
> 
> But I am still stressing the card more in The Witcher 3 with the 2151 OC than I do with the 2126 OC, but it remains stable, as I am not at 100 fps cap yet, so every extra frame means the card works a bit harder. No?
> 
> I get why the stress-test is stable with vsync on, but I don't get why the game is stable with vsync on.
> 
> If I run the 2151 clock with vsync off, the witcher 3 will cause driver crash. if I run the 2151 clock with vsync on, the game runs just fine.


Probably because you're hitting performance caps (PerfCaps) if you run GPUZ while running a game or bench it will show you what is going on with your card.


----------



## steeludder

I suppose there is no news/updates on custom bioses (or bios editors) for the 1080 FE's yet?

I performed der8auer's power mod using Thermal Grizzly Conductonaut... That works a treat but basically sent me straight into Volt-throttling. Considering I have a decent watercooling solution, I would think a few more mV could most likely help.


----------



## Fediuld

Quote:


> Originally Posted by *ucode*
> 
> I get a drop at 43C. How much it drops depends on the curve.
> 
> 
> 
> 
> 
> 
> 
> Multiple drops over a temperature range.


It doesn't help the rest of us without posting the curve also








Please do so


----------



## grimboso

Quote:


> Originally Posted by *Benjiw*
> 
> Probably because you're hitting performance caps (PerfCaps) if you run GPUZ while running a game or bench it will show you what is going on with your card.


I'll do some runs and log them to see what PerfCap I hit. There is probably something different with vsync on / of.

It might be because I was dropped as a kid but I can't wrap my head around how the card is put under more load when enabling gsync / vsync when I do not hit the frame-cap of 100. I stay at around 80-85 fps. If anything the card should be under more load using gsync because the gsync module and gpu have to talk to eachother.

I can understand that enabling vsync then locking the fps to say 100 would lessen the load, if the load otherwize would be something in the range of 150-200. But then enabling vsync (100) when the fps is less, should in my uneducated oppinion not affect stability.


----------



## juniordnz

Guys, a quick one: the voltage we see on GPUz and the offset we can add to it applies only to the GPU, right? It doesn't affect VRAM at all?


----------



## TK421

Correct


----------



## Cornerer

My ThunderMster program has a separate memory overvoltage button. Doesn't seem to affect my best memory OC though ...


----------



## TK421

Quote:


> Originally Posted by *Cornerer*
> 
> My ThunderMster program has a separate memory overvoltage button. Doesn't seem to affect my best memory OC though ...


I don't believe its anything more than a dud, nvidia won't allow any voltage to go out of spec.


----------



## fat4l

Quote:


> Originally Posted by *grimboso*
> 
> But I am still stressing the card more in The Witcher 3 with the 2151 OC than I do with the 2126 OC, but it remains stable, as I am not at 100 fps cap yet, so every extra frame means the card works a bit harder. No?
> 
> I get why the stress-test is stable with vsync on, but I don't get why the game is stable with vsync on.
> 
> If I run the 2151 clock with vsync off, the witcher 3 will cause driver crash. if I run the 2151 clock with vsync on, the game runs just fine.


try to fiddle with memory OC. witcher 3 crashes with +550MHz, runs fine with +480 tho


----------



## SAFX

For benching, vsync should be disabled entirely unless the card is hitting performance caps, right?
For regular use, what setting is desirable for vsync? I'm using *Prefer 3D Application Settings*, I stopped using *Fast* sync because it drops frames exceeding the monitors refresh rate, no sense in making your gpu work harder for nothing, unless I'm not fully understanding Fast sync.


----------



## juniordnz

Is it onyl here or GTA V is a nice OC breaker? I'm getting some crazy red artifacts with 2126mhz.


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> Is it onyl here or GTA V is a nice OC breaker? I'm getting some crazy red artifacts with 2126mhz.


No problems here GTA V, Fallout 4, Far Cry Primal and every game i tested most test done with 2088mhz my daily driver and 2160mhz just for fun 372.54 driver


----------



## Bishop07764

Quote:


> Originally Posted by *SAFX*
> 
> How did you _reset_ your card?


Clocks can get stuck sometimes without it wanting to boost properly when having driver crashes pushing for your max clocks. A simple PC reset will usually do the trick.


----------



## SAFX

Quote:


> Originally Posted by *GanGstaOne*
> 
> No problems here GTA V, Fallout 4, Far Cry Primal and every game i tested most test done with 2088mhz my daily driver and 2160mhz just for fun 372.54 driver


Would love to know your GTA V graphics params, and vsync setting in Nvidia CP


----------



## GanGstaOne

Quote:


> Originally Posted by *SAFX*
> 
> Would love to know your GTA V graphics params, and vsync setting in Nvidia CP


Only change in nvidia cp is maximum performance for every game vsync set to default everything else at max in game settings + vsync on in game


----------



## SAFX

Quote:


> Originally Posted by *GanGstaOne*
> 
> Only change in nvidia cp is maximum performance for every game vsync set to default everything else at max in game settings + vsync on in game


do you lock FPS at 60 in GTA? that's what I do, otherwise, bad screen tearing


----------



## GanGstaOne

Quote:


> Originally Posted by *SAFX*
> 
> do you lock FPS at 60 in GTA? that's what I do, otherwise, bad screen tearing


Yep


----------



## ucode

Quote:


> Originally Posted by *Fediuld*
> 
> It doesn't help the rest of us without posting the curve also
> 
> 
> 
> 
> 
> 
> 
> 
> Please do so


Already posted before









Here


----------



## KickAssCop




----------



## grimboso

Quote:


> Originally Posted by *fat4l*
> 
> try to fiddle with memory OC. witcher 3 crashes with +550MHz, runs fine with +480 tho


Both 512 and 478 mem is stable, but the problem is that the 512 mem clock is only stable when I have vsync on, even if the witcher only runs at 80-85 fps (100 hz monitor).
If I disable vsync I have to drop the memory down. Now this is not a real issue since I will play with gsync/vsync on anyway, but I am just curious what causes the benches to be stable with vsync on vs off.

2151 core / +512 mem, vsync on = pass
Fallout 4 with UHD textures: 81 average fps. - The witcher 3 everything ultra, ~85 fps average. The Division, all ultra: 78 fps average
2151 core / +512 mem, vsync off = fail / artefact / driver crash
3dmark stresstest fail, 3dmark ultra stresstest fail, 3dmark extreme stresstest fail
Games run with about the same average as above. Not able to play out scenes etc for long enough to get a good measurement and account for variances in the gameplay, but the average is close enough.

2126 core / +478 mem, vsync on = pass
Fallout 4 with UHD textures: 75 average fps. - The witcher 3 everything ultra, ~77 fps average. The Division, all ultra: 71 fps average
2126 core / +478 mem, vsync off = pass.
Fallout 4 with UHD textures: 75 average fps. - The witcher 3 everything ultra, ~77 fps average. The Division, all ultra: 71 fps average


----------



## THERIDDLER

which 1080 is the best card to water cool with a custom block? Might wait for the 1080ti but am not 100% sure. Buying parts slowly over the next few months-year for a build


----------



## PasK1234Xw

Quote:


> Originally Posted by *KickAssCop*


Did they ship them like that?


----------



## KickAssCop

Seems like it. Wasn't at home to see if this came in a box or direct like the above.


----------



## wardo3640

I would be mad they ruined my pretty box with that ugly sticker lol.


----------



## KickAssCop

Seems like Amazon went full ****** with my cards. Let's hope they are intact.


----------



## toncij

Quote:


> Originally Posted by *KickAssCop*
> 
> Seems like it. Wasn't at home to see if this came in a box or direct like the above.


Some products are, by request of the manufacturer, sent in original boxes. When ordering, Amazon offers: "send in an Amazon box" tick box for you so they (free of charge) put the product in a standard Amazon box to protect the contents.

Careful next time.


----------



## KickAssCop

I ordered using my mobile so...


----------



## juniordnz

Guys, screen and sound freeze for like 5 seconds, then the computer restarts without any BSOD. That looks more like a CPU crash than GPU, right? Or could it be GPU?

I must have gotten the worst 4690K in the whole planet


----------



## danjal

Quote:


> Originally Posted by *juniordnz*
> 
> Guys, screen and sound freeze for like 5 seconds, then the computer restarts without any BSOD. That looks more like a CPU crash than GPU, right? Or could it be GPU?
> 
> I must have gotten the worst 4690K in the whole planet


might be psu.


----------



## KickAssCop

My SLi bridge is on the way still. Can I use ONE ribbon bridge and have some SLi loving till then?


----------



## juniordnz

Quote:


> Originally Posted by *danjal*
> 
> might be psu.


Really? I thought I had a decent PSU. I researched when I built this rig and found out Seasonic makes XFX PSUs, please someone correct me If I'm wrong. And I think 850W is more than enough for 4690K + GTX1080


----------



## GreedyMuffin

It's probably your CPU. Complete freeze can be cache from what I've experienced.









I can't think of why it could be the PSU. If it was doing that on stock It's one thing.


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> It's probably your CPU. Complete freeze can be cache from what I've experienced.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't think of why it could be the PSU. If it was doing that on stock It's one thing.


My CPU is the worst...I can't get it stable with [email protected] and stock cache 3.5ghz. That's the setting I was using when I got the crashes. Tried ramping input from 1.9 to 2.0V but still got crashes. Went back to [email protected] and everything seems fine.

Can't wait to get a cannonlake and retire this lazy cpu...guess I'm stuck with it for another 1,5 years...


----------



## Tdbeisn554

Quote:


> Originally Posted by *KickAssCop*
> 
> My SLi bridge is on the way still. Can I use ONE ribbon bridge and have some SLi loving till then?


Yeah, you are fine if you play at 1080 or 1440P 60Hz








http://images.nvidia.com/geforce-com/international/images/nvidia-geforce-gtx-1080/nvidia-geforce-gtx-1080-recommended-sli-bridge-configuration.png


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> My CPU is the worst...I can't get it stable with [email protected] and stock cache 3.5ghz. That's the setting I was using when I got the crashes. Tried ramping input from 1.9 to 2.0V but still got crashes. Went back to [email protected] and everything seems fine.
> 
> Can't wait to get a cannonlake and retire this lazy cpu...guess I'm stuck with it for another 1,5 years...


I have two Asus X99 Pro usb3.1 mobos and 5930k overcklock very good but on one of the mobos i cant get past 3.3Ghz cache so your cpu can be fine and the problem can be in your mobo


----------



## BrainSplatter

Quote:


> Originally Posted by *juniordnz*
> 
> My CPU is the worst...I can't get it stable with [email protected] and stock cache 3.5ghz. That's the setting I was using when I got the crashes. Tried ramping input from 1.9 to 2.0V but still got crashes. Went back to [email protected] and everything seems fine.


Try 1.35v instead of input 2.0. And check your cache voltage. My 4770K requires 1.38v for 4.5Ghz, btw.


----------



## Bishop07764

Quote:


> Originally Posted by *juniordnz*
> 
> Guys, screen and sound freeze for like 5 seconds, then the computer restarts without any BSOD. That looks more like a CPU crash than GPU, right? Or could it be GPU?
> 
> I must have gotten the worst 4690K in the whole planet


Do you have any WHEA errors in your event viewer? If it's ID 20 errors, you would likely need more volts on your cpu.


----------



## juniordnz

Quote:


> Originally Posted by *GanGstaOne*
> 
> I have two Asus X99 Pro usb3.1 mobos and 5930k overcklock very good but on one of the mobos i cant get past 3.3Ghz cache so your cpu can be fine and the problem can be in your mobo


The stock cache clock for the 4690K is 3.5ghz isn't it? I'm almost sure about it...Are you saying I should try to get Cache below stock? I could really use some more power in my CPU since it is bottlenecking my GTX1080 ([email protected])
Quote:


> Originally Posted by *BrainSplatter*
> 
> Try 1.35v instead of input 2.0. And check your cache voltage. My 4770K requires 1.38v for 4.5Ghz, btw.


Cache should be left stock, right? I know my stock cache is 3.5ghz, but I'm not sure about the voltage, it was set to auto on the mobo. I really didn't want to go that high on vcore, but since I have a year or so left with this CPU, why not? Gonna hook it up to a H100i V2 soon...
Quote:


> Originally Posted by *Bishop07764*
> 
> Do you have any WHEA errors in your event viewer? If it's ID 20 errors, you would likely need more volts on your cpu.


When I got the Bsod, yes. It was voltage. But the screen freezes followed by sudden restarts gave me no BSODs nor error reports on windows. Strange...

Sorry about going off the track with the thread, I'm just trying to figure out if my crashes on GTA V are CPU or GPU related...guess I'm not TOO off track


----------



## alton brown

Hi guys, I need some up to date advice! Next week I'm going to buy the GTX 1080 Evga FTW and also a 1440p 144hz monitor. How well does the 1080 run the 1440? Will I be able to max games out like BF1 or Elite Dangerous? Any input would be appreciated! Thanks!

I'll throw this in there, Should I purchase a 1080p 144hz monitors instead?


----------



## juniordnz

Quote:


> Originally Posted by *alton brown*
> 
> Hi guys, I need some up to date advice! Next week I'm going to buy the GTX 1080 Evga FTW and also a 1440p 144hz monitor. How well does the 1080 run the 1440? Will I be able to max games out like BF1 or Elite Dangerous? Any input would be appreciated! Thanks!
> 
> I'll throw this in there, Should I purchase a 1080p 144hz monitors instead?


I have a [email protected] screen and I'll tell what I've seen:

You need a powerfull CPU to run high FPS with the 1080P screen. My OCed 4690K is bottlenecking my GTX1080 (I usually get 70% GPU and 100% CPU utilization)

If you have a powerful CPU you'll probably get 100+ in most games (considering heavy ones like ROTTR).

1440P you can expect sub100 FPS and you won't have as much CPU bottleneck as you would with 1080P.

My advice? Go futureproof and get youserlf a nice 1440P screen. Don't make the same mistake I did buying a 1080P screen ins 2016.


----------



## Whitechap3l

Quote:


> Originally Posted by *juniordnz*
> 
> Really? I thought I had a decent PSU. I researched when I built this rig and found out Seasonic makes XFX PSUs, please someone correct me If I'm wrong. And I think 850W is more than enough for 4690K + GTX1080


850W is Overkill in this case..
I got a Wattagemeter or what that Thing is called at home and my whole System peeks at 500W max!

i7 @ 1.35ghz
Gtx 1080 @1.2v
ram @ 1.65v
9 fans
costum Loop with pump etc.

I just dont get it why people getting psu with Overkill wattage...


----------



## juniordnz

Quote:


> Originally Posted by *Whitechap3l*
> 
> 850W is Overkill....
> I got a Wattagemeter or what that Thing is called at home and my whole System peeks at 500W max!
> 
> i7 @ 1.35ghz
> Gtx 1080 @1.2v
> ram @ 1.65v
> 9 fans
> costum Loop with pump etc.
> 
> I just dont get it why people getting psu with Overkill wattage...


I was going to do a 780ti sli when I built this rig


----------



## Whitechap3l

Quote:


> Originally Posted by *juniordnz*
> 
> I was going to do a 780ti sli when I built this rig


It wasnt adressed to you but just a General thought of mine








And I mean is nearly equal if you getting a 750w Platin or 850 bronze certificated psu


----------



## GanGstaOne

Quote:


> Originally Posted by *alton brown*
> 
> Hi guys, I need some up to date advice! Next week I'm going to buy the GTX 1080 Evga FTW and also a 1440p 144hz monitor. How well does the 1080 run the 1440? Will I be able to max games out like BF1 or Elite Dangerous? Any input would be appreciated! Thanks!
> 
> I'll throw this in there, Should I purchase a 1080p 144hz monitors instead?


1440p will be just fine very little difference in fps you wont even feel it and you can still play at 1080p


----------



## alton brown

Quote:


> Originally Posted by *juniordnz*
> 
> I have a [email protected] screen and I'll tell what I've seen:
> 
> You need a powerfull CPU to run high FPS with the 1080P screen. My OCed 4690K is bottlenecking my GTX1080 (I usually get 70% GPU and 100% CPU utilization)
> 
> If you have a powerful CPU you'll probably get 100+ in most games (considering heavy ones like ROTTR).
> 
> 1440P you can expect sub100 FPS and you won't have as much CPU bottleneck as you would with 1080P.
> 
> My advice? Go futureproof and get youserlf a nice 1440P screen. Don't make the same mistake I did buying a 1080P screen ins 2016.


Thanks for the input. Greatly appreciated! I'm thinking down the road I could purchase a used 1080 and run sli.


----------



## Whitechap3l

Quote:


> Originally Posted by *alton brown*
> 
> Hi guys, I need some up to date advice! Next week I'm going to buy the GTX 1080 Evga FTW and also a 1440p 144hz monitor. How well does the 1080 run the 1440? Will I be able to max games out like BF1 or Elite Dangerous? Any input would be appreciated! Thanks!
> 
> I'll throw this in there, Should I purchase a 1080p 144hz monitors instead?


Get a 1440p 1000% sure!
Why should you buy one of the newest Cards out there and go a step back to 1080p ?









It is a german page but you should be able to see the Benchmarks :

http://www.gamestar.de/hardware/grafikkarten/asus-geforce-gtx-1080-rog-strix/test/asus_geforce_gtx_1080_rog_strix,1000,3274661,2.html#spiele-benchmarks

I guess that BF1 will not require better Hardware then BF4 / battlefront








In the near futrue you should be good to go with 1440p and 1080


----------



## danjal

Quote:


> Originally Posted by *alton brown*
> 
> Hi guys, I need some up to date advice! Next week I'm going to buy the GTX 1080 Evga FTW and also a 1440p 144hz monitor. How well does the 1080 run the 1440? Will I be able to max games out like BF1 or Elite Dangerous? Any input would be appreciated! Thanks!
> 
> I'll throw this in there, Should I purchase a 1080p 144hz monitors instead?


I run a 32" 2560x1440 75hz monitor, [email protected] and a Zotac 1080amp edition.... In BF1 I run just over 100fps in dx11 or dx12 at stock clocks, gpu never gets over 70c.


----------



## danjal

Quote:


> Originally Posted by *alton brown*
> 
> Hi guys, I need some up to date advice! Next week I'm going to buy the GTX 1080 Evga FTW and also a 1440p 144hz monitor. How well does the 1080 run the 1440? Will I be able to max games out like BF1 or Elite Dangerous? Any input would be appreciated! Thanks!
> 
> I'll throw this in there, Should I purchase a 1080p 144hz monitors instead?


I run a 32" 2560x1440 75hz monitor, [email protected] and a Zotac 1080amp edition.... In BF1 I run just over 100fps (around 105) in dx11 or dx12 at stock clocks, gpu never gets over 70c.


----------



## GreedyMuffin

I need 1.280V for 4600 and i find that bad. Lol I think I should shut up.









My 4670K which I bought unused in box a week ago OC like shiat. Needs 1.1V for 4000..


----------



## BrainSplatter

Quote:


> Originally Posted by *juniordnz*
> 
> Cache should be left stock, right? I know my stock cache is 3.5ghz, but I'm not sure about the voltage, it was set to auto on the mobo.


Even if not overclocked itself, cache should get some voltage raise. Might be a weak part. Try 1.15v or 1.2v.

Sorry for off topic. We can take discussion to this thread http://www.overclock.net/t/1490324/the-intel-devils-canyon-owners-club if u want to try more stuff.


----------



## alton brown

Thanks guys for the advise. I'm going for the 1440p monitor!


----------



## Bishop07764

Quote:


> Originally Posted by *juniordnz*
> 
> Sorry about going off the track with the thread, I'm just trying to figure out if my crashes on GTA V are CPU or GPU related...guess I'm not TOO off track


I've been playing GTA V all morning about 1 tick down the res scale from 4k. I haven't gotten a single artifact or crash clocked at 2126 core the whole time. Consequently, I did notice that the voltage set itself to 1.064 volts for a lot of the time. I haven't really seen many other games do that. Temps 38 C. I haven't tried the newest Nvidia drivers. I'm still sitting on 369.09 as a lot of people seemed to be reporting problems with the latest.


----------



## KickAssCop

Is it me or does boost 3.0 works in mysterious ways that can't decide what clocks to run the card at.
I pushed about +185 / +500 on a single AG card and am scoring about 5717 in Firestrike Ultra (5698 graphics).
I see boost clocks ranging from 2000 - 2088 during the test with no clear single clock where it stabilizes.

Need to test more. Haven't opened my second card yet.

http://www.3dmark.com/3dm/14508627?


----------



## Benjiw

Quote:


> Originally Posted by *BrainSplatter*
> 
> Even if not overclocked itself, cache should get some voltage raise. Might be a weak part. Try 1.15v or 1.2v.
> 
> Sorry for off topic. We can take discussion to this thread http://www.overclock.net/t/1490324/the-intel-devils-canyon-owners-club if u want to try more stuff.


You should always push Cache when you overclock core because having it too low can cause instability from experience with my haswell chip.


----------



## Benjiw

Quote:


> Originally Posted by *KickAssCop*
> 
> Is it me or does boost 3.0 works in mysterious ways that can't decide what clocks to run the card at.
> I pushed about +185 / +500 on a single AG card and am scoring about 5717 in Firestrike Ultra (5698 graphics).
> I see boost clocks ranging from 2000 - 2088 during the test with no clear single clock where it stabilizes.
> 
> Need to test more. Haven't opened my second card yet.
> 
> http://www.3dmark.com/3dm/14508627?


It depends on perfcaps and temperature. It's not really a mystery it's just a pain thats why when people mod the bios of 9xx cards they often remove boost and just have it pinned to the max clocks it can go too under load.


----------



## GreedyMuffin

I can get my cache up to 4000 at 1.075V. My core (currently stresstesting) is at 4600 at 1.282V.


----------



## Benjiw

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can get my cache up to 4000 at 1.075V. My core (currently stresstesting) is at 4600 at 1.282V.


Not familar with your CPU etc so couldn't say for certain but 1.1v seems a bit low for 4ghz cache considering. mine is at 1.4v while my cpu sits at 1.5v for 4.7ghz respectively for both.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Benjiw*
> 
> Not familar with your CPU etc so couldn't say for certain but 1.1v seems a bit low for 4ghz cache considering. mine is at 1.4v while my cpu sits at 1.5v for 4.7ghz respectively for both.


My Cache is stock 3000 mhz I believe at 0.880V or so.


----------



## IronAge

New WHQL Driver gets me higher clocks FSE stable:

http://www.nvidia.com/download/driverResults.aspx/107109/

372.54 GPU @ 2139 with 1.062V 372.70 @ 2152 with 1.0620V.

Zotac AMP! with Zotac AMP! Extreme Bios.


----------



## juniordnz

Quote:


> Originally Posted by *IronAge*
> 
> New WHQL Driver gets me higher clocks FSE stable:
> 
> http://www.nvidia.com/download/driverResults.aspx/107109/
> 
> 372.54 GPU @ 2139 with 1.062V 372.70 @ 2152 with 1.0620V.
> 
> Zotac AMP! with Zotac AMP! Extreme Bios.


Are you on water?

Just tested 2138 with new driver but still a no go.


----------



## Benjiw

Quote:


> Originally Posted by *GreedyMuffin*
> 
> My Cache is stock 3000 mhz I believe at 0.880V or so.


Would try more voltage then see if that helps improve stability.


----------



## IronAge

Quote:


> Originally Posted by *juniordnz*
> 
> Are you on water?
> 
> Just tested 2138 with new driver but still a no go.


Nope ... air .. wont do that for 20 loops for sure since temps go up fast. what Bios you have got on your FTW ?

I have anything outside a case on a Desk, GTX1080 fans @ 100 % and a 140x140x38mm fan @ 2000 rpm blowing from behind the card.


----------



## juniordnz

Quote:


> Originally Posted by *IronAge*
> 
> Nope ... air .. wont do that for 20 loops for sure since temps go up fast. what Bios you have got on your FTW ?
> 
> I have anything outside a case on a Desk, GTX1080 fans @ 100 % and a 140x140x38mm fan @ 2000 rpm blowing from behind the card.


Stock FTW 130%TDP Bios. 2126mhz seems to be all that this card can reach.

I was convinced to put it on water, but after a lot of testing, I Don't think watercooling it will make it go any further. It'll just make it run cooler, lose a clock or two less for thermal throttle, but I don't see it making this card reach higher clocks.


----------



## IronAge

I am pretty sure you would get more with a FC Blocks and Temps under 50 Degree Celcius. i bet at least 2 -3 steps.

there is less leakage and power consumption when PASCAL stays cool.

Better temps are even more effective than a custom pcb and a beefed up power delivery.


----------



## juniordnz

Quote:


> Originally Posted by *IronAge*
> 
> I am pretty sure you would get more with a FC Blocks and Temps under 50 Degree Celcius. i bet at least 2 -3 steps.
> 
> there is less leakage and power consumption when PASCAL stays cool.
> 
> Better temps are even more effective than a custom pcb and a beefed up power delivery.


Yeah, I'm a stubborn fella and will put it under water. Just deciding if I'll go overkill with H100i or uber overkill with H115i. I'm hoping to keep mid 40s after all that. Also, will apply CLU.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> I have a [email protected] screen and I'll tell what I've seen:
> 
> You need a powerfull CPU to run high FPS with the 1080P screen. My OCed 4690K is bottlenecking my GTX1080 (I usually get 70% GPU and 100% CPU utilization)
> 
> If you have a powerful CPU you'll probably get 100+ in most games (considering heavy ones like ROTTR).
> 
> 1440P you can expect sub100 FPS and you won't have as much CPU bottleneck as you would with 1080P.
> 
> My advice? Go futureproof and get youserlf a nice 1440P screen. Don't make the same mistake I did buying a 1080P screen ins 2016.


I cannot agree more 1440p is the sweet spot, I have no interest in 4k gaming, it would basically diminish all of the gains I got going from a 980 Ti Classifed to the 1080 FTW. I am currently rocking a 27 inch Qnix panel and I love it, but I noticed all the new panels coming out are 4k and I want nothing to do with it. So I orderd an Asus PG279Q and it should be arriving tomorrow, I really hope I will end up getting one past the previous QC issues, but only time will tell.


----------



## Sazexa

So, I'm a bit of a noob when it comes to overclocking GPU's. I downloaded MSI Afterburner for my two FE 1080's. Anyone have some decent suggestions to gain some performance? They're in a waterloop, if that matters.


----------



## juniordnz

Quote:


> Originally Posted by *Sazexa*
> 
> So, I'm a bit of a noob when it comes to overclocking GPU's. I downloaded MSI Afterburner for my two FE 1080's. Anyone have some decent suggestions to gain some performance? They're in a waterloop, if that matters.


Maximum TDP possible and test, then go up in 25 increments on the core and test. When you hit instabilit, go back 1 clock (13mhz) e go to memory overclock (here you can start at 500 and se if your card can handle, you should see artifacts very soon if not).

You'll probably get 2100/+500 minimum with those FE


----------



## jlp0209

I got unlucky. Received my 1080 FTW Hybrid yesterday and while temps are always under 45 degrees, my max stable OC is only 2050mhz. I crash at 2063. For gaming I'm not sure 2050 vs. 2100-ish makes a very big difference. But it would be nice to be able to OC a little higher.


----------



## Deders

Quote:


> Originally Posted by *jlp0209*
> 
> I got unlucky. Received my 1080 FTW Hybrid yesterday and while temps are always under 45 degrees, my max stable OC is only 2050mhz. I crash at 2063. For gaming I'm not sure 2050 vs. 2100-ish makes a very big difference. But it would be nice to be able to OC a little higher.


Theoretically the extra 50mhz would give you about 1 extra fps for every 50fps.


----------



## Benjiw

Quote:


> Originally Posted by *juniordnz*
> 
> Yeah, I'm a stubborn fella and will put it under water. Just deciding if I'll go overkill with H100i or uber overkill with H115i. I'm hoping to keep mid 40s after all that. Also, will apply CLU.


I have a 420mm rad for my cpu and a 360 for my gpu. I'm still yet to hook this up to it...


Quote:


> Originally Posted by *jlp0209*
> 
> I got unlucky. Received my 1080 FTW Hybrid yesterday and while temps are always under 45 degrees, my max stable OC is only 2050mhz. I crash at 2063. For gaming I'm not sure 2050 vs. 2100-ish makes a very big difference. But it would be nice to be able to OC a little higher.


It's not just core temps you need to worry about, you also need to cool you memory chips as best you can when overclocking those. Ideally you should find your max memory overclock first then do the core. You will see nice improvements with memory overclocking and benchmarks like Valley love memory clocks, that's why 10x0 cards obliterate 9xx cards in valley because of the sheer clock speeds.


----------



## juniordnz

Quote:


> Originally Posted by *Benjiw*
> 
> I have a 420mm rad for my cpu and a 360 for my gpu. I'm still yet to hook this up to it...


Guud lawd! Brace yourselves, winter is coming









Just pulled the trigger on H100i V2. The price difference for H115 was like 30% more (H115 delivers 2% more performance in some situations)

Hopefully with a nice push/pull I'll be able to keep my 1080 in mid 40s. Will adapt a kraken g10 so i have some cool air blown directly to my FTW's heatplate.


----------



## jlp0209

Quote:


> Originally Posted by *Benjiw*
> 
> I have a 420mm rad for my cpu and a 360 for my gpu. I'm still yet to hook this up to it...
> 
> 
> It's not just core temps you need to worry about, you also need to cool you memory chips as best you can when overclocking those. Ideally you should find your max memory overclock first then do the core. You will see nice improvements with memory overclocking and benchmarks like Valley love memory clocks, that's why 10x0 cards obliterate 9xx cards in valley because of the sheer clock speeds.


I will give the memory a try when I have the chance, thanks! P.S., that's some serious cooling in that photo, LOL.


----------



## Benjiw

Quote:


> Originally Posted by *jlp0209*
> 
> I will give the memory a try when I have the chance, thanks! P.S., that's some serious cooling in that photo, LOL.


My sister is giving me her old NZXT phantom case so I'll probably mod it to take as many big rads as possible and maybe mount the 1080 on the rear panel some how so its pretty much contained. Just can't be bothered with noise so the bigger the rads the lower the fan speeds under load means I can enjoy silence while overclocking as far as I can.


----------



## Papazmurf

s
Quote:


> Originally Posted by *jlp0209*
> 
> I got unlucky. Received my 1080 FTW Hybrid yesterday and while temps are always under 45 degrees, my max stable OC is only 2050mhz. I crash at 2063. For gaming I'm not sure 2050 vs. 2100-ish makes a very big difference. But it would be nice to be able to OC a little higher.


I got the Gigabyte 1080 Xtreme Water Cooling, and while my max stable OC is at 2088mhz I was definitely expecting more considering the premium I paid for a AIO water cooled card. At 2100mhz it's a guaranteed crash. I guess no one ever advertised binning.

You definitely beat me on the temps though. With a custom fan curve I've been able to make sure the card rarely goes over 60C, but most people with my card, the MSI Sea Hawk and your Hybrid now, seem to get temps that never go beyond 50C, some even 40C. Maybe my ambient if hotter than everyone else, but I feel like I should be getting better temps with the AIO.

All things considered, the card is still super silent and that's really what I wanted. The throttle mhz I lose from 50C to 60C is negligible in terms of what I see in performance. Definitely wish I could've got a card that went over 2100mhz though. Just something about that number







.


----------



## derekyws

I recently changed the stock cooler from my two Zotac 1080FE to EVGA's hybrid cooler. The temperature does improve greatly from 83C while gaming to 50C (Room temperature around 25C). I've flashed both cards to the EVGA FTW bios posted earlier from TechPowerUp but the maximum OC i can get is still 2088/5400 which is a +100/+400 OC in afterburner before crashing in Time Spy. I also have to tune down the OC to 2055/5200 while playing BF4 but not even more GPU intensive game like Witcher 3. I'm using Noctua's 3000rpm fans on the radiators running at 2000rpm at base but it doesn't reduce the temperature by much. Is what I'm getting normal or should I gain more performance out of my setup? Much appreciated.


----------



## Papazmurf

Quote:


> Originally Posted by *derekyws*
> 
> I recently changed the stock cooler from my two Zotac 1080FE to EVGA's hybrid cooler. The temperature does improve greatly from 83C while gaming to 50C (Room temperature around 25C). I've flashed both cards to the EVGA FTW bios posted earlier from TechPowerUp but the maximum OC i can get is still 2088/5400 which is a +100/+400 OC in afterburner before crashing in Time Spy. I also have to tune down the OC to 2055/5200 while playing BF4 but not even more GPU intensive game like Witcher 3. I'm using Noctua's 3000rpm fans on the radiators running at 2000rpm at base but it doesn't reduce the temperature by much. Is what I'm getting normal or should I gain more performance out of my setup? Much appreciated.


Lovely photos.

In terms of overclocking and temps you seem to be getting similar results to what I have on my Gigabyte Xtreme 1080 which is also water cooled. I haven't tried The Witcher 3, but I do know that I have to keep a separate overclocking profile with BF4 or else I'll get driver errors. My BF4 profile is typically -30 core on the Afterburner slider to what my max stable overclock is on all my other games, some more intensive than BF4. I had the same issue with my old AMD 6970s and had to actually under clock them to keep BF4 stable.

I'm clearly not an expert and I've never messed with updating the bios for GPU's but I thought I'd let you know I was having similar experiences from a similar setup especially regarding BF4.


----------



## KickAssCop

A few questions for the guys.

1) Can't seem to push beyond 5700 marks in ultra. Dafuq! My CPU is clocked at 4.3 Ghz about now but don't understand how I am almost 1300 less than the norm on my clocks (2050 is where the card stabilizes). However, 3dmark is the only application where the card keeps jumping around between 1948-2088 clocks. In Deus Ex MD it is pegged at 2063 and down to 2050 during the bench and gaming.

2) Can it be because I didn't do a clean driver install?

3) Can it be because I am only using Windows 7?

4) Can it be that the latest NVidia drivers 372.70 are crap?

5) Is there any way to run OC mode on the card without resorting to using GPU Tweak 2?

6) Where can I download GPU Tweak 2 from? The website that I am directed to doesn't let me download it (asus something).

Would love to hear your thoughts.


----------



## DStealth

5700 GPU score on ultra is just fine with these clocks.
You're mistaken if 1300 less means the norm is 7000 in ultra, some few cards with higher limits are passing 6k's








Here're my best runs @2139 core:


----------



## GanGstaOne

5700-5800 is normal for ultra i get same scores with 2088Mhz and stock memory cpu clocks 4.2


----------



## Whitechap3l

Quote:


> Originally Posted by *KickAssCop*
> 
> A few questions for the guys.
> 
> 1) Can't seem to push beyond 5700 marks in ultra. Dafuq! My CPU is clocked at 4.3 Ghz about now but don't understand how I am almost 1300 less than the norm on my clocks (2050 is where the card stabilizes). However, 3dmark is the only application where the card keeps jumping around between 1948-2088 clocks. In Deus Ex MD it is pegged at 2063 and down to 2050 during the bench and gaming.
> 
> 2) Can it be because I didn't do a clean driver install?
> 
> 3) Can it be because I am only using Windows 7?
> 
> 4) Can it be that the latest NVidia drivers 372.70 are crap?
> 
> 5) Is there any way to run OC mode on the card without resorting to using GPU Tweak 2?
> 
> 6) Where can I download GPU Tweak 2 from? The website that I am directed to doesn't let me download it (asus something).
> 
> Would love to hear your thoughts.


1) Cant say anything, my ( watercoold ) is clocking 2177 and I can go even further
2) Yes imo. I didnt deinstalled the Driver correctly, installed the new and I got Trouble in games until I deinstalled everything the correct way
3) Sorry not an expert on this Level, but why it should?
4) No! At least for me the the Driver is pretty insane







Good clocks with Performance increase
5) The OC Profile not, but I mean you can just adjust the curve in afterburner
6) https://www.asus.com/support/Download/9/13/0/3/41/ - should be working but I can get better clocks/scores with afterburner ( I dunno why )


----------



## KickAssCop

Thanks for the replies. Seems my score and clocks aren't too bad. 2050 is all I can get with the AGs. Maybe should have waited for the OGs to come in stock but patience ran out.
This card seems like it won't go past the advertised clocks. +165 nets me exactly at 1836 in AB (the OC mode of the card). I tried 175 and 185 and it ran the benches but became unstable afterwards. AGs seem like an edition of the card created that did not pass the OG clocks.

2050 for a single card seems to be the norm. Don't know how it will do once I slot the second card. My SLi bridge arrives today so will test tonight.


----------



## Whitechap3l

Quote:


> Originally Posted by *KickAssCop*
> 
> Thanks for the replies. Seems my score and clocks aren't too bad. 2050 is all I can get with the AGs. Maybe should have waited for the OGs to come in stock but patience ran out.
> This card seems like it won't go past the advertised clocks. +165 nets me exactly at 1836 in AB (the OC mode of the card). I tried 175 and 185 and it ran the benches but became unstable afterwards. AGs seem like an edition of the card created that did not pass the OG clocks.
> 
> 2050 for a single card seems to be the norm. Don't know how it will do once I slot the second card. My SLi bridge arrives today so will test tonight.


Did you tried to Flash the Strix OC Bios on the Card?








I saw so many Reviews of the non OC strixes and nearly everyone was able to get same clocks as OG Version


----------



## Reckit

Quote:


> Originally Posted by *Whitechap3l*
> 
> Did you tried to Flash the Strix OC Bios on the Card?
> 
> 
> 
> 
> 
> 
> 
> 
> I saw so many Reviews of the non OC strixes and nearly everyone was able to get same clocks as OG Version


Makes no difference. Flashing with OC Bios just changes the default clocks. You can run the norm Bios and still achieve those clocks if your card can handle it.

2050 boost on a no oc strix card is all I can get (stable) and I have had two, both were pretty similar.

On a 3d Mark

powerlimit 120%
+170 coreclock
+505 mem clock
fan 90%

boosts to about 2050/65

firestrike (high-performance 24300 on graphics score)
Time Spy 7845 graphics score


----------



## GreedyMuffin

Quote:


> Originally Posted by *Benjiw*
> 
> Would try more voltage then see if that helps improve stability.


Hehe, I don't have a stabilty issue. I was just posting for some reason, I don't remember.


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> Makes no difference. Flashing with OC Bios just changes the default clocks. You can run the norm Bios and still achieve those clocks if your card can handle it.
> 
> 2050 boost on a no oc strix card is all I can get (stable) and I have had two, both were pretty similar.
> 
> On a 3d Mark
> 
> powerlimit 120%
> +170 coreclock
> +505 mem clock
> fan 90%
> 
> boosts to about 2050/65
> 
> firestrike (high-performance 24300 on graphics score)
> Time Spy 7845 graphics score


Well 2050/65 isnt that bad or am I wrong? ( on air )
So did I got a good Card or is watercooling the key? I mean I get 8250ish ( graphic ) TimeSpy scores... that is 5% more what I found not that bad actually


----------



## Reckit

Quote:


> Originally Posted by *Whitechap3l*
> 
> Well 2050/65 isnt that bad or am I wrong? ( on air )
> So did I got a good Card or is watercooling the key? I mean I get 8250ish ( graphic ) TimeSpy scores... that is 5% more what I found not that bad actually


I reckon so, as mentioned time and time again I think temp is the key, if you can keep it down then I would expect better performance.

gonna run my time spy bench again. I should be able to get more, looking for that 8000 score


----------



## KickAssCop

I need to run Timespy. Will do it tonight. Also will install my second card so will post more results.
Why would OC bios make a difference as the card is unable to handle anything beyond the AG clocks? Also card runs at 1670 defaults when not in OC mode and I don't have GPU Tweak 2 installed.

Voltage or fan speed doesn't seem to do anything to my clocks. At +200 the card fails. At 185 it fails Firestrike non Ultra. At 175 it runs Firestrike but fails gaming. So 165 is all I found to be stable. Memory side of things, didn't try more than 500 but many people said they saw lower fps after +500 on their cards.

Oh well. Not the upgrade everyone made it out to be. Going from a 980 Ti classified, I am not seeing the gains (except in DX MD where my AVG for Ultra pre-set at 1440P went from 37.4 to 49.2 fps.


----------



## Whitechap3l

Quote:


> Originally Posted by *KickAssCop*
> 
> Why would OC bios make a difference as the card is unable to handle anything beyond the AG clocks? Also card runs at 1670 defaults when not .


Good question but why I than can hit 2177 stable with OC Bios and not with my non OC Bios ?








Makes no sense for me at least...


----------



## BrainSplatter

Quote:


> Originally Posted by *juniordnz*
> 
> Oh well. Not the upgrade everyone made it out to be. Going from a 980 Ti classified, I am not seeing the gains (except in DX MD where my AVG for Ultra pre-set at 1440P went from 37.4 to 49.2 fps.


Depending on game and resolution u can expect about 10-30% improvement (if not CPU limited ofc). The rule of thumb is, the higher the resolution and the more 'advanced' the game engine the higher the gains going from OC 980TI to OC 1080. That's what u can see with Deus Ex.


----------



## Reckit

Quote:


> Originally Posted by *Whitechap3l*
> 
> Good question but why I than can hit 2177 stable with OC Bios and not with my non OC Bios ?
> 
> 
> 
> 
> 
> 
> 
> 
> Makes no sense for me at least...


Sorcery


----------



## toncij

On average, 1080 OC is faster than 980Ti OC about 25% and than Titan X OC about 15%. Depends on clock, but I'm talking about [email protected], [email protected], [email protected]


----------



## Whitechap3l

Quote:


> Originally Posted by *Reckit*
> 
> Sorcery


Yeah probably I just got the magic touch


----------



## KickAssCop

Fine I will flash my cards tonight. Can someone post links to latest flash tool and bios.
Sorry a bit busy to search right now.


----------



## Reckit

Quote:


> Originally Posted by *KickAssCop*
> 
> Fine I will flash my cards tonight. Can someone post links to latest flash tool and bios.
> Sorry a bit busy to search right now.


https://www.techpowerup.com/vgabios/183797/asus-gtx1080-8192-160522 Here is a link to oc bios

https://www.techpowerup.com/downloads/2709/nvflash-5-292-0-for-windows nvflash link

below is a link to a guide on how to do it. works for 1080 too
http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980


----------



## KickAssCop

Amazing. Thanks!


----------



## IronAge

Quote:


> Originally Posted by *toncij*
> 
> On average, 1080 OC is faster than 980Ti OC about 25% and than Titan X OC about 15%. Depends on clock, but I'm talking about [email protected], [email protected], [email protected]


i have compared GTX980Ti @ 1550 and GTX1080 @ 2100 and its about 15% difference.


----------



## BrainSplatter

Quote:


> Originally Posted by *IronAge*
> 
> i have compared GTX980Ti @ 1550 and GTX1080 @ 2100 and its about 15% difference.


Game and resolution ? It depends a lot on game/game engine.


----------



## Salad Fingers

What's up fellow 1080 owners! Glad to be part of the group









I have managed up to 2189mhz benchmark stable, but with the voltage locked at 1093 that number is definitely not game stable. Currently running 2113-2138mhz, some games will stay at 2113 while others will bump to 2126. Best I can do with this BIOS.


----------



## IronAge

Quote:


> Originally Posted by *BrainSplatter*
> 
> Game and resolution ? It depends a lot on game/game engine.


BF4 2560x1440 ... and my i7 runs @ 4.6 GHz.


----------



## KickAssCop

Quote:


> Originally Posted by *IronAge*
> 
> i have compared GTX980Ti @ 1550 and GTX1080 @ 2100 and its about 15% difference.


Seeing the same.


----------



## moustang

Quote:


> Originally Posted by *IronAge*
> 
> BF4 2560x1440 ... and my i7 runs @ 4.6 GHz.


Not sure I would call BF4 a worthy benchmark of modern GPUs. You're not really benchmarking anything other than raw fill rate. This is a game even a 2GB GTX 770 could run at Ultra settings at more than 60fps.


----------



## BrainSplatter

Quote:


> Originally Posted by *moustang*
> 
> Not sure I would call BF4 a worthy benchmark of modern GPUs. You're not really benchmarking anything other than raw fill rate


Indeed, that's why there is only a 15% difference. More 'modern' engines will show a difference of up to 30%, espcially at 1440p or 4K. Older engines might show no performance advantage at all.


----------



## toncij

Quote:


> Originally Posted by *IronAge*
> 
> i have compared GTX980Ti @ 1550 and GTX1080 @ 2100 and its about 15% difference.


Something was wrong with your test. Theoretical performance of a 980Ti @1550 is 8729600 FLOPS. The. perf. of a Titan [email protected] 1550 is 9523200 FLOPS and [email protected] is 10752000. 1080 is 23% faster than a 980Ti and 13% faster than Titan X (Maxwell). If you're getting only 15% something is limiting your 1080's performance. Pure compute is 23% and in my own tests I have seen this difference that makes it impossible that 980Ti is as fast as Titan X (can't theoretically be) and thus 1080 so slow in comparison.


----------



## IronAge

I doubt that a 770 would run my Settings in BF4 ... there is no bottleneck neither CPU/GPU both running close to 99%.

I have also tested it with 3D Mark ... and the graphic scores just shows the same differences. (GTX980Ti ~10500 GTX1080 - 11900-12000.)

i been also hoping that DX12 would make a difference ... but seeing BF1 performs worse with DX12 than with DX11...









Not sure if i am gonna keep one of my remaining two GTX1080 or just keep the GTX980Ti which runs @ 1550 constantly and has no clock-drops unlike the GTX1070/1080.

i don't like Boost 3.0 too much ... in fact i think it's pretty annoying ... and i been looking for reasons to keep a GTX1080.


----------



## juniordnz

I had a 970 and my performance gain was something from 80-90% (Measured in Firestrike) compared to 1080. I would have definitely skipped this generation if I had a 980ti. But that's me...


----------



## IronAge

i had multiple gpus each gen at least ten GTX980Ti alone ... and the one i kept is the best of them oc wise. (Gigabyte Xtreme)

don't even play that much ... playing more with the cards, trying 10+ different Bios on each of them.









btw: i flashed Gainward Goes like Hell Bios and reached highest memory overlock with that Bios 5576 ... maybe even 5584.

it has 240W max PL though ... but that should be enuff for a GTX1080 under air cooling.

when the difference between actually used PL and maximum PL gets too high i often observe PerfCap VRel.


----------



## Besty

I conducted my testing with a Gigabyte Xtreme Waterforce 980ti at 1550/2000 against 1080 Classified and 1080 Waterforce at 2050/5000 and my findings were similar (around 15% at 1080p) so i sent both 1080 back.

My opinion is that the higher the 980ti clocks, the narrower the gap in perf between the two @ 1080p.

In terms of stock 980ti vs 1080 Classified and Waterforce I would go for 1080 everytime.

I think those with the newer big Ultrawide displays (3440x1440 or whatever it is) are probably better off with the 1080 as it has more VRAM and as a poster pointed out earlier, the higher the resolution, the lower the cpu bottleneck, allowing the 1080 to stretch its legs.


----------



## Thetbrett

so I finally got it, was "lost in transit" for a while there. Zotac Amp. Happy so far. Boosts to 1987 stock:

and best oc I could do was +65 +400 75% volt, but will tinker with that to keep temps down, but i have an agressive fan curve, never goes past 77c


Funny thing is, my 980 ti still beats it in benchmarks like valley by a good 10 fps, but game wise it's clearly better. You don't play benchmarks though.


----------



## Cornerer

Quote:


> Originally Posted by *IronAge*
> 
> i had multiple gpus each gen at least ten GTX980Ti alone ... and the one i kept is the best of them oc wise. (Gigabyte Xtreme)
> 
> don't even play that much ... playing more with the cards, trying 10+ different Bios on each of them.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw: i flashed Gainward Goes like Hell Bios and reached highest memory overlock with that Bios 5576 ... maybe even 5584.
> 
> it has 240W max PL though ... but that should be enuff for a GTX1080 under air cooling.
> 
> when the difference between actually used PL and maximum PL gets too high i often observe PerfCap VRel.


May I ask what's the best BIOS(s) u've tried so far on 1080s?
The very best one I heard here was MSI seahawk x (non EK).
Some claimed Asus gave best clocks but also lower benchmark scores.


----------



## Cornerer

Quote:


> Originally Posted by *Thetbrett*
> 
> and best oc I could do was +65 +400 75% volt, but will tinker with that to keep temps down, but i have an agressive fan curve, never goes past 77c


This is actually quite bad in comparison to quite many other custom models.
Is it 2-fan ver. or AMP Extreme? And what's the ambient?


----------



## Benjiw

Quote:


> Originally Posted by *Thetbrett*
> 
> so I finally got it, was "lost in transit" for a while there. Zotac Amp. Happy so far. Boosts to 1987 stock:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> and best oc I could do was +65 +400 75% volt, but will tinker with that to keep temps down, but i have an agressive fan curve, never goes past 77c
> 
> 
> 
> 
> Funny thing is, my 980 ti still beats it in benchmarks like valley by a good 10 fps, but game wise it's clearly better. You don't play benchmarks though.


You're hitting a lot of perfcaps in that 2nd image.


----------



## Bishop07764

Quote:


> Originally Posted by *Cornerer*
> 
> May I ask what's the best BIOS(s) u've tried so far on 1080s?
> The very best one I heard here was MSI seahawk x (non EK).
> Some claimed Asus gave best clocks but also lower benchmark scores.


Best one for me thus far has been the MSI Gaming Z bios. But I have a Gaming X PCB. I think many people were getting the best results with founders edition bios. I have noticed in the latest whql driver (372.70) that my clocks have decreased by about 12 mhz at default. I have to kick up my core by another 10 in Afterburner and it goes right back to where it was before. Anyone else notice anything like this?


----------



## Cornerer

Quote:


> Originally Posted by *Bishop07764*
> 
> I have noticed in the latest whql driver (372.70) that my clocks have decreased by about 12 mhz at default. I have to kick up my core by another 10 in Afterburner and it goes right back to where it was before. Anyone else notice anything like this?


It's probably stressing GPU harder than b4 for same clock. Few in Nvidia forum claimed their max OC not stable anymore and need knock down a notch (e.g. 2088 to 2063).
Isn't necessarily a bad thing except u might gotta tweak your OC again for 10-15min


----------



## juniordnz

Agreed.

I believe that as drivers develop and become more mature, the card will be better utilized by applications. That's why the max overclock may go down a bit as the time goes by. That happend with maxwell aswell. My 970 was rock solid at 1605mhz in the beginning, but had to dial it down a clock or two along the way as drivers improved.

Lost 1 clock with last driver, but performance in FS stayed exactly the same.
IMHO, those who stay with old drivers ONLY because of the clocks are chasing numbers, not performance.


----------



## kx11

what about this unlocked bios for 1080 ?! is it good ? did anyone try it ?


----------



## IronAge

Quote:


> Originally Posted by *juniordnz*
> 
> Lost 1 clock with last driver, but performance in FS stayed exactly the same.
> IMHO, those who stay with old drivers ONLY because of the clocks are chasing numbers, not performance.


Probably try the Gainward GLH Bios ... got me about 30 MHz higher VRAM Clocks than the EVGA FTW Bioses.

And to my own surprise 2126/5576 got me about the same graphics Score as 2152/5544.

Plus my AMP only needs 1.050V for 2126 and it needs 1.062 V for 2139-2152.


----------



## Thetbrett

Quote:


> Originally Posted by *Cornerer*
> 
> This is actually quite bad in comparison to quite many other custom models. B
> Is it 2-fan ver. or AMP Extreme? And what's the ambient?


2 fan. 22c ambient. Bacall accounts, the only difference between amp models is the cooler.


----------



## Whitechap3l

Quote:


> Originally Posted by *kx11*
> 
> what about this unlocked bios for 1080 ?! is it good ? did anyone try it ?


Hey mate I can give you my experience :
Strix t4 gives me good ans with newest drives stable clocks around 2230 MHz but performance is lower than with the for me best working evga fe bios @ 2177. Also I get more noises with more voltages so it is basically **** for me at least. Only thing : I can say my card boost to mid 2200 clocks - wohoo B-)


----------



## arrow0309

Quote:


> Originally Posted by *Whitechap3l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kx11*
> 
> what about this unlocked bios for 1080 ?! is it good ? did anyone try it ?
> 
> 
> 
> Hey mate I can give you my experience :
> Strix t4 gives me good ans with newest drives stable clocks around 2230 MHz but performance is lower than with the for me best working evga fe bios @ 2177. Also I get more noises with more voltages so it is basically **** for me at least. Only thing : I can say my card boost to mid 2200 clocks - wohoo B-)
Click to expand...

Hi, are you saying you have a 1080 Asus Strix and using it with a (EVGA) FE bios?
All OK? Is it safe, the two cards have such different vrm's









I'm asking you cause I also own a 1080 Strix @ Strix OC bios under water as well (Bitspower wb) and is running fine


----------



## Deders

Basically there hasn't been a proper unlocked bios released yet, but people have been experimenting with crossing different bios' with different cards.


----------



## Thetbrett

this was from a valley run. 100% core voltage gets rid of per cap issues, and 90% fans keeps te,ps down. 112.4 score though. What do people make of these?


----------



## kx11

Quote:


> Originally Posted by *Whitechap3l*
> 
> Hey mate I can give you my experience :
> Strix t4 gives me good ans with newest drives stable clocks around 2230 MHz but performance is lower than with the for me best working evga fe bios @ 2177. Also I get more noises with more voltages so it is basically **** for me at least. Only thing : I can say my card boost to mid 2200 clocks - wohoo B-)


if the clocks are higher than normal and the performance is worse then what's the point ??

nvidia somewhere stated that they'll fix the OC core readings


----------



## Whitechap3l

Quote:


> Originally Posted by *arrow0309*
> 
> Hi, are you saying you have a 1080 Asus Strix and using it with a (EVGA) FE bios?
> All OK? Is it safe, the two cards have such different vrm's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm asking you cause I also own a 1080 Strix @ Strix OC bios under water as well (Bitspower wb) and is running fine


Yes no problems for some weeks now
Strix OC bios is not the best for me scorewise.. Strix bios gave me same clocks but lower scores.. If u are interested in I can post some pictures tomorrow afternoon


----------



## Whitechap3l

Quote:


> Originally Posted by *kx11*
> 
> if the clocks are higher than normal and the performance is worse then what's the point ??
> 
> nvidia somewhere stated that they'll fix the OC core readings


Sure, that's why I am using the evga one 
With new future drivers I will always check and tell my experiences


----------



## Benjiw

So is there anyway to mod or make custom bios for the 10x0 cards yet?


----------



## skyn3t

^^ sort of yes.


----------



## Thetbrett

members form won't accept my gpuz validation number.


----------



## Benjiw

Quote:


> Originally Posted by *skyn3t*
> 
> ^^ sort of yes.


Sort of?


----------



## ucode

Quote:


> Originally Posted by *kx11*
> 
> what about this unlocked bios for 1080 ?! is it good ? did anyone try it ?


What unlocked VBIOS? If you mean the strix xoc without software power and temperature limits then that one's not so great IMO. IIRC the video clock was limited to 1708MHz. The t4 version is better in that it allows the video clock to increase above 1708MHz and usually sits about 200MHz lower than the GPU clock. I think the hardware temp limit of 96C and shutdown of 99C might still work but not confirmed by me. t4 VBIOS did enable my FE card to reach 26k graphics score in Fire Strike which I couldn't do with the FE BIOS but given voltage needed and power draw IMO not worth it for 24/7 clocks. Results have seemed to be varied with different users, perhaps the weaker regulation on FE boards doesn't help. IOWYMMV so probably going to have to find out for yourself.


----------



## skyn3t

Quote:


> Originally Posted by *Benjiw*
> 
> Sort of?


not stable but it can be done


----------



## Benjiw

Quote:


> Originally Posted by *skyn3t*
> 
> not stable but it can be done


Any word on when we'll be able to mod the 10x0 cards and it be stable? Like we can do with the 7xx and 9xx cards? Want to pull the trigger on some 1080s but gpu boost is a big no for me, would rather mod the bios and remove it.


----------



## juniordnz

Quote:


> Originally Posted by *Benjiw*
> 
> Any word on when we'll be able to mod the 10x0 cards and it be stable? Like we can do with the 7xx and 9xx cards? Want to pull the trigger on some 1080s but gpu boost is a big no for me, would rather mod the bios and remove it.


I'd suggest you to wait then. I'm pretty pessimistic with bios modding in these cards...Nvidia already gave all the signs that pascal is not a "very stable" architecture (voltage lock, insane high thermal throttle, loss of stability with added voltage even within the limits of 1.093V), therefore I'm not so convinced we wiuill have all the flexibility we got with maxwell or kepler.


----------



## looniam

Quote:


> Originally Posted by *skyn3t*
> 
> not stable but it can be done


how in the HELL are you brother!!!!!!


----------



## Benjiw

Quote:


> Originally Posted by *juniordnz*
> 
> I'd suggest you to wait then. I'm pretty pessimistic with bios modding in these cards...Nvidia already gave all the signs that pascal is not a "very stable" architecture (voltage lock, insane high thermal throttle, loss of stability with added voltage even within the limits of 1.093V), therefore I'm not so convinced we wiuill have all the flexibility we got with maxwell or kepler.


Well that sucks, I know these cards don't scale well with voltage like maxwell did but at the same time I thought some 3rd party vendors would find a way around it.


----------



## skyn3t

Quote:


> Originally Posted by *looniam*
> 
> how in the HELL are you brother!!!!!!


in a very deep hole







. sky still alive


----------



## looniam

Quote:


> Originally Posted by *skyn3t*
> 
> in a very deep hole
> 
> 
> 
> 
> 
> 
> 
> . sky still alive


glad you're still sucking air.







good to see ya but it had to be past my bed time.

hopefully chat with ya later sometime. don't be a stranger.


----------



## skyn3t

Quote:


> Originally Posted by *looniam*
> 
> glad you're still sucking air.
> 
> 
> 
> 
> 
> 
> 
> good to see ya but it had to be past my bed time.
> 
> hopefully chat with ya later sometime. don't be a stranger.


I won't. never stop reading all the good's here just not ready to post


----------



## fat4l

Has anyone tried the newest drivers? How are they?


----------



## jlp0209

For kicks I loaded the StrixOC bios onto the 2nd bios slot on my 1080 Hybrid. Lots of control over features missing, but voltage did go up to 1.12v (or 1.15, I forgot). It made no difference at all, my card still crashed at 2088 and 2063mhz. Re-loaded the proper Hybrid slave bios back onto the card just fine.

I'm really bummed. I'm not going to return a card due to bad OC, I'd sell it if I choose to get rid of it. I am so tempted to go and buy a Classified instead to just have the highest clock possible without loading custom bios onto the card. I'd have to believe it will go beyond the 2050mhz wall that my Hybrid hits.

I am able to OC the memory to 10,400mhz.

My main game is F1 2016 and I want to get the minimum FPS as close to 100 as possible (1440p resolution). Core clock at 2050 and stock memory yields minimum FPS of 84. Simply raising the memory to 10,400mhz gives me minimum FPS of 86. If I can get a card that will boost to 2130 or 2150 that's a 100mhz difference, which is a lot.

Knowing my luck, if I get a Classified it will be just as bad.

Edit- it's still better than my 980 Ti w/ Hybrid kit + custom bios w/ core clock at 1478mhz. Minimum FPS was 71. So overall I guess I am pleased with the bump the 1080 gives me, was just expecting better.


----------



## KickAssCop

Did the DDU. It cleaned about 30 GB of crap in my OS drive lol.

Re-ran the bench. Scored the same.

Installed second card and first default run was 9182 in Firestrike Ultra. My 980 Tis scored about 9003 maxed out.

Need to test more but thus far, what nonsense people say about 1080s being 30% faster than 980 Tis. Will test games and see what's up.


----------



## Bishop07764

Quote:


> Originally Posted by *Cornerer*
> 
> It's probably stressing GPU harder than b4 for same clock. Few in Nvidia forum claimed their max OC not stable anymore and need knock down a notch (e.g. 2088 to 2063).
> Isn't necessarily a bad thing except u might gotta tweak your OC again for 10-15min


Thanks. It's just seemed a bit strange. I can still up my clocks to what I have before it appears with very short testing today anyway. I guess time will tell if it impacts my stability. I eventually ran into this with my old 780 Lightning. It could usually compensate by upping the voltage. That doesn't seem to help Pascal too much.


----------



## Cornerer

Quote:


> Originally Posted by *Thetbrett*
> 
> 2 fan. 22c ambient. Bacall accounts, the only difference between amp models is the cooler.


Thx for reply.
AMP Edition was once my 1st choice due to 2.1+GHz core speeds claimed by quite some buyers, but also worried about its cooling and my 30+C ambient (plus don't like the sag with Extreme cooler.


----------



## Whitechap3l

Quote:


> Originally Posted by *fat4l*
> 
> Has anyone tried the newest drivers? How are they?


For me it is pretty good, clocks 2177 again stable again in Games.


----------



## arrow0309

Quote:


> Originally Posted by *Whitechap3l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> Hi, are you saying you have a 1080 Asus Strix and using it with a (EVGA) FE bios?
> All OK? Is it safe, the two cards have such different vrm's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm asking you cause I also own a 1080 Strix @ Strix OC bios under water as well (Bitspower wb) and is running fine
> 
> 
> 
> Yes no problems for some weeks now
> Strix OC bios is not the best for me scorewise.. Strix bios gave me same clocks but lower scores.. If u are interested in I can post some pictures tomorrow afternoon
Click to expand...

Quote:


> Originally Posted by *Whitechap3l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> Has anyone tried the newest drivers? How are they?
> 
> 
> 
> For me it is pretty good, clocks 2177 again stable again in Games.
Click to expand...

Hi and thanks, of course I'm interested








Did this evga bios managed to boost all by default up to the same 2088 as the Asus OC or did you need any extra oc?
How about the voltages (cause I didn't play with them yet)?
Any word on the temp differences between the two bioses?
And lastly, what temps did you get at 2177 (I suppose on vdef)?









Edit:

I've just run a Time Spy with the Strix OC bios at 2164/1377 vdef:


----------



## GreedyMuffin

I'm going to test later tonight with the new driver.









Very happy with my current 1080. Overclocks good and performs good.

Been playing with my 5960X for the last couple of days. Tweaking my current OC.


----------



## Whitechap3l

Quote:


> Originally Posted by *arrow0309*
> 
> Hi and thanks, of course I'm interested
> 
> 
> 
> 
> 
> 
> 
> 
> Did this evga bios managed to boost all by default up to the same 2088 as the Asus OC or did you need any extra oc?
> How about the voltages (cause I didn't play with them yet)?
> Any word on the temp differences between the two bioses?
> And lastly, what temps did you get at 2177 (I suppose on vdef)?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> 
> I've just run a Time Spy with the Strix OC bios at 2164/1377 vdef:


I adjust the voltages with afterburner curve, it didnt boost that far from alone...
Voltage is the max. 1.093v even if I take the strix t4 bios and push it to [email protected] 1.2v scores stay the same/ or even lower ( testet on 3 different Driver Versions ) and even when it would noisewise ill go with 1.093v









Temps are around 55 degrees on both Bios ( no Trouble with watercooling








)

Your score is roughly the same I get with Evga Bios








Maybe I ll try the Strix OC one with latest drives for getting some more Points


----------



## r0l4n

Quote:


> Originally Posted by *r0l4n*
> 
> Did anybody experience a change in max stable clocks when changing the motherboard? I had to reduce the 24/7 clocks by 2 bins after switching to X99, I get artifacts with the same clocks that were stable with my previous Z97...


Has anyone experienced something similar? I can't figure out what's up.


----------



## IronAge

You have used older drivers with your Z97 setup ? 368.x allows higher clock rates but resulting lower graphics scores in Benchmarks.


----------



## Cornerer

Quote:


> Originally Posted by *r0l4n*
> 
> Did anybody experience a change in max stable clocks when changing the motherboard? I had to reduce the 24/7 clocks by 2 bins after switching to X99, I get artifacts with the same clocks that were stable with my previous Z97...


I thought it's pretty acknowledged fact that there're stability differences between motherboards








My previous HD 6950's best performing clock shifted from 895 to 912MHz after changing motherboard (could go higher with both mobos but regression in performance).


----------



## Benjiw

Quote:


> Originally Posted by *r0l4n*
> 
> Has anyone experienced something similar? I can't figure out what's up.


Older drivers let you have higher clocks, it's not uncommon for drivers to change stability. I would be very surprised if you x99 was causing the issue unless the board is faulty.


----------



## xTesla1856

Joining the party late as usual, after a brief liaison with a 1070 I picked up a Founder's 1080 from EVGA:

What are my options for custom BIOS with this card?


----------



## IronAge

Not too many without Shunt-Mod and Watercooling.


----------



## Thetbrett

Quote:


> Originally Posted by *Cornerer*
> 
> Thx for reply.
> AMP Edition was once my 1st choice due to 2.1+GHz core speeds claimed by quite some buyers, but also worried about its cooling and my 30+C ambient (plus don't like the sag with Extreme cooler.


I actually now have a stable 2113 oc. Ran loops of valley and heaven no problem. But had to have fan and voltage at 100%. I don't game at that, happily sits at 2050 at 50% voltage and 80% fan. memeory now at +450 too. Nice card for the price.


----------



## xTesla1856

Quote:


> Originally Posted by *IronAge*
> 
> Not too many without Shunt-Mod and Watercooling.


Waterblock is on it's way already


----------



## r0l4n

Quote:


> Originally Posted by *Benjiw*
> 
> Older drivers let you have higher clocks, it's not uncommon for drivers to change stability. I would be very surprised if you x99 was causing the issue unless the board is faulty.


Quote:


> Originally Posted by *r0l4n*
> 
> Has anyone experienced something similar? I can't figure out what's up.


Well the OS and the driver version were the same. I'm thinking perhaps the pci-e power delivery is different?


----------



## IronAge

Does your X99 have a PLX Chip ?
Quote:


> Originally Posted by *xTesla1856*
> 
> Waterblock is on it's way already


You could give the Amp Extreme Bios a try which has 2050/5400 under Load with H20 ... without having to use AB.

Inno3D iChll 3X allows higher memory clocks as well as the Gainward GLH Bios.


----------



## Bishop07764

Quote:


> Originally Posted by *r0l4n*
> 
> Well the OS and the driver version were the same. I'm thinking perhaps the pci-e power delivery is different?


Not real familiar with the X99 platform. Doesn't it overclock the PCIE when pushing memory above a certain point? Overclocking the frequency of the PCIE might contribute to some instability.


----------



## Bishop07764

Well. After messing around some more with these new drivers, I am back to 369.09. The newer driver was netting me clocks over 2150 core at stock voltage, but the performance was definitely worse. Especially in the Witcher 3 despite using DDU to uninstall the previous set. Firestrike score definitely went down with the same clocks too.


----------



## skyn3t

who has MSI GeForce GTX 1080 SEAHAWK EK.


----------



## Bishop07764

Quote:


> Originally Posted by *skyn3t*
> 
> who has MSI GeForce GTX 1080 SEAHAWK EK.


I have a Seahawk EK. Thanks again for your 780 Lightning Bios. That served me well for about 3 years.


----------



## Benjiw

Quote:


> Originally Posted by *Bishop07764*
> 
> Well. After messing around some more with these new drivers, I am back to 369.09. The newer driver was netting me clocks over 2150 core at stock voltage, but the performance was definitely worse. Especially in the Witcher 3 despite using DDU to uninstall the previous set. Firestrike score definitely went down with the same clocks too.


So did going back get you the same results? If so then it's driver related but I wouldn't go back to using old drivers, it's counter productive really. Unless you're on HWbot trying to get the number 1 spot.


----------



## skyn3t

Quote:


> Originally Posted by *Bishop07764*
> 
> I have a Seahawk EK. Thanks again for your 780 Lightning Bios. That served me well for about 3 years.


you welcome glad my bios worked good for you. how do you like your GPU, I had returned my first Seahawk and got a new one today.


----------



## Bishop07764

Quote:


> Originally Posted by *Benjiw*
> 
> So did going back get you the same results? If so then it's driver related but I wouldn't go back to using old drivers, it's counter productive really. Unless you're on HWbot trying to get the number 1 spot.


Performance has gone right back to normal. I mainly did it because the Witcher 3 was appallingly worse. I don't care so much about the benchmarks honestly. I'm usually a fan of running about the latest because of the improvements. Core clock gains were impressive though.


----------



## Bishop07764

Quote:


> Originally Posted by *skyn3t*
> 
> you welcome glad my bios worked good for you. how do you like your GPU, I had returned my first Seahawk and got a new one today.


I've been having a blast with mine. It's been treating me well. I've already flashed the gaming Z BIOS onto it. With this BIOS, it defaults to 2076. I can push it to 2126 game stable with no added voltage. The newer driver let me push it over 2150 but it wasn't performing quite as well in the Witcher 3. It might have more in it as I haven't hit the power limit for it yet. I haven't tried using the curve for overclocking. Temps have been great.


----------



## KickAssCop

Installed ASUS GPU Tweak 2 and set cards to OC mode. Got 9433 in firestrike.

http://www.3dmark.com/3dm/14563561?

Flashed my AGs with OG bios. Cards are boosting till 2038 without any overclock. Overclocked CPU to 4.5 GHz.

Got 9699. Pretty disappointed right now and don't know what else I can do.

http://www.3dmark.com/3dm/14564066?


----------



## TK421

You guys with 1080 single, how much fps on BF1 beta lowest setting?


----------



## Salad Fingers

Quote:


> Originally Posted by *TK421*
> 
> You guys with 1080 single, how much fps on BF1 beta lowest setting?


Is it free to play? I could give it a try. Meh, not free. Sorry, never been into BF games so I won't pay for it.
I didn't look good enough. Beta is free, downloading. Will keep you posted.

Someone in LTT forums reported 98-132fps on 1440p, but take that with a pinch of salt.


----------



## toncij

All Ultra, 2560x1440 - 99-115 FPS single 1080.


----------



## Salad Fingers

BF1 scales really well...

At 1080p it won't drop bellow 130fps. I get 130-200, around 150 average. Unfortunately I don't have a monitor higher than 1080p yet, I could try supersampling though if we know the exact value for certain resolutions.

Edit: Ultra preset, 0% motion blur, DX12 setting.


----------



## toncij

On Nvidia, so far DX12 runs worse than DX11 so avoid it. Just crank up the resolution slider to 100%.


----------



## Salad Fingers

Quote:


> Originally Posted by *toncij*
> 
> On Nvidia, so far DX12 runs worse than DX11 so avoid it. Just crank up the resolution slider to 100%.


If I'm getting 130-200fps on something that runs worse, I am impressed. What resolution does moving the slider at 100% ends up at?

Edit: *Moments later...* It is very unstable at DX11 and 100%. 3 out of 5 times it got stuck while loading the menu. I managed to get in-game only once, but I also crashed haha. It might be my overclock not being stable at those settings. From what I managed to see, I was running around 70fps.


----------



## zlpw0ker

can I use the asus gpu tweak 2 even if I dont have a asus gpu?

Will I get better TDP on my 1080 seahawk x if I plug in the 6pin power connector on my x99-e ws/3.1 mobo?


----------



## Salad Fingers

Quote:


> Originally Posted by *zlpw0ker*
> 
> can I use the asus gpu tweak 2 even if I dont have a asus gpu?
> 
> Will I get better TDP on my 1080 seahawk x if I plug in the 6pin power connector on my x99-e ws/3.1 mobo?


You could, but all those tools are Afterburner-like and don't really make a difference. I was trying to "hack" my card to work with EVGA's K-Boost which worked, but only when not in-game







I don't think ASUS's software has anything proprietary like that, though.

If I were you I'd plug both. It's not going to draw additional power if it doesn't need it, so why limit it by hardware?


----------



## IronAge

Flash an Asus Bios then you got an Asus







... there are very few GTX1080 with custom VRM Controler (EVGA Classified and GALAX HOF the only ones i know)

The Strix OC T4 Bios together with the 369.09 Mod driver given me the highest Graphics Score yet ... 12188 with 2126/5576.

BUT the missing limits are a problem under air cooling since it wont reduce GPU Clock rate when the GPU reaches 62 Degree Celcius it sometimes freezes then or i get lower scores.

so i think the Strix OC t4 Bios is good for aftermarket/H20 coolers

With the Seahawk EK i would try the Gaming X OC Modus Bios, the Strix OC Review Sample Bios or the Strix OC t4 Bios.


----------



## zlpw0ker

Quote:


> Originally Posted by *Salad Fingers*
> 
> You could, but all those tools are Afterburner-like and don't really make a difference. I was trying to "hack" my card to work with EVGA's K-Boost which worked, but only when not in-game
> 
> 
> 
> 
> 
> 
> 
> I don't think ASUS's software has anything proprietary like that, though.
> 
> If I were you I'd plug both. It's not going to draw additional power if it doesn't need it, so why limit it by hardware?


ohh,ok ill just tick with afterburner and XOC then,thanks.

I found out the vga 1080 ftw hybrid was longer than I expected and it covers 4 sata contacts on my mobo, its not a big issue but if i want to unplug the sata cables its gonna give me hard time. I have the XL R2 case and there isnt much space there and with the length of the 1080 hybrid+power connector that is required I dont have the space for the gpu.
Secondly I chose the 1080 seahawk x due to black/ silver/white theme and it fits great with my black and white theme in my case,and the fact that it has 1x 8pin connector which means less cables.

but I just wanted to know if I added the pcie power connector thats on my mobo if I get more TDP % in afterburner,I have now 105% TDP max on the seahawk x, and it seems IF I ever chose to try and OC my GPU that I could push it further with 120% TDP than with 105% TDP.
I dont have plans on modding the BIOS.


----------



## Salad Fingers

Quote:


> Originally Posted by *zlpw0ker*
> 
> but I just wanted to know if I added the pcie power connector thats on my mobo if I get more TDP % in afterburner,I have now 105% TDP max on the seahawk x, and it seems IF I ever chose to try and OC my GPU that I could push it further with 120% TDP than with 105% TDP.
> I dont have plans on modding the BIOS.


Oh, I'm not sure about that. I think that depends on the BIOS, tbh. Fun fact: My "MSI Gaming" card had a limit of 121%. After flashing the "MSI Gaming X" BIOS which unlocked the voltage from 1.075V to 1.093V, my slider actually went down to 107%. So I think there's some weird connection between volts and your power limits, probably because if it's drawing more volts it's actually higher % of power anyway. Previously, my graphs would show around 70-80% power drawn, now they say 80-90%, for whatever that is worth. So if I would take a bet, I'd say the sliders are fixed from the card's firmware no matter the pins connected, and also also tied with the available voltage limit.


----------



## IronAge

Your GPU/OC runs into PerfCap Vrel when the difference between the PL you set and the used PL gets high without adding VDDC.

many Bios i have tried set way too high default or max PL. (for instance AMP Extreme, Gigabyte Extreme), and setting it under 100% got me higher and more stable OC.

the default and max power limit is set by Bios in Watt and its different from bios to bios.

you may check the default and max Power Limit in Watt by executing this in a command line:

C:\Program Files\NVIDIA Corporation\NVSMI>nvidia-smi.exe -q -d power

Power Limits of Zotac AMP! Extreme Bios:


----------



## Salad Fingers

Quote:


> Originally Posted by *IronAge*


==============NVSMI LOG==============

Timestamp : Fri Sep 02 14:54:19 2016
Driver Version : 372.70

Attached GPUs : 1
GPU 0000:06:00.0
Power Readings
Power Management : Supported
Power Draw : 49.37 W
Power Limit : 288.90 W
Default Power Limit : 270.00 W
Enforced Power Limit : 288.90 W
Min Power Limit : 90.00 W
Max Power Limit : 291.00 W
Power Samples
Duration : 2.36 sec
Number of Samples : 119
Max : 49.61 W
Min : 49.01 W
Avg : 49.22 W

Damn. You're much better off than I am. Too bad yellow would look too weird in my build, lol.

EDITED the above because I had my power limit to 95% instead of 107%.


----------



## juniordnz

To anyone who have their 1080 on water:

Did watercooling made possible to achieve higher clocks that on air would crash/artifact or did it just reduced thermal throttle (not affecting max overclock)?


----------



## Salad Fingers

Water should in theory keep a high clock while drawing a lower voltage... which for Pascal is supposed to be more stable? I'm not familiar with the concept of less voltage = more stable. It always used to be the other way around for my experience







A friend has a FE running at 2190-2207 like a champ under water, never exceeding 40C. Forgot to ask about his voltage though







If anything, water would help prevent the stupid downclocking that happens solely based on temps.


----------



## Darkboomhoney

Is there a Tool or bios for Classified i can push my Voltage?
With the t4 bios i can get 2400mhz But i think there is a bug i have lower results in benchmark...


----------



## Cornerer

I don't have water (yet) but GPU Boost would push clock down by 2 notches at least if u're at 70+C instead of 4X. There're always 2 downclcking points which are at ~50C and ~65C respectively.
Not sure if it's just the "thermal throttling" you referring to. I would personally consider this as "GPU Boost doing **** to me"


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> To anyone who have their 1080 on water:
> 
> Did watercooling made possible to achieve higher clocks that on air would crash/artifact or did it just reduced thermal throttle (not affecting max overclock)?


Not better overclock just more stable card at least here with max load temps 38C i just dont see how any lower than that can help and idle wont drop much more that stock cooler with fans off 30C with my water is at 27C


----------



## arrow0309

Quote:


> Originally Posted by *Salad Fingers*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TK421*
> 
> You guys with 1080 single, how much fps on BF1 beta lowest setting?
> 
> 
> 
> Is it free to play? I could give it a try. Meh, not free. Sorry, never been into BF games so I won't pay for it.
> I didn't look good enough. Beta is free, downloading. Will keep you posted.
> 
> Someone in LTT forums reported 98-132fps on 1440p, but take that with a pinch of salt.
Click to expand...

That's exactly the framerate my Strix OC (2088, no throttling) can do at 1440p on Ultra, def resolution (42)


----------



## Thetbrett

um, so I received a replacement card because the first card was "lost", 5 weeks later and the replacement card turned up the same day as the original card. It is from a rather large company. The devil on my shoulder tells me to SLI them. The angel says send it back. What to do?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Thetbrett*
> 
> um, so I received a replacement card because the first card was "lost", 5 weeks later and the replacement card turned up the same day as the original card. It is from a rather large company. The devil on my shoulder tells me to SLI them. The angel says send it back. What to do?


Keep it of course. I would have.


----------



## falcon26

OPS double post


----------



## falcon26

Man this sucker files. Played some Battlefield 1 beta at 3440x1440 everything maxed out and it never went below 60 FPS  and it only hit about 62 degrees load  very impressed. Now I just have to figure out how to tell my wife I spent $700 on it :-( If I'm not online for a while RIP my wife did it..


----------



## arrow0309

Quote:


> Originally Posted by *falcon26*
> 
> Man this sucker files. Played some Battlefield 1 beta at 3440x1440 everything maxed out and it never went below 60 FPS  and it only hit about 62 degrees load  very impressed. Now I just have to figure out how to tell my wife I spent $700 on it :-( If I'm not online for a while RIP my wife did it..


























BTW, I've just spent about £500 for a second hand "new" Asus PG279Q, my wife knows nothing yet


----------



## Benjiw

Quote:


> Originally Posted by *Thetbrett*
> 
> um, so I received a replacement card because the first card was "lost", 5 weeks later and the replacement card turned up the same day as the original card. It is from a rather large company. The devil on my shoulder tells me to SLI them. The angel says send it back. What to do?


I'd probably send it back if I'm honest. Not sure I could cope with the guilt.


----------



## juniordnz

Quote:


> Originally Posted by *Thetbrett*
> 
> um, so I received a replacement card because the first card was "lost", 5 weeks later and the replacement card turned up the same day as the original card. It is from a rather large company. The devil on my shoulder tells me to SLI them. The angel says send it back. What to do?


Daddy taught me to how to be a good man and never want things that aren't mine by right.


----------



## IronAge

New Afterburner Beta with DX12 Support ... OSD now available in DX12.

http://www.guru3d.com/news-story/download-msi-afterburner-4-3-beta-14.html


----------



## juniordnz

you guys never had any headaches with AB osd(RTSS)? I never install it because I had several incompatibilities in the past...yesterday I installed it again to see CPU/GPU usage on GTAV and BF1 and got a lot of crashes that never happened before


----------



## Thetbrett

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Keep it of course. I would have.


in the end I told them, they said keep it. So I tried out the new card, poor OC. I'll sell it because I don't need SLI. Said company was awesome, but I had to push for local courier pickup then they said too hard, keep it.


----------



## IronAge

no problems at all with AB + OSD.

but when you got problems with AB OSD you might wanna try HWINFO.

it may be used to put up an OSD when RTSS is installed.

Have not tried it with DX12 and RTSS 6.5.0 though.


----------



## skyn3t

Quote:


> Originally Posted by *juniordnz*
> 
> you guys never had any headaches with AB osd(RTSS)? I never install it because I had several incompatibilities in the past...yesterday I installed it again to see CPU/GPU usage on GTAV and BF1 and got a lot of crashes that never happened before


always install what is necessary.


----------



## juniordnz

Quote:


> Originally Posted by *IronAge*
> 
> no problems at all with AB + OSD.
> but when you got problems with AB OSD you might wanna try HWINFO.
> it may be used to put up an OSD when RTSS is installed.
> Have not tried it with DX12 and RTSS 6.5.0 though.


I always have HWINFO running on the background. Maybe it is conflicting with AB or RTSS and causing the crashes? Also, RTSS can get the info from HWINFO?
Quote:


> Originally Posted by *skyn3t*
> 
> always install what is necessary.


it's not THAAAAT necessary. Just wanted to check for bottlenecks real time and adjust settings to minimize it.


----------



## IronAge

i would try hwinfo OSD to display RTSS values and disable AB OSD.


----------



## Fediuld

Guys, some of you used CLU.

Do I have to scrub the GPU core to put it on, or I can just clean it and install it? I do not want to lose the any warranty on the card.
Mine is watercooled but MSI has no issue with this and covers the warranty, as long as I do not destroy physically the GPU....


----------



## juniordnz

Quote:


> Originally Posted by *Fediuld*
> 
> Guys, some of you used CLU.
> 
> Do I have to scrub the GPU core to put it on, or I can just clean it and install it? I do not want to lose the any warranty on the card.
> Mine is watercooled but MSI has no issue with this and covers the warranty, as long as I do not destroy physically the GPU....


As long as I remember MSI puts a warranty seal over one of the 4 screws that holds the cooler in place over the GPU. At least my Armor came like that...

No, you don't scrape anything. Just clean it thoroughly with isopropyl alcohol and make sure you don't touch the surface after you cleaned it . Also watch out for the small smds around the die. An OCN user gave me a nice tip to use nail polish around the die to kinda isolate everything from the liquid metal. It comes off easily with acetone when you need to get the card back to stock one day.

Just waiting my H100iV2 and my Kraken G10 to arrive to get the mod done here


----------



## Fediuld

Thank you.

MSI has a sticker to one of the screws only, and according to their warranty, they are fine to watercool the card as long as you do not cause physical damage to the parts.


----------



## Bishop07764

Quote:


> Originally Posted by *falcon26*
> 
> Man this sucker files. Played some Battlefield 1 beta at 3440x1440 everything maxed out and it never went below 60 FPS  and it only hit about 62 degrees load  very impressed. Now I just have to figure out how to tell my wife I spent $700 on it :-( If I'm not online for a while RIP my wife did it..


Quote:


> Originally Posted by *arrow0309*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, I've just spent about £500 for a second hand "new" Asus PG279Q, my wife knows nothing yet


Awesome.







I had been thinking about maybe getting a nice 1440 g-sync monitor and telling the wife later. Ha ha. 1080p monitor is beginning to feel a bit aged but DSR is awesome and games like the witcher 3 are amazingly clean looking downscaled from 4k. Maybe less strain on the eyes....yes that might work.


----------



## Whitechap3l

Quote:


> Originally Posted by *arrow0309*
> 
> Hi and thanks, of course I'm interested
> 
> 
> 
> 
> 
> 
> 
> 
> Did this evga bios managed to boost all by default up to the same 2088 as the Asus OC or did you need any extra oc?
> How about the voltages (cause I didn't play with them yet)?
> Any word on the temp differences between the two bioses?
> And lastly, what temps did you get at 2177 (I suppose on vdef)?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> 
> I've just run a Time Spy with the Strix OC bios at 2164/1377 vdef:




So that is my average ( range between 8300 - 8315 ) with EVGA FE Bios @ 2177 ( 8303 )








Also stable in games!


----------



## FarmerJo

hey yall. just wondering if there any bios out there yet that disable gpu boost? Alwyas hated clocks going up and down. of not whats the best bois to use on a areo msi card?


----------



## KickAssCop

http://www.3dmark.com/3dm/14591712?

Timespy seems fine. CPU @ 4.5 and cards at default OC mode (1936 Mhz, Boost 2025)

http://www.3dmark.com/3dm/14591939?

12361 with card at 2063 boost.

http://www.3dmark.com/3dm/14591802?

9985 in Windows 10.


----------



## Bishop07764

Quote:


> Originally Posted by *KickAssCop*
> 
> http://www.3dmark.com/3dm/14591712?
> 
> Timespy seems fine. CPU @ 4.5 and cards at default OC mode (1936 Mhz, Boost 2025)
> 
> http://www.3dmark.com/3dm/14591939?
> 
> 12361 with card at 2063 boost.
> 
> http://www.3dmark.com/3dm/14591802?
> 
> 9985 in Windows 10.


Personally, my Firestrike score went way down with the newest drivers. You might try 369.09 if you are trying to maximize your bench scores to see if it makes a difference.


----------



## arrow0309

Quote:


> Originally Posted by *Whitechap3l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> Hi and thanks, of course I'm interested
> 
> 
> 
> 
> 
> 
> 
> 
> Did this evga bios managed to boost all by default up to the same 2088 as the Asus OC or did you need any extra oc?
> How about the voltages (cause I didn't play with them yet)?
> Any word on the temp differences between the two bioses?
> And lastly, what temps did you get at 2177 (I suppose on vdef)?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> 
> I've just run a Time Spy with the Strix OC bios at 2164/1377 vdef:
> 
> 
> 
> 
> 
> 
> 
> So that is my average ( range between 8300 - 8315 ) with EVGA FE Bios @ 2177 ( 8303 )
> 
> 
> 
> 
> 
> 
> 
> 
> Also stable in games!
Click to expand...

When you set the voltage curve you still need to max the voltage slider to 100?
And can you tell me why did you left the top part of the graphic curve straight / linear?
What does this exactly mean?


----------



## Fediuld

OK, chickened out today, and didn't put CLU on the GPU.

However replaced the EK Ectotherm (8.5W/mk) with Kryonaut (12.5W/mk). Also replaced the EK thermal pards (3.5W/mk) with Minus Pad 8 (8W/mk).

Idle temps dropped by 5C, full load temps by a staggering 13C......... The card even running Spy & Firestrike Extreme for 3 runs each, non-stop, didn't exceeded 35C.
Overclock...
The card happily runs at 2164 core at 1.075v (locked the curve-voltage with the new MSI AB), giving me 8323 score.
http://www.3dmark.com/spy/386694

2177 @ 1.081 is the maximum all it can go also, giving me 8350 score.
http://www.3dmark.com/spy/386723

However at 1.093v it refuses even 2180 let alone trying for 2189. Any thoughts?

As for re-fitting the MSI Armor with the EK block spend a lot of time today examining the card and having few notes. The backplate diagram makes no sense. There are 2 (VRM or what ever they are called), on the back of the card that leaves without pad. (Added pads to them this time)
While is asks the user to put pad on the other 2 that sitting just on the back of the GPU.
See the red area on the attached pic



Now, since this area of the card is not covered by the block, could you tell me which resistors I can try to connect with CLU? I can see 3, two near the power sockets to the right side, and one to the left.
Do you believe I will benefit at all, or keep the card as is with the overclocks I have obtained and stay there? (ofc I have an itch....







)



Thnx.


----------



## steveTA1983

So I caved and got an Nvidia branded FE yesterday. Beast of a card, but it keeps throttling. I've been reading through this forum but haven't found a direct answer: any solution (custom bios maybe)? I see it's a common issue with afar cards


----------



## steveTA1983

Quote:


> Originally Posted by *falcon26*
> 
> Man this sucker files. Played some Battlefield 1 beta at 3440x1440 everything maxed out and it never went below 60 FPS  and it only hit about 62 degrees load  very impressed. Now I just have to figure out how to tell my wife I spent $700 on it :-( If I'm not online for a while RIP my wife did it..


Quote:


> Originally Posted by *arrow0309*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, I've just spent about £500 for a second hand "new" Asus PG279Q, my wife knows nothing yet


Hahahaha, I'm in the same boat as you guys. Luckily I'm selling my Maxwell Titan X to cover the costs, but she'd still be ticked off as hell if she found out I dropped $700 on a new graphics card


----------



## Fediuld

Quote:


> Originally Posted by *steveTA1983*
> 
> So I caved and got an Nvidia branded FE yesterday. Beast of a card, but it keeps throttling. I've been reading through this forum but haven't found a direct answer: any solution (custom bios maybe)? I see it's a common issue with afar cards


Nothing works if your card is operating above 60C....
Best option you have, watercool it.


----------



## steveTA1983

Quote:


> Originally Posted by *Fediuld*
> 
> Nothing works if your card is operating above 60C....
> Best option you have, watercool it.


Ok cool, thank you!!!


----------



## libremaster

No matter what I do, I can't OC the memory on my MSI GTX1080 Gaming. Tried Afterburner, Precision X, stock voltage, 100% voltage, nothing. Tried all offsets from +1 up to +500 and it always crushes or artifacts. Guess I didn't get a good card.... Will have to settle for a 2088 on the core I guess


----------



## skyn3t

Quote:


> Originally Posted by *Fediuld*
> 
> Now, since this area of the card is not covered by the block, could you tell me which resistors I can try to connect with CLU? I can see 3, two near the power sockets to the right side, and one to the left.
> Do you believe I will benefit at all, or keep the card as is with the overclocks I have obtained and stay there? (ofc I have an itch....
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> Thnx.


you got it right just short them all and you goo to go.


----------



## PasK1234Xw

Quote:


> Originally Posted by *steveTA1983*
> 
> So I caved and got an Nvidia branded FE yesterday. Beast of a card, but it keeps throttling. I've been reading through this forum but haven't found a direct answer: any solution (custom bios maybe)? I see it's a common issue with afar cards


Keep it cool pascal thermal down clocks start 51, 61, 71c it will drop about 10mhz per point hybrid kit will keep it under 60c no problem mine never go over 50.

http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


----------



## steveTA1983

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Keep it cool pascal thermal down clocks start 51, 61, 71c it will drop about 10mhz per point hybrid kit will keep it under 60c no problem mine never go over 50.
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


Might have to buy that. I got my fan curve so it doesn't go over 68C and hovers around 2065mhz core and memory is stable at +400mhz.


----------



## PasK1234Xw

Its well worth it you wont have to deal with fan curve anymore and your card will be virtually silent.


----------



## THEROTHERHAMKID

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Its well worth it you wont have to deal with fan curve anymore and your card will be virtually silent.


How hard is this to fit? Will it fit the 1080 g1?


----------



## skyn3t

I just had a bit of time and decide to update my Ezflash for the community 1080's to 760's

Code:



Code:


ooooooooooo               ooooooo  ooooooooooo o888                        oooo
 888    88  ooooooooooo o88    888o 888    88   888   ooooooo    oooooooo8  888ooooo
 888ooo8         8888       88888o  888ooo8     888   ooooo888  888ooooooo  888   888
 888    oo    8888      88o    o888 888         888 888    888          888 888   888
o888ooo8888 o888ooooooo   88ooo88  o888o       o888o 88ooo88 8o 88oooooo88 o888o o888o   by skyn3t

* Have fun. Don't forget to visit us at
* http://www.overclock.net :)

* Ez3flash - In other to use this tool you must rename the rom to "X.rom"
* place it in the same nvflash directory. If not the Ez3flash will fail"

#  1. nvflash --protectoff      " This disable EEprom "
#  2. nvflash --save            " This will save the stock bios or vBios before flash "
#  3. nvflash -4 -5 -6          " Normal Flash "
#  4. nvflash -override -6      " Override GPU ID mismatch "

* Use this options to flash each individual GPU in different order
* and if you have *PLX chip* or if any those command above fail to flash.

#  5.  nvflash -i0 -4 -5 -6      " Flash GPU #1 "
#  6.  nvflash -i1 -4 -5 -6      " Flash GPU #2 "
#  7.  nvflash -i2 -4 -5 -6      " Flash GPU #3 "
#  8.  nvflash -i3 -4 -5 -6      " Flash GPU #4 "

* Use this options to Disable / Enable or Restart Display drivers  in order to flash new bios

#  9. Disable GTX 1080
# 10. Enable  GTX 1080
# 11. Restart GTX 1080

# 12. Disable GTX 1070
# 13. Enable  GTX 1070
# 14. Restart GTX 1070

# 15. Disable GTX 1060
# 16. Enable  GTX 1060
# 17. Restart GTX 1060

# 18. Disable GTX 980
# 19. Enable  GTX 980
# 20. Restart GTX 980

# 21. Disable GTX 970
# 22. Enable  GTX 970
# 23. Restart GTX 970

# 24. Disable GTX 960
# 25. Enable  GTX 960
# 26. Restart GTX 960

# 27. Disable GTX 780
# 28. Enable  GTX 780
# 29. Restart GTX 780

# 30. Disable GTX 770
# 31. Enable  GTX 770
# 33. Restart GTX 770

# 33. Disable GTX 760
# 34. Enable  GTX 760
# 35. Restart GTX 760
# 36. nvflash                   " All you need to know about Nvflash "

Type the number to execute the process=

you can use it even when your "Display Drivers" crash you don't need to restart your RiG if crash during gaming or benchmark only if you system is really messed up. just use "Restart" option # and hit enter same apply for all the rest.


----------



## wsarahan

HI guys how are you?

Can you guys tell me how much can you get stable OC with 1080 SLI?

With a single card is way easier here to get something stable, with SLI way harder

Thanks


----------



## nexxusty

Quote:


> Originally Posted by *shadow85*
> 
> Will an i7-5930K @ 4.0GHz bottleneck ASUS STRIX GTX 1080 SLI?


No. Absolutely not.

4ghz is crap though. What gives?
Quote:


> Originally Posted by *steveTA1983*
> 
> So I caved and got an Nvidia branded FE yesterday. Beast of a card, but it keeps throttling. I've been reading through this forum but haven't found a direct answer: any solution (custom bios maybe)? I see it's a common issue with afar cards


All GTX 1080's throttle.

Aside from stock watercooled variants.

Watercool it. 140mm Radiator on mine and it never breaks 42c. Ever.

Boosts at 2100mhz solid, whatever I throw at it.


----------



## KickAssCop

Quote:


> Originally Posted by *wsarahan*
> 
> HI guys how are you?
> 
> Can you guys tell me how much can you get stable OC with 1080 SLI?
> 
> With a single card is way easier here to get something stable, with SLI way harder
> 
> Thanks


See my signature. 2050/11000 stable on air.

So I played a ton of games over the weekend to test out the cards. My biggest improvements were in the following games:

Ass Creed Syndicate. Game was smoothest as ever. I don't know what frames I was getting but getting around the city was butter. In the past, it did feel the game was hitching a bit everytime I had a line launcher across far distances as new buildings were loaded. I went to the settings and I see the RAM consumption is marked as 7 GB. I can see how the 8 GB is beneficial to me in this game.

Replayed a couple of levels in Doom and my word, what smoothness. At 1440P maxed out settings the game was butter whether I was doing glory kills, jumping off areas, blasting stuff. The game was virtually hitting 145+ fps all throughout. Previously, in the first levels, I used to see dips at around 120 and sometimes even 110 when doing a glory kill. Now, it never goes down. The smoothness was real.

Fired up The Division and again I see the game performing fantastic. I do notice difference of fps from 75 up to 90 outdoors. Indoors, I am hitting about 120 fps. Game runs fantastic though no change in smoothness apart from high fps increase.

I also tried Tomb Raider for a bit in DX 11 and it was fine. No real difference.

Tried StarCraft 2 and no real difference. Actually the card was not boosting to full in that game which was weird.

BF 1 Open Beta was blissful. 95-120 fps was the norm. Nothing was bringing down frames in that game.

Deus Ex is tons better than my 980 Ti. Not because of SLi since I am not sure if SLi is working for me. I am getting only 53 fps so need to trouble shoot that bit.

All said and done, it is not all that bad. 15-20% was expected. Some games see real gains whilst others are nice to have.

I still have remorse though and if FH3 does not support SLi and Gears of War 4 then I will be pissed.

P.S. I am running the cards in OC mode. They boost to 2050 but then back down to 2025 or 2012 depending on game. I can max out without any change in voltage or fan curve to 2050/11000. Card temperatures are also decent with top card maxing out at 76 C and bottom at 60 C. Usually in most games, they hover around 70 C and 55 C.


----------



## Fediuld

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Keep it cool pascal thermal down clocks start 51, 61, 71c it will drop about 10mhz per point hybrid kit will keep it under 60c no problem mine never go over 50.
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


Pascal thermal down clocks start from 31.

Went to the super market last night and left the window open, was cold night. Came back to see the card idle at 21C.
Fired up ESO, and the reported speed was 2189 for quite a while, until it hit 32C and dropped to 2164.

So I wonder, if I put a modded Predator 240 or 360 between CPU and GPU flow, would that drop the temps more?
(atm using a Predator 360 and the flow is CPU -> GPU).


----------



## arrow0309

Quote:


> Originally Posted by *Fediuld*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PasK1234Xw*
> 
> Keep it cool pascal thermal down clocks start 51, 61, 71c it will drop about 10mhz per point hybrid kit will keep it under 60c no problem mine never go over 50.
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1
> 
> 
> 
> Pascal thermal down clocks start from 31.
> 
> Went to the super market last night and left the window open, was cold night. Came back to see the card idle at 21C.
> Fired up ESO, and the reported speed was 2189 for quite a while, until it hit 32C and dropped to 2164.
> 
> So I wonder, if I put a modded Predator 240 or 360 between CPU and GPU flow, would that drop the temps more?
> (atm using a Predator 360 and the flow is CPU -> GPU).
Click to expand...

Good to know
And yes, another 240 (or bigger) rad will help dropping the temps in any position, not necessarily between cpu and gpu.
The position is almost irrelevant, the (increased) liquid temp is, control it with a liquid temp sensor


----------



## Fediuld

Cheers









Also, annoyingly, since I refit the cooler got bad coil whine, that wasn't there before, neither with the Armor Heatsink, nor with the waterblock :/
Any thoughts?


----------



## IronAge

i would try hwinfo OSD to display RTSS values and disable AB OSD.
Quote:


> Originally Posted by *libremaster*
> 
> No matter what I do, I can't OC the memory on my MSI GTX1080 Gaming. Tried Afterburner, Precision X, stock voltage, 100% voltage, nothing. Tried all offsets from +1 up to +500 and it always crushes or artifacts. Guess I didn't get a good card.... Will have to settle for a 2088 on the core I guess


I flashed Gaming X OC Modus Bios to my Zotac Amp and had the same issues with VRAM overclocking.

Just try Asus Strix OC T4 Bios or Gainward GLH Premium Bios on your card and i bet your memory will at least do around 5535.

most likely it will even go higher. with those bioses i been able to run VRAM @ 5576 without artifacting and with a good performance gain.

@ Fediuld

try running 3D Mark Icestorm for about 30 minutes (high FPS , all cards have heavy coil whine during icestorm) and check back with games/normals fps if it got better.

remove heatsink again, check thermal pads and put heatsink back.


----------



## Fediuld

Quote:


> Originally Posted by *IronAge*
> 
> i would try hwinfo OSD to display RTSS values and disable AB OSD.
> I flashed Gaming X OC Modus Bios to my Zotac Amp and had the same issues with VRAM overclocking.
> 
> Just try Asus Strix OC T4 Bios or Gainward GLH Premium Bios on your card and i bet your memory will at least do around 5535.
> 
> most likely it will even go higher. with those bioses i been able to run VRAM @ 5576 without artifacting and with a good performance gain.
> 
> @ Fediuld
> 
> try running 3D Mark Icestorm for about 30 minutes (high FPS , all cards have heavy coil whine during icestorm) and check back with games/normals fps if it got better.
> 
> remove heatsink again, check thermal pads and put heatsink back.


I checked all thermal pads :/ nothing. Even replaced them, paste the lot this morning.
But thank you i will try icestorm and let you know


----------



## PasK1234Xw

..


----------



## tps3443

I'm playing the BF1 beta, running very smooth maxed out at 4K with FXAA at medium and ssao and not HBAO. Maintains about 57-60 fps.

If I run FXAA on high, and HBAO is still manages 45-55 fps.

This is a great card!

Did you guys see vault 1080 yet?

Nvidia spent 6 months creating a vault named after our loved GTX1080's. Apparently it shows off some of the best lighting affects today, and utilizes the best graphics available in Fallout 4!

Vault 1080! New steam update, for Fallout 4. I'm on my way there now.somewhere in the wasteland!

This 1080 is a beast! I thought about Titan P.. But, I feel the price is not worth the small increase. Its like going from a 1070 to a 1080 only it cost $500 more.

My graphics score at 25,721 is not far off from the 27,000 a titan P gets. I'd rather have (2) GTX 1080's.

I know a single faster card is better. But, in this case the single faster card is not much better. I could be wrong.. But, I am super happy with my 4K performance from my 1080. So, I do not feel investing any more money is necessary even for more performance, because the gameplay experience right now is so smooth, And impressive!!

I just love my GTX 1080 founders edition! It is truly a damn beast. 4K gaming is now possible


----------



## Agavehound

Anybody running with an EVGA hybrid waterblock AIO and how does it perform?

Nevermind, FTW isn't released yet.


----------



## moustang

Quote:


> Originally Posted by *libremaster*
> 
> No matter what I do, I can't OC the memory on my MSI GTX1080 Gaming. Tried Afterburner, Precision X, stock voltage, 100% voltage, nothing. Tried all offsets from +1 up to +500 and it always crushes or artifacts. Guess I didn't get a good card.... Will have to settle for a 2088 on the core I guess


That is strange.

I have no problems clocking the memory on my Gaming X. I can push the memory as high as 5564mhz before I start getting artifacting, and run the core at 2113mhz as my 24/7 overclock. It's totally stable at 2113mhz Core and 5554mhz Memory.


----------



## KickAssCop

You guys think that 2050/11000 is a decent stable clock for SLi on air?
I think it is pretty good for 1080s.

I know I would like 2100 core but seems like it is not happening anytime soon. Even if I start at 2100 core, it backs down to 2050 after an hour of gaming irrespective of how many +X I throw on the OC tool (GPU Tweak 2, AB doesn't matter).


----------



## Fediuld

Without having mod the card yet, managed to hit 2190 at 1.093v (curve fiddling).

However the performance is same as 2177 @ 1.081v. Clearly the card is running 2190 according to overlay, and locked speeds on the curve panel.
Any thoughts?


----------



## IronAge

Some sort of error correction could kick in ... you are not only overclocking the shader cores/rops but also the caches/uncore.

When you oc vram too much you will see graphics scores decreasing before you see visible artifacts.

Memory of my Zotac AMP @5576 - just fine .. @5584 graphics score decreaseing ... and above visible artifacts in benchmarks. (3D Mark Firestrike Extreme)

the line between loss of graphics score and artifacting seems to be finer than with maxwell gpus.


----------



## Fediuld

VRAM at 5524 is my optimal it seems. Any way up +20/+40 or down -20/-40-60 and the perf goes down.


----------



## IronAge

There are Bioses which allow higher VRAM Clock rates.

i am reaching higher beneficial vram clock rates with Palit Gamerock Premium / Gainward GLH / Inno3D iChill X3, Strix OC T4 Bios than with the original Bios of my Zotac Amp! for instance.

I would rather run 2126/5576 than 2152/5544 ... vram clock rate has quite an impact on the graphics score.

vram clock wise the worst have been the MSI Gaming Bioses.


----------



## KickAssCop

Link to Strix OC T4 bios please. I will try it tonight and see what's up!


----------



## IronAge

http://www.mediafire.com/download/22ifgc9yuk72eyg/strix1080xoc_t4.zip

bear in mind that it removes the power limit, thermal limits and voltage limits too.

no clock decrease ... when gpu gets too warm it will rather terminate the benchmark/reset the driver or start artifacting.

with his bios and driver 369.09 i reached my highest graphics scores yet.

but i would only recommend it for 24/7 use when the GPU is in an H2O loop.


----------



## ucode

Quote:


> Originally Posted by *IronAge*
> 
> Some sort of error correction could kick in ...


Note that nVidia software reports that there isn't any ECC support for memory. Mine seems to dramatically drop speed when hitting 1400MHz in bandwidth orientated results and doesn't get a lot worse. I don't remember 100% but it may even start getting a little faster again.


----------



## steveTA1983

Quote:


> Originally Posted by *IronAge*
> 
> http://www.mediafire.com/download/22ifgc9yuk72eyg/strix1080xoc_t4.zip
> 
> bear in mind that it removes the power limit, thermal limits and voltage limits too.
> 
> no clock decrease ... when gpu gets too warm it will rather terminate the benchmark/reset the driver or start artifacting.
> 
> with his bios and driver 369.09 i reached my highest graphics scores yet.
> 
> but i would only recommend it for 24/7 use when the GPU is in an H2O loop.


Safe to use on FE cards too?


----------



## IronAge

there is some sort of internal data transmission control.

as far as i know for GDDR5 DBI is used for CRC ... but maybe somebody in here knows better and wants to explain it.


----------



## IronAge

Quote:


> Originally Posted by *steveTA1983*
> 
> Safe to use on FE cards too?


some already flashed FEs with that bios ... so ready to read your results.


----------



## Bishop07764

Quote:


> Originally Posted by *Fediuld*
> 
> Without having mod the card yet, managed to hit 2190 at 1.093v (curve fiddling).
> 
> However the performance is same as 2177 @ 1.081v. Clearly the card is running 2190 according to overlay, and locked speeds on the curve panel.
> Any thoughts?


I believe others have noticed this too with using the curve. I have scored the same as some who were getting 40mhz higher on the core. Core clock doesn't seem to mean everything when you get to a certain point. Maybe the voodoo that is boost 3.0?


----------



## Koniakki

Quote:


> Originally Posted by *skyn3t*
> 
> in a very deep hole
> 
> 
> 
> 
> 
> 
> 
> . sky still alive


We missed you for exactly 15 months man! I hope all is good.

W-E-L-C-O-M-E BACK buddy!!


----------



## steveTA1983

Quote:


> Originally Posted by *IronAge*
> 
> some already flashed FEs with that bios ... so ready to read your results.


Flashed and testing


----------



## steveTA1983

Got a heaven score maxed out @1080p of 3123 and 124.0 FPS. Temps reach 80C though. Safe????


----------



## steveTA1983

NM, still throttled down due to power/voltage. Really need custom bios









3097 and 122.9fps on stock bios with max temp of 68C. Just gonna leave it for now


----------



## GanGstaOne

Quote:


> Originally Posted by *steveTA1983*
> 
> NM, still throttled down due to power/voltage. Really need custom bios


What you need is custom cooling much more then custom bios pascal works best with highest temps around 50C


----------



## HunterKen7

Installed a Gigabyte G1 Gaming GTX 1080 yesterday!


----------



## steveTA1983

Quote:


> Originally Posted by *GanGstaOne*
> 
> What you need is custom cooling much more then custom bios pascal works best with highest temps around 50C


Yep, I've noticed that. Maybe one day I'll get a cooler, but for now there is nothing this card can't handle anyways, and I "upgraded" from a Maxwell Titan X (no out of pocket costs, sold the TX for what I paid for the 1080, including taxes)


----------



## GanGstaOne

Quote:


> Originally Posted by *steveTA1983*
> 
> Yep, I've noticed that. Maybe one day I'll get a cooler, but for now there is nothing this card can't handle anyways, and I "upgraded" from a Maxwell Titan X (no out of pocket costs, sold the TX for what I paid for the 1080, including taxes)


Good for you my 980 burned out just before i got the 1080 so i paid good + custom cooling just my luck i guess


----------



## steveTA1983

Quote:


> Originally Posted by *GanGstaOne*
> 
> Good for you my 980 burned out just before i got the 1080 so i paid good + custom cooling just my luck i guess


Ouch man! Either way, these cards are so worth it. With just a slight overclock, there is no way any single Maxwell card can touch them. For me, I would have been happy if I got $600 for the TX, but no complaints for basically an even swap


----------



## juniordnz

Is a little buzz on the PSU normal? Bad sign? Is it going to die?

The PSU is a XFX TX 850 (seasonic made)


----------



## GanGstaOne

Quote:


> Originally Posted by *juniordnz*
> 
> Is a little buzz on the PSU normal? Bad sign? Is it going to die?
> 
> The PSU is a XFX TX 850 (seasonic made)


Well if it wasnt there before then its probably not good


----------



## jase78

Finnaly went with w10 and can actually get timespy results for my 1080strix. What are the numbers you guys are typically seeing? So far 7350 with 8000 graphics is as best as it can do. Is this a pretty average score?


----------



## Deders

Quote:


> Originally Posted by *juniordnz*
> 
> Is a little buzz on the PSU normal? Bad sign? Is it going to die?
> 
> The PSU is a XFX TX 850 (seasonic made)


Is it an electrical buzz? Is the power cable inserted firmly? Can sometimes happen with a loose connection.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Is a little buzz on the PSU normal? Bad sign? Is it going to die?
> 
> The PSU is a XFX TX 850 (seasonic made)


Of course it's a bad sign if it wasn't happening before.

Cmon now...


----------



## juniordnz

Quote:


> Originally Posted by *Deders*
> 
> Is it an electrical buzz? Is the power cable inserted firmly? Can sometimes happen with a loose connection.


Yes, electrical buzz. It's a low noise, had to get very close to the PSU to hear it clear.
Quote:


> Originally Posted by *nexxusty*
> 
> Of course it's a bad sign if it wasn't happening before.
> 
> Cmon now...


That's the catch...I used to let my side panel closed. But since the 1080 and the thermal throttle problem I'm letting it open. I couldn't have noticed it before.


----------



## schoolofmonkey

Quote:


> Originally Posted by *juniordnz*
> 
> Yes, electrical buzz. It's a low noise, had to get very close to the PSU to hear it clear.


I've experienced this before.
These are all my cards over the last 2 or so years that have done the "power buzz", the only card that didn't was my eVGA GTX980ti Hybrid.






My current GTX1080 Strix does it but only if you stick your head 1/2 an inch from the card otherwise, side case on or off you can't hear it.


----------



## Bishop07764

Quote:


> Originally Posted by *jase78*
> 
> Finnaly went with w10 and can actually get timespy results for my 1080strix. What are the numbers you guys are typically seeing? So far 7350 with 8000 graphics is as best as it can do. Is this a pretty average score?


That score sounds pretty good. Your right where you should be. I think most are getting 7-8k range.


----------



## shadow85

Is it wrong that one of my stock STRIX can reach 73 degC in SLi?


----------



## Deders

Quote:


> Originally Posted by *shadow85*
> 
> Is it wrong that one of my stock STRIX can reach 73 degC in SLi?


The top card usually gets hotter when the heat rises from the bottom one.

If you want to be sure, swap the cards over.


----------



## KickAssCop

Quote:


> Originally Posted by *shadow85*
> 
> Is it wrong that one of my stock STRIX can reach 73 degC in SLi?


My top card also does 70-76 C and bottom 55-60 C depending on games.


----------



## shadow85

Ok so this is normal then for STRIX. I thought I may have bad airflow, but I shouldn't because it is in a 900D case.


----------



## KickAssCop

As long as you are not seeing 83C or higher temperatures, the cards are working fine.


----------



## juniordnz

It may not harm the cards, but you are surely loosing 3 or 4 clocks reaching those temps. I would consider watercooling a must with SLI. Just get a couple G10s and H80i.


----------



## Derpinheimer

e]
Quote:


> Originally Posted by *IronAge*
> 
> Some sort of error correction could kick in ... you are not only overclocking the shader cores/rops but also the caches/uncore.
> 
> When you oc vram too much you will see graphics scores decreasing before you see visible artifacts.
> 
> Memory of my Zotac AMP @5576 - just fine .. @5584 graphics score decreaseing ... and above visible artifacts in benchmarks. (3D Mark Firestrike Extreme)
> 
> the line between loss of graphics score and artifacting seems to be finer than with maxwell gpus.


For the card i have, I've yet to see any artifacting.. Only crashes.

+575 yields best results, but it's artifact free at +800. Wonder how that works..


----------



## Joshwaa

Woohooo! Was able to order the EK full cover for my 1080FTW. It's up for pre order if any of you are waiting for it. Will post as soon as it arrives.


----------



## juniordnz

Quote:


> Originally Posted by *Joshwaa*
> 
> Woohooo! Was able to order the EK full cover for my 1080FTW. It's up for pre order if any of you are waiting for it. Will post as soon as it arrives.


*envy*


----------



## Joshwaa

Quote:


> Originally Posted by *juniordnz*
> 
> *envy*


Really nice part it was basically free. Ordered with some bitcoins I had lying around. lol


----------



## juniordnz

Quote:


> Originally Posted by *Joshwaa*
> 
> Really nice part it was basically free. Ordered with some bitcoins I had lying around. lol


*even more envy*









That's nice! Post some before and after temperatures when you get it installed.

Here in Brazil it's too expensive to get a decent custom loop







I'll just have to settle for some G10 + H100i that just got delivered here. Hopefully in a few hours I'll get it set up in my 1080FTW and can post back some the results.


----------



## skyn3t

This is after playing some BF1 round. don't pay attention on voltage slide as it don't work as it should for me.


----------



## tps3443

I was messing around with the noise levels of my GTX 1080 FE. Anyways, my computer is LOUD! Both of my CPU radiator fans run at full blast in a confined area. So, you hear a lot of turbulence and noise coming from the case. So, my wife started to get sick of the noise, because my PC is in the living room. So, I found out that I can run my 6600K without radiator fans at 4.6Ghz. And it never exceeds 60C gaming, and never gets hotter than about 75C at 100% load. So, I have totally removed the radiator fans. And, my computer is SILENT again!

Accept, when I play games. I can hear my GTX1080 spin up, it is quiet compared to my now removed radiator fans. Although, I put the card on SILENT mode. which I believe is about 40% MAX fan speed at 80C. Well, with the card overclocks, +235 Core, and +600 memory. I was playing DOOM 2016 and I forgot I even had the quiet mode on. I was playing for about 2 hours. And, I realized the card ran at 90C during the entire period of gaming, although my core clocks never went below 1927Mhz. Which I thought was very impressive! I know I can get more speed out of the card. But, the noise is phenomenal! My GTX1080 is totally silent!

So, is 90C ok? it JUST STICKS at 1927Mhz, I am playing at 4K which is a lot of power. And, considering how quiet it is. And how much faster then factory speeds it is.

So, anyways is 90C ok? or should I allow more FAN RPM to my 1080? I also have a 120MM fan blowing at a very low RPM over the card long ways. So, air is circulating the top and bottom of the card.


----------



## Deders

Doom doesn't really heat up a 1080 very much compared to other games.

I'd recommend setting the fans to the fastest they can be whilst still being silent, then experiment with the fan curve whilst playing a more demanding game to get the best noise/heat ratio.

It's much easier to prevent a card from reaching a certain temperature than it is to reduce the temperature once it has been reached.


----------



## tps3443

I see, lol I was wondering why even at 4K and 8XAA it is just capped at the 60fps level it was smooth though! Ok, I will try Fallout4 it is definitely more demanding.


----------



## juniordnz

I wouldn't feel comfortable running a 1080 at 90C. The vicious thermal throttle is there for a reason and I believe it should be read as a hint that this GPU is very heat sensitive. The cap on voltage control could also be a "safety measure" to keep people from overvolting the GPU.

Anyway, If I were you I would try to keep my card at least below 70s all the time. 90 is definitely a no no.


----------



## chronicfx

My reference 1080's in SLI have had a HUGE improvement, well one of them, the top card used to get to about 77 degrees at 100% fan speed during heaven runs with the bottom card at about 68-70c. I put one of these http://www.microcenter.com/product/284960/SB-F1_Fox-1_System_Blower 
in between them, I wouldn't recommend it with an aftermarket, but because it does not block the fan on a reference, but both my cards now top out at 68-70c in heaven. And even a few degrees lower when gaming. Just a thought, the SLi bridge is not affected either.


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> I wouldn't feel comfortable running a 1080 at 90C. The vicious thermal throttle is there for a reason and I believe it should be read as a hint that this GPU is very heat sensitive. The cap on voltage control could also be a "safety measure" to keep people from overvolting the GPU.
> 
> Anyway, If I were you I would try to keep my card at least below 70s all the time. 90 is definitely a no no.


The thermal threshold is 95C. And, I really don't think there are many Founders Edition 1080's keeping it under 70! LOL

But, I myself overclock. So, the card is going to run hot. Especially when it is overclocked. I keep video cards about as long as I keep oil in my cars engine. I may keep this GTX1080 for the life cycle though, I love the card. I'm just going to deal with it and allow some more fan speed.

Out of the box, the reference GTX1080 hits about 81-83C, They run hot. Not to hot. But, warmer then some cards.


----------



## Benjiw

Quote:


> Originally Posted by *tps3443*
> 
> The thermal threshold is 95C. And, I really don't think there are many Founders Edition 1080's keeping it under 70! LOL
> 
> But, I myself overclock. So, the card is going to run hot. Especially when it is overclocked. I keep video cards about as long as I keep oil in my cars engine. I may keep this GTX1080 for the life cycle though, I love the card. I'm just going to deal with it and allow some more fan speed.
> 
> Out of the box, the reference GTX1080 hits about 81-83C, They run hot. Not to hot. But, warmer then some cards.


You don't understand, 9xx cards liked to stay cold, and 10x0 cards are the same, you need to keep them cool or they become incredibly unstable, GPU boost starts to mess you around. 95c will see the card have crazy thermal throttling and all that jazz. People running these cards on water are getting the most out of them tbh.


----------



## chronicfx

Quote:


> Originally Posted by *tps3443*
> 
> The thermal threshold is 95C. And, I really don't think there are many Founders Edition 1080's keeping it under 70! LOL
> 
> But, I myself overclock. So, the card is going to run hot. Especially when it is overclocked. I keep video cards about as long as I keep oil in my cars engine. I may keep this GTX1080 for the life cycle though, I love the card. I'm just going to deal with it and allow some more fan speed.
> 
> Out of the box, the reference GTX1080 hits about 81-83C, They run hot. Not to hot. But, warmer then some cards.


It will not hurt the longevity of the card in terms of performance in new games, all cards are obsolete in three years if you use that metric. But when using GPU boost 3.0 as the newer cards do the difference between 68c and 83c is going to be about 100mhz on the core clock (just a guess, may not be that bad). So if you want a stable core clock, which I am happy with in my case, it starts out at 2076mhz and settles at 2050mhz during gaming pretty fast, the cooler in the pci slot made the difference between 2050mhz all the way down to 1988mhz in some instances even with thermal allowance set to max. You should download MSI afterburner and watch your graphs, drag em across the entire screen and game and benchmark for a couple hours. You will see what we are talking about. Stable temps = stable core speed, lower temps = better core speed. I am happy at 2050, not sure if spending $250 on a couple blocks and fittings are going to get me much more. Who knows tho.


----------



## tps3443

I am just running it how I normally run the card. I have the fan set to a very low speed, until it hits 70C of course and then it ramps to 100%. And my temps seem to stay at 70C. My memory is at 11,600 and my core speeds seem to stay around 2,088 Mhz - 2112 Mhz while gaming.

The performance gains are huge when overclocking a GTX1080. For example in Fallout 4, in this one area of the game at 4K resolution, and default fan profile, not overclocked the card is running around 1685 Mhz-1717Mhz and the temps are at the throttle zone of 82C. Well, in this one area the frame rate drops to 40 FPS.

If I overclock to around 2,088Mhz and 11,600Mhz memory and 100% fan profile, my minimum in this area is 51 FPS.

This is a 11FPS boost just from overclocking. I am gaining nearly 500 Mhz, and running about 12C cooler. So the difference is HUMONGOUS!

I just wanted to see what type of real world gains I get. And 1080P gaming with a GTX1080, overclocking is just silly unless your benchmarking. But, there are huge advantaged in 4K gaming.

There is nothing better then enjoying gameplay on PC with that 50-60+ frame rate all the time! And The GTX1080 provides this awesome experience.

Some games run stupid easy, and others may need some minor adjusting to get to 60 FPS without affecting graphics quality.

FORZA 6 PC @4K 8XAA ULTRA everything = 100+ FPS

DOOM PC @ 4K 8XAA NIGHTMARE = never seen a dip below 60 FPS vsync on

FALL OUT 4 @4K ULTRA EVERYTHING, GODRAYS=MED, FXAA 60 FPS, dips to 50 sometimes.

OVERWATCH @4K EPIC EVERYTHING, 8XAA, 65-80FPS

BF1 BETA @4K ULTRA EVERYTHING, FXAA LOW, SSAO 57-60+ FPS, if HBAO on it is 49-60+ FPS

Its a hell of a video card for gaming at 4K is all I can say about it.


----------



## LiquidHaus

Hey guys, real quick - I'm sure you guys have a concensus on here on which 1080 to get. So, which one to go with? I'm selling my 1070 to my roommate so I'm gonna get a 1080 this week. And don't say FE.


----------



## tps3443

Quote:


> Originally Posted by *lifeisshort117*
> 
> Hey guys, real quick - I'm sure you guys have a concensus on here on which 1080 to get. So, which one to go with? I'm selling my 1070 to my roommate so I'm gonna get a 1080 this week. And don't say FE.


I know you said not to recommend a Founder Edition card. But, in all honesty! All GTX1080's will overclock the same, and offer the same. The only advantage is better cooling. I know you have a Zotac Amp Extreme, and those cards were pushing $459-$469 and you could have achieved the same level of cooling, and performance with the budget oriented $389 dollar gigabyte G1 GTX1070 gaming card and saved your self some money.

(2) pin 8 power connectors, and 8 phase this and that are not going to help you with regular overclocking one bit!

I would grab the cheapest GTX1080 I can get and RUN WITH IT!

I have a 1080 FE. I sold off my 1070 SC ACX, to get it!

I am pulling roughly 25,600 in 3dmark firestrike with this card. It does not run as cool as the other AIB cards. But, its fast!

The GTX1080 Classfied had totally nothing to offer but good cooling. And unless you paid $719.99 or less. You paid more for nothing!

A guy on here got one new for like $684 at microcenter which was a deal! And a nice card for the money.

The Founders edition cards are built really nice! They look nice, you can overclock it to over 2,100 core and keep it in the low 70's with a more aggressive fan profile. What more do you need?


----------



## Spiriva

Anyone here who flashed the "strix1080xoc_t4" bios on a 1080 FE, that could write a few lines on that experiance ? Any gains etc ?


----------



## steveTA1983

Quote:


> Originally Posted by *Spiriva*
> 
> Anyone here who flashed the "strix1080xoc_t4" bios on a 1080 FE, that could write a few lines on that experiance ? Any gains etc ?


More heat. Was more stable at higher clocks (was only able to do 50mhz more), but still got blocked because of voltage and power limit.


----------



## gothictech13

Hello people this is my first post and I'm a proud owner of the GTX1080 AMP EXTREME edition
here's my video of getting, unboxing, and installing it. Hope you enjoy it


----------



## SauronTheGreat

Quote:


> Originally Posted by *gothictech13*
> 
> Hello people this is my first post and I'm a proud owner of the GTX1080 AMP EXTREME edition
> here's my video of getting, unboxing, and installing it. Hope you enjoy it


congratulations .. sorry i did not see the full video just the unboxing part then when the 3dmark score comes ... are you planning to upgrade your CPU etc ? because my old build had a i7-3770k @ 4.5ghz and a 980TI on a 1080p system ... i still remember the max fps i got on witcher 3 on all settings maxed were like 70 and when i upgraded to a 6700k @4.5ghz i had 90 + fps with the 980Ti ....


----------



## tps3443

Quote:


> Originally Posted by *gothictech13*
> 
> Hello people this is my first post and I'm a proud owner of the GTX1080 AMP EXTREME edition
> here's my video of getting, unboxing, and installing it. Hope you enjoy it


I stroke my 1080's too. It's okay.

Nice card, your gonna love it.


----------



## fat4l

Guys, still no bios tools ?


----------



## gothictech13

Quote:


> Originally Posted by *SauronTheGreat*
> 
> congratulations .. sorry i did not see the full video just the unboxing part then when the 3dmark score comes ... are you planning to upgrade your CPU etc ? because my old build had a i7-3770k @ 4.5ghz and a 980TI on a 1080p system ... i still remember the max fps i got on witcher 3 on all settings maxed were like 70 and when i upgraded to a 6700k @4.5ghz i had 90 + fps with the 980Ti ....


thank you so much ^-^. Yes I will upgrade the CPU in about a year maybe, I find it okay so far with my overclock, it runs all my rendering and games really will, and in VR it preforms great. It would be great if AMD get ZEN to compete with the enthusiast level from Intel then maybe I'll upgrade sooner.


----------



## gothictech13

Quote:


> Originally Posted by *tps3443*
> 
> I stroke my 1080's too. It's okay.
> 
> Nice card, your gonna love it.


how doesn't stroke their GPU these days XD. I do love it thanks.


----------



## Koniakki

Hey everyone. So I had given an offer for a 1080 FE.

Do you think €600 for a slightly used GTX 1080 FE is good or too much?

Its about €80 less than a new Palit 1080 Jestream(cheapest I could find).


----------



## Shultzy

So I've been contemplating upgrading my gtx 780 classified to a gtx 1080. I've heard that voltage cannot go past 1.092v, but EVGA still included the evbot port on their new classified cards. I actually still have my evbot and was curious to know if there will be any adjustment to voltage allowed with this tool presently? If not will it be something that can be done in the future?


----------



## tps3443

Quote:


> Originally Posted by *Shultzy*
> 
> So I've been contemplating upgrading my gtx 780 classified to a gtx 1080. I've heard that voltage cannot go past 1.092v, but EVGA still included the evbot port on their new classified cards. I actually still have my evbot and was curious to know if there will be any adjustment to voltage allowed with this tool presently? If not will it be something that can be done in the future?


First of all, that is a HUGE upgrade! If I must say, A GTX 1080 is just as fast as (2) GTX 980's in SLI. So, coming from a GTX 780. You should be nearly 3X times faster in games, and benchmarks. Not, to mention overclocking the NEW GTX1080.

I have not heard of the evbot working on the 1080's but, there are some bios floating around that will allow 1.100 - 1.250 Volts to the Vcore with more power. So you will be fine.

it is a HUGE boost. So, be prepared.


----------



## ShortySmalls

Tossed an EK waterblock on my Gigabyte G1 Gaming GTX 1080 and wow does it run cool, I idle now at 25*c with a full furmark load of just 45*c while running completely silent on a cpu/gpu loop with a single 360mm rad.


----------



## juniordnz

Just started getting some micro green dots, pixel size, on the screen and the error saying that "windows stopped the device from working". It doesn't even recognizes the card.

Boxed it up and returning to the store tomorrow. Hopefully they won't make too much trouble to get it replaced for me.

Wish I could change to Galax...


----------



## Shultzy

Yes I know it'll be huge just like the hole it's going to leave in my wallet lol.i just think I'd be a little bummed if I purchased the classified version and wasn't able to adjust voltage. Otherwise I'll probably just get the FTW version since it'll save ne almost $100. I'll be throwing a block on it too once EK releases it. I've been running 2560x1440 resolution and the 780 does the job but it's starting to show it's age.


----------



## IronAge

Quote:


> Originally Posted by *Koniakki*
> 
> Hey everyone. So I had given an offer for a 1080 FE.
> 
> Do you think €600 for a slightly used GTX 1080 FE is good or too much?


It is ok ... if buying a used GPUs i would prefer EVGA over others due to their better support/ warranty policy


----------



## Spiriva

Quote:


> Originally Posted by *steveTA1983*
> 
> More heat. Was more stable at higher clocks (was only able to do 50mhz more), but still got blocked because of voltage and power limit.


Ah, so no real gain at all in flashing this bios then


----------



## steveTA1983

Quote:


> Originally Posted by *Spiriva*
> 
> Ah, so no real gain at all in flashing this bios then


Very small gain, but held back by heat, voltage limit, power limit. Just need to wait it out till an unlocked one arrives


----------



## xTesla1856

Was testing my card for most of yesterday, and I can say that this one is a keeper for me. +220 on the core, +500 on the memory are rock solid game stable. While the card is still cool, it boosts to 2126 and later settling between 2060 and 2088. The waterblock is already on it's way


----------



## ucode

Quote:


> Originally Posted by *steveTA1983*
> 
> Very small gain, but held back by heat, voltage limit, power limit. Just need to wait it out till an unlocked one arrives


? The T4 doesn't have a power limit.


----------



## steveTA1983

Quote:


> Originally Posted by *ucode*
> 
> ? The T4 doesn't have a power limit.


Still shows up in GPU-z for some reason


----------



## ucode

Shouldn't do. Are you confusing power with GPU load %?


----------



## steveTA1983

Quote:


> Originally Posted by *ucode*
> 
> Shouldn't do. Are you confusing power with GPU load %?


No. According to GPU-z I was hitting reliable voltage and power limit errors. Weird.


----------



## ChaosAD

Recieved my Inno3d 1080 some hours ago and run FS ofc. Only changed PL and TL to max.Boost at 2050 and max temp 63C with 26C ambient


----------



## Synthetickiller

For those of you who have water cooled, what option did you take?

The MSI Sea Hawk EK X seems overpriced. ($830+)
The Zotac Arcticstorm is a much better deal. ($720)
GTX 1080 + XSPC/EK/ETC water block. (about the same price as the Zotac, maybe a little more).

I'm having a lot of trouble finding detailed info on WC'ing these cards, since most say it's not necessary.
Anything I should go for or stay away from?


----------



## Fediuld

Quote:


> Originally Posted by *Synthetickiller*
> 
> For those of you who have water cooled, what option did you take?
> 
> The MSI Sea Hawk EK X seems overpriced. ($830+)
> The Zotac Arcticstorm is a much better deal. ($720)
> GTX 1080 + XSPC/EK/ETC water block. (about the same price as the Zotac, maybe a little more).
> 
> I'm having a lot of trouble finding detailed info on WC'ing these cards, since most say it's not necessary.
> Anything I should go for or stay away from?


Get a good GTX1080 and WC it.

I bought an MSI Armor OC, sub £600 was the cheapest non ref card at OCUK.

Installed the EK block, backplate and attached it to the Predator 360 that cools CPU. It can do 2177 @ 1.081v, however the 2190 @ 1.093 provides the same perf as the 2177 clock.
Only change was to use Thermal Grizzly pads, which are more than twice better than the EK ones, and Kryonaut instead of Ectotherm. Which is 50% better.

Card works like dream with no issues. It gives me 8350 at 3D Mark Spy. Only hardware modded 1080s can beat it with 8500 score. That says a lot.


----------



## Derpinheimer

Quote:


> Originally Posted by *juniordnz*
> 
> Just started getting some micro green dots, pixel size, on the screen and the error saying that "windows stopped the device from working". It doesn't even recognizes the card.
> 
> Boxed it up and returning to the store tomorrow. Hopefully they won't make too much trouble to get it replaced for me.
> 
> Wish I could change to Galax...


Wow that sucks! It reads like you kinda saw this coming? Or was it totally unexpected?


----------



## Derpinheimer

Quote:


> Originally Posted by *Synthetickiller*
> 
> For those of you who have water cooled, what option did you take?
> 
> The MSI Sea Hawk EK X seems overpriced. ($830+)
> The Zotac Arcticstorm is a much better deal. ($720)
> GTX 1080 + XSPC/EK/ETC water block. (about the same price as the Zotac, maybe a little more).
> 
> I'm having a lot of trouble finding detailed info on WC'ing these cards, since most say it's not necessary.
> Anything I should go for or stay away from?


I got an EVGA FTW and used a universal EK VGA Supremacy. Temps are great, but i don't think it's worth the hassle unless you have an existing loop to accommodate it.

Btw sorry for double post i am clumsy on mobile


----------



## kx11

inno3d released their Hybrid

https://www.techpowerup.com/225664/inno3d-announces-the-geforce-gtx-1080-ichill-black


----------



## ShortySmalls

Seems i got a poor overclocking GTX 1080, I got the G1 gaming since it was the cheapest (and only one in stock when they launched) overclocks around +135mhz core and +500 memory, much higher it either crashes or my benchmark scores go down.

Could this be because it only allows 108% power limit and only has the 1 8 pin connector?


----------



## Synthetickiller

Quote:


> Originally Posted by *Fediuld*
> 
> Get a good GTX1080 and WC it.
> 
> I bought an MSI Armor OC, sub £600 was the cheapest non ref card at OCUK.
> 
> Installed the EK block, backplate and attached it to the Predator 360 that cools CPU. It can do 2177 @ 1.081v, however the 2190 @ 1.093 provides the same perf as the 2177 clock.
> Only change was to use Thermal Grizzly pads, which are more than twice better than the EK ones, and Kryonaut instead of Ectotherm. Which is 50% better.
> 
> Card works like dream with no issues. It gives me 8350 at 3D Mark Spy. Only hardware modded 1080s can beat it with 8500 score. That says a lot.


Good to know. I see that EK supports a lot of cards. I'll have to price everything out & see what happens.
I'm wondering if the Zotac card is the cheapest, maybe that's just the best way to go, assuming it's the lowest rpice
Quote:


> Originally Posted by *Derpinheimer*
> 
> I got an EVGA FTW and used a universal EK VGA Supremacy. Temps are great, but i don't think it's worth the hassle unless you have an existing loop to accommodate it.
> 
> Btw sorry for double post i am clumsy on mobile


I have a dual 180mm rad. It does a great job cooling y 4970k & GTX 690. Temps would only drop even more with a single 1080. That's why I brought WC'ing in the first place.


----------



## Rikuo

Quote:


> Originally Posted by *Synthetickiller*
> 
> For those of you who have water cooled, what option did you take?
> 
> The MSI Sea Hawk EK X seems overpriced. ($830+)
> The Zotac Arcticstorm is a much better deal. ($720)
> GTX 1080 + XSPC/EK/ETC water block. (about the same price as the Zotac, maybe a little more).
> 
> I'm having a lot of trouble finding detailed info on WC'ing these cards, since most say it's not necessary.
> Anything I should go for or stay away from?


I ended up getting 2x Msi 1080 Gaming X & putting an ek block on them. Unfortunately, I ordered mine before they announced the Seahawk EK, Or I probably would have just picked that up lol


----------



## steveTA1983

Ok, so I've been hitting the sauce a little (well, a lot), and while I'm more than pleased with the performance of this card, I am kind of ticked off. Nvidia knew what it was capable of, and they gimped it with the FE (I got mine at Best Buy). It's a beast. Stock, it's better than my previous bios modded Maxwell Titan X, but, the power delivery (one 8 pin) cripples it. I mean come on Nvidia. I get it, you want to make money. But why cripple your "gold standard FE" and let the third party companies kick your butt? Especially only a few months after release? Know one gives a crap about efficiency of they are dropping $699 on a video card!

What we have is a monopoly. Did Nvidia deliver a killer product? Yes. Did they overcharge? Hell yes (mine was an even swap basically, not complaining). What I am mad about is that this card IS capable of new Titan X performance, but they knew from day one it would release soon after the 1080 and they wanted a quick cash grab. Both cards (1080 and new TX) are monsters, but the fact of the matter is the new TX is nothing more than a 1080 w/12GB GDDRX5 and 2 8 pin power connecters.

I work for a company that has morals. We do not operate this way. Competion is good not only for a great product, but ensuring the customers are pleased. Nvidia has control and can care less. AMD is toast and Nvidia knows it, and we are the ones who will suffer (from our wallets). The 1080 should really be a $499 card and the 1070 a $299 card. God knows they outsell AMD like crazy.

With that said, the 1080 FE performance is nuts. I just hate that in 3 years a flagship card is going to cost as much as a mortgage payment.


----------



## ucode

Quote:


> Originally Posted by *steveTA1983*
> 
> No. According to GPU-z I was hitting reliable voltage and power limit errors. Weird.


Can get those reports with 15W, GPU clock less than 1GHz, Mem clock less than 100MHz and voltage less than 1V


Perhaps the power limit report is a little buggy. The T4 will allow more than 300W on x-flashed FE so should be careful with that in mind.


----------



## tps3443

Quote:


> Originally Posted by *Synthetickiller*
> 
> For those of you who have water cooled, what option did you take?
> 
> The MSI Sea Hawk EK X seems overpriced. ($830+)
> The Zotac Arcticstorm is a much better deal. ($720)
> GTX 1080 + XSPC/EK/ETC water block. (about the same price as the Zotac, maybe a little more).
> 
> I'm having a lot of trouble finding detailed info on WC'ing these cards, since most say it's not necessary.
> Anything I should go for or stay away from?


You should be able to squeeze a little more I am in the 25,000 Graphics range with my MSI GTX1080 FE. Your card has the superior cooling so, it will do fine.


----------



## tps3443

Quote:


> Originally Posted by *steveTA1983*
> 
> Ok, so I've been hitting the sauce a little (well, a lot), and while I'm more than pleased with the performance of this card, I am kind of ticked off. Nvidia knew what it was capable of, and they gimped it with the FE (I got mine at Best Buy). It's a beast. Stock, it's better than my previous bios modded Maxwell Titan X, but, the power delivery (one 8 pin) cripples it. I mean come on Nvidia. I get it, you want to make money. But why cripple your "gold standard FE" and let the third party companies kick your butt? Especially only a few months after release? Know one gives a crap about efficiency of they are dropping $699 on a video card!
> 
> What we have is a monopoly. Did Nvidia deliver a killer product? Yes. Did they overcharge? Hell yes (mine was an even swap basically, not complaining). What I am mad about is that this card IS capable of new Titan X performance, but they knew from day one it would release soon after the 1080 and they wanted a quick cash grab. Both cards (1080 and new TX) are monsters, but the fact of the matter is the new TX is nothing more than a 1080 w/12GB GDDRX5 and 2 8 pin power connecters.
> 
> I work for a company that has morals. We do not operate this way. Competion is good not only for a great product, but ensuring the customers are pleased. Nvidia has control and can care less. AMD is toast and Nvidia knows it, and we are the ones who will suffer (from our wallets). The 1080 should really be a $499 card and the 1070 a $299 card. God knows they outsell AMD like crazy.
> 
> With that said, the 1080 FE performance is nuts. I just hate that in 3 years a flagship card is going to cost as much as a mortgage payment.


Hey bud, your card is not gimped one bit! The GTX1080 uses about 165 watts stock under load. And once the GTX1080 is overclocked, it is pushing up to 200-215 watts. The standard PCI-E can provide 75-100 watts of power, while the 8 PIN can provide 150 + watts of power. So, you can easily provide a GTX1080 with 250-300 watts of power. A 8 PIN is going to pull as much power as it needs.

The 8 PIN is not the PROBLEM.. The card is limited to a 120% power limiter. Meaning, in the GTX1080's BIOS it is limited to how much power it can utilize, even though the power is available to be pulled. And while it is overclocked it makes it tough to keep it at a consistent speed also. Now, the GTX Titan X required a 8PIN, and a 6PIN because this is a older architecture from nearly and it uses more power. People have gotten past the power limiter by tricking the card thinking it is using less power then it actually is, overclocking in the 2.4-2.5Ghz range and just blowing the doors off if benchmarks. But, if the GTX1080 has a higher power limiter, then this would not help the 12 month down the road refreshment release of the faster GTX1080 TI now would it?

I am keeping my GTX1080 through the cycle of its 20-24 month life span, looking at the performance of the new 2016 GTX Titan X BIG PASCAL, it is not enough of a improvement. I use my 1080 for 4K gaming, and it is just amazing what it can do... minimum frame rates of about 34 and averaging 60.

It is not the single 8 PIN that is holding the GTX1080 FE back, I have a MSI GTX1080 FE And, I am pulling 25,600 in fire strike graphics a score.

Now, I must say.. Looking at the GTX 980 when it was first released, it was a great gaming card! And just look at its life span! 22 months before the GTX 1080 was released. I am the type of person to upgrade a video card every week almost lol.

In the period of 40 days, I purchased a AMD RX480, then a GTX1070, and then a GTX1080. Well I am happy right where I am at. To obtain more power like only 7-9 fps more in 4K gaming I need to spend $500 more to get a Titan Pascal. It is unreasonable for me anyways.


----------



## HunterKen7

Hey guys, I want to get your thoughts on this scary situation I had with my new Gigabyte GTX 1080 G1:

After dinner last night, I fired up DOOM 2016 and after about 5 minutes I notice my fps was tanking hard. It slowly fell down to about 20fps. I exited out and checked GPUZ to see that the fans didn't turn on AT ALL when playing!









I didnt have any OC software running (or even installed) at the time. All I had was GPUZ running in the background. I decided to reboot my PC and, as soon as it exited windows, you could hear the fans roar up to speed.

Earlier in the evening I was messing around with Gigabytes xtreme gaming engine, but frustratingly uninstalled it because it refused to control the led light. Do you think remnants of that apps settings were affecting the fan curve even with it uninstalled?!

I'm worried it will happen again and cause some damage. Things are ok now this morning though. The fans spin up fine when I run some tests. Oh and the xtreme gaming engine all of a sudden controls the leds again somehow.


----------



## SweWiking

I got two evga 1080 FE cards. One oc very good (2200) and the other not so good (2050). Could i go ahead and just flash the less good card with the strix t4 bios and leave the good card alone with the original FE bios ?

Hopefully this will allow me to overclock the less good card a few more mhz, i would be more happy running at 2100mhz.

So the question is: is it dumb to just flash "the bad" card for some reason ? Since the good card can already do 2200 @ default volt.


----------



## juniordnz

After having an EVGA card failing on me with less than 1 month of use, I'm considering shifting to MSI.

Can anyone say anything about build quality differences between EVGA and MSI? I know MSI advertises a lot on "military grade components", but does that really mean the card itself will last longer?


----------



## DStealth

No


----------



## alton brown

Hey Guys! Need some opinions and advise again. What's going to pair better with the 1080? The ASUS ROG PG279Q or the new ViewSonic XG2703-GS ? Or does anyone feel one company is better than the other? Or can someone refer another brand monitor?

Thanks!


----------



## SauronTheGreat

Can anyone help me out in the witcher 3 if you turn off AA and AA for hairworks both on 4K ... how much graphics quality is lowered i see no difference but i see FPS are lowered a little when they are on... btw i am not talking about hair works just AA hair works







.....


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> After having an EVGA card failing on me with less than 1 month of use, I'm considering shifting to MSI.
> 
> Can anyone say anything about build quality differences between EVGA and MSI? I know MSI advertises a lot on "military grade components", but does that really mean the card itself will last longer?


I have a MSI GTX1080 Founders Edition now. And it replaced a EVGA GTX1070 SC ACX. So, I have my experience with both brands and this has all happen over the period of about a month.

The EVGA looked beautiful when it came out of the box, both of the ACX cooler fans made a clicking noise that got louder and louder the faster the fans would spin. They were not rubbing on anything, and it was something going on internally inside of the fan. So, I couldn't easily take them apart. The card also represented some extreme flex to it. Not sure if it was the weight or what. I soon sold the card after a short ownership of only about 2 weeks. Right before I sold the card, the fans started to fail. The card was also running a little hotter than usual. It was fast, and performed great for a GTX1070. But, it was a little disappointing.

THE MSI GTX1080 FE is a work of ART! Superior build quality, nice smooth fan that doesn't click, Japanese caps with a super long life span, the card is super stiff when mounted. You can literally press on the far end corner when it is mounted, and the card is just stiff as a board. I know this is not important, and shouldn't even be looked as a issue or a comparison. But when a 11.5" long glowing video card is sagging down, it looks like something is wrong lol. And the MSI, just does not SAG!. I have never heard any kind of coil whine either. Or any kind of noises from this MSI GTX1080 at all for that matter. Everything is perfect about it and I am extremely satisfied with my MSI 1080

That is not to say there are people with perfect EVGA cards, but mine was sealed when I received it locally. And, it has some issues for sure as it made clicking noises and sounded like a power transformer sometimes, it clicked and the fans started to fail at a early age of only a few weeks old.


----------



## tps3443

Quote:


> Originally Posted by *DStealth*
> 
> No


Something is definitely not right. I play Doom on nightmare, everything maxed out with the highest antialiasing available @4K resolution. I have never dipped below 54 fps? Maybe not even 57 fps? And that is rare! Now, I am still only on the 2nd level. So, I am not sure what is to come later on in the game that may be more demanding or something. But, The gefore experience FPS counter is always on the top left of my screen and it just sticks at 60 frames per second 95% of the time. It is really kind of silly how easy this game runs.

I play Fallout 4 on Ultra everything TXAA at 4K, and only Godrays are turned on LOW. View distances are maxed out too. and the game averages about 55fps - 60fps, and dips to 38fps sometimes. It is much more demanding then Doom.


----------



## tps3443

Quote:


> Originally Posted by *SauronTheGreat*
> 
> Can anyone help me out in the witcher 3 if you turn off AA and AA for hairworks both on 4K ... how much graphics quality is lowered i see no difference but i see FPS are lowered a little when they are on... btw i am not talking about hair works just AA hair works
> 
> 
> 
> 
> 
> 
> 
> .....


Hey, I am curious but. With every option available maxed out in The Witcher 3, WITH NO AA turned on at 4K. What type of average fps do you get? And what is the minmum?

IS it playable at 4K with everything maxed out? hair works etc? and AA turned off.


----------



## SauronTheGreat

Quote:


> Originally Posted by *tps3443*
> 
> Hey, I am curious but. With every option available maxed out in The Witcher 3, WITH NO AA turned on at 4K. What type of average fps do you get? And what is the minmum?
> 
> IS it playable at 4K with everything maxed out? hair works etc? and AA turned off.


its completely playable even when all types of AA are on, i get fps in between 35 to 42 ... but i find no difference in graphics quality when AA is on or off ...


----------



## tps3443

Well at 4K AA is unnecessary I think. And Anti-aliasing will slow the GTX1080 more than others, with only a 256Bit memory bus width.

That is more than playable performance though. Especially with a game like Witcher 3.

The GTX1080 just impresses me more and more everyday of how capable of a 4K card it really is.


----------



## boredgunner

Quote:


> Originally Posted by *tps3443*
> 
> Well at 4K AA is unnecessary I think. And Anti-aliasing will slow the GTX1080 more than others, with only a 256Bit memory bus width.
> 
> That is more than playable performance though. Especially with a game like Witcher 3.
> 
> The GTX1080 just impresses me more and more everyday of how capable of a 4K card it really is.


Aliasing is still extremely visible in most modern games at 4k, especially The Witcher 3 which needs to be run at 5k or more to not look like a mess (but even then other visual aspects drag it down). That 256-bit GDDR5X is clocked at 10000 MHz effective after all so the memory bandwidth is actually very good, bested only by a few (R9 390/390X, R9 Nano/Fury/Fury-X, Titan XP).

Plus, TAA and the mostly worthless MLAA/SMAA/FXAA all have essentially zero performance impact on modern GPUs. MSAA runs terrible in most modern games due to deferred rendering, plus it is quite useless anyway since the vast majority of aliasing is shader based not geometry based, so forget about MSAA unless you're playing an older game. And without deferred rendering, MSAA has next to no performance impact on modern high end GPUs so you can use as much as you want in older titles (although you're better off using supersampling).


----------



## LiquidHaus

Consider me part of the club!



Feels good man


----------



## tps3443

Quote:


> Originally Posted by *lifeisshort117*
> 
> Consider me part of the club!
> 
> 
> 
> Feels good man


The power is a nice thing to have on tap.


----------



## LiquidHaus

Yes it does lol.

Does anyone happen to know the stock FTW thermal pad thickness?

I'm looking to get some better thermal pads and paste on the ACX cooler in the meantime before Watercool releases their block.


----------



## tps3443

Quote:


> Originally Posted by *lifeisshort117*
> 
> Yes it does lol.
> 
> Does anyone happen to know the stock FTW thermal pad thickness?
> 
> I'm looking to get some better thermal pads and paste on the ACX cooler in the meantime before Watercool releases their block.


2mm thick approximately. This is a very close guesstimate from my own findings.


----------



## Martin778

Got my first 1080SC today, overclocks are disappointing, only +35MHz in EVGA Precision resulting in 2012MHz GPU clock and that's all folks.









Big step forward is that the 1080 w. ACX3.0 is dead quiet compred to 980TI with ACX2.0


----------



## Tdbeisn554

I am pretty sure I am buying a GTX 1080 now, just not really sure what model I should buy. The Asus 1080 strix is really nice and would go really well with my Formula but it is kinda expensive and I can not find any place where I can buy it. The EVGA 1080 FTW looks good too, but I am not really sure it would match my build, I could just buy it from the EVGA store for a decent price, or the Classified, just because it looks awesome, but again not in stock...


----------



## boredgunner

Quote:


> Originally Posted by *lifeisshort117*
> 
> Consider me part of the club!
> 
> 
> 
> Feels good man


That picture caused a bodily reaction in me that I can't describe here due to the ToS.


----------



## tps3443

Check this out! This is perfect for a GTX1080. It is not 4K, and it's not 1440P either.

A 3840x1600, 38" curved IPS panel.

New resolution?

http://www.lg.com/us/monitors/lg

I finished reading the specs. For that price it should be 100HZ, and have Gsync. The X34P is a better value at about 1,000-1,100


----------



## boredgunner

Quote:


> Originally Posted by *tps3443*
> 
> Check this out! This is perfect for a GTX1080. It is not 4K, and it's not 1440P either.
> 
> A 3840x1600, 38" curved IPS panel.
> 
> New resolution?
> 
> http://www.lg.com/us/monitors/lg
> 
> I finished reading the specs. For that price it should be 100HZ, and have Gsync. The X34P is a better value at about 1,000-1,100


It would need G-SYNC to be any good with the GTX 1080, since frame rates will be all over the place. For me, the GTX 1080 is good for 2560 x 1440 but not perfect. Perfect for me is being able to maintain 120 FPS or more so that I can use blur reduction in every game I play. We're not even close to that of course.


----------



## Synthetickiller

I have a unique situation. I think my general "what's the best gtx 1080 for WC'ing" question was too vague.

I'm going to be AC'ing a media closet that'll house my rig & maybe 2 others (the other two won't generate much heat). If I had to guess, the closet should sit between 50F-60F.
On top of that, I'll be watercooling (currently have a loop, so why not utilize it?). Temps should be very, very good. We're obviously not talking sub-zero, but I should gain a little overhead due to such low ambient temps. I read about 100 pages of this thread. I had a fairly decent understanding of the flashing & benefits. I'm interested in doing some flashing, specifically the XOC bios, unless my OC won't improve.

For my situation, I'm sort of stumped about the following options

Some say, "just get a founders & a waterblock." The argument is that any OCing that isn't subzero won't benefit from extra voltage (like the 1.2v bios flash). Water blocks are easy to source
Another option is to get something like an MSI Armor or Asus Strix (or really any 8 + 6 power setup). I know the Asus Strix XOC bios disables a single display port since it comes with DP x2 & HDMI x2.
I need to find a compatible water block if I go with option 2.
The third option is the MSI EK X or Zotac arctic storm. These are nice as a drop in option.
The Zotac arctic storm should be the cheapest, but I can't find much info on what clocks people are getting
The MSI EK X is $100 more & I don't see how it's justified. It costs far more than many custom cards + 3rd party waterblock.

Some people say to always buy EVGA. I've had excellent luck with OCing & pushing cards on air/water without killing anything. My Asus GTX 690 was bought used & it's always been solid. I also ran MSI GTX 570 Twin Frozr cards with great success. I'm not brand loyal. I just want a good setup w/o having to spend ridiculous amounts of money to get 1 or 2 percent more performance.

Thoughts?


----------



## Fediuld

Quote:


> Originally Posted by *Synthetickiller*
> 
> Another option is to get something like an MSI Armor or Asus Strix (or really any 8 + 6 power setup). I know the Asus Strix XOC bios disables a single display port since it comes with DP x2 & HDMI x2.
> I need to find a compatible water block if I go with option 2.
> 
> Thoughts?


I can only write from my own experience.

The EK makes a block for the MSI Armor/Gaming. I have the card (Armor) watercooled and runs perfectly. Without hard modding it (even with CLU shorting) can do 2177 @ 1.081v and 8350 on Spy.
2190 @ 1.093v doesn't improve perf compared to previous clock. And with +520 on vram

No other modifications on the card, only that by replacing the EK Ectotherm and pads with Thermal Grizzly Kryonaut and Minus 8, temps drop 13C for the Core to 35C max. And my loop is just a Predator 360.


----------



## tps3443

Quote:


> Originally Posted by *boredgunner*
> 
> It would need G-SYNC to be any good with the GTX 1080, since frame rates will be all over the place. For me, the GTX 1080 is good for 2560 x 1440 but not perfect. Perfect for me is being able to maintain 120 FPS or more so that I can use blur reduction in every game I play. We're not even close to that of course.


Shoot! Perfect for me is 35-70 FPS in 4K maxed out, and that is exactly what this GTX1080 provides for me. Some games are better than others, Some are worse. But most seem to fall well with then the 60 average.


----------



## Fediuld

Quote:


> Originally Posted by *boredgunner*
> 
> It would need G-SYNC to be any good with the GTX 1080, since frame rates will be all over the place. For me, the GTX 1080 is good for 2560 x 1440 but not perfect. Perfect for me is being able to maintain 120 FPS or more so that I can use blur reduction in every game I play. We're not even close to that of course.


Samsung said that their CF791 will come with Gsync also (currently Freesync only, on HDMI 2.0 also) some time in the future. That's the 1500R 34" 3440x1440 @ 100 quantum dot monitor.


----------



## tps3443

Hey every one I had a quick question. I am running a 6600K at 4.82Ghz, and [email protected] 2345Mhz Cas 13, and a GTX1080 overclocked around 2,100Mhz and 11Ghz memory.

The problem is, I checked my GPU usage for FALL OUT 4, and it is all over the place! Under a easy rendering scene it stays at 100% But, when the FPS drops in the game on something more demanding is starts to bounce around 88%-94% back to 100%. is this normal?

Am I really bottlenecked with a 6600K at 4.82Ghz?


----------



## RJacobs28

Don't go freaking out at Fallout 4's expense. That title is unnecessarily CPU hungry.


----------



## Martin778

Are there any BIOS mods already for the 1080? I'm severly disappointed by looking how the card gets temp limited. Bench starts at 2025MHz and after a few minutes drops to 1974MHz.

+
Boosting the fan to 70% brings the clock back to 2GHz and if you pump up the RPM to 84% it gets back to 2025MHz with temps dropping to 59*C.


----------



## Cornerer

Quote:


> Originally Posted by *tps3443*
> 
> Hey every one I had a quick question. I am running a 6600K at 4.82Ghz, and [email protected] 2345Mhz Cas 13, and a GTX1080 overclocked around 2,100Mhz and 11Ghz memory.
> 
> The problem is, I checked my GPU usage for FALL OUT 4, and it is all over the place! Under a easy rendering scene it stays at 100% But, when the FPS drops in the game on something more demanding is starts to bounce around 88%-94% back to 100%. is this normal?
> 
> Am I really bottlenecked with a 6600K at 4.82Ghz?


Yes some games would definitely bottleneck i5s. It's pretty much proven everywhere on 'net.
If u want no compromise 6700K is the one you can choose for now, though 7700K Kaby Lake is just around the corner.


----------



## perkeleprkl

nevermind...


----------



## fitzy-775

So my gigabyte g1 gaming 1080 has started playing up on me today and artifacting in all games im playing, and i don't have the card overclocked at all. It has been working fine up in till now. Does anyone know how I can fix this?


----------



## tps3443

To perkeleprkl

Your life span will be fine which is like 8-12+ years on a overclocked video card. It will over clock a little better too, Just keep tinkering!
It is amazing how durable, and tough these components actually are.

Your GTX1080 is just fine.


----------



## tps3443

Quote:


> Originally Posted by *Cornerer*
> 
> Yes some games would definitely bottleneck i5s. It's pretty much proven everywhere on 'net.
> If u want no compromise 6700K is the one you can choose for now, though 7700K Kaby Lake is just around the corner.


Your life span will be fine.

Wow! Considering my overclock, I guess I've just got to much GPU power? And I play Fallout 4 more than anything so, a upgrade is actually important to me.

I want all of the performance available that my GTX1080 can offer.

I play Fallout 4 alot, and I'm trying to justify a upgrade.


----------



## trivium nate

AMD FX-8350 Vishera-4GHz(stock)(8-CORE)//Corsair-H50//ASUS M5A99X EVO-990X Mobo
EVGA GTX-980(SLI)SC(4GB GDDR5)//256GB SSD//8TBHDD(x2)//G.SKILL Ripjaws X-24GB
DVD-RW//BLU-Ray(Drive's)//1000 Watt Corsair PSU//(55"TCL-4K-TV)//55"Insignia-TV

would it be worth it to go to a 1080? the EVGA 08G-P4-6284-KR GEFORCE GTX 1080 FTW DT edition? would i notice increase in fps?


----------



## Deders

Only if you find 4GB limiting. Would be a lot to spend for a Vram upgrade.


----------



## Cornerer

Quote:


> Originally Posted by *trivium nate*
> 
> AMD FX-8350 Vishera-4GHz(stock)(8-CORE)


Any current AMD chip is already an issue before anything due to poor architechure. Doesn't matter if u've got high clock speeds or not bottlenecking in any games, u're still already losing like ~15fps or more already in gaming compare to even i5.

I would say ignore your graphics card for now and aim for Intel CPU and Mobo 1st if u must switch now, or preferably wait for few months to see how the upcoming AMD's new Zen chips and Intel's Kaby Lake will perform.
This way you should get considerable fps gains at just above half the price (at most) of a GTX 1080.


----------



## trivium nate

yeah i play at 4k like deus and mirrors edge and stuff like that need more


----------



## trivium nate

of course the issue is my mobo n cpu

what about just this cpu?

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113347


----------



## Fediuld

Quote:


> Originally Posted by *trivium nate*
> 
> of course the issue is my mobo n cpu
> 
> what about just this cpu?


Currently AMD doesn't have any other CPU better than this. Zen is coming out in few months and it will require a new motherboard.

I would say spend some money on cooling it and clock it it to 4.8Ghz, most can take it and you have 20% perf boost compared to now.


----------



## trivium nate

yeah ive got a corsair h50 watercooler earlier post updated with cpu link i forgot


----------



## GunnzAkimbo

Please smash this score with a *SINGLE 1080*.

Cheapass(t) card i can find is the Gainward Phoenix 1080 ($999 AUD) thats the first 1080 ive seen at 3 digits.

Need a vid card upgrade but it has to be something worthy of an upgrade. (may stick to 1070 territory)


----------



## ChaosAD

Thats easy to beat. Im at 23K with everything stock except PL and TL.


----------



## Fediuld

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> Please smash this score with a *SINGLE 1080*.
> Cheapass(t) card i can find is the Gainward Phoenix 1080 ($999 AUD) thats the first 1080 ive seen at 3 digits.
> Need a vid card upgrade but it has to be something worthy of an upgrade. (may stick to 1070 territory)


Nothing special for that GPU score, my MSI Armor OC does 25144 at 2164 clock. (haven't run the bench at 2190 core).

http://www.3dmark.com/fs/9883027

And here is the firestrike extreme with 12075 gpu score.

http://www.3dmark.com/fs/9892724


----------



## Synthetickiller

I've been reading more & I get the following impressions:

No card is better than another from an OC stand point. We are all players in the silicone lottery.
Single 8 pin cards OC just as well as 8 + 6 or 8 + 8 pin.
EVGA cards seem to have a lot of coil whine. (I know this is going to rustle some jimmies, lol)
The ability to flash almost any bios to any card means I can overcome poorly OC'ing bioses.

Based on all of this....
Price-wise, I'm stuck between these 3 1080s (bonus 4th):


Zotac Arcticstorm ($725 shipped, but OOS) + 90° fitting ($10) ($735 final price)
Asus ROG Strix (non-OC) ($630) + EK Strix water block (OOS currently) ($150) ($780 shipped) or bitspower ($135) ($765 shipped)
MSI Sea Hawk EK X ($815 shipped)
Any $700 founders edition + Watercool HEATKILLER® IV (about $810)
Both the MSI & Zotac seem to be better deals as they come with back plates. That adds another $40 to the cost of any card, otherwise.
Heatkiller makes a really nice block for the founders edition, so that competes with the MSI EK X, pricewise.

I just feel like I'm at an impasse & can't my a confident choice, lol. This generation is complicated.

Thoughts?


----------



## GunnzAkimbo

virtually the same GFX score.


----------



## alton brown

Quote:


> Originally Posted by *alton brown*
> 
> Hey Guys! Need some opinions and advise again. What's going to pair better with the 1080? The ASUS ROG PG279Q or the new ViewSonic XG2703-GS ? Or does anyone feel one company is better than the other? Or can someone refer another brand monitor?
> 
> Thanks!


Any input from anyone?


----------



## Synthetickiller

There's a discussion about that monitor & some others here: http://www.overclock.net/t/1600695/asus-pg279q-vs-acer-predator-xb271hu-vs-dell-s2716dg

One consideration is to wait for cheaper Korean panels. Depends on how patient you are.


----------



## alton brown

Thanks. I'm pretty patient but right now its on sale @ Newegg for 799.00.


----------



## ucode

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> Please smash this score with a *SINGLE 1080*.


Define "smash". If you are looking for double the graphics score it's not going to happen. More like 10% only unless going cold.


----------



## Cornerer

Quote:


> Originally Posted by *trivium nate*
> 
> of course the issue is my mobo n cpu
> 
> what about just this cpu?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819113347


I thought 9590 wouldn't fit into any 8350 motherboard?
Anyway even this best AMD chip at 4.8GHz would struggle to keep up with just i5-6600K at slightly less clocks. This is why you should wait a bit longer until both AMD/Intel's new lineups arrive in just few month's time, since you really should get a new CPU+Mobo anyway in order to draw out max potential of whether 980 SLI or 1080.


----------



## Cornerer

double post


----------



## GunnzAkimbo

Quote:


> Originally Posted by *ucode*
> 
> Define "smash". If you are looking for double the graphics score it's not going to happen. More like 10% only unless going cold.


that's the problem.

May have to sli new cards.


----------



## tps3443

Hey, is it normal for my GTX1080 to always list a PerfCap reson in GPU-Z? It always says ive hit "Voltage Limit" And "power Limit" "Vrel limit"

if its stock or overclocked I run in to this problem. It overclocks good, and I can score 25,000+ in firestrike graphics.

I see some people in GPU z and the perfcap monitor does not list the same things my card does.


----------



## DStealth

PG279Q all the way...
Quote:


> Originally Posted by *GunnzAkimbo*
> 
> Please smash this score with a *SINGLE 1080*.


Ok


Just closed 8500 GPU in Timespy with 2164/2151 w/o curve...and lower temps..


----------



## MaFi0s0

Is it normal for this card to idle at <200mhz? Its causing issues in benchmarks I think. I have "prefere max performance".

If its normal is there a firmware fix?


----------



## GanGstaOne

Quote:


> Originally Posted by *MaFi0s0*
> 
> Is it normal for this card to idle at <200mhz? Its causing issues in benchmarks I think. I have "prefere max performance".
> 
> If its normal is there a firmware fix?


Yep normal and it dosnt need a fix it dosnt cause any issues


----------



## senna89

*Anyone can say me if the EVGA product ( SC or FTW ) suffers to a particular coil whine problem ?* More than other brands ?
Control quality issue or simply a diffusion questions ?

You can recommend me an EVGA product or better choose another brand ?
FTW more problems than ACX and SC ?


----------



## GanGstaOne

Quote:


> Originally Posted by *senna89*
> 
> *Anyone can say me if the EVGA product ( SC or FTW ) suffers to a particular coil whine problem ?* More than other brands ?
> Control quality issue or simply a diffusion questions ?
> 
> You can recommend me an EVGA product or better choose another brand ?
> FTW more problems than ACX and SC ?


From what i have seen MSI and Gigabyte are the most stable cards and yes most evga cards have coil whine


----------



## Rhadamanthis

i have evga founders ed with hybrid kit no coil wine after and before mod


----------



## GanGstaOne

Quote:


> Originally Posted by *Rhadamanthis*
> 
> i have evga founders ed with hybrid kit no coil wine after and before mod


The custom pcb's has coil whine


----------



## Vellinious

I've owned 3 FTWs and the only time I hear coil whine is when the frame rates are extremely high, around 300+.


----------



## Spiriva

Quote:


> Originally Posted by *SweWiking*
> 
> I got two evga 1080 FE cards. One oc very good (2200) and the other not so good (2050). Could i go ahead and just flash the less good card with the strix t4 bios and leave the good card alone with the original FE bios ?
> 
> Hopefully this will allow me to overclock the less good card a few more mhz, i would be more happy running at 2100mhz.
> 
> So the question is: is it dumb to just flash "the bad" card for some reason ? Since the good card can already do 2200 @ default volt.


Only one way to find out


----------



## Joenc

I was leaning towards a evga ftw , but after reading all the problems they've had

with black screens and then doing returns, I'm leaning toward msi now...

Evga says they have fixed the problem as of aug 31 , but a couple of buyers

got the new rma cards back to them and still having black screen problem..

So, I like the size of ftw, but evga messed something up out the gate..., they

do have good support though,


----------



## MaFi0s0

Quote:


> Originally Posted by *GanGstaOne*
> 
> Yep normal and it dosnt need a fix it dosnt cause any issues


Just "fixed" it and didnt help the benchmark.

So I havent mentioned my problem here, since upgrading from a 980Ti my min FPS in Heaven 4.0 drops to 8 or 9. It used to be 35 with the 980Ti. There is also some stuttering around the same area of the bench where the drops are.

I have tried checking power state management in Control Panel, Nvidia settings and bios.

Tried different drivers.

I am even on a different motherboard.

Chipset drivers are updated.

Doube checked Nvidia settings for the app and the launcher for Heaven.


----------



## fat4l

Any bios mods yet ????


----------



## Reckit

Quote:


> Originally Posted by *alton brown*
> 
> Any input from anyone?


I have the Asus and it is one hell of a monitor, love G-Sync, viewing angles and colour calibration, can't really comment on the other


----------



## boredgunner

Quote:


> Originally Posted by *alton brown*
> 
> Any input from anyone?


The ViewSonic is currently being reviewed by PCMonitors.info so wait for that. But really, both of those and the Acer Predator XB271HU are basically the same thing. Price is the main factor here.


----------



## Fediuld

Samsung is making the 27" 2560x1440 quantum dot, with Gsync.


----------



## alton brown

Quote:


> Originally Posted by *boredgunner*
> 
> The ViewSonic is currently being reviewed by PCMonitors.info so wait for that. But really, both of those and the Acer Predator XB271HU are basically the same thing. Price is the main factor here.


How about the Acer Predator XB271HU? Anyone prefer Acer over the Asus?


----------



## boredgunner

Quote:


> Originally Posted by *Fediuld*
> 
> Samsung is making the 27" 2560x1440 quantum dot, with Gsync.


Are you sure it's 1440p and not 1080p? If so I'm in.
Quote:


> Originally Posted by *alton brown*
> 
> How about the Acer Predator XB271HU? Anyone prefer Acer over the Asus?


@CallsignVega bought five XB271HUs and five PG279Qs and preferred the XB271HU. Less backlight bleed on average, ever so slightly faster response time. The thinner bezel is slightly more attractive too.


----------



## alton brown

Quote:


> Originally Posted by *boredgunner*
> 
> Are you sure it's 1440p and not 1080p? If so I'm in.
> @CallsignVega bought five XB271HUs and five PG279Qs and preferred the XB271HU. Less backlight bleed on average, ever so slightly faster response time. The thinner bezel is slightly more attractive too.


I'm reading about the less backlight Bleed on the acer model.. correct me if I'm wrong, isn't some backlight bleed normal for a ips monitor?


----------



## Martin778

My Dell U2515H has clearly visible bleed in al 4 corners of the screen.


----------



## alton brown

Quote:


> Originally Posted by *Martin778*
> 
> My Dell U2515H has clearly visible bleed in al 4 corners of the screen.


Does it really bother you? Or do you just accept it for the technology?


----------



## Martin778

It's only visible on full black background, not really interesting for me. I've compared it to an Iiyama TN gaming screen, afaik it was the X2488HSU B2 (?) and the picture quality was plain rubbish compared to the IPS Dell.

*Regarding the GTX 1080 and BIOS mods*:

I've flashed the Asus Strix OC bios to my EVGA 1080 SC ACX and it didn't help at all with overclock, except that 1 of the DP ports stopped working.
Max available voltage was 1.12V according to GPU-Z, Precision and ASUS GPU TweakII

I then tried a BIOS from Inno3D iChill x3 card and the TDP shown was only 40-50% meaning that it has very high power limits in GPU-Z but there is a cooling issue - 100% fan speed on the Inno3D corresponds to roughly 50% RPM on the EVGA so your fans will spin at around 1500 RPM max when using this BIOS.

In both cases GPU-Z still showed all kinds of perfcap reasons after a minute of Unigine Valley.

Both BIOS'es didn't help the OC at all so I reverted back to the stock EVGA bios. Voltage scaling on air on this GPU (at least on my example) is zero.


----------



## Fediuld

Quote:


> Originally Posted by *boredgunner*
> 
> Are you sure it's 1440p and not 1080p? If so I'm in.


True the CFG70 lineup are 1080p :/

At least last week they announced that the 34CF791 will come with Gsync next year. That is the 1500R, 3440x1440 100hz 4ms quantum dot panel, that comes out with Freesync this year.


----------



## boredgunner

Quote:


> Originally Posted by *alton brown*
> 
> I'm reading about the less backlight Bleed on the acer model.. correct me if I'm wrong, isn't some backlight bleed normal for a ips monitor?


Backlight bleed is a lottery, but it does come with the territory with edge lit monitors.
Quote:


> Originally Posted by *Fediuld*
> 
> True the CFG70 lineup are 1080p :/
> 
> At least last week they announced that the 34CF791 will come with Gsync next year. That is the 1500R, 3440x1440 100hz 4ms quantum dot panel, that comes out with Freesync this year.


Yeah but there is still not even a single announced high refresh rate 1440p or larger 16:9 VA monitor.


----------



## shadow85

What OC should a stock STRIX on air and in SLI get without going over 75°C?
Quote:


> Originally Posted by *Fediuld*
> 
> Samsung is making the 27" 2560x1440 quantum dot, with Gsync.


This is nice, but I really wish they would make atleast 40" screens like this. After using a Philips 40" monitor for gaming I just don't feel like going downwards in size anymore no matter how superior the panel maybe.


----------



## Deders

Cards w
Quote:


> Originally Posted by *shadow85*
> 
> What OC should a stock STRIX on air and in SLI get without going over 75°C?


You should be able to max out your chip below 75c so long as you don't mind some fan noise.


----------



## Fediuld

Quote:


> Originally Posted by *shadow85*
> 
> What OC should a stock STRIX on air and in SLI get without going over 75°C?
> This is nice, but I really wish they would make atleast 40" screens like this. After using a Philips 40" monitor for gaming I just don't feel like going downwards in size anymore no matter how superior the panel maybe.


Actually the CFG70 is 1920x1080, my mistake..

The only worthy monitor coming out of them is the 34CF791 with gsync that will happen later next year.
And other than Acer and Asus, cannot see any announcements of new gsync monitors.


----------



## grimboso

Quote:


> Originally Posted by *senna89*
> 
> *Anyone can say me if the EVGA product ( SC or FTW ) suffers to a particular coil whine problem ?* More than other brands ?
> Control quality issue or simply a diffusion questions ?
> 
> You can recommend me an EVGA product or better choose another brand ?
> FTW more problems than ACX and SC ?


I don't have any noticable coil-whine on my 1080 FTW card.


----------



## SauronTheGreat

This is my unigene heaven score on 4k resolution with AA off .. can someone tell me is it good enough ... and one last question my gigabyte g1 gaming 1080 has a custom PCB but how come it has only one 8 pin connector like the founders edition ? i have never tried to overclock my card although its running in factory OC mode via the gigabyte xtreme engine tool


----------



## Martin778

Quote:


> Originally Posted by *grimboso*
> 
> I don't have any noticable coil-whine on my 1080 FTW card.


Same here, I'm running 1080 SC which in reality is nothing more than an FE with the ACX3.0 slapped on it.
The coils only whine if I run stuff like 3dmark03/06 with a few hundred FPS.


----------



## Joshwaa

No coil whine at all on my FTW at any clocks or any FPS. I should have it on water next week and then my whole case will be silent. I will let you know if I hear anything then.


----------



## T-Rez

With the stock bios on my Strix (non-OC edition) I couldn't go past 1825hz before throttling at 1.035v due to power limit (wouldn't go past 72C). So I flashed the strix1080xoc_t4 bios from this thread. Now I'm hitting a wall at 1950hz, 74C, and around 1.131v. Not the greatest, but certainly an improvement over stock bios.

The strange thing is that now the sliders for power limit and temp limit are grayed out in MSI afturburner, EVGA precision, and Asus GPU Tweak2. I can't change those values at all now. They were accessible before I flashed the bios. Is this normal for the strix1080xoc_t4 bios?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *T-Rez*
> 
> With the stock bios on my Strix (non-OC edition) I couldn't go past 1825hz before throttling at 1.035v due to power limit (wouldn't go past 72C). So I flashed the strix1080xoc_t4 bios from this thread. Now I'm hitting a wall at 1950hz, 74C, and around 1.131v. Not the greatest, but certainly an improvement over stock bios.
> 
> The strange thing is that now the sliders for power limit and temp limit are grayed out in MSI afturburner, EVGA precision, and Asus GPU Tweak2. I can't change those values at all now. They were accessible before I flashed the bios. Is this normal for the strix1080xoc_t4 bios?


That's normal.


----------



## MaFi0s0

Quote:


> Originally Posted by *Fediuld*
> 
> Samsung is making the 27" 2560x1440 quantum dot, with Gsync.


Knowing Samsung they will cap it at 60Hz.


----------



## Joshwaa

Quote:


> Originally Posted by *MaFi0s0*
> 
> Knowing Samsung they will cap it at 60Hz.


Or if it's like the new Note 7 it could just explode when you plug it in too. lol


----------



## LiquidHaus

Something I just thought of for those to try with coil whine:

Get rivatuner and set fps cap to 60 (or whatever your monitor's cap is)

The coil whine is gnarly on most cards because frames are so high when they don't need to be. Sure you can brag about it, but then again the coil whine associated with it isn't much to brag about.

BTW, Watercool is making waterblocks for FTW cards. I plan to go that route. Late September or early October.


----------



## Vellinious

Quote:


> Originally Posted by *lifeisshort117*
> 
> Something I just thought of for those to try with coil whine:
> 
> Get rivatuner and set fps cap to 60 (or whatever your monitor's cap is)
> 
> The coil whine is gnarly on most cards because frames are so high when they don't need to be. Sure you can brag about it, but then again the coil whine associated with it isn't much to brag about.
> 
> BTW, Watercool is making waterblocks for FTW cards. I plan to go that route. Late September or early October.


EK already has them up for pre-order.


----------



## LiquidHaus

Quote:


> Originally Posted by *Vellinious*
> 
> EK already has them up for pre-order.


I know. Watercool's are better looking though.


----------



## Synthetickiller

Quote:


> Originally Posted by *lifeisshort117*
> 
> I know. Watercool's are better looking though.


I almost ordered an FE or EVGA SC 1080 because of the Heatkiller IV Block.


----------



## Fediuld

Quote:


> Originally Posted by *T-Rez*
> 
> With the stock bios on my Strix (non-OC edition) I couldn't go past 1825hz before throttling at 1.035v due to power limit (wouldn't go past 72C). So I flashed the strix1080xoc_t4 bios from this thread. Now I'm hitting a wall at 1950hz, 74C, and around 1.131v. Not the greatest, but certainly an improvement over stock bios.
> 
> The strange thing is that now the sliders for power limit and temp limit are grayed out in MSI afturburner, EVGA precision, and Asus GPU Tweak2. I can't change those values at all now. They were accessible before I flashed the bios. Is this normal for the strix1080xoc_t4 bios?


a) To start, 1.131v is too much to say the least if possible. The significant majority that has aircooled cards, can do 2100 at 1.093v. Watercooled ones doing 2177 at 1.081
b) I wonder if your 1.131v bios represents truly that voltage or not. It if does, does anyone believe that would work on MSI Armor OC and what are the chances to brick the card?

----
c) Has anyone flashed MSI Armor, with Gaming Z or the Seahawk BIOS?


----------



## AllGamer

*Re: Coil Whine*

Does anyone know if coil whine only affects Air Cooled cards?

Can anyone running their GTX 1080 on water confirm if they had any Coil Whine issue?


----------



## MaFi0s0

My last card was water cooled and had coil whine at FPS above 200, same with my air cooled card now.


----------



## AllGamer

Quote:


> Originally Posted by *MaFi0s0*
> 
> My last card was water cooled and had coil whine at FPS above 200, same with my air cooled card now.


Damn!









i was afraid of that, I hope these 2 MSI EK Sea Haw don't develop the problem

my current MSI FE (still on air, soon to be in WC block) is quiet, no issue,









i've been using it for a few months already.


----------



## Synthetickiller

Quote:


> Originally Posted by *AllGamer*
> 
> *Re: Coil Whine*
> 
> Does anyone know if coil whine only affects Air Cooled cards?
> 
> Can anyone running their GTX 1080 on water confirm if they had any Coil Whine issue?


I saw a video on youtube of the MSI Seahawk EK X with coil whine.





The uploader didn't state if that was underload, idle, partial load or what.


----------



## alawadhi3000

Quote:


> Originally Posted by *AllGamer*
> 
> *Re: Coil Whine*
> 
> Does anyone know if coil whine only affects Air Cooled cards?
> 
> Can anyone running their GTX 1080 on water confirm if they had any Coil Whine issue?


Quote:


> Originally Posted by *Synthetickiller*
> 
> I saw a video on youtube of the MSI Seahawk EK X with coil whine.
> 
> 
> 
> 
> 
> The uploader didn't state if that was underload, idle, partial load or what.


I get coil whine only on super high FPS like 2000 or something.


----------



## AllGamer

Quote:


> Originally Posted by *Synthetickiller*
> 
> I saw a video on youtube of the MSI Seahawk EK X with coil whine.
> 
> 
> 
> 
> 
> The uploader didn't state if that was underload, idle, partial load or what.


Holy crap, that is a loud coil whine.

most of the others I've seen / heard are very very faint, the type that if you are not looking for it, it'll fade into the background, but the one in this video is loud.

it's like having a Cicada inside your case.


----------



## LiquidHaus

Hahaha, a Cicada. So true.

Try limiting FPS thru Rivatuner and see if the problem persists.


----------



## DFroN

Got a Gigabyte G1 1080 today, much quieter and cooler than the Zotac Amp I tried before. I have a few questions I'd be grateful for help with though:

How do I control the RGB lighting? I've tried Gigabyte xtreme engine, nothing happens when I choose a colour and hit apply. Also tried OC Guru 2 and whatever the other Gigabyte utility was called with no luck. Even if I do get the colour to change, do I need the Gigabyte utility running all the time to make it stick?
Is the bios update on the Gigabyte website worth it?
I'm getting flickering with my G-Sync monitor on desktop, I can stop it by lowering refresh rate down to 120 and then back to 144. I gather this is a common issue but I've been unable to find a fix. Using latest drivers (372.70)
Cheers


----------



## GanGstaOne

So i didnt really had time to enjoy my 1080 G1 but now playing Fallout 4 with all my textures mods 4K ones the card load is about 50-70% and uses 2.5-2.7GB vram where my old 980 G1 same game same mods only the 2K textures ones cards was 80-100% load and it used all the 4GB of vram
those 1080 are real beast the benchmarks just dont do it justice


----------



## shadow85

So I added 120% power target, and +100 MHz core to my STRIX (A8G) via AB on stock voltage, and AB reports I am getting max core of 2038. Is this good? The only testing I done was a few hours of gaming Deus Ex: Mankind Divided on Ultra @ 4K.

Top card hits 71 degC and bottom 61 degC on standard fan profile.

I haven't tried to add more to the core yet but will later tonight.


----------



## Bdonedge

Holy **** I just updated to the newest drivers - Somehow its making my card give coil whine. I never had it before. Omg


----------



## steveTA1983

So I was playing GTA V @ 4K with everything maxed out except no AA, reflections on high, water reflections on high, and it is just cruising at a locked 60 FPS. The card even was running at 1215mhz, only 38% power usage, and temps haven't gone above 46C. Either those readings are wrong or this card is just freakin nuts! Core is running at 1215mhz, but memory is at 10.8ghz


----------



## daunow

Quote:


> Originally Posted by *AllGamer*
> 
> Holy crap, that is a loud coil whine.
> 
> most of the others I've seen / heard are very very faint, the type that if you are not looking for it, it'll fade into the background, but the one in this video is loud.
> 
> it's like having a Cicada inside your case.


The 970's I had from EVGA were even louder than this, it was a pretty awful experience.


----------



## Benjiw

Quote:


> Originally Posted by *daunow*
> 
> The 970's I had from EVGA were even louder than this, it was a pretty awful experience.


I left mine folding for a few days, the noise wasn't so bad afterwards.


----------



## tps3443

I have never hear coil whine on my gtx1080 until after exiting the unigine haven 4.0 benchmark. Once you hit quit, it shows the marketing advertising
screen it whines like crazy!

Can anyone confirm there's does this also?

It happens on the pop up screen after exiting unigine haven benchmark. And it goes away, after hitting ESC to close the pop up.


----------



## senna89

Quote:


> Originally Posted by *shadow85*
> 
> So I added 120% power target, and +100 MHz core to my STRIX (A8G) via AB on stock voltage, and AB reports I am getting max core of 2038. Is this good? The only testing I done was a few hours of gaming Deus Ex: Mankind Divided on Ultra @ 4K.
> 
> Top card hits 71 degC and bottom 61 degC on standard fan profile.
> 
> I haven't tried to add more to the core yet but will later tonight.


have u coil whine with Strix Advance A8G ?


----------



## grimboso

Quote:


> Originally Posted by *lifeisshort117*
> 
> I know. Watercool's are better looking though.


What does the watercool one look like?


----------



## BURGER4life

Quote:


> Originally Posted by *grimboso*
> 
> What does the watercool one look like?


Like this: http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/6260#post_25511256


----------



## grimboso

Quote:


> Originally Posted by *BURGER4life*
> 
> Like this: http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/6260#post_25511256


yuck







Not my style.
Thanks for the link!


----------



## shadow85

Ok stable with +125 core on my STRIX (A8G) SLi on air, boosting to 2076 via AB. Will try +150 soon.


----------



## GreedyMuffin

Something is wrong with my loop. Playing far cry 4 with 5960X at 4200 1.090V and 1080 2139 1,050V it reaches 50'C!!!

This is with 1x XTX360 push-pull, 1x XT240 push. Fans are eloops running on 750-800 rpm.


----------



## AllGamer

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Something is wrong with my loop. Playing far cry 4 with 5960X at 4200 1.090V and 1080 2139 1,050V it reaches 50'C!!!
> 
> This is with 1x XTX360 push-pull, 1x XT240 push. Fans are eloops running on 750-800 rpm.


did you try running the fans at around 1500 rpm to 1800 rpm?

800 rpm is for when the computer is idle, or doing simple stuff like web, emails, office, and such.


----------



## GreedyMuffin

Quote:


> Originally Posted by *AllGamer*
> 
> did you try running the fans at around 1500 rpm to 1800 rpm?
> 
> 800 rpm is for when the computer is idle, or doing simple stuff like web, emails, office, and such.


That will be noisy, but I can test i suppose!


----------



## Benjiw

Quote:


> Originally Posted by *tps3443*
> 
> I have never hear coil whine on my gtx1080 until after exiting the unigine haven 4.0 benchmark. Once you hit quit, it shows the marketing advertising
> screen it whines like crazy!
> 
> Can anyone confirm there's does this also?
> 
> It happens on the pop up screen after exiting unigine haven benchmark. And it goes away, after hitting ESC to close the pop up.


Yes it's normal because it's putting out loads of FPS.


----------



## Groo21

Quote:


> Originally Posted by *tps3443*
> 
> Hey every one I had a quick question. I am running a 6600K at 4.82Ghz, and [email protected] 2345Mhz Cas 13, and a GTX1080 overclocked around 2,100Mhz and 11Ghz memory.
> 
> The problem is, I checked my GPU usage for FALL OUT 4, and it is all over the place! Under a easy rendering scene it stays at 100% But, when the FPS drops in the game on something more demanding is starts to bounce around 88%-94% back to 100%. is this normal?
> 
> Am I really bottlenecked with a 6600K at 4.82Ghz?


This is kind of funny because I have the exact opposite problem.
P
My Fallout 4 is unplayable (thread pending). My CPU utilization rarely exceeds 40%. I haven't be able to figure it out yet. GPU rarely gets beyond 60%.


----------



## GreedyMuffin

I have 900-1200 FPS in FC4 menu, not much coilwhine. Been folding on difference clocks and voltages. Has really helped!


----------



## LiquidHaus

Quote:


> Originally Posted by *tps3443*
> 
> I have never hear coil whine on my gtx1080 until after exiting the unigine haven 4.0 benchmark. Once you hit quit, it shows the marketing advertising
> screen it whines like crazy!
> 
> Can anyone confirm there's does this also?
> 
> It happens on the pop up screen after exiting unigine haven benchmark. And it goes away, after hitting ESC to close the pop up.


Yes that will always happen. It's just something the program makes the card do. It's happened to me hundreds of times. It's because technically the card is still being utilized....but on a static screen. So FPS is through the roof and that results in mad coil whine.

Quote:


> Originally Posted by *grimboso*
> 
> What does the watercool one look like?


And you're telling me you don't absolutely love this design!?


----------



## Derpinheimer

Quote:


> Originally Posted by *GreedyMuffin*
> 
> That will be noisy, but I can test i suppose!


What is the idle and does it jump to 50 in the first few seconds, and then stop rising, or what?

How it gets to 50 tells a lot more than anything else


----------



## Synthetickiller

Quote:


> Originally Posted by *lifeisshort117*
> 
> And you're telling me you don't absolutely love this design!?


It's too bad MSI didn't partner with watercool or aquatuning.
I think EK's blocks are too industrial in comparison. There seems to be a trend towards minimalism, yet people seem to hate this block (I've asked a couple of people I know who WC & they think it's hideous). To each their own. If I ever decide to go SLI vs upgrading to whatever the 1180 would be, I will certainly buy this block.


----------



## LiquidHaus

Quote:


> Originally Posted by *Synthetickiller*
> 
> It's too bad MSI didn't partner with watercool or aquatuning.
> I think EK's blocks are too industrial in comparison. There seems to be a trend towards minimalism, yet people seem to hate this block (I've asked a couple of people I know who WC & they think it's hideous). To each their own. If I ever decide to go SLI vs upgrading to whatever the 1180 would be, I will certainly buy this block.


Honestly, I don't really like EK. Don't get me wrong - their products are top notch quality and they do an amazing job throughout the whole process. They've obviously made the right connections these past few years and it shows: MSI's Seahawk X EK, EVGA's Hydro Copper line is now EK, Gigabyte teaming up with EK this year, etc.

They're creating an industry standard for the realm of watercooling. I can't knock that. But in my opinion, there are better designs out there. EK's is very simple but most of the time it's too simple.

I have to use EK products in the watercooled systems I build at Xidax because that is what is more widely available and a lower cost. But it's not what I want at all.

So I think the widespread take over that EK is doing in the market is swaying a lot of people over making them think they really are the best and any other company is sub par at best.

Think about an average joe browsing newegg and he comes across the Seahawk X EK. He'll search EK. He'll find out oh wow these guys make products for all sorts of manufacturers! And then instantly think that's company to always go with.

It's a genius move by EK, but it's warping people's opinions. At least that's the way I see it.

All in all, Watercool is my favorite. All they need is a memory block line and they'd be pretty much covered.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *steveTA1983*
> 
> So I was playing GTA V @ 4K with everything maxed out except no AA, reflections on high, water reflections on high, and it is just cruising at a locked 60 FPS. The card even was running at 1215mhz, only 38% power usage, and temps haven't gone above 46C. Either those readings are wrong or this card is just freakin nuts! Core is running at 1215mhz, but memory is at 10.8ghz


38c and 1215mhz on air sounds pretty low for pushing GTA V @ 4K... Something may not be right. I'm running 2x1080 under water cooling loop, and they will both run at 2000mhz, 40c, but only ~40-50% utilization on each one. But I have two and they're running full speed. Double check your resolution or something. As long as you're not using the advanced NV settings like MFAA or TXAA, or grass to very high, you shouldn't push the card to the limit.. but not even hitting full clock speed, that's surprising at 4k.

Hey, if your performance is there, and vsync is keeping it locked to 60hz (which is how I run mine, which is why I get the lower utilization), then that's great.


----------



## HunterKen7

Quote:


> Originally Posted by *DFroN*
> 
> Got a Gigabyte G1 1080 today, much quieter and cooler than the Zotac Amp I tried before. I have a few questions I'd be grateful for help with though:
> 
> How do I control the RGB lighting? I've tried Gigabyte xtreme engine, nothing happens when I choose a colour and hit apply. Also tried OC Guru 2 and whatever the other Gigabyte utility was called with no luck. Even if I do get the colour to change, do I need the Gigabyte utility running all the time to make it stick?
> Is the bios update on the Gigabyte website worth it?
> I'm getting flickering with my G-Sync monitor on desktop, I can stop it by lowering refresh rate down to 120 and then back to 144. I gather this is a common issue but I've been unable to find a fix. Using latest drivers (372.70)
> Cheers



You need to use xtreme gaming engine to control the LED. I'll warn you that it is one finicky piece of software though. It will control the LEDs one day and then not work at all the next time you start your PC. I really don't get it.
The F2 beta bios update was a result of a thread discussion at techpowerup. Have a read, it is very informative: https://www.techpowerup.com/forums/threads/gigabyte-gtx-1080-g1-gaming-fan-spikes.223629/ There was Gigabyte rep giving almost live updates on the development of it and then he sort of disappeared. Basically, at around 50C, the fans tend to spin up and down constantly making a faint "grinding" noise. The beta bios was an attempt to limit the "spikes". Myself, I set a custom fan curve that has the fans always running at 30 percent so I don't have this problem.
I'll be honest, I own a 144Hz gsync monitor and have no flickering at all with the latest drivers. I know all about the issue and I'm surprised that I don't have it. Lot's of discussion at the geforce forums about it, though.


----------



## Kold

Quote:


> Originally Posted by *DFroN*
> 
> Got a Gigabyte G1 1080 today, much quieter and cooler than the Zotac Amp I tried before. I have a few questions I'd be grateful for help with though:
> 
> How do I control the RGB lighting? I've tried Gigabyte xtreme engine, nothing happens when I choose a colour and hit apply. Also tried OC Guru 2 and whatever the other Gigabyte utility was called with no luck. Even if I do get the colour to change, do I need the Gigabyte utility running all the time to make it stick?
> Is the bios update on the Gigabyte website worth it?
> I'm getting flickering with my G-Sync monitor on desktop, I can stop it by lowering refresh rate down to 120 and then back to 144. I gather this is a common issue but I've been unable to find a fix. Using latest drivers (372.70)
> Cheers


I had the same weird flickering everywhere, including my desktop, with those drivers. Reverted to the previous and the flickering is gone.

So is there a way to stop my card's core clock from fluctuating so much? I'm running at 1.075v and 2114MHz, but it always bounces all over the place. Power slider is maxed and card is a founder's with a full EK block.


----------



## DFroN

Quote:


> Originally Posted by *HunterKen7*
> 
> 
> You need to use xtreme gaming engine to control the LED. I'll warn you that it is one finicky piece of software though. It will control the LEDs one day and then not work at all the next time you start your PC. I really don't get it.
> The F2 beta bios update was a result of a thread discussion at techpowerup. Have a read, it is very informative: https://www.techpowerup.com/forums/threads/gigabyte-gtx-1080-g1-gaming-fan-spikes.223629/ There was Gigabyte rep giving almost live updates on the development of it and then he sort of disappeared. Basically, at around 50C, the fans tend to spin up and down constantly making a faint "grinding" noise. The beta bios was an attempt to limit the "spikes". Myself, I set a custom fan curve that has the fans always running at 30 percent so I don't have this problem.
> I'll be honest, I own a 144Hz gsync monitor and have no flickering at all with the latest drivers. I know all about the issue and I'm surprised that I don't have it. Lot's of discussion at the geforce forums about it, though.


Ah thanks very helpful. I haven't noticed any peculiar fan noises so I'll probably not bother with the BIOS update. Although now that I know about it...

I've fixed my 144Hz flickering issue - I set the drivers to "prefer maximum performance" and rebooted but the flickering persisted so I changed it back to the default setting ("optimum power" or something) and now it's gone. I think that Nvidia's drivers are just buggy in general, they're very finicky when switching between G-Sync and ULMB, sometimes the monitor just sits in normal mode. Never had this issue with older drivers.


----------



## Bishop07764

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Something is wrong with my loop. Playing far cry 4 with 5960X at 4200 1.090V and 1080 2139 1,050V it reaches 50'C!!!
> 
> This is with 1x XTX360 push-pull, 1x XT240 push. Fans are eloops running on 750-800 rpm.


You might need more rad space if you want to keep temps lower while having your fans spin that slow. I know where you're coming from. My fans never go over 750-800 rpms for the silence too. I haven't seen my seahawk EK go over 40 C yet, but it's on its own separate loop. You should get better temps if you upped the fan speed.


----------



## Joshwaa

Quote:


> Originally Posted by *Groo21*
> 
> This is kind of funny because I have the exact opposite problem.
> P
> My Fallout 4 is unplayable (thread pending). My CPU utilization rarely exceeds 40%. I haven't be able to figure it out yet. GPU rarely gets beyond 60%.


When you say 40% on the cpu is that across all cores. I would check to see if you have some cores at 100% and others not doing much.


----------



## GanGstaOne

Quote:


> Originally Posted by *Groo21*
> 
> This is kind of funny because I have the exact opposite problem.
> P
> My Fallout 4 is unplayable (thread pending). My CPU utilization rarely exceeds 40%. I haven't be able to figure it out yet. GPU rarely gets beyond 60%.


f....d boost 3.0


----------



## pez

Quote:


> Originally Posted by *Groo21*
> 
> This is kind of funny because I have the exact opposite problem.
> P
> My Fallout 4 is unplayable (thread pending). My CPU utilization rarely exceeds 40%. I haven't be able to figure it out yet. GPU rarely gets beyond 60%.


Unfortunately, this is Fallout 4's fault. Well most likely the engine to be specific. Best advice I've found so far is to turn down shadows, HBAO, and god rays. Then anything else graphically intensive, crank it.


----------



## Groo21

I sent a GPU trace to nVidia showing the issue.

It shows cpu idle, gpu idle, and hard queues with varying empty times. If this was strictly Fallout, I would have to assume other people would be screaming more.


----------



## GanGstaOne

Quote:


> Originally Posted by *pez*
> 
> Unfortunately, this is Fallout 4's fault. Well most likely the engine to be specific. Best advice I've found so far is to turn down shadows, HBAO, and god rays. Then anything else graphically intensive, crank it.


980 didnt have this problem with fallout 4 only 1080 they went too far trying to minimize load and save power with that stupid boost 3.0


----------



## pez

Quote:


> Originally Posted by *Groo21*
> 
> I sent a GPU trace to nVidia showing the issue.
> 
> It shows cpu idle, gpu idle, and hard queues with varying empty times. If this was strictly Fallout, I would have to assume other people would be screaming more.


I've been having the same issues as well. Someone in the Fallout 4 thread explained it a bit more. Things like the fire pits in the game render if you just so happen to be facing in their general directions. Shadows and NPCs are CPU based so that's what will load up your CPU and leave your GPU at 50-70%. I thought it was specific to NVIDIA cards until someone with a RX 470 or 480 chimed in with a similar issue. Either way, something is wrong.

I did find that downclocking my Titan X to 1400 resulted in higher usage, though not necessarily an increase in performance







.
Quote:


> Originally Posted by *GanGstaOne*
> 
> 980 didnt have this problem with fallout 4 only 1080 they went too far trying to minimize load and save power with that stupid boost 3.0


Yeah my SLI 970s even felt like they did better at 1440p than my 1080 or Titan does. Ironically, 4K and SLI 1080s also seemed to run the game much better and more consistently with high GPU usage than my Titan at 3440x1440. Honestly, I'd love for it to be a GPU problem that could be fixed. There is something somewhere that is not communicating between the GPU and game saying that 'Hey, we're in a city, we're getting 45FPS, and we're only using 63% of the GPU and the clocks aren't even boosted high...'.

And something somewhere is responding 'Shhh, just let it happen...'


----------



## brucethemoose

Quote:


> Originally Posted by *Groo21*
> 
> I sent a GPU trace to nVidia showing the issue.
> 
> It shows cpu idle, gpu idle, and hard queues with varying empty times. If this was strictly Fallout, I would have to assume other people would be screaming more.


At this point, players just expect that kind of thing from Bethesda RPGs, and don't scream about it.

I'm not into the Fallout 4 modding scene, but if it's anything like the previous games, you should check the Nexus for a fix.


----------



## Groo21

I have tried every fix known to man, including the ones that make absolutely no sense whats so over ... "Hey, I drank a PowerAde yesterday and it worked, so, uh, you might want to try that." Ok.... {Note: PowerAde does NOT work}


----------



## juniordnz

I see a lot of people with 5820/5830 here. Just tought I should ask:

Do you see, in ANY scenario, in ANY game, CPU bottleneck with that processor?

The fact that I bought the 1080 to get 120+ fps and I'm getting seriously bottlenecked by my 4.5ghz 4690K reallt bothers me. I'm considering an upgrade to a 5820K (since i've read it performs better than a 6800K with both overclocked).

Thanks in advance


----------



## pez

Quote:


> Originally Posted by *juniordnz*
> 
> I see a lot of people with 5820/5830 here. Just tought I should ask:
> 
> Do you see, in ANY scenario, in ANY game, CPU bottleneck with that processor?
> 
> The fact that I bought the 1080 to get 120+ fps and I'm getting seriously bottlenecked by my 4.5ghz 4690K reallt bothers me. I'm considering an upgrade to a 5820K (since i've read it performs better than a 6800K with both overclocked).
> 
> Thanks in advance


You're gaming on 1080p based on what your sig says. That can seriously be a huge factor. And with a 1080, going to 1440p would be a huge step up for you. I understand why a lot of people don't move from 1080p when using high refresh panels, but I'd be willing to bet that's the case.

Are you legitimately seeing 80+% CPU usage?


----------



## juniordnz

Quote:


> Originally Posted by *pez*
> 
> You're gaming on 1080p based on what your sig says. That can seriously be a huge factor. And with a 1080, going to 1440p would be a huge step up for you. I understand why a lot of people don't move from 1080p when using high refresh panels, but I'd be willing to bet that's the case.
> 
> Are you legitimately seeing 80+% CPU usage?


Yes, high refresh rates are the cause for all that bottleneck. And the fact that I'm gaming in 1080p just makes it worse. I really don't feel like compromising fluidity for resolution. So upgrading to 1400p is something I'll be doing when I'm able to keep 120+ fps at that resolution (1080 sli maybe). But for now, I'm more into solving this bottleneck issue to be able to play at 1080p/120+fps. And the 4690K gives me a huge bottleneck even in games like Rise of The Tomb Raider (if try to lower graphics to get high FPS). Games like Rainbow Six, Battlefield 1, GTA V, WatchDogs are the worst. And I know those are really CPU bound, but those are the game I enjoy the most right now, and that's why this is so important to me.

I was really decided to wait for cannonlake, but that seems a little too far away for me. Like 1,5 year. And maybe I'll have the oportunity to get a cheap x99+5920K or 6800K soon, so I'm considering that.


----------



## CapKrunch

Hi, I wonder what is the size of screwdriver for screw to remove backplate and fan cooler on msi gtx 1080 gaming x ? I'm waiting on package with waterblock to arrive at my door today. I would like to get everything ready.









Thanks


----------



## netok

I saw a benchmark comparing many custom 1080 cards, but I cannot find it now.
Anyone kindly give me the link? Thanks.


----------



## pez

Quote:


> Originally Posted by *juniordnz*
> 
> Yes, high refresh rates are the cause for all that bottleneck. And the fact that I'm gaming in 1080p just makes it worse. I really don't feel like compromising fluidity for resolution. So upgrading to 1400p is something I'll be doing when I'm able to keep 120+ fps at that resolution (1080 sli maybe). But for now, I'm more into solving this bottleneck issue to be able to play at 1080p/120+fps. And the 4690K gives me a huge bottleneck even in games like Rise of The Tomb Raider (if try to lower graphics to get high FPS). Games like Rainbow Six, Battlefield 1, GTA V, WatchDogs are the worst. And I know those are really CPU bound, but those are the game I enjoy the most right now, and that's why this is so important to me.
> 
> I was really decided to wait for cannonlake, but that seems a little too far away for me. Like 1,5 year. And maybe I'll have the oportunity to get a cheap x99+5920K or 6800K soon, so I'm considering that.


What games are you playing that you wouldn't be at 144 fps on that panel? I imagine if you're playing OW or CS:GO you wouldn't have an issue. I realize other AAA titles are going to be difficult, but I'm curious to know your primary focus(es).


----------



## juniordnz

Quote:


> Originally Posted by *pez*
> 
> What games are you playing that you wouldn't be at 144 fps on that panel? I imagine if you're playing OW or CS:GO you wouldn't have an issue. I realize other AAA titles are going to be difficult, but I'm curious to know your primary focus(es).


Actually, I have set my panel to 120hz so I can use blur reduction. That's why I alweys mention 120+ and not 144+. That's 17% less frames, so it should help a little.

I left CSGO after 1400hrs and now I play mostly competitive R6S and some other single players titles like ROTTR, GTA V, WatchDogs. That's what I'm playing right now. I'm mostly into first person, shooting, fast paced games. Even on third person games like ROTTR, GTA V and WatchDogs I enjoy the fluidity 120fps gives when making fast movements, aiming, everything.

R6S is not a problem, I can get 120+ most of the time. Even though It could get better with more firepower from the CPU. The other games bother me the most. It's like 99/60, 99/70 CPU/GPU usage most of the time.

That's why I'm asking if a 6 core like a 5820K/6800K (which one is better is a whole other topic) would sufice to run everything I throw at it without bottlenecking my 1080.


----------



## pez

Quote:


> Originally Posted by *juniordnz*
> 
> Actually, I have set my panel to 120hz so I can use blur reduction. That's why I alweys mention 120+ and not 144+. That's 17% less frames, so it should help a little.
> 
> I left CSGO after 1400hrs and now I play mostly competitive R6S and some other single players titles like ROTTR, GTA V, WatchDogs. That's what I'm playing right now. I'm mostly into first person, shooting, fast paced games. Even on third person games like ROTTR, GTA V and WatchDogs I enjoy the fluidity 120fps gives when making fast movements, aiming, everything.
> 
> R6S is not a problem, I can get 120+ most of the time. Even though It could get better with more firepower from the CPU. The other games bother me the most. It's like 99/60, 99/70 CPU/GPU usage most of the time.
> 
> That's why I'm asking if a 6 core like a 5820K/6800K (which one is better is a whole other topic) would sufice to run everything I throw at it without bottlenecking my 1080.


Understood. I'm super happy to let g-sync take over on anything under 100Hz on my panel, but OW and CS:GO definitely don't have an issue. GTA V at max definitely gets below 100 along with just about any other AAA title. it's an uphill struggle to take the work off of the CPU for 1080p, but your options are (what you mentioned) going X99 or going SLI. The benefit though is with X99 you do get x16/x16 for SLI if you still decide to do so. But I have to say you'd be the first person I know to have SLI 1080s and use 1080p







.


----------



## juniordnz

Quote:


> Originally Posted by *pez*
> 
> Understood. I'm super happy to let g-sync take over on anything under 100Hz on my panel, but OW and CS:GO definitely don't have an issue. GTA V at max definitely gets below 100 along with just about any other AAA title. it's an uphill struggle to take the work off of the CPU for 1080p, but your options are (what you mentioned) going X99 or going SLI. The benefit though is with X99 you do get x16/x16 for SLI if you still decide to do so. But I have to say you'd be the first person I know to have SLI 1080s and use 1080p
> 
> 
> 
> 
> 
> 
> 
> .


No, no. I would only go for SLI once I upgrade to 1440p. Doing that on 1080p is like killing a cockroach with a grenade (they deserve it, though). That's why I think what is important now is to get rid of the CPU bottleneck so I can squeeze the maximum juice out of this card. I believe I can get at least pretty close to steady 120fps in almost everything by compromising one useless filter or another. But in order to do so I would need a better CPU. Most new titles are scaling very well with the increasing number of cores/hyperthreading, that's why I'm guessing 5820K/6800K would be way more "futureproof" than a 6700K. Didn't knew about the two x16 lanes, that's nice...


----------



## pez

Yep, that's why I considered it at one point, but ultimately I decided to go small again







.


----------



## T-Rez

Quote:


> Originally Posted by *lifeisshort117*
> 
> Something I just thought of for those to try with coil whine:
> 
> Get rivatuner and set fps cap to 60 (or whatever your monitor's cap is)
> 
> The coil whine is gnarly on most cards because frames are so high when they don't need to be. Sure you can brag about it, but then again the coil whine associated with it isn't much to brag about.


True. A few weeks ago I set my fps to 164 using inspector because I have an asus xb271hu predator. This significantly reduced the coil whine to being hardly noticeable at all. Before that when I would hit over 200fps the whine was REALLY loud.
Quote:


> Originally Posted by *Fediuld*
> 
> a) To start, 1.131v is too much to say the least if possible. The significant majority that has aircooled cards, can do 2100 at 1.093v. Watercooled ones doing 2177 at 1.081
> b) I wonder if your 1.131v bios represents truly that voltage or not. It if does, does anyone believe that would work on MSI Armor OC and what are the chances to brick the card?
> 
> ----
> c) Has anyone flashed MSI Armor, with Gaming Z or the Seahawk BIOS?


According to every OC utility I've used (MSI, ASUS, EVGA) I'm hitting 1.131v max. I wonder if that is correct though because I keep hitting the power limit according to those same utilities. The highest clock I can get it to is 1980hz. Anything higher and games start crashing. This is with my temps never going over 76C.

I guess I just lost the silicone lottery big time (as usual lol).


----------



## Fediuld

Quote:


> Originally Posted by *T-Rez*
> 
> True. A few weeks ago I set my fps to 164 using inspector because I have an asus xb271hu predator. This significantly reduced the coil whine to being hardly noticeable at all. Before that when I would hit over 200fps the whine was REALLY loud.
> According to every OC utility I've used (MSI, ASUS, EVGA) I'm hitting 1.131v max. I wonder if that is correct though because I keep hitting the power limit according to those same utilities. The highest clock I can get it to is 1980hz. Anything higher and games start crashing. This is with my temps never going over 76C.
> 
> I guess I just lost the silicone lottery big time (as usual lol).


Use MSI AB only.
Create a curve where you do 2164 @ 1.075 and lock it there (L key). Set fan speed at 100% and test it.


----------



## Synthetickiller

First video card I've bought in 4 years. Can't wait to see if I won the silicon lottery or at least can get 2000mhz, lol.







:thumb:


----------



## tps3443

Quote:


> Originally Posted by *Groo21*
> 
> This is kind of funny because I have the exact opposite problem.
> P
> My Fallout 4 is unplayable (thread pending). My CPU utilization rarely exceeds 40%. I haven't be able to figure it out yet. GPU rarely gets beyond 60%.


My GPU usage is a little higher, because I'm playing with fallout 4, upscaled 4K! If I run 1080p it is around 30-40% I think lol.

The gtx1080 is a crusher though. I run it with +235 Core, and +525 memory. It is a boss.

I hope we get a 22 month life cycle with this gtx1080, like the gtx980 had.


----------



## Bishop07764

Quote:


> Originally Posted by *Synthetickiller*
> 
> First video card I've bought in 4 years. Can't wait to see if I won the silicon lottery or at least can get 2000mhz, lol.
> 
> 
> 
> 
> 
> 
> 
> :thumb:


Hope you got a good one. Mine is pretty happy with 2126 core daily driver clock at stock voltage.


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> Hope you got a good one. Mine is pretty happy with 2126 core daily driver clock at stock voltage.


Thanks!
I'm dying to hook it up, but I'm missing a 180mm fan that arrives tomorrow.
I have 2 DP to DVI converters that should let me run 1440p on all of my qnix 2710 monitors.


----------



## Benjiw

Quote:


> Originally Posted by *juniordnz*
> 
> I see a lot of people with 5820/5830 here. Just tought I should ask:
> 
> Do you see, in ANY scenario, in ANY game, CPU bottleneck with that processor?
> 
> The fact that I bought the 1080 to get 120+ fps and I'm getting seriously bottlenecked by my 4.5ghz 4690K reallt bothers me. I'm considering an upgrade to a 5820K (since i've read it performs better than a 6800K with both overclocked).
> 
> Thanks in advance


Bottle necked? Hmmm, I was going to say could you not get some better cooling but you have a decent cooler but you're in brazil so maybe running your chip naked like mine wouldn't be as helpful.

Here's my benching clocks, but my 24/7 clock is 4.7ghz @ 1.5v RAM 2400mhz 11-12-12-28-1T and I think I started to fine tune the other timings but not 100%.


----------



## StreaMRoLLeR

Any NON OC Strix 1080 owners here ? would like to share some data

Using Stock Voltage %110 PT Obsidian 900D (avg airflow)
+190 MHZ +200 MHZ Memory (games dont care 200 is sweetspot)

At 3440x1440 my Strix 1080 hovers around 1940-1970-1911 in Witcher 3 (all ultra hairworkss off) around MIN 62-60 fps.

In Just Cause 3 card can stay near 2050-2025 and never goes below.Why in earth Witcher 3 so demanding that my card never hits 2000 ( well hits 2k if its not warm staying like 10 seconds).

+208 core i see artifact / black boxes in Witcher 3 but in Heaven card reached 2088mhz (aggressive fan)

I read decent amount of info here and their 1080 generally runs around 65 and 68C MAX. Mine is 71C max auto fan. DID I LOOSE Silicon lottery or not,want to know that please share your data also:thumb:


----------



## juniordnz

Quote:


> Originally Posted by *Benjiw*
> 
> Bottle necked? Hmmm, I was going to say could you not get some better cooling but you have a decent cooler but you're in brazil so maybe running your chip naked like mine wouldn't be as helpful.
> 
> Here's my benching clocks, but my 24/7 clock is 4.7ghz @ 1.5v RAM 2400mhz 11-12-12-28-1T and I think I started to fine tune the other timings but not 100%.


I have a pretty lazy i5. Couldn't get it stable yet at 4.5ghz. Tried 1.32V and stock uncore, no go. Crashed like crazy on GTA V and R6S. But even at the short time I could play with that clock, I still got crazy bottleneck on both games. You're currently gaming with that i5 and a 1080? If you could take a look at some of the games I mentioned (R6S, GTA V, Watch Dogs, Battlefield 1). Of course, the bottleneck happens because I'm trying to get 120+fps on a 1080p panel.


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> I have a pretty lazy i5. Couldn't get it stable yet at 4.5ghz. Tried 1.32V and stock uncore, no go. Crashed like crazy on GTA V and R6S. But even at the short time I could play with that clock, I still got crazy bottleneck on both games. You're currently gaming with that i5 and a 1080? If you could take a look at some of the games I mentioned (R6S, GTA V, Watch Dogs, Battlefield 1). Of course, the bottleneck happens because I'm trying to get 120+fps on a 1080p panel.


What do you have a 6600K?

I can pull a 18,400 overall in FIRESTRIKE with a 25,600 graphics when I'm running at 4.8 CPU, and really pushing my MSI GTX1080 Founders Edition

I think about 93% of all 6600K's will run at 4.68+ GHz.

Here is what my 6600K can do, I Bought it from Silicon Lottery on sale as a 4.8Ghz chip For $229.99.

1.32 Volts is not enough sometimes for 4.5Ghz

I can run mine at,

4.5Ghz=1.345 Volts

4.61Ghz=1.356 Volts

4.7Ghz= 1.404 Volts

4.81Ghz= 1.428 Volts

5.0Ghz= 1.508 Volts

5Ghz will run stable in games, and benchmarks. But, I cannot keep it cool enough for Intel burn test or anything like that lol. It will go from 40C idle to 95C in about 0.2 seconds after clicking "Run burn test"

I run my Uncore at the same speed as my CPU speed, that is usually what is recommended.


----------



## juniordnz

Quote:


> Originally Posted by *tps3443*
> 
> What do you have a 6600K?
> I can pull a 18,400 overall in FIRESTRIKE with a 25,600 graphics when I'm running at 4.8 CPU, and really pushing my MSI GTX1080 Founders Edition
> I think about 93% of all 6600K's will run at 4.68+ GHz.
> Here is what my 6600K can do, I Bought it from Silicon Lottery on sale as a 4.8Ghz chip For $229.99.
> 1.32 Volts is not enough sometimes for 4.5Ghz
> 
> I can run mine at,
> 4.5Ghz=1.345 Volts
> 4.61Ghz=1.356 Volts
> 4.7Ghz= 1.404 Volts
> 4.81Ghz= 1.428 Volts
> 5.0Ghz= 1.508 Volts
> 
> 5Ghz will run stable in games, and benchmarks. But, I cannot keep it cool enough for Intel burn test or anything like that lol. It will go from 40C idle to 95C in about 0.2 seconds after clicking "Run burn test"
> I run my Uncore at the same speed as my CPU speed, that is usually what is recommended.


I have a 4690K. I'll have to fiddle again with overclock as soon as I get my new 1080 here. But with 4.5ghz I could get 16500 on firestrike.


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> I have a 4690K. I'll have to fiddle again with overclock as soon as I get my new 1080 here. But with 4.5ghz I could get 16500 on firestrike.


If I'm not mistaken you had a evga, and you were having issues with it. Which one did you get?

I'm considering a new CPU, I know a modern i5 overclocked is about the minimum for a GTX1080, and my 6600K overclocked is border line even being enough.

Other than Fallout 4 being abnormally CPU hungry, I've never bottlenecked a overclocked GTX1080 in any other games at 1080P.


----------



## juniordnz

Quote:


> Originally Posted by *tps3443*
> 
> If I'm not mistaken you had a evga, and you were having issues with it. Which one did you get?
> 
> I'm considering a new CPU, I know a modern i5 overclocked is about the minimum for a GTX1080, and my 6600K overclocked is border line even being enough.
> 
> Other than Fallout 4 being abnormally CPU hungry, I've never bottlenecked a overclocked GTX1080 in any other games at 1080P.


I had a 1080 FTW. Started showing some green dot artifacts and not being recognized by windows, so i sent it back last thursday. A new one is alreadyon the way and I should get it by friday. Maybe, just maybe, life could compensate all this trouble with a better position in silicon lottery? I can only hope...

Try some BF1 later. Try to get 120+ FPS and keep an eye for CPU and GPU load. You'll see what I'm talking about =(


----------



## Benjiw

Quote:


> Originally Posted by *juniordnz*
> 
> I have a pretty lazy i5. Couldn't get it stable yet at 4.5ghz. Tried 1.32V and stock uncore, no go. Crashed like crazy on GTA V and R6S. But even at the short time I could play with that clock, I still got crazy bottleneck on both games. You're currently gaming with that i5 and a 1080? If you could take a look at some of the games I mentioned (R6S, GTA V, Watch Dogs, Battlefield 1). Of course, the bottleneck happens because I'm trying to get 120+fps on a 1080p panel.


Keep uncore close to core speed to help stability.


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> I had a 1080 FTW. Started showing some green dot artifacts and not being recognized by windows, so i sent it back last thursday. A new one is alreadyon the way and I should get it by friday. Maybe, just maybe, life could compensate all this trouble with a better position in silicon lottery? I can only hope...
> 
> Try some BF1 later. Try to get 120+ FPS and keep an eye for CPU and GPU load. You'll see what I'm talking about =(


I cannot play BF1 beta anymore. It says it is expired. I try to get it working.


----------



## juniordnz

Quote:


> Originally Posted by *Benjiw*
> 
> Keep uncore close to core speed to help stability.


Really? I alçways thought it was the other way around (high uncore cripples core overclocking)


----------



## Benjiw

Quote:


> Originally Posted by *juniordnz*
> 
> Really? I alçways thought it was the other way around (high uncore cripples core overclocking)


Uncore increase has helped quite a few people get their clocks stable, I think uncore might lower your memory overclocks as it's speeding up the NB frequency that helps ram communicate with the controller on your chip but I'm not 100% as I'm new to intel.

I replied to you in the haswell overclocking thread also btw.


----------



## Synthetickiller

Did anyone see the 30th anniversary edition of MSI's 1080?

https://www.ekwb.com/wp-content/uploads/2016/09/GeForce-GTX-1080-30th-Anniversary-V336_2D.jpg

http://www.guru3d.com/news-story/msi-releases-limited-edition-geforce-gtx-1080-30th-anniversary-graphics-card.html


----------



## AllGamer

Quote:


> Originally Posted by *Synthetickiller*
> 
> Did anyone see the 30th anniversary edition of MSI's 1080?
> 
> https://www.ekwb.com/wp-content/uploads/2016/09/GeForce-GTX-1080-30th-Anniversary-V336_2D.jpg
> 
> http://www.guru3d.com/news-story/msi-releases-limited-edition-geforce-gtx-1080-30th-anniversary-graphics-card.html


Interesting, yet another EK version.

This is similar to the Corsair version but with EK AIO instead of Corsair AIO

can't really say I'm a fan of it, but it's a good solution for people that want water in a budget.

..been there done that, and I outgrew it, after learning of all the annoying limitations of the AIO devices.


----------



## boredgunner

^ It's gorgeous. So it has a full coverage block, and what amounts to an EK Predator 120 system. Not bad. I would like the block separately.


----------



## AllGamer

Quote:


> Originally Posted by *boredgunner*
> 
> ^ It's gorgeous. So it has a full coverage block, and what amounts to an EK Predator 120 system. Not bad. I would like the block separately.


That's true, it's a very sexy card, If I didn't have a custom water loop, I'd have been jumping all over it already.

Just like when they released the EK Sea Hawk 1080 for custom water loops


----------



## Groo21

Any experienced graphics coders in the house?

I took some traces for my current microStutter issues on fallout 4 in windows 10 with a GTX 1080

It looks like the GPU stops working on the fallout4 process while there are still frames in the context queue. It then restarts at the next vsync frame on the frame that was left in the device context.

That doesn't seem right.



Full-Res: Here

EDIT: In looking at the trace more, a context change causes the GPU to stop, and it never comes back.


----------



## grimboso

Quote:


> Originally Posted by *Synthetickiller*
> 
> Thanks!
> I'm dying to hook it up, but I'm missing a 180mm fan that arrives tomorrow.
> I have 2 DP to DVI converters that should let me run 1440p on all of my qnix 2710 monitors.


Active version? Where did you get those? All active DVI-DP i've found have been in the 100$-range








Passive adapters are to my knowledge unusable (and was for me on both the Qnix and the Crossover I've had).


----------



## shadow85

does overclocking the memory on our GTX 1080s increase fps in 4K games at all?


----------



## juniordnz

Quote:


> Originally Posted by *Benjiw*
> 
> Uncore increase has helped quite a few people get their clocks stable, I think uncore might lower your memory overclocks as it's speeding up the NB frequency that helps ram communicate with the controller on your chip but I'm not 100% as I'm new to intel.
> 
> I replied to you in the haswell overclocking thread also btw.


I'll definetly try that. Thanks for the tip.


----------



## Cornerer

Don't know if this has been posted b4 but just a heads up for Palit/Gainward users:

https://www.ekwb.com/news/new-ek-water-blocks-for-multiple-palit-and-gainward-graphics-cards/

Also I'm now stomping my feet for didn't able to hold longer with my 1080 purchase until this comes out:

http://wccftech.com/msi-announces-gtx-1080-30th-anniversary-edition-limited-edition-gpu-decorated-waterblock/


----------



## GreedyMuffin

Been gaming on 200x mhz at 0.900V. Max peak TDP is 91.


----------



## Synthetickiller

Quote:


> Originally Posted by *grimboso*
> 
> Active version? Where did you get those? All active DVI-DP i've found have been in the 100$-range
> 
> 
> 
> 
> 
> 
> 
> 
> Passive adapters are to my knowledge unusable (and was for me on both the Qnix and the Crossover I've had).


I haven't tried the ones I ordered. Hopefully, it'll be up & running tonight. I just don't want to say that they work before they work, lol.

Quote:


> Originally Posted by *Cornerer*
> 
> Don't know if this has been posted b4 but just a heads up for Palit/Gainward users:
> 
> https://www.ekwb.com/news/new-ek-water-blocks-for-multiple-palit-and-gainward-graphics-cards/
> 
> Also I'm now stomping my feet for didn't able to hold longer with my 1080 purchase until this comes out:
> 
> http://wccftech.com/msi-announces-gtx-1080-30th-anniversary-edition-limited-edition-gpu-decorated-waterblock/


Yeah, on the previous page.








I wonder how the setup varies from the Seahawk X in terms of temps, cost & if it's possible to expand the loop (looks like it, since it has a filler port).


----------



## redshoulder

Msi 30th only £999 at Overclockers UK!


----------



## GreedyMuffin

I wonder how the stock FE cooler would handle 1080 at 0.900V and 2ghz. Max TDP peak is currently at 91%. That is about 160 watts.

Would be fun to test, but if I need to drain the loop etc.. :I


----------



## Cornerer

Quote:


> Originally Posted by *redshoulder*
> 
> Msi 30th only £999 at Overclockers UK!


This actually isn't too bad compare to price of Zotac ArcticStorm with EK "AIO" missing.


----------



## AllGamer

Quote:


> Originally Posted by *redshoulder*
> 
> Msi 30th only £999 at Overclockers UK!


is that a good thing? or a bad thing?

not sure what are considered "normal" prices across the pond.


----------



## chantruong

Has anyone here tried 1080 sli in x16/x8 mode? How's the performance compared to x16/x16 and can you use a HB bridge in x16/x8? I have 5820k and would prefer not to upgrade to 40 lane cpu. I usually game at 5760x1080 if resolution matters. The article about the titan xp at x16/x16 vs x8/x8 made me think about this.

Thanks,


----------



## Benjiw

Quote:


> Originally Posted by *Groo21*
> 
> Any experienced graphics coders in the house?
> 
> I took some traces for my current microStutter issues on fallout 4 in windows 10 with a GTX 1080
> 
> It looks like the GPU stops working on the fallout4 process while there are still frames in the context queue. It then restarts at the next vsync frame on the frame that was left in the device context.
> 
> That doesn't seem right.
> 
> 
> 
> Full-Res: Here
> 
> EDIT: In looking at the trace more, a context change causes the GPU to stop, and it never comes back.


Your CPU is at stock, there's the issue. Fallout 4 is a cpu bound game.


----------



## Bishop07764

Quote:


> Originally Posted by *Cornerer*
> 
> This actually isn't too bad compare to price of Zotac ArcticStorm with EK "AIO" missing.


Dang. My pick might have been different had this been around when I bought my 1080. It would have been significantly cheaper than the seahawk ek and flashing the AMP extreme bios onto it would likely bring the default boost up over 2ghz. Funny, my max stable tested clocks are exactly what his were. Oh well.


----------



## juniordnz

Well, after exactly 1 week since I took my defective 1080FTW to RMA, I got my new one today. I think I got a little better at silicon loterry with this one, got 2 clocks more out of it than my previous one.

Old one would make 2000mhz out of the box and 2113mhz overclocked at stock voltage (with new drivers, previously did 2126mhz). The new one made 2025mhz out of the box and 2138mhz overclocked at stock voltage. Got the same +500 on the memory, though.

Won't be overclocking, though. No point doing so if my 4690K is bottlenecking me in almost everything I play.

That's a least a nice compensation for my 1 week gaming hiatus.


----------



## tps3443

I'm considering one of those 30th anniversaries MSI cards. I like my video card to be fairly quiet, and I am at the point now, where I stopped overclocking my MSI GTX1080 Founders Edition all together. It makes it scream all the time, and I've been running in to bugs with Precision XOC 6.06, and MSI AB beta 4. They both force 3D clock speeds in 2D mode constantly. And, they will not start with windows.

I've found the card to run about 58C at auto default fan profile, and it boost to, 1911Mhz Core and the memory is only 10,000 of course but, until I get a real 4K panel, or it least 1440P. Or a Ultra wide. 1080P just cannot push the card to even heat up at all!

This is just totally overkill on 1080P lol.

I think 3440x1440 is about perfect for a GTX1080, as the performance is about 35% faster than real 4K or UHD. So avg fps are 70-100 which is amazing performance!

A GTX1080 Overclocked would be good for a 1080P HDR display with 240HZ refresh rate ! Lmao.

Are HDR displays coming out soon yet?

Should I keep waiting on a display?


----------



## pLuhhmm

hey does anyone know how to properly manually config Pascal voltage graphs when overclocking? Specifically with the Gigabyte overclocking software?


----------



## juniordnz

Quote:


> Originally Posted by *pLuhhmm*
> 
> hey does anyone know how to properly manually config Pascal voltage graphs when overclocking? Specifically with the Gigabyte overclocking software?


That's a terrible graph, lots of points missing. Afterburner graph is a lot more precise, you should use that. What's so special about GB software anyway? Anything totally necessary?


----------



## ShortySmalls

Quote:


> Originally Posted by *juniordnz*
> 
> That's a terrible graph, lots of points missing. Afterburner graph is a lot more precise, you should use that. What's so special about GB software anyway? Anything totally necessary?


It's the only way to control the lighting on gigabytes 1080's that I found before i water-cooled mine and got rid of the light all together.


----------



## Synthetickiller

Quick question for everyone here.
I've benchmarking with heaven 4.0 Basic (can't run firestrike or anything else till I figure out exactly why my CPU sit at 90C when doing Prime95 when ater cooled:thumbsdow).
The Memory & Ram speed in heaven does not match what afterburner 4.3.0 Beta 14 says. The difference is about about 40mhz gpu & 8mhz ram clock. Minor stuff, but I'm a little confused.

Here's what afterburner is giving me.


Also, here's a nifty physical comparison of a WC'd GTX 690 & the MSI GTX 1080 EK X:



Quote:


> Originally Posted by *grimboso*
> 
> Active version? Where did you get those? All active DVI-DP i've found have been in the 100$-range
> 
> 
> 
> 
> 
> 
> 
> 
> Passive adapters are to my knowledge unusable (and was for me on both the Qnix and the Crossover I've had).


The short story is that they work.
The long story is that's not exactly the case when OC'ing. My QNIX QX2710s can push about 72hz with these active converters if I'm lucky, but things get a little wonky. The direct DVI connection allows me to OC the monitor. I run anywhere from 96hz to 110hz. Sometimes, 120hz is stable, but not enough that it's worth perusing. Anything over 120hz isn't stable at all.

I used the GoFranco adapter after reading a blog entry about them.
Most seem to only allow 1920x1080/1200. These have absolutely no issue pushing 1440p @ 60hz (72hz if you're lucky). I try to run 72hz/96hz to watch movies & what not to avoid that annoying jutter effect.


----------



## LiquidHaus

what I wanna know is if a version of gpu-z will come out that'll show ASIC.

not sure if this new architecture is somehow locked showing ASIC quality just like it is limiting the bios so much.


----------



## Cornerer

Quote:


> Originally Posted by *tps3443*
> 
> I've found the card to run about 58C at auto default fan profile, and it boost to, 1911Mhz Core and the memory is only 10,000 of course but, until I get a real 4K panel, or it least 1440P. Or a Ultra wide. 1080P just cannot push the card to even heat up at all!
> 
> This is just totally overkill on 1080P lol.
> 
> I think 3440x1440 is about perfect for a GTX1080, as the performance is about 35% faster than real 4K or UHD. So avg fps are 70-100 which is amazing performance!
> 
> A GTX1080 Overclocked would be good for a 1080P HDR display with 240HZ refresh rate ! Lmao.
> 
> Are HDR displays coming out soon yet? Should I keep waiting on a display?


I know I belong to minority but I'm the kind of nerd who can tell the difference between 4k and 1440p DSR game quality easily on 1080P. Things look awful to my eyes even in 1440P with no DSR.

Even a single GTX 1080 does have serious trouble in making good use of 1080p 144Hz if you're aiming for ultimate downsampling eye candy as well.


----------



## KickAssCop

So what do you guys think of 2050/10500 daily clocks on air default fan curve for ASUS Strix in SLi. Cards remain silent and hit 75 C top card and 65 C bottom card.


----------



## Synthetickiller

I don't think this is a really solid clocking card. 2100mhz seems to be impossible. I can get 2050/2088 (gpuz & heaven report different speeds). The memory on this thing is crazy though. +625mhz using afterburner is a cake walk, but 650mhz fails. I can still push 5500mhz ram, easily.

I was originally having issues using 3 monitors hooked up while running heaven on the primary. The 2nd & 3rd used a DP to DVI active adapter by gofranco. The adapters work, but seem just a hair glitchy. Moving to single monitor (this benchmark) dramatically improved stability & I gained 2 or 3 fps.

Decent score?


----------



## juniordnz

That's with a seahawk EK? And here I thought MSI would use some better gpus on those...

must be a boomer to have all that cooling and such an average oc.

Looks awesome, though. And you won't loose a single clock to thermal throttle.


----------



## Joenc

Hey juniordnz...

Have you noticed any physical difference from old to new rma gpu ?

I was getting ready to get a 1070/1080 evga but after reading all the problems

on their website , I'm waiting for a bit...

Have you tested the new card with witcher 3 ..

thanks...

Quote:



> Originally Posted by *juniordnz*
> 
> Well, after exactly 1 week since I took my defective 1080FTW to RMA, I got my new one today. I think I got a little better at silicon loterry with this one, got 2 clocks more out of it than my previous one.
> 
> Old one would make 2000mhz out of the box and 2113mhz overclocked at stock voltage (with new drivers, previously did 2126mhz). The new one made 2025mhz out of the box and 2138mhz overclocked at stock voltage. Got the same +500 on the memory, though.
> 
> Won't be overclocking, though. No point doing so if my 4690K is bottlenecking me in almost everything I play.
> 
> That's a least a nice compensation for my 1 week gaming hiatus.


----------



## Synthetickiller

Quote:


> Originally Posted by *juniordnz*
> 
> That's with a seahawk EK? And here I thought MSI would use some better gpus on those...
> 
> must be a boomer to have all that cooling and such an average oc.
> 
> Looks awesome, though. And you won't loose a single clock to thermal throttle.


Yeah, it's the EK X... I'm half way tempted to move to a 1080Ti when that drops or hold out for volta. I don't get why MSI wouldn't bin at least 2100mhz+ gpus into this thing.
I never plan on air cooling again, since my loop can support anything I will realistically throw at it. That's why I went with this. Maybe I should sell it & try my luck at a zotac, lol. That was my original choice (or WC an FE), but they were OOS (or the price was higher) & the price on the EK X dropped $50, so I jumped on it.

Compared to my 690, the 1080 runs everything at higher settings at 1440p (vs 1080p) and at double the frame rate. I'm used to upgrading GPUs yearly or every 2 years. I'm impressed how my 690 held on, but the 4 year gap really makes the jump feel enormous. Realistically, I don't need to even overclock this card, but I just hate spending that kind of cash on a card that barely breaks 2000mhz.

The temps are insane in Heaven. Everything maxed at 1440p it doesn't break 44C.
I don't have coil whine either. I take that back, when exiting Heaven benchmark, I get it momentarily & it's seriously loud. Otherwise, during benchmarking, desktop/2D or gaming, I hear nothing.

I need to try to voltage curve method for OCing. Maybe I can get this guy to 2100mhz. If it's compatible, maybe the MSI Gaming Z bios would be better & allow me to eek out that extra 50mhz+?


----------



## juniordnz

Quote:


> Originally Posted by *Synthetickiller*
> 
> Yeah, it's the EK X... I'm half way tempted to move to a 1080Ti when that drops or hold out for volta. I don't get why MSI wouldn't bin at least 2100mhz+ gpus into this thing.
> I never plan on air cooling again, since my loop can support anything I will realistically throw at it. That's why I went with this. Maybe I should sell it & try my luck at a zotac, lol. That was my original choice (or WC an FE), but they were OOS (or the price was higher) & the price on the EK X dropped $50, so I jumped on it.
> Compared to my 690, the 1080 runs everything at higher settings at 1440p (vs 1080p) and at double the frame rate. I'm used to upgrading GPUs yearly or every 2 years. I'm impressed how my 690 held on, but the 4 year gap really makes the jump feel enormous. Realistically, I don't need to even overclock this card, but I just hate spending that kind of cash on a card that barely breaks 2000mhz.
> The temps are insane in Heaven. Everything maxed at 1440p it doesn't break 44C.
> I don't have coil whine either. I take that back, when exiting Heaven benchmark, I get it momentarily & it's seriously loud. Otherwise, during benchmarking, desktop/2D or gaming, I hear nothing.
> 
> I need to try to voltage curve method for OCing. Maybe I can get this guy to 2100mhz. If it's compatible, maybe the MSI Gaming Z bios would be better & allow me to eek out that extra 50mhz+?


You can try that, won't hurt since all custom MSI cards share the same PCB/Power construction. But from my experience, that won't raise your maximum overclock. I had an 1080 Armor and flashing GamingZ bios only raised my base/boost clock (as expected), max overclock stayed exactly the same.

I heard Zotac's waterblock version overclocks like a champ, maybe you can return the MSI you got and get that?
Quote:


> Originally Posted by *Joenc*
> 
> Hey juniordnz...
> Have you noticed any physical difference from old to new rma gpu ?
> 
> I was getting ready to get a 1070/1080 evga but after reading all the problems
> on their website , I'm waiting for a bit...
> 
> Have you tested the new card with witcher 3 ..
> thanks...


Physical? Only in the box contents. The plastic that wraps the card had EVGA's Logo with the first card, and a generic, not branded, one with the new card. Also, the old card came in a foam like container inside the box, as the new one came a rigid plastic one. The card itself is exactly the same.


----------



## Nightingale

Quote:


> Originally Posted by *Synthetickiller*
> 
> Quick question for everyone here.
> I've benchmarking with heaven 4.0 Basic (can't run firestrike or anything else till I figure out exactly why my CPU sit at 90C when doing Prime95 when ater cooled:thumbsdow).
> The Memory & Ram speed in heaven does not match what afterburner 4.3.0 Beta 14 says. The difference is about about 40mhz gpu & 8mhz ram clock. Minor stuff, but I'm a little confused.
> 
> Here's what afterburner is giving me.
> 
> 
> Also, here's a nifty physical comparison of a WC'd GTX 690 & the MSI GTX 1080 EK X:
> 
> 
> The short story is that they work.
> The long story is that's not exactly the case when OC'ing. My QNIX QX2710s can push about 72hz with these active converters if I'm lucky, but things get a little wonky. The direct DVI connection allows me to OC the monitor. I run anywhere from 96hz to 110hz. Sometimes, 120hz is stable, but not enough that it's worth perusing. Anything over 120hz isn't stable at all.
> 
> I used the GoFranco adapter after reading a blog entry about them.
> Most seem to only allow 1920x1080/1200. These have absolutely no issue pushing 1440p @ 60hz (72hz if you're lucky). I try to run 72hz/96hz to watch movies & what not to avoid that annoying jutter effect.


Just wanted to compliment you on that beautiful water block. I also am water cooling my pascal 1070(can't afford 1080). It's so nice never having to deal with clock frequency fluctuation due to temperature on water.


----------



## Bishop07764

Quote:


> Originally Posted by *KickAssCop*
> 
> So what do you guys think of 2050/10500 daily clocks on air default fan curve for ASUS Strix in SLi. Cards remain silent and hit 75 C top card and 65 C bottom card.


For air and SLI, that sounds pretty good especially for default fan profile.
Quote:


> Originally Posted by *Synthetickiller*
> 
> Yeah, it's the EK X... I'm half way tempted to move to a 1080Ti when that drops or hold out for volta. I don't get why MSI wouldn't bin at least 2100mhz+ gpus into this thing.
> I never plan on air cooling again, since my loop can support anything I will realistically throw at it. That's why I went with this. Maybe I should sell it & try my luck at a zotac, lol. That was my original choice (or WC an FE), but they were OOS (or the price was higher) & the price on the EK X dropped $50, so I jumped on it.
> 
> Compared to my 690, the 1080 runs everything at higher settings at 1440p (vs 1080p) and at double the frame rate. I'm used to upgrading GPUs yearly or every 2 years. I'm impressed how my 690 held on, but the 4 year gap really makes the jump feel enormous. Realistically, I don't need to even overclock this card, but I just hate spending that kind of cash on a card that barely breaks 2000mhz.
> 
> The temps are insane in Heaven. Everything maxed at 1440p it doesn't break 44C.
> I don't have coil whine either. I take that back, when exiting Heaven benchmark, I get it momentarily & it's seriously loud. Otherwise, during benchmarking, desktop/2D or gaming, I hear nothing.
> 
> I need to try to voltage curve method for OCing. Maybe I can get this guy to 2100mhz. If it's compatible, maybe the MSI Gaming Z bios would be better & allow me to eek out that extra 50mhz+?


I have the EK X myself flashed to the gaming Z BIOS. I thought it helped my max overclocks initially, but I honestly didn't test the stock BIOS for very long at all. I would try it because it's the same PCB. You can always flash back. Heads up that your power limit in afterburner will drop to 107. Mines never even come close to the limit even at 2164 core.


----------



## Synthetickiller

Quote:


> Originally Posted by *juniordnz*
> 
> You can try that, won't hurt since all custom MSI cards share the same PCB/Power construction. But from my experience, that won't raise your maximum overclock. I had an 1080 Armor and flashing GamingZ bios only raised my base/boost clock (as expected), max overclock stayed exactly the same.
> 
> I heard Zotac's waterblock version overclocks like a champ, maybe you can return the MSI you got and get that?
> Physical? Only in the box contents. The plastic that wraps the card had EVGA's Logo with the first card, and a generic, not branded, one with the new card. Also, the old card came in a foam like container inside the box, as the new one came a rigid plastic one. The card itself is exactly the same.


Yeah, I doubt it would help. I figured I'd ask. It's not worth going to the trouble.
I bought the card from Amazon. Returning it shouldn't be a problem.
The price on the Zotac has gone up $40. It's still cheaper than the MSI.

Quote:


> Originally Posted by *Nightingale*
> 
> Just wanted to compliment you on that beautiful water block. I also am water cooling my pascal 1070(can't afford 1080). It's so nice never having to deal with clock frequency fluctuation due to temperature on water.


Thanks. It is a nice looking card. I can't say it's my top pick. I actually prefer what Zotac has done w/ their block more. The Heatkiller IV is also an awesome block. As much as I like the EK, I'd rate it the lowest in overall aesthetics.

Quote:


> Originally Posted by *Bishop07764*
> 
> For air and SLI, that sounds pretty good especially for default fan profile.
> I have the EK X myself flashed to the gaming Z BIOS. I thought it helped my max overclocks initially, but I honestly didn't test the stock BIOS for very long at all. I would try it because it's the same PCB. You can always flash back. Heads up that your power limit in afterburner will drop to 107. Mines never even come close to the limit even at 2164 core.


I have the core voltage set to 99% or 100%. Power limit is at 121%.
I've tried the sliders & the curve overclocking method (there really isn't much of a difference unless you set a custom curve). I can't increase my clocks more than 110mhz. The memory OC'd like a beast, hitting 625. I could get 650, but felt that it might cause stability issues. 11,250mhz vs 10,000 is a decent jump. I saw an increase in about 5fps in Heaven from that alone. I know that memory speed doesn't provide a real increase vs core clocks, so I don't see much of a point in keeping the card.

Since Amazon has such an easy return policy, what would you guys do?
I'm torn between getting an EVGA card & buying a Heatkiller IV block vs the Zotac Arctic Storm. The Zotac is probably $50 cheaper. I'm leaning towards the Zotac.


----------



## MrTOOSHORT

@Synthetickiller

MSI FE bios worked best for me on the EK X Sea Hawk. Got more per clock that the original bios.


----------



## juniordnz

Quote:


> Originally Posted by *Synthetickiller*
> 
> Since Amazon has such an easy return policy, what would you guys do?
> I'm torn between getting an EVGA card & buying a Heatkiller IV block vs the Zotac Arctic Storm. The Zotac is probably $50 cheaper. I'm leaning towards the Zotac.


I'd return it and get the zotac. It seems to overclock very well.


----------



## VSG

Please don't, from what I have seen that block has mixed aluminum and copper segments. Jay never takes coolers apart and this is one case where it can be harmful in the long run.


----------



## Synthetickiller

Quote:


> Originally Posted by *geggeg*
> 
> Please don't, from what I have seen that block has mixed aluminum and copper segments. Jay never takes coolers apart and this is one case where it can be harmful in the long run.


As per Zotac:
https://www.zotac.com/news/when-you-combine-game-changing-graphics-fearless-cooling
Quote:


> Our ArcticStorm combines the best of Pascal architecture with the latest in cooling technology. This new solution powers your play with a direct copper contact block containing .3mm microchannels, connected to an aluminum block. This solution is able to maximize heat dissipation while eliminating a lot of the weight. We've also brought back the fan-favorite all-metal wraparound backplate for that extra protection that has come to be expected from our premium line of graphics cards.


I realize that mixing metals with water causes electrolysis. I'm not familiar with direct metal to metal electrolysis & how that would effect a loop.
I'm in no way saying it wouldn't; I'm just claiming ignorance here. Usually, every instance of electrolysis I've about & experienced in real life involved water or a buffer.
Either way, it's not good at all! I agree, this can be really bad long term, especially in higher humidity environments for those who don't upgrade every 6 months to a year.

https://www.copper.org/applications/architecture/arch_dhb/technical-discussion/fundamentals/arch_considerations.html
Quote:


> It is not necessary to isolate copper from lead, tin or stainless steel under most circumstances. The principal metals of concern in terms of direct contact are aluminum and zinc. Iron and steel are generally not a problem unless their mass is small in comparison to the copper.
> 
> If paints or coatings are used for isolation, they must be compatible with both metals. Bituminous or zinc chromate primers can be used between copper and aluminum. Either of these or a red lead primer can be effective in separating copper from iron and other ferrous metals.
> 
> Taping or gasketing with nonabsorptive materials or sealants are effective methods of separating copper from all other metals. In areas with severe exposure, lead or similar gasketing materials should be used, except between copper and aluminum.
> 
> Regardless of the method used to separate the metals, wash from copper surfaces should be prevented from draining onto exposed aluminum. Traces of copper salts in the wash may accelerate corrosion of the aluminum.


Shesh!


----------



## juniordnz

I'm completely noob to waterblocks. The problem is aluminum in contact with copper? The block should be 100% copper?


----------



## Synthetickiller

Quote:


> Originally Posted by *juniordnz*
> 
> I'm completely noob to waterblocks. The problem is aluminum in contact with copper? The block should be 100% copper?


It's chemistry (basic chemistry, but I didn't want to use basic to sound like a know it all).
Here's some info: https://en.wikipedia.org/wiki/Electrolysis

In this case, electrons transfer from copper to aluminum and vise versa. This causes degradation & corrosion of each material. If it somehow contaminates the loop, everything will corrode.

The block should be all copper. It's a cost cutting measure.
It's the same BS we've seen in Toyota Prius batteries where the copper blocks are connected with aluminum plates! They corrode & fail in high humidity environments.


----------



## juniordnz

Quote:


> Originally Posted by *Synthetickiller*
> 
> It's chemistry (basic chemistry, but I didn't want to use basic to sound like a know it all).
> Here's some info: https://en.wikipedia.org/wiki/Electrolysis
> 
> In this case, electrons transfer from copper to aluminum and vise versa. This causes degradation & corrosion of each material. If it somehow contaminates the loop, everything will corrode.
> 
> The block should be all copper. It's a cost cutting measure.
> It's the same BS we've seen in Toyota Prius batteries where the copper blocks are connected with aluminum plates! They corrode & fail in high humidity environments.


Yeah, I sucked in chemistry back in school...

That sucks. I wonder how companies are willing to put something with such a high possibility to "go south". Imagine wrecking a whole custom loop because of that?


----------



## LiquidHaus

Quote:


> Originally Posted by *geggeg*
> 
> Please don't, from what I have seen that block has mixed aluminum and copper segments. Jay never takes coolers apart and this is one case where it can be harmful in the long run.


yup yup.

me and a coworker are still attempting to find out exactly what the block is made out of. never seen a black coating before, and Zotac is still beating around the bush with me asking them what it is.

it's this exact reason I chose the evga ftw card over the arctic storm. I'd rather not risk my entire loop over one card.

that block will grenade itself the first time someone runs just distilled water.


----------



## Synthetickiller

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> @Synthetickiller
> 
> MSI FE bios worked best for me on the EK X Sea Hawk. Got more per clock that the original bios.


I'm trying to flash it.
I have the bios, have the latest version of nvflash. I can't seem to get it to work.

I followed a few guides (the guides for the 980ti flashing as well as fermi cards). I continually get "nvflash" is not recognized as an internal or external command...
I tried to guide listed in this thread too: http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/4620#post_25414919
I've moved the folder to the desktop, moved the files (program & bios rom) to the desktop itself as some people have done that to overcome this issue.

Assuming the program & rom is in the nvflash folder in the C drive:

Disable the gtx 1080 in device manager (in win 10 x64)
Open CMD as admin
type "cd c:\nvflash
Then type any of the follow: "nvflash -i0 --protectoff" "nvflash -6 bios.com" "nvflash -i0 bios.rom" (yes, i'm typing the name of the rom, not bios.rom, lol)
Thoughts?
I know this is very, very basic, but command prompts have always hated me, lol.


----------



## MrTOOSHORT

Use the nvflash from this thread:

*http://forum.hwbot.org/showthread.php?t=159025*

Stick the FE bios inside the nvflash folder and put that folder in :C:

open command prompt and type *cd c:\nvflash
*
press enter and then

nvflash --save "your rom"

now type

nvflash -6 "MSI FE bios"

press enter, exit and reboot.

That's all I know, hope it works for you.


----------



## Synthetickiller

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Use the nvflash from this thread:
> 
> *http://forum.hwbot.org/showthread.php?t=159025*
> 
> Stick the FE bios inside the nvflash folder and put that folder in :C:
> 
> open command prompt and type *cd c:\nvflash
> *
> press enter and then
> 
> nvflash --save "your rom"
> 
> now type
> 
> nvflash -6 "MSI FE bios"
> 
> press enter, exit and reboot.
> 
> That's all I know, hope it works for you.


Thanks dude!
Worked like a charm. Unfortunately, my stability isn't increasing at all. It still fails at the exact same clocks. I'm running heaven at stock & at the same stable OC's I got before just to compare scores.

I'm now noticing coil whine on this card under heavy loads, not just at the exit screen for heaven... lol. Just my luck!

I'm tempted to try the XOC bios since I have more cooling than I need.


----------



## fewness

Haven't come here for a while...has anyone figured out how Nvidia is controlling 3-way SLI and limiting it to 3DMark?


----------



## Synthetickiller

Quote:


> Originally Posted by *fewness*
> 
> Haven't come here for a while...has anyone figured out how Nvidia is controlling 3-way SLI and limiting it to 3DMark?


SLI in the 1070s and 1080s (assuming titan x and 1080ti) are limited to 2-way sli, other than benchmarking.
If a developer wanted to code for it, they can, but it's not officially supported. Plus there's only 2 way bridges available now.


----------



## LiquidHaus

For those interested in the Zotac 1080 Arctic Storm, finally got an answer about the block material:



That being said, the aluminum is coated pretty heavily. Not sure with what though. But the black on it does look good.

Just thought I'd share this info I got from them.


----------



## juniordnz

Is it possible that the thermal throttle is specific to each card? I'm pretty sure my last FTW would get it's first clock down at 39ºC. Now I'm getting it at 48ºC. Could it be related to GPU qualitiy? Because this new one is a little better.

Also, this BIOS came pretty different from the old one. The new one has the peak clock on 1.031V and then it flatlines up to 1.062 and above, while the old one would go up gradually and flatlines onluy after 1.062V.


----------



## Synthetickiller

Quote:


> Originally Posted by *lifeisshort117*
> 
> For those interested in the Zotac 1080 Arctic Storm, finally got an answer about the block material:
> 
> That being said, the aluminum is coated pretty heavily. Not sure with what though. But the black on it does look good.
> 
> Just thought I'd share this info I got from them.


From what I've read in the last 2 minutes, so long as the coating doesn't interact with either metal, it should block any form of corrosion. It's basically acting as an electrical insulator. I wish they'd tell us what the insulator is, but it's probably a "trade secret," lol.

That's tempting though. I'm not really worried.... now the question is, what to do? lol.


----------



## pantsoftime

If it's black then the aluminum has been anodized. If the copper is nickel coated there is nothing to fear. Anodized aluminum and nickel plated copper will not have a galvanic coupling issue so long as the coating is not damaged (nicks / scratches) on the areas that are touching.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> Is it possible that the thermal throttle is specific to each card? I'm pretty sure my last FTW would get it's first clock down at 39ºC. Now I'm getting it at 48ºC. Could it be related to GPU qualitiy? Because this new one is a little better.
> 
> Also, this BIOS came pretty different from the old one. The new one has the peak clock on 1.031V and then it flatlines up to 1.062 and above, while the old one would go up gradually and flatlines onluy after 1.062V.


I didn't even realize your card died, mine has been running strong thank goodness. Did you check the BIOS version and see if its any different then the original one?


----------



## Cornerer

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> @Synthetickiller
> 
> MSI FE bios worked best for me on the EK X Sea Hawk. Got more per clock that the original bios.


May I ask what BIOSes you tested so far?


----------



## Madness11

Guys hello Some one have problem with new drivers (372.70) and g-sync monitor ? Its not working







you have same issue ?


----------



## Madness11

and gtx 1080


----------



## TWiST2k

Quote:


> Originally Posted by *Madness11*
> 
> Guys hello Some one have problem with new drivers (372.70) and g-sync monitor ? Its not working
> 
> 
> 
> 
> 
> 
> 
> you have same issue ?


Quote:


> Originally Posted by *Madness11*
> 
> and gtx 1080


And you couldn't even edit your first post lol. I have a 1080 FTW with an Asus PG279Q on Windows 10 and everything works perfect. I would give this amazing new tool called Google a try and see if maybe someone else with a configuration similar to your own has had similar issues.


----------



## Madness11

Send me please your settings from NV control panel







cheer


----------



## TWiST2k

Quote:


> Originally Posted by *Madness11*
> 
> Send me please your settings from NV control panel
> 
> 
> 
> 
> 
> 
> 
> cheer


http://bfy.tw/7kOV


----------



## Madness11

No dude i mean your settings







You turn on v-sync in Nv panel or no ?


----------



## TWiST2k

Quote:


> Originally Posted by *Madness11*
> 
> No dude i mean your settings
> 
> 
> 
> 
> 
> 
> 
> You turn on v-sync in Nv panel or no ?


I have my refresh rate set to 144 in my control panel and I turn on vsync inside of the game settings.


----------



## Madness11

And you dont have input lag in game ? i mean Dota lol cs:go ??)


----------



## Bishop07764

Quote:


> Originally Posted by *lifeisshort117*
> 
> For those interested in the Zotac 1080 Arctic Storm, finally got an answer about the block material:
> 
> 
> 
> That being said, the aluminum is coated pretty heavily. Not sure with what though. But the black on it does look good.
> 
> Just thought I'd share this info I got from them.


Thanks for the confirmation. It would definitely make me nervous using it in my loop. I use distilled plus some PT nuke. Didn't realize the Arctic had aluminum.


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> Thanks for the confirmation. It would definitely make me nervous using it in my loop. I use distilled plus some PT nuke. Didn't realize the Arctic had aluminum.


From the description, only the nickel makes contact with water. The aluminum should only be making "contact" with the copper block & that's prevented by whatever coating is used.
So long as Zotac is providing truthful answers, there's shouldn't be galvanic corrosion. It's too bad that they are using an aluminum block to cut corners, but I guess that's why the card was about $100 less than MSI's variant. I say was because the price on that card has jumped $50 overnight.


----------



## Synthetickiller

I hate to double post, but I'd rather not just edit my top comment.
The card is going back. I don't have an option & it's not because it's a crappy overclocker.

Yesterday in MSI afterburner, I had the following settings rock solid stable:

Core Voltage (%): +100
Power Limit (%): 121
Temp. Limit (°C): 92
Core Clock (Mhz) +120
Memory Clock (Mzh) +400
Today, +100mhz core clock isn't stable. Even at stock settings, there is tearing during Heaven benchmark. I originally had no tearing & it was fairly smooth, except when dropping below 60fps & even then, it ran smoother. Temps have never exceeded 46°C, so I know that's not the issue. I've also noticed an increase in coil while during menus in certain games like Doom when running Vulkan (haven't tried OpenGL). I'm wondering if the card clocks poorly because this one might actually be defective? This is on the stock bios, btw.


----------



## Bishop07764

Quote:


> Originally Posted by *Synthetickiller*
> 
> I hate to double post, but I'd rather not just edit my top comment.
> The card is going back. I don't have an option & it's not because it's a crappy overclocker.
> 
> Yesterday in MSI afterburner, I had the following settings rock solid stable:
> 
> Core Voltage (%): +100
> Power Limit (%): 121
> Temp. Limit (°C): 92
> Core Clock (Mhz) +120
> Memory Clock (Mzh) +400
> Today, +100mhz core clock isn't stable. Even at stock settings, there is tearing during Heaven benchmark. I originally had no tearing & it was fairly smooth, except when dropping below 60fps & even then, it ran smoother. Temps have never exceeded 46°C, so I know that's not the issue. I've also noticed an increase in coil while during menus in certain games like Doom when running Vulkan (haven't tried OpenGL). I'm wondering if the card clocks poorly because this one might actually be defective? This is on the stock bios, btw.


That stinks. What are you getting as a replacement? It was quite hard to get mine. I had to wait quite a while for availability. Maybe it's better now.


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> That stinks. What are you getting as a replacement? It was quite hard to get mine. I had to wait quite a while for availability. Maybe it's better now.


Anyone is welcome to give their input b/c I'm fairly irritated right now, lol.

Here's the options I have, off the top of my head.

I can simply to a swap for another EK X from Amazon. They are in stock.
I can source the Zotac Arcticstorm & try my luck.
I can buy a Founder's Edition card & buy a Heatkiller IV block.

Option 1 is straight forward. Your's has a pretty good OC. If mine had that OC & didn't degrade, I'd be happy. You know what they say, once bitten, twice shy.
Option 2 was my original choice, but the card was OOS. People here seem to think that this model OCs well across the board. I am not worried about galvanic corrosion. I doubt Zotac would lie about the coating on the aluminium block. The block itself is gorgeous, especially w/ the built in lights.
Option 3 is the most complicated. I'd want to buy an EVGA card for the warranty. do I buy the SC model or FE (FE is $50 more, but they have the same board)? I've read that the FEs clock the highest out of all cards, as a whole when it comes to watercooling. This option is $100+ more than the Zotac option.


----------



## Bishop07764

Quote:


> Originally Posted by *Synthetickiller*
> 
> Anyone is welcome to give their input b/c I'm fairly irritated right now, lol.
> 
> Here's the options I have, off the top of my head.
> 
> I can simply to a swap for another EK X from Amazon. They are in stock.
> I can source the Zotac Arcticstorm & try my luck.
> I can buy a Founder's Edition card & buy a Heatkiller IV block.
> 
> Option 1 is straight forward. Your's has a pretty good OC. If mine had that OC & didn't degrade, I'd be happy. You know what they say, once bitten, twice shy.
> Option 2 was my original choice, but the card was OOS. People here seem to think that this model OCs well across the board. I am not worried about galvanic corrosion. I doubt Zotac would lie about the coating on the aluminium block. The block itself is gorgeous, especially w/ the built in lights.
> Option 3 is the most complicated. I'd want to buy an EVGA card for the warranty. do I buy the SC model or FE (FE is $50 more, but they have the same board)? I've read that the FEs clock the highest out of all cards, as a whole when it comes to watercooling. This option is $100+ more than the Zotac option.


I would lean more toward the cheapest option at #2. Not crazy about the aluminum though. Overclocking is going to be a complete crap shoot with any choice I think. MSI had done well for me personally. Had a 780 Lightning before this that would do 1.4 GHz. Went strong in my rig for about 3 years before the 1080 was just too great a temptation. All my MSI cards over the years have had zero coil whine which I love. But the artic storm looks like an awesome card.


----------



## GreedyMuffin

Weird...

My PPD with a 1080 at 2012 900mv is 750K, at 2138 at 1050mv it's 900K.

The speed difference is only 6%, but the PPD difference is 20%?!

Does the curve really suck that hard? :/

I was so happy with my 2012 900mv OC. TDP Peak never went over 90%. :/


----------



## Koniakki

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I wonder how the stock FE cooler would handle 1080 at 0.900V and 2ghz. Max TDP peak is currently at 91%. That is about 160 watts.
> 
> Would be fun to test, but if I need to drain the loop etc.. :I


No need.









In FC4 which hits my TDP hard, with [email protected] with 100% fan it was about 58-60'C.

And thats with high ambient temps. Hope it helps.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Koniakki*
> 
> No need.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In FC4 which hits my TDP hard, with [email protected] with 100% fan it was about 58-60'C.
> 
> And thats with high ambient temps. Hope it helps.


Hi!

Can you test 1900 at 0.900 with auto fan profile for me? I wonder if the cooler would handle a 900 mv OC, and yet still be quiet.

Cheers!


----------



## Koniakki

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Hi!
> 
> Can you test 1900 at 0.900 with auto fan profile for me? I wonder if the cooler would handle a 900 mv OC, and yet still be quiet.
> 
> Cheers!


As much as I hate "heat", in the name of science I will use auto fan speed.









Pretty sure will be way over 75'C.. Brb.

*Edit:* FE cant handle its "heat". lol.

Well, [email protected] went over 82'C easily with 83'C target.

Changed the target to 85'C out of curiosity and it went straight for it. I stopped it at 84'C.

Max Fan speed went up to 54% and stayed there.

With Vsync it was a linear climp there. Without Vsync, well, it wasnt so "linear"..

I should note that I got high ambient temps now. ~29ish'C.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Koniakki*
> 
> As much as I hate "heat", in the name of science I will use auto fan speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pretty sure will be way over 75'C.. Brb.
> 
> *Edit:* FE cant handle its "heat". lol.
> 
> Well, [email protected] went over 82'C easily with 83'C target.
> 
> Changed the target to 85'C out of curiosity and it went straight for it. I stopped it at 84'C.
> 
> Max Fan speed went up to 54% and stayed there.
> 
> With Vsync it was a linear climp there. Without Vsync, well, it wasnt so "linear"..


+Rep!

Thanks for testing! Too bad that the FE cooler sucks so much. When I had a 980 reference it was actually not bad.


----------



## Koniakki

Quote:


> Originally Posted by *GreedyMuffin*
> 
> +Rep!
> 
> Thanks for testing! Too bad that the FE cooler sucks so much. When I had a 980 reference it was actually not bad.


Thanks. Glad to help.

And I wouldn't actually say its "that" bad. Its bad compared to the AIB cards cooler of course.









This is actually my 2nd reference card that I personally own(besides those coming and going for testing/benching/building and my previous Titan X(M)).

And tbh it's quite good considering its a ref cooler. And I can swear its half as loud as the Titan X(M) cooler @100% fan.

*Edit:* Hmm some interesting finds. I re-run FC4 again. With fixed %Fan speed.

I tested again because I was skeptical about those really high temps using auto fan speed.

Take note that this is on an open TT P5 case.

[email protected] using [email protected]/Ultra/2xTXAA.

60(2400rpm)% Fixed Fan= ~73'C max. Barely audible.

70(2800rpm)% Fixed Fan= ~66'C max. Slightly audible.

85(3400rpm)% Fixed Fan= ~61'C max. Audible. Not annoying tho.

Tested the exact same way, at the exact same scene. Very weird.

I think its due to the auto to ramps up really slow and that 54% auto fan I saw was 2160rpm btw.

Those are vastly different results. Need more in-depth testing.


----------



## dmcl325i

Joined the club a week ago. Palit gtx 1080 super jetstream here.


----------



## pantsoftime

Anyone following progress on BIOS mods? It's been hard to gauge whether anyone is getting closer to making this a reality. It would be wonderful to even make some basic adjustments such as power limit.


----------



## jleslie246

Hey guys. Im going back and forth on waiting for a 1080 ti or just getting the 1080. I need to be able to run a 2k 144Hz monitor with highest settings at a constant 144fps. Will a 1080 do this? See my sig rig for full current specs. And I do not want SLI.

Thank you for your help.


----------



## RJacobs28

Quote:


> Originally Posted by *jleslie246*
> 
> Hey guys. Im going back and forth on waiting for a 1080 ti or just getting the 1080. I need to be able to run a 2k 144Hz monitor with highest settings at a constant 144fps. Will a 1080 do this? See my sig rig for full current specs. And I do not want SLI.
> 
> Thank you for your help.


Of course that depends on the titles, but in my experience - it wont. And a 1080ti won't either.


----------



## jleslie246

Quote:


> Originally Posted by *RJacobs28*
> 
> Of course that depends on the titles, but in my experience - it wont. And a 1080ti won't either.


Battlefield and Call of Duty type games. Would a Titan X (p)? Money isn't too much of an issue, I just feel like Titans are priced more for their name and not their performance. And I see a lot of SLI issues with my 780's so I really dont want to go that route again. But I still may.


----------



## RJacobs28

I was seeing 60-70fps in the BF1 Beta @ 1440p and 80-100fps @1080p with 1 GTX 1080.
I've (sadly) had my 2nd GTX 1080 switched off most of the time I've had it as the titles I'm playing haven't liked SLI at all.

If it were me, I would wait and see what the 1080ti brings compared to the Titan X. If it's anything like the last generation the x80ti will be the one to get.

I just wouldn't be expecting 144fps maxed out on ALL modern titles.


----------



## ShortySmalls

Quote:


> Originally Posted by *RJacobs28*
> 
> I was seeing 60-70fps in the BF1 Beta @ 1440p and 80-100fps @1080p with 1 GTX 1080.
> I've (sadly) had my 2nd GTX 1080 switched off most of the time I've had it as the titles I'm playing haven't liked SLI at all.
> 
> If it were me, I would wait and see what the 1080ti brings compared to the Titan X. If it's anything like the last generation the x80ti will be the one to get.
> 
> I just wouldn't be expecting 144fps maxed out on ALL modern titles.


Hmm, i was playing on Ultra in BF1 beta with my 4k monitor on a single GTX 1080 and was getting my 60fps vsync cap, never moved off it.


----------



## RJacobs28

Quote:


> Originally Posted by *ShortySmalls*
> 
> Hmm, i was playing on Ultra in BF1 beta with my 4k monitor on a single GTX 1080 and was getting my 60fps vsync cap, never moved off it.


Your 6700k @ 4.6 with 3200MHz memory could make the difference.
GPU usage was pegged @ 100%.


----------



## ucode

Anyone tried running Octanebench rendering benchmark?


----------



## smicha

Quote:


> Originally Posted by *ucode*
> 
> Anyone tried running Octanebench rendering benchmark?


Yes - 141 on a modified benchmark software - look up on octane forum.


----------



## ucode

Would you be able to run that with CPU max performance and C-States disabled?

Cheers.


----------



## smicha

Quote:


> Originally Posted by *ucode*
> 
> Would you be able to run that with CPU max performance and C-States disabled?
> 
> Cheers.


what do you mean?


----------



## ucode

Disable package C-States and core C-States C3, C6, C7 in BIOS setup. In Windows set minimum processor speed to 100% (High Performance).

I get over 170 with that at 2100/1395 GPU. Curious if it's the same for others and especially if it has an effect on earlier Maxwell / Fermi. Is it real or a bugged benchmark.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> I didn't even realize your card died, mine has been running strong thank goodness. Did you check the BIOS version and see if its any different then the original one?


Yeah, it started showing some artifacts even on windows and not being recgnized either. So I sent it back to the store and 1 week later I got a replacement. The new card got me 2 clocks more in silicon lottery and apparently that has something to do with thermal throttle aswell. I get no clock down until 48-49C. BIOS number is the same, though. If I could only get it under water and keep it below those 48ºC that would be perfect! But I just don't this it's worth it. Would only do that if voltages were unlocked and I could get more juice out of it. Not worth the expense for 13-26mhz.


----------



## Joshwaa

EK waterblock for my FTW should be here today! Hope I have time to put it on and drain and refill the loop.


----------



## whitrzac

It looks like I may be joining this club....

My amazon price mistake MSI 1080 seahawk arrived today.


----------



## Kenshiro 26

Quote:


> Originally Posted by *Joshwaa*
> 
> EK waterblock for my FTW should be here today! Hope I have time to put it on and drain and refill the loop.


Mine shipped but I don't expect delivery until the end of the week.


----------



## dalekdukesboy

Quote:


> Originally Posted by *whitrzac*
> 
> It looks like I may be joining this club....
> 
> My amazon price mistake MSI 1080 seahawk arrived today.


I probably don't want to know, but how much of a price mistake was it? But good job on catching it and capitalizing!


----------



## AllGamer

Quote:


> Originally Posted by *whitrzac*
> 
> It looks like I may be joining this club....
> 
> My amazon price mistake MSI 1080 seahawk arrived today.


seems like they fixed the pricing already

I can't seem to find it


----------



## whitrzac

Quote:


> Originally Posted by *AllGamer*
> 
> seems like they fixed the pricing already
> 
> I can't seem to find it


Price mistake, as in it was live for ~30min on friday before they sold out.


----------



## whitrzac

As of right now I don't think I'm going to keep it....

My 980ti does 19889 in firestrike (+50mhz on the core) and already has a EKwb.

This 1080 does 23028 in firestrike(+135 on core) and is a hybrid design.

Upgrade price would be ~$150ish, I'm just not seeing the value.


----------



## rintalahri

Quote:


> Originally Posted by *Joshwaa*
> 
> EK waterblock for my FTW should be here today! Hope I have time to put it on and drain and refill the loop.


Yep, mine is coming in the morning..







These are waiting. .


----------



## moustang

Quote:


> Originally Posted by *jleslie246*
> 
> Battlefield and Call of Duty type games. Would a Titan X (p)? Money isn't too much of an issue, I just feel like Titans are priced more for their name and not their performance. And I see a lot of SLI issues with my 780's so I really dont want to go that route again. But I still may.


Even the Titan X can't guarantee those frame rates at that resolution. 2K @ 144hz at all times just isn't possible on any current GPU.


----------



## jleslie246

Quote:


> Originally Posted by *moustang*
> 
> Even the Titan X can't guarantee those frame rates at that resolution. 2K @ 144hz at all times just isn't possible on any current GPU.


I know it will bounce a bit. How about average fps of 120


----------



## juniordnz

Quote:


> Originally Posted by *jleslie246*
> 
> I know it will bounce a bit. How about average fps of 120


No, just no.


----------



## Synthetickiller

I just ordered the Zotac Arcticstorm. I'll be returning the MSI once I get the Zotac so there's maybe 1 or 2 hours of downtime.
Hopefully, this one is stable.


----------



## TWiST2k

Quote:


> Originally Posted by *jleslie246*
> 
> I know it will bounce a bit. How about average fps of 120


I love these generalizations. Do some homework guys, ugh. If I am playing Golf with Friends, sure I will get a solid 144 but if I am playing Witcher 3 its gonna bounce. IMO its all about 1440p that is the sweet spot and even then you are not going to get a solid 144 DEPENDING ON THE GAME! 4K is a total performance killer for what you get visually, the trade off is not worth it, and you can forget about SLI, what a buggy mess that is. This is all my opinion and I am sure there are people that will disagree.

In other news, I finally got a good PG279Q monitor and could not be happier with it, just took a few free amazon exchanges and I am in business!


----------



## Koala Bear

rintalahari what is the name of the water pump / reservoir in your pc?


----------



## Koniakki

Quote:


> Originally Posted by *Koala Bear*
> 
> rintalahari what is the name of the water pump / reservoir in your pc?


I think that's an Alphacool Eisbecher for D5 if I'm not mistaken.


----------



## Koala Bear

Thank you Koniakki


----------



## wardo3640

Hello all,

Anyone upgraded the fans on their Seahawk X rad yet? I am thinking about push-pulling mine with some different fans and curious about:

1)how hard/easy it is to swap out fans?

2)What kind of control I will have over speed of the fans?

3)Will I need 3 pin or 4 pin fans. PWM/DC

I would like to have an idea of the task and parts on hand before I take the thing apart.

Thanks!!!


----------



## alpsie

Hello everyone.

I´m having a weird issue with a HDMI cabel that I want to connect to my Zotac 1080 extreme AMP edition and my TV (Samsung KS7005)

1. Active HDMI cable is with the Redmere chips
2. When hooking up the cable to TV and PC, the tv will show the boot sequence of the pc, but when the PC is completely booted, the TV wont show anything.
3. The PC will show the TV in the settings, but no matter what settings I use, it wont display on the TV.

4. I´ve tried with a regular passive HDMi cabel and that works fine.
5. I´ve also tried the redmere HDMi cabel attached from my laptop and the tv, which works without any issue.

Any suggestions on what could be the issue and how to fix it ?


----------



## sirleeofroy

Quote:


> Originally Posted by *alpsie*
> 
> Hello everyone.
> 
> I´m having a weird issue with a HDMI cabel that I want to connect to my Zotac 1080 extreme AMP edition and my TV (Samsung KS7005)
> 
> 1. Active HDMI cable is with the Redmere chips
> 2. When hooking up the cable to TV and PC, the tv will show the boot sequence of the pc, but when the PC is completely booted, the TV wont show anything.
> 3. The PC will show the TV in the settings, but no matter what settings I use, it wont display on the TV.
> 
> 4. I´ve tried with a regular passive HDMi cabel and that works fine.
> 5. I´ve also tried the redmere HDMi cabel attached from my laptop and the tv, which works without any issue.
> 
> Any suggestions on what could be the issue and how to fix it ?


You could be limited by the Redmere chip if you're pushing a 4K signal, there are a handful of Redmere chips. MEA 1689, PRE 1692 (10.2gbps) and PRA 1700 (18gbps).

To push a 4K signal you'll need a cable with the latter (PRA 1700).


----------



## alpsie

@sirleeofroy

That makes a lot of sense, since I´m trying to push 4k to it.
The cable say on the box that it support this tho, which makes it very confusing, but it dont specify which version of the chip they have used.
Its this cabel https://www.deltaco.se/produkter/deltaco/deltaco-prime/HDMI-2100


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> In other news, I finally got a good PG279Q monitor and could not be happier with it, just took a few free amazon exchanges and I am in business!


What was the problem? Back Light Bleeding? Dead Pixels? Can you tell if 4ms make any noticeable difference in FPS games?


----------



## rintalahri

Koala Bear , Yes its Alphacool Eisbecher for D5


----------



## sirleeofroy

Quote:


> Originally Posted by *alpsie*
> 
> @sirleeofroy
> 
> That makes a lot of sense, since I´m trying to push 4k to it.
> The cable say on the box that it support this tho, which makes it very confusing, but it dont specify which version of the chip they have used.
> Its this cabel https://www.deltaco.se/produkter/deltaco/deltaco-prime/HDMI-2100


That is odd, other listings I find for the cable also state 4K capable. My only other guess is that the refresh rate is too high, have you tried forcing the refresh rate to 60Hz to see if that works? Your TV has HDMI 2.0 inputs so 60Hz would be the maximum it could take.

Could try 30Hz just to see if you get a picture...


----------



## invincible20xx

please accept me in the 1080 family









do i need to post a photo ?

Edit : will fill the template later


----------



## juniordnz

Quote:


> Originally Posted by *invincible20xx*
> 
> please accept me in the 1080 family
> 
> 
> 
> 
> 
> 
> 
> 
> 
> do i need to post a photo ?
> 
> Edit : will fill the template later


A firestrike bench result at stock (out of the box) would be more appreciated. Always nice to see what people are getting from silicon lottery.

Congrats for the card and good luck on the lottery


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> What was the problem? Back Light Bleeding? Dead Pixels? Can you tell if 4ms make any noticeable difference in FPS games?


It was back light bleed, I posted all about it on the PG279Q forum, once I got one without any bleed, it is an amazing monitor!


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> It was back light bleed, I posted all about it on the PG279Q forum, once I got one without any bleed, it is an amazing monitor!


And how many returns did it take to get a good one? I believe here in Brazil the only company that would accept so many returns without any further complications would be Dell, but they don't have anything like PG279Q yet. It sucks to pay premium for a panel and get BLB...


----------



## LeonardoHLB

Quote:


> Originally Posted by *juniordnz*
> 
> And how many returns did it take to get a good one? I believe here in Brazil the only company that would accept so many returns without any further complications would be Dell, but they don't have anything like PG279Q yet. It sucks to pay premium for a panel and get BLB...


You're wrong ... they have in Brazil but better than PG279Q, same configuration and has no temperature issues as the Asus, I have one and I can say it is excellent, and my already malfunctioned but was changed in 3 days by Dell.

http://accessories.la.dell.com/sna/productdetail.aspx?c=br&cs=brdhs1&l=pt&sku=210-AGJR

I bought with a website bug where the price was 2700.00 with 5 year warranty.


----------



## tps3443

I have a MSI GTX1080 FE, and I run it at +235 Core, and +597 memory, 120% power Limiter, 92C temp target, the sliders are unlocked, and Voltage is set to 74%

With a fan profile set at Silent which is around 30-40% fan. Its quiet!!

It will stick at 90C with clock speeds of 2,088 MHz. This is a bit hot for my taste, but I was so in to the game "Fallout 4" I didn't realize it was doing this until I checked gpuz graph. And, this was just testing purposes. I like to get know my video cards like the back of my hand lol.

After more tinkering, and placing a SILENT 120MM case fan behind the card, blowing down it. And I upped the profile to 58% fan speed on my GTX1080 once it reaches 70C gpu temps. And with the silent case fan helping some it now never exceeds 79C

I've tinkered with my card so much now. I am actually really impressed with a FE, allow some really good airflow around it, and overclock it to death! It will stay under 80C temps, and this is even with a low fan profile.

I like my system quiet. And I've found the GTX1080 fe is sufficient enough to pull it off, some tinkering, fine tuning, and experiments with case fans, and air flow.

Bare in mind, these temps are tested at 4K. So it utilizes the card to its fullest.

If I run the same testing at 1080P it will never exceed 64C. And some coil whine to go with it!

This is all I have, so I squeeze the most from it. I cannot afford a custom loop right now as of yet.

25,600 in FIRESTRIKE is amazing from my FE, and this is game stable performance! All day long! Under 80c. And I hardly hear the thing.

I'm not sure if my card is mediocre, or better than others. But its beastly!


----------



## KickAssCop

Got shafted again guys. Anyone with GOW 4 promo extra, hit me.
Should have waited to purchase the 1080s I guess.


----------



## invincible20xx

can i actually expect close to 60fps maxed out (without AA at all) on all or most games @ 4k with the 1080 @ over 2ghz ?


----------



## KickAssCop

No.


----------



## invincible20xx

Quote:


> Originally Posted by *KickAssCop*
> 
> No.


what can i expect then ? 45 fps ? also do you guys think my current processor is going to let me see the full benefit of the card ?


----------



## alpsie

Quote:


> Originally Posted by *sirleeofroy*
> 
> That is odd, other listings I find for the cable also state 4K capable. My only other guess is that the refresh rate is too high, have you tried forcing the refresh rate to 60Hz to see if that works? Your TV has HDMI 2.0 inputs so 60Hz would be the maximum it could take.
> 
> Could try 30Hz just to see if you get a picture...


Thank you for the suggestion, I will try it out later today


----------



## pez

Quote:


> Originally Posted by *invincible20xx*
> 
> what can i expect then ? 45 fps ? also do you guys think my current processor is going to let me see the full benefit of the card ?


At 1440p you start to see your CPU become less of a factor for gaming. 3440x1440 and 4K are both GPU-heavy resolutions--not to say you won't hit a game with a nasty CPU bottleneck every so often. In short, no your CPU won't be a limitation especially at the clock it is at.

To have an enjoyable experience with games at 4K without AA, you're looking at SLI 1080s at the bare minimum. When running 4K with SLI 1080s, I was able to max nearly all titles without AA with the exception of Crysis 3. I believe I had to turn down shaders as well.


----------



## invincible20xx

Quote:


> Originally Posted by *pez*
> 
> At 1440p you start to see your CPU become less of a factor for gaming. 3440x1440 and 4K are both GPU-heavy resolutions--not to say you won't hit a game with a nasty CPU bottleneck every so often. In short, no your CPU won't be a limitation especially at the clock it is at.
> 
> To have an enjoyable experience with games at 4K without AA, you're looking at SLI 1080s at the bare minimum. When running 4K with SLI 1080s, I was able to max nearly all titles without AA with the exception of Crysis 3. I believe I had to turn down shaders as well.


well maybe i will end up adding another one down the road, but the thing is if they release a 1080 ti wouldn't it be better to go for this over SLI 1080 if i'm targeting 4k 60 ?


----------



## pez

Quote:


> Originally Posted by *invincible20xx*
> 
> well maybe i will end up adding another one down the road, but the thing is if they release a 1080 ti wouldn't it be better to go for this over SLI 1080 if i'm targeting 4k 60 ?


A potential Ti is going to perform somewhere within the realm of the TXP, which is also not going to do 4K justice with a single card. Granted, you could do it with a game like Overwatch or CS:GO, but I'm assuming when you say most games, those are not your idea of that.

Single card 4K just isn't a thing yet. You either play games that run at 300FPS on every other resolutions to get 4K60F or you play triple A titles with so much compromise you might as well play the game on a console in front of a huge TV screen.


----------



## invincible20xx

Quote:


> Originally Posted by *pez*
> 
> A potential Ti is going to perform somewhere within the realm of the TXP, which is also not going to do 4K justice with a single card. Granted, you could do it with a game like Overwatch or CS:GO, but I'm assuming when you say most games, those are not your idea of that.
> 
> Single card 4K just isn't a thing yet. You either play games that run at 300FPS on every other resolutions to get 4K60F or you play triple A titles with so much compromise you might as well play the game on a console in front of a huge TV screen.


actually i'm using a 50 inch 4k tv as a monitor, tried to strike a balance between a big screen experience and image Sharpness









but running that tv at a decent frame rate is proving to be difficult lol

so you are saying maybe i want to add another down the road for the optimum 4k experience ? my pc was built around having more than 1 GPU anyways, first let me try and see maybe 1 GPU is going to somehow cut it for me without AA, i can accept a minimum of 45 FPS specially if i can lock the whole experience to 45 FPS, if there is a way to do that, i'm also wondering will i be able to get the full benefit of my 2 x 1080 if i'm keeping my 4.5 GHz 3770k ?


----------



## pez

Quote:


> Originally Posted by *invincible20xx*
> 
> actually i'm using a 50 inch 4k tv as a monitor, tried to strike a balance between a big screen experience and image Sharpness
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but running that tv at a decent frame rate is proving to be difficult lol
> 
> so you are saying maybe i want to add another down the road for the optimum 4k experience ? my pc was built around having more than 1 GPU anyways, first let me try and see maybe 1 GPU is going to somehow cut it for me without AA, i can accept a minimum of 45 FPS specially if i can lock the whole experience to 45 FPS, if there is a way to do that, i'm also wondering will i be able to get the full benefit of my 2 x 1080 if i'm keeping my 4.5 GHz 3770k ?


You probably will start to see the CPU hold back a bit for SLI purposes. Then you have the whole PCIe lanes thing. With standard platforms (Z77,Z87,Z97,Z170, etc) you're limited to the a lower amount of PCIe lanes on the CPU...so you're looking at x8/x8 SLI. These differences are measurable as early on as the 980Tis (I'll post video at the bottom), but nothing crazy.

That being said, you're going to see a hell of a lot more performance out of SLI'ing the cards on your current rig over an entire platform upgrade. I don't think it's a bad idea to SLI on your current system whatsoever.


----------



## Spieler4

Quote:


> Originally Posted by *wardo3640*
> 
> Hello all,
> 
> Anyone upgraded the fans on their Seahawk X rad yet? I am thinking about push-pulling mine with some different fans and curious about:
> 
> 1)how hard/easy it is to swap out fans?
> 
> 2)What kind of control I will have over speed of the fans?
> 
> 3)Will I need 3 pin or 4 pin fans. PWM/DC
> 
> I would like to have an idea of the task and parts on hand before I take the thing apart.
> 
> Thanks!!!


1) very easy
2) Connect to motherboard and set fan speed in bios or use motherboard software in windows
3) depending on connector on your fan wire or motherboard

AFAIK "Noctua NF-P12 PWM" 120mm have both extension cable, cable splitter for 2 fans, resistance cable to reduce rpm by 50% ex. if powered by molex. In the box
You can connect most 3/4pin connectors to Noctua wires.

I tried push/pull on both my sea hawk x cards. Fan runs1050rpm on the radiator. Lowering 3-5 celcius on push/pull setup. As temps never go higher than 49 degress on push. I removed the pull fans again as I could not push core clock higher anyway


----------



## Bishop07764

Quote:


> Originally Posted by *KickAssCop*
> 
> Got shafted again guys. Anyone with GOW 4 promo extra, hit me.
> Should have waited to purchase the 1080s I guess.


Yeah. I guess that I wasn't late enough to the party either. Would have liked to score Gears for free.

Is anyone else's card not downclocking anymore? Mine downclocks on the desktop like normal after a driver reinstall. After the next pc reboot, it goes up to 1771 core and stays there in the desktop and other non gaming tasks. Mine is hooked up via a DVI to hdmi cable to a single monitor that is only 60hz. (don't laugh)


----------



## tps3443

Quote:


> Originally Posted by *invincible20xx*
> 
> can i actually expect close to 60fps maxed out (without AA at all) on all or most games @ 4k with the 1080 @ over 2ghz ?


The gtx1080 is a 4K, UHD, gaming card. If anyone wants to disagree with me, well thats fine. But, these are my findings of several benchmarks, with my card overclocked to its best. Alot of reviewers have found the GTX1080 to be a very suitable 4K gaming card. It manages 60+ in several titles that I play. If you want to play at 4K, overclock it to death, and it will do it very well!

Doom 2016 easy to run, ive never gone below 60 with AA, on nightmare.

Fallout 4 without AA manages 60 fps. And can dip as low as 40 overclocked every now and then. At 4K

Project cars averages 70 without AA4K

Forza 6 averages 100fps maxed our 4K.

BF1 beta everages 54 with fxaa 4K

Bf4 averages 75 without AA 4K

GtaV maxed out no AA avg 57 4K

Overwatch maxed out on EPIC at 4K averages 60-75 fps with AA. 100% resolution slider.

Starwars battle front averages 65 at 4K maxed out.

My gtx1080 does phenomenal. It plays 4K very well. Turn off AA, and usually when its overclocked it plays everything very smooth. My card usually runs around 2114/11,400.

It is a great card, when you overclock it you can play games at ultra settings at 4K resolution. Some games require minor adjustment. But usually that adjustment doesnt affect image quality.


----------



## boredgunner

Quote:


> Originally Posted by *tps3443*
> 
> Some games require minor adjustment. But usually that adjustment doesnt affect image quality.


This is really the key. Granted it may affect image quality, just not substantially. Be willing to let go of ultra settings and you may be pleasantly surprised.


----------



## tps3443

Yes, but he said will it average close to 60 in most games.

This is a simple answer! YES!

Every game I play is maxed out, without AA, and alot of titles play beyond 60 fps. And other land in the 50's fps range. Some games, you can run AA. Like Overwatch, Project cars, Doom.

But, most of the time everything is very smooth playing around 60 fps.

The RX480 and GTX1060 are suppose to be the 1080P kings, but in metro 2033 last light with AA on it falls on its face. But, its still the 1080P king lol.

Well in most cases the GTX1080 pulls off 50-60 fps in all titles at 4K.

So, I think 4K is a reasonable resolution. And the 1080 will push it just fine!


----------



## boredgunner

Quote:


> Originally Posted by *tps3443*
> 
> Yes, but he said will it average close to 60 in most games.
> 
> This is a simple answer! YES!
> 
> Every game I play is maxed out, without AA, and alot of titles play beyond 60 fps. And other land in the 50's fps range. Some games, you can run AA. Like Overwatch, Project cars, Doom.
> 
> But, most of the time everything is very smooth playing around 60 fps.
> 
> The RX480 and GTX1060 are suppose to be the 1080P kings, but in metro 2033 last light with AA on it falls on its face. But, its still the 1080P king lol.
> 
> Well in most cases the GTX1080 pulls off 50-60 fps in all titles at 4K.
> 
> So, I think 4K is a reasonable resolution. And the 1080 will push it just fine!


Metro uses supersampling as well. Nobody can expect to use that in a graphically demanding game. 2x supersampling = rendering the image at double the resolution before applying various filters and then sampling it back down. Huge performance hit, let alone 4x and 8x supersampling.


----------



## nexxusty

Quote:


> Originally Posted by *trivium nate*
> 
> yeah i play at 4k like deus and mirrors edge and stuff like that need more


4K is fine for an AMD CPU. Cpu bottlenecks are lessened big time at 4k.


----------



## Synthetickiller

Quote:


> Originally Posted by *Cornerer*
> 
> May I ask what BIOSes you tested so far?


Sorry for missing your post. I only tried the FE bios. It proved to be even more unstable.I gave up after that. it wasn't worth flashing a ton of bioses & running through benchmarks. I just didn't have the time.
The fact that the peformance dropped from day 1 to day 2 for no reason was enough for me to return it. That & the card "only" clocked to about 2075ish... which is a joke for the premium asked. That's why I'm hoping that the Zotac lives up to it's reputation. With my luck, it won't break 2000, lol. We'll have to see.

At least the block is much more attractive & looks to be more streamlined.The EK block is rather terrible in terms of trapping air. The card currently has no air bubbles, but it takes forever to get everything out.


----------



## whitrzac

I'm no longer a member... I had my card for 2 days.

Sent it to some other caring family.

I'm sticking with my 980ti:thumb:


----------



## istudy92

Hey guys,

*I need help*, I feel weird asking since it has been a LONG time since I used that word around here (most issues are found already searching)

Here is the problem:

I have 2 new 1080 EVGA cards, i7 6700k, 16GB RAM , and Gigabyte 170X Gaming 7.

2) I did benchmark for Firestrike Extream sli and I got a score of 15000 (is this normal or good?)

3) I played tombraider rise of w/e and I get 45FPS Avg on SLI. I went to Nvidia control panel and turned off SLI and BAM same 45FPS avg.
I have power connected to both cards, I updated to latest driver. The cards on my Mobo are as follows:

PCE x1 (Sound card) then PCEX16 has 1080, then PCEx8 has another 1080, then PCEx4 has my wifi card.

They have the bridge (regular one).

At first I thought shadowplay was a load of bull giving me bad FPS..but after testing it out with Dota 2, and GTA5 I could see the 2nd card or one of the 2 were not being utlized.

HOWEVER, when I use Persicion X OC OSD to display GPU info, both cards are running at 1900MHZ..

Please help =/

Note: I did GPUZ and rendered that both cards are at PCIE 3.0 x8 both cards, but maybe they are oversaterated??IDK
Note #2: I did benchmark with TR, and shadowplay showed my system doing 45-60FPS sometimes hitting 20 FPS...however the results from TR showed avg 95FPS.
WHY?


----------



## tps3443

Quote:


> Originally Posted by *whitrzac*
> 
> I'm no longer a member... I had my card for 2 days.
> 
> Sent it to some other caring family.
> 
> I'm sticking with my 980ti:thumb:


$700 is alot to spend, especially considering you already had a GTX980Ti. So I totally get it!

Although a reference GTX1080 is still quicker a GTX980Ti is still faster on LN2 and clocks nearly as high lol

Playing at 1440P does not result in much noticed difference though. Both perform great.

I'm definitely not getting rid of mine though, I don't have a GTX980Ti to fall back on!

GTX980's in general are still top performing cards!

Wait until GTX1080TI if you want 15-25% more power than a GTX1080, then again.. By the time the gtx1080ti is released, in 8-10 months next gen will be here vicious cycle all over again!


----------



## tps3443

Quote:


> Originally Posted by *boredgunner*
> 
> Metro uses supersampling as well. Nobody can expect to use that in a graphically demanding game. 2x supersampling = rendering the image at double the resolution before applying various filters and then sampling it back down. Huge performance hit, let alone 4x and 8x supersampling.


I had no idea it used Super Sampling!

I was wondering why it killed my old RX480 so bad.

Have you seen the AA called DSX? Project cars has it, I was playing 4K, and I enabled DSX9 lol and I was getting 20 fps. Never heard of DSX AA. Then I enabled standard FXAA and I was back up to 75 fps


----------



## tps3443

Your getting single card performance. My graphics score is 25,000 with a single GTX1080, and overall is 18,500 with a 6600K. Bare in mine, both components are Overclocked substantially.

But, it sounds like single card performance to me.


----------



## istudy92

Quote:


> Originally Posted by *tps3443*
> 
> Your getting single card performance. My graphics score is 25,000 with a single GTX1080, and overall is 18,500 with a 6600K. Bare in mine, both components are Overclocked substantially.
> 
> But, it sounds like single card performance to me.


firestrike extream SLi is 15000, not the regular.

Anywho I am going crazy, I removed sound card, moved my wifi card to a x1 pci..nothing in SLI increases TR performance or GTA5, and DOTA actually DECREASES in FPS going to a sli!!!


----------



## tps3443

Quote:


> Originally Posted by *istudy92*
> 
> firestrike extream SLi is 15000, not the regular.
> 
> Anywho I am going crazy, I removed sound card, moved my wifi card to a x1 pci..nothing in SLI increases TR performance or GTA5, and DOTA actually DECREASES in FPS going to a sli!!!


SLI is finicky sometimes, there are games that show worse performance with two cards. While others will more than double in performance. It's the nature of the beast.

Look at reviewa to see if it's inline.


----------



## pez

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah. I guess that I wasn't late enough to the party either. Would have liked to score Gears for free.
> 
> Is anyone else's card not downclocking anymore? Mine downclocks on the desktop like normal after a driver reinstall. After the next pc reboot, it goes up to 1771 core and stays there in the desktop and other non gaming tasks. Mine is hooked up via a DVI to hdmi cable to a single monitor that is only 60hz. (don't laugh)


Are you using Afterburner? I noticed this issue (I believe others did too) with a beta release. The newest one available seems to have stopped doing this for me. I don't know what it is, though, but I feel 372.70 is slightly more aggressive with downclocking my card while gaming.

Quote:


> Originally Posted by *tps3443*
> 
> The gtx1080 is a 4K, UHD, gaming card. If anyone wants to disagree with me, well thats fine. But, these are my findings of several benchmarks, with my card overclocked to its best. Alot of reviewers have found the GTX1080 to be a very suitable 4K gaming card. It manages 60+ in several titles that I play. If you want to play at 4K, overclock it to death, and it will do it very well!
> 
> Doom 2016 easy to run, ive never gone below 60 with AA, on nightmare.
> 
> Fallout 4 without AA manages 60 fps. And can dip as low as 40 overclocked every now and then. At 4K
> 
> Project cars averages 70 without AA4K
> 
> Forza 6 averages 100fps maxed our 4K.
> 
> BF1 beta everages 54 with fxaa 4K
> 
> Bf4 averages 75 without AA 4K
> 
> GtaV maxed out no AA avg 57 4K
> 
> Overwatch maxed out on EPIC at 4K averages 60-75 fps with AA. 100% resolution slider.
> 
> Starwars battle front averages 65 at 4K maxed out.
> 
> My gtx1080 does phenomenal. It plays 4K very well. Turn off AA, and usually when its overclocked it plays everything very smooth. My card usually runs around 2114/11,400.
> 
> It is a great card, when you overclock it you can play games at ultra settings at 4K resolution. Some games require minor adjustment. But usually that adjustment doesnt affect image quality.


I guess I have just become picky over the years. With G-sync, dropping below 45 FPS can be pretty distracting and noticeably 'less-than-stellar'--without G-sync, this is noticeable beyond a doubt and is immersion-breaking at that point. Games like GTA V are going to fall well under 60FPS (even without AA) on in many parts (forestation, grass, rendering distance around the desert). It's very doable if you're willing to compromise a bit for sure, but IMO, minimums are vastly more relevant than averages.

As a side note, it's nice to see another RDU/NC user on OCN







.


----------



## FattysGoneWild

Quote:


> Originally Posted by *tps3443*
> 
> The gtx1080 is a 4K, UHD, gaming card. If anyone wants to disagree with me, well thats fine. But, these are my findings of several benchmarks, with my card overclocked to its best. Alot of reviewers have found the GTX1080 to be a very suitable 4K gaming card. It manages 60+ in several titles that I play. If you want to play at 4K, overclock it to death, and it will do it very well!
> 
> Doom 2016 easy to run, ive never gone below 60 with AA, on nightmare.
> 
> Fallout 4 without AA manages 60 fps. And can dip as low as 40 overclocked every now and then. At 4K
> 
> Project cars averages 70 without AA4K
> 
> Forza 6 averages 100fps maxed our 4K.
> 
> BF1 beta everages 54 with fxaa 4K
> 
> Bf4 averages 75 without AA 4K
> 
> GtaV maxed out no AA avg 57 4K
> 
> Overwatch maxed out on EPIC at 4K averages 60-75 fps with AA. 100% resolution slider.
> 
> Starwars battle front averages 65 at 4K maxed out.
> 
> My gtx1080 does phenomenal. It plays 4K very well. Turn off AA, and usually when its overclocked it plays everything very smooth. My card usually runs around 2114/11,400.
> 
> It is a great card, when you overclock it you can play games at ultra settings at 4K resolution. Some games require minor adjustment. But usually that adjustment doesnt affect image quality.


Nope. And before I even start. I have a 4790k, 16gb ram 1440p G-Sync and a 1080. You are hitting the good stuff thinking this is a 4k card. No its not at max settings Its a cross between 1080p/1440p max settings @60fps+. And the Witcher 3 cant even pull that off all Ultra with hairworks on 60+ fps holding consistently. And the newer games have not even hit yet to really push the 1080 more. As games get more demanding later in the year early next. 4k numbers will TANK. You are really misleading people thinking they can game 4k 60+ fps max settings. Its a 4K card IF you are willing to turn down settings. And as I said earlier the newer games coming out are only going to be more demanding. 1440p with a 1070/1080 is where its at. Volta might change that next year but until then. Yeah.

Not a personal attack and please don't take it that way. I just strongly disagree with you on this one. I cant be the only one.


----------



## Cornerer

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Nope. And before I even start. I have a 4790k, 16gb ram 1440p G-Sync and a 1080. You are hitting the good stuff thinking this is a 4k card. No its not at max settings Its a cross between 1080p/1440p max settings @60fps+. And the Witcher 3 cant even pull that off all Ultra with hairworks on 60+ fps holding consistently. And the newer games have not even hit yet to really push the 1080 more. As games get more demanding later in the year early next. 4k numbers will TANK. You are really misleading people thinking they can game 4k 60+ fps max settings. Its a 4K card IF you are willing to turn down settings. And as I said earlier the newer games coming out are only going to be more demanding. 1440p with a 1070/1080 is where its at. Volta might change that next year but until then. Yeah.
> 
> Not a personal attack and please don't take it that way. I just strongly disagree with you on this one. I cant be the only one.


+1
Depends on individual's needs on graphics quality really. AA off is easily recognizable even under highly compressed appalling YouTube format and is entirely unacceptable on screen to my eyes.
It's possible that Vulkan might help things out a huge chunk though once furture games are optimised for it (like DOOM). Maybe it's finally time to go AMD Vega?


----------



## pez

I was very hard pressed to notice the lack of AA in most games. It was glaringly obvious for me in FO4 even at 4K, but games like Crysis 3, I had to seriously look for it. The remedy to AA that won't hit you too hard on performance for 4K can usually be FXAA. I don't like every game's implementation of it, but it's a viable compromise that I never felt too bothered by.


----------



## Dry Bonez

hey OCN, i just bought a gtx EVGA SC 1080 for $430,isthat a good price?


----------



## KickAssCop

No it is not. Send it to me for 250.


----------



## juniordnz

I don't really care at all about FXAA/MSAA/TXAA. Actually, I disable it in most games I play. I just don't like the blur effect on the edges. I pretty much prefer the sharpness of AA off.

And I play on 1080p! Maybe it's because I sit at a good 50cm from the screen, so I don't see the pixels on the edge that much. I don't get how some people rub their noses on the screen, that's not healthy at all and I get headaches in half an hour like that lol


----------



## Bishop07764

Quote:


> Originally Posted by *pez*
> 
> Are you using Afterburner? I noticed this issue (I believe others did too) with a beta release. The newest one available seems to have stopped doing this for me. I don't know what it is, though, but I feel 372.70 is slightly more aggressive with downclocking my card while gaming.


Yeah. I'm using the latest Afterburner. I mean it's not a huge deal as I don't keep my computer on 24/7. It's more annoying than anything. I mean it doesn't downclock in games unless I'm on a title screen on in a map menu. In gameplay it stays at a solid 2126 core. Maybe gpu boost 3.0 is boosting my card during idle because it idles in the 20's C.








Quote:


> Originally Posted by *Dry Bonez*
> 
> hey OCN, i just bought a gtx EVGA SC 1080 for $430,isthat a good price?


I think you just bought a 1070.


----------



## Synthetickiller

For everyone running SLI, how are you guys liking it?
Everything I read about DX12 & SLI says to stay very, very far away, but most people who comment on it are very happy with the implementation.After having 570 SLI & a 690, I feel naked only running one card, lol. And it's a good reason to upgrade to a 1440p 120hz/144hz or 4k 60hz.


----------



## st0necold

Quote:


> Originally Posted by *Dry Bonez*
> 
> hey OCN, i just bought a gtx EVGA SC 1080 for $430,isthat a good price?


Horrible price..

Are you serious? You can answer your own question very quickly just search Amazon, Newegg, etc.. and compare your price to the ones listed.


----------



## juniordnz

Quote:


> Originally Posted by *st0necold*
> 
> Horrible price..
> 
> Are you serious? You can answer your own question very quickly just search Amazon, Newegg, etc.. and compare your price to the ones listed.


Give him a break, he's just trying to brag about the super duper deal he got and show everyone how cool and superior he is for paying less on hardware.


----------



## Joshwaa

Got the EKWB on my FTW. Temps never go above 39C now even after hours of gaming or folding. Love'n it! Now the lowest it downclocks to is 2112Mhz. Will do some benching later.


----------



## tps3443

Quote:


> Originally Posted by *Dry Bonez*
> 
> hey OCN, i just bought a gtx EVGA SC 1080 for $430,isthat a good price?


I paid $280 for mine. One of those once in a lifetime deals lol


----------



## juniordnz

Quote:


> Originally Posted by *tps3443*
> 
> I paid $280 for mine. One of those once in a lifetime deals lol


I paid 1450usd on mine, one of those lifetime regrets lol


----------



## tps3443

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Nope. And before I even start. I have a 4790k, 16gb ram 1440p G-Sync and a 1080. You are hitting the good stuff thinking this is a 4k card. No its not at max settings Its a cross between 1080p/1440p max settings @60fps+. And the Witcher 3 cant even pull that off all Ultra with hairworks on 60+ fps holding consistently. And the newer games have not even hit yet to really push the 1080 more. As games get more demanding later in the year early next. 4k numbers will TANK. You are really misleading people thinking they can game 4k 60+ fps max settings. Its a 4K card IF you are willing to turn down settings. And as I said earlier the newer games coming out are only going to be more demanding. 1440p with a 1070/1080 is where its at. Volta might change that next year but until then. Yeah.
> 
> Not a personal attack and please don't take it that way. I just strongly disagree with you on this one. I cant be the only one.


Average of 60, with a minimum of 60 are two different things.

Although, in over half of my games it provides this 60/60+ performance.

Like I said, I find it more than capable of playing at 4K. I do not have Witcher 3. So, I cannot account for that. Although a few users on here, said maxed out, it acheives 45 fps. Which is very playable. Hence most games are playable, and can acheive close to 60 fps.

I just responded to a question with my findings without AA, and said YES it will run 4K.

Every game I play handles 4K very well. In some cases I do not run AA, unless it is TAA, or FXAA. Which shows virtually no performance impact.

Yes some games will drop in frame rate

, Fallout 4 will drop to 40. GTAV may drop to 35 fps. That is going to happen. Although, that is a very acceptable minimum frame rate. Considering the image quality, the card is out putting.

I guess what im getting at is, I use my card for actually gaming, it has about 24 hours in fallout so far, and alot of use in other games. I am totally ok with a avg of 50-60 fps. And even with minimums that hit below 40 sometimes.

My minimum in Doom on nightmare 8XAA is 59.

On overwatch it is 64 minimum. This card runs 4K like a dream dude.

Star wars battle front runs with the same locked 60 fps performance.

Just because every single game does not provide that 60 minimum, and 60 avg does not mean it is not a outstanding 4K card. I have a 4K panel, and I believe you would be surprised at how well it performs!

How is this not acceptable performance?

Keep in mind, keywords are;

"Most Games"

"Around 60 fps avg"

"Without AA"

He didnt say, can the GTX1080 provide 4K performance while never ever going below 60 fps, with highest detail, highest AA, highest everything at 4K resolution.

He just said, can it average around 60 in most games, without AA lol.

Because Ive got footage with shadow play that says other wise.

So, in the end it will provide playable performance.

Ive pushed my card to to a 25,600 avg graphics score in firestrike. So it is overclocked, and it needs to be to do this.


----------



## Bishop07764

Quote:


> Originally Posted by *Joshwaa*
> 
> Got the EKWB on my FTW. Temps never go above 39C now even after hours of gaming or folding. Love'n it! Now the lowest it downclocks to is 2112Mhz. Will do some benching later.


Great temps. It still down clocks even at 39C?


----------



## Synthetickiller

Quote:


> Originally Posted by *Joshwaa*
> 
> Got the EKWB on my FTW. Temps never go above 39C now even after hours of gaming or folding. Love'n it! Now the lowest it downclocks to is 2112Mhz. Will do some benching later.


What size is your loop?
I think you've just proven that I have to return my card (even though it's in the RMA process, you've now confirmed it).
My MSI EK X hits 48C during Time Spy. It does get the heat from a moderately clocked 4.7ghz 4790k @ 1.3v. I've got a dual 180mm rad w/ 2000 rpm AP182s.

39C is excellent.


----------



## GreedyMuffin

I got 360mm (XTX) push-pull and XT240 push. Got 45-47'C with a good OC.


----------



## rdhrdh

What sort of performance increase are people generally seeing when switching from air to liquid?

I have this cooler attached my 6700k. Would it be worth getting a gpu block for my Asus A8G 1080 and expanding that loop?

I suppose the question of worth varies by person, but will I likely see >5% increase in performance?


----------



## Derpinheimer

Quote:


> Originally Posted by *rdhrdh*
> 
> What sort of performance increase are people generally seeing when switching from air to liquid?
> 
> I have this cooler attached my 6700k. Would it be worth getting a gpu block for my Asus A8G 1080 and expanding that loop?
> 
> I suppose the question of worth varies by person, but will I likely see >5% increase in performance?


No


----------



## rdhrdh

Quote:


> Originally Posted by *Derpinheimer*
> 
> No


Thanks.

And just to clarify, are you saying no as a blanket statement to all water cooling vs air or because of the cooler/rad/loop i'm looking to expand.


----------



## Derpinheimer

Quote:


> Originally Posted by *rdhrdh*
> 
> Thanks.
> 
> And just to clarify, are you saying no as a blanket statement to all water cooling vs air or because of the cooler/rad/loop i'm looking to expand.


Yep, not worth it on this generation imo. You might see 5% but not typical. From air to water my max oc is unchanged on memory, and from 2113 to 2150 on core. Factor in the throttling on air and it's more like 2050 vs 2150.


----------



## Joshwaa

Quote:


> Originally Posted by *Bishop07764*
> 
> Great temps. It still down clocks even at 39C?


Yes at 39C it clocks down one notch from whatever I set the clock to.


----------



## rdhrdh

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yep, not worth it on this generation imo. You might see 5% but not typical. From air to water my max oc is unchanged on memory, and from 2113 to 2150 on core. Factor in the throttling on air and it's more like 2050 vs 2150.


Ahh okay, thank you. Adding a custom block likely voids the warranty too I assume. Doesn't seem worth it for that small of a performance increase.


----------



## Joshwaa

Quote:


> Originally Posted by *Synthetickiller*
> 
> What size is your loop?
> I think you've just proven that I have to return my card (even though it's in the RMA process, you've now confirmed it).
> My MSI EK X hits 48C during Time Spy. It does get the heat from a moderately clocked 4.7ghz 4790k @ 1.3v. I've got a dual 180mm rad w/ 2000 rpm AP182s.
> 
> 39C is excellent.


I have just the 1080 and a 4790K at 4.6. Have EX res 140 w/D5 vario, Monsta 80mm 280(front intake), 30mm 280(top exhaust) and 45mm 140(back exhaust) in a Corsair 760T. Using Gelid Xtreme on both EKWB and EK Cpu block. Running all noctua industrial 3000rpm pwm fans @ 800rpm nice and quiet.


----------



## tps3443

Quote:


> Originally Posted by *Joshwaa*
> 
> Got the EKWB on my FTW. Temps never go above 39C now even after hours of gaming or folding. Love'n it! Now the lowest it downclocks to is 2112Mhz. Will do some benching later.


What kind of GPU boost does it maintain in gaming? Is this at 4K and it holds at 2,112 GPU clock?


----------



## FattysGoneWild

Quote:


> Originally Posted by *juniordnz*
> 
> Give him a break, he's just trying to brag about the super duper deal he got and show everyone how cool and superior he is for paying less on hardware.


True. All I can think of is the poor sap that needed money so desperately and took a big loss.


----------



## Joshwaa

Quote:


> Originally Posted by *tps3443*
> 
> What kind of GPU boost does it maintain in gaming? Is this at 4K and it holds at 2,112 GPU clock?


I have not tinkered much with it yet, I am sure it could go higher. I believe 2126 or 2128 is my normal clock. Gaming at 1440P. Do not have a 4K monitor.


----------



## juniordnz

What's the secret to get PrecisonX OC OSD to work? Tried everything, this thing just doesn't show!


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> What's the secret to get PrecisonX OC OSD to work? Tried everything, this thing just doesn't show!


Well at 4K it doesn't work for me either. Only 1080P lol Precision X 6.06 and the ones before have alot of issues. Hardly anything functions correctly. My OSD works about once every week or so. I wish I knew bud!

It is finicky lol. One day it works, and the next it doesn't.


----------



## Spieler4

Quote:


> Originally Posted by *Synthetickiller*
> 
> For everyone running SLI, how are you guys liking it?
> Everything I read about DX12 & SLI says to stay very, very far away, but most people who comment on it are very happy with the implementation.After having 570 SLI & a 690, I feel naked only running one card, lol. And it's a good reason to upgrade to a 1440p 120hz/144hz or 4k 60hz.


Been trying some dual SLI with "gtx 1080 sea hawk x" + hb bridge a couple of days now I am very happy with it 
playing games like ACU, GTA, BF2 in 5K 4.00 X DSR - factor on a 1440p monitor g-sync

win 7 vs win 10
win 7 shared mem 7905gb
win 10 shared mem 8161gb

Gpu can OC around +110 in SLI. enough to reach 2025mhz. With one card OC +169 in AB
cannot OC VRAM in SLI with out crashing. With 1 card OC +500 in AB

Also win 10 utilizes more VRAM in games than win 7 = more shared mem I guess
16 gb ram is not enough when dual SLI win 10 in 5K. Win 7 works better with direct x 9 -11 games for me 

Also gpu utilization in SLI is less than 1 card = lower temps


----------



## juniordnz

Quote:


> Originally Posted by *tps3443*
> 
> Well at 4K it doesn't work for me either. Only 1080P lol Precision X 6.06 and the ones before have alot of issues. Hardly anything functions correctly. My OSD works about once every week or so. I wish I knew bud!
> 
> It is finicky lol. One day it works, and the next it doesn't.


Only moved to Precision X OC for the FPS cap. Otherwise I would still be using AB. =(


----------



## tin0

I have been fiddling with trying to watercool my MSI 1080 ARMOR OC. First tried mounting the Corsair GPU bracket combined with a H80 pump, but quickly decided to get a full cover EK block for it.
Been trying to install the bastard for days now. First it turns out this block is meant for the GAMING X / Z cards with a backplate, so the screws were too long since the ARMOR doesn't have a backplate (you could see the block moving up and down cause of space between PCB and screws, not good).
So I ordered a EK TF6 backplate...well...screws still too long ffs..currently have 11mm ones, got 6mm ones on the way. But got so tired of it after 4/5 attempts so I ordered a Titan X Pascal instead









Decided to do a last ultimate mount attempt with some MX4 and its' stock cooler as a last goodbye for the 1080, and I have to say I'm pleasently surprised by the improvements (ok room temp also got a lot lower since last time I tried it was burning hot outside). Before I could barely reach 24K graphics score, now I almost hit 25K and GPU never passes 55°C @ 2126MHz steady. This one is a good clocker


----------



## Fediuld

Quote:


> Originally Posted by *tin0*
> 
> I have been fiddling with trying to watercool my MSI 1080 ARMOR OC. First tried mounting the Corsair GPU bracket combined with a H80 pump, but quickly decided to get a full cover EK block for it.
> Been trying to install the bastard for days now. First it turns out this block is meant for the GAMING X / Z cards with a backplate, so the screws were too long since the ARMOR doesn't have a backplate (you could see the block moving up and down cause of space between PCB and screws, not good).
> So I ordered a EK TF6 backplate...well...screws still too long ffs..currently have 11mm ones, got 6mm ones on the way. But got so tired of it after 4/5 attempts so I ordered a Titan X Pascal instead
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Decided to do a last ultimate mount attempt with some MX4 and its' stock cooler as a last goodbye for the 1080, and I have to say I'm pleasently surprised by the improvements (ok room temp also got a lot lower since last time I tried it was burning hot outside). Before I could barely reach 24K graphics score, now I almost hit 25K and GPU never passes 55°C @ 2126MHz steady. This one is a good clocker


It looks like all the Armor OC are good clockers.

Mine works happily at 2177 @1.081v under water. 2190 @ 1.093v can be done, but the perf is exactly the same at the 2177.
To improve it, I have to use CLU to the 3 resistors, but have chicken out more than a month now.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *Synthetickiller*
> 
> For everyone running SLI, how are you guys liking it?
> Everything I read about DX12 & SLI says to stay very, very far away, but most people who comment on it are very happy with the implementation.After having 570 SLI & a 690, I feel naked only running one card, lol. And it's a good reason to upgrade to a 1440p 120hz/144hz or 4k 60hz.


Stick with 1440p/120hz.

I'm running SLI and 4K/60hz, and while it works on AAA games without much issue, it doesn't work on all of them. I can run GTA5 maxed out on everything but grass at 4K and never drop below 60fps, and both cards don't have to push very hard. I've been playing The Division a lot, and I get the same performance whether I'm using one 1080 or 2x1080s in SLI, hovering around 45fps. Basically the only boost I got from going from an overclocked R9 290 to this GTX 1080 in The Div was turning the settings from low to medium at 4K, but I played in 4K with both cards (yes, I know, it's almost unbelievable that I could play 4K at all with a 290, but I did and was surprised at how small of a jump I experienced in performance.

Apparently Ubisoft hates SLI and generally does a bad job supporting it. But my friend who I play with SLIed 2 1080s and sees a big performance bump, we both are running same drivers, reinstalled win10 together, and only have slightly different hardware. So who knows.

Battlefield 1 ran buttery-smooth during the beta at 4K, though I didn't have enough time to give it a good testing, since I only had the game for one day before it went off beta. The scaling in 3dmark ultra was almost exactly 2x when adding the second card, for whatever that's worth.

I'm running both under water, and with CPU in the loop on a single (probably overtaxed) 280mm radiator. Usually hover around 45c while gaming, sometimes it can get to the low 50s C, but it remains dead silent the whole time, so working as intended. I'm sure if I added a 360 or 420mm radiator on top, I could do cool and quiet, but I like the way my system looks and sounds now.


----------



## ShortySmalls

Quote:


> Originally Posted by *rdhrdh*
> 
> What sort of performance increase are people generally seeing when switching from air to liquid?
> 
> I have this cooler attached my 6700k. Would it be worth getting a gpu block for my Asus A8G 1080 and expanding that loop?
> 
> I suppose the question of worth varies by person, but will I likely see >5% increase in performance?


I'm running mine on a triple 60mm thick rad, and it runs a lot quieter now than the gigabyte heatsink and fan did. But my overclock is the same. I load around 45-50*c in a gpu/cpu loop with silent fans.


----------



## Bishop07764

Quote:


> Originally Posted by *Joshwaa*
> 
> Yes at 39C it clocks down one notch from whatever I set the clock to.


I get similar temps about 37c usually. I haven't seen mine down clock. Thought it was closer to 50c when they down clocked. I can't seem to game beyond 2126 without artifacts in Just Cause 3. Mine won't up the voltage even if I max the slider. Should get around to trying the curve I suppose.


----------



## tps3443

Quote:


> Originally Posted by *Spieler4*
> 
> Been trying some dual SLI with "gtx 1080 sea hawk x" + hb bridge a couple of days now I am very happy with it
> playing games like ACU, GTA, BF2 in 5K 4.00 X DSR - factor on a 1440p monitor g-sync
> 
> win 7 vs win 10
> win 7 shared mem 7905gb
> win 10 shared mem 8161gb
> 
> Gpu can OC around +110 in SLI. enough to reach 2025mhz. With one card OC +169 in AB
> cannot OC VRAM in SLI with out crashing. With 1 card OC +500 in AB
> 
> Also win 10 utilizes more VRAM in games than win 7 = more shared mem I guess
> 16 gb ram is not enough when dual SLI win 10 in 5K. Win 7 works better with direct x 9 -11 games for me
> 
> Also gpu utilization in SLI is less than 1 card = lower temps


I'm considering adding another. I'm kind of relieved my Mini itx board died, I'm getting a bigger board so I actually can run SLI now.

I'm just worried, I hear VR will not offer the optimization of sli. Is this true?


----------



## Synthetickiller

Quote:


> Originally Posted by *Spieler4*
> 
> Been trying some dual SLI with "gtx 1080 sea hawk x" + hb bridge a couple of days now I am very happy with it
> playing games like ACU, GTA, BF2 in 5K 4.00 X DSR - factor on a 1440p monitor g-sync
> 
> win 7 vs win 10
> win 7 shared mem 7905gb
> win 10 shared mem 8161gb
> 
> Gpu can OC around +110 in SLI. enough to reach 2025mhz. With one card OC +169 in AB
> cannot OC VRAM in SLI with out crashing. With 1 card OC +500 in AB
> 
> Also win 10 utilizes more VRAM in games than win 7 = more shared mem I guess
> 16 gb ram is not enough when dual SLI win 10 in 5K. Win 7 works better with direct x 9 -11 games for me
> 
> Also gpu utilization in SLI is less than 1 card = lower temps


I never had a chance to try the 1080 under win 7. I've been using Win 10 since the last day the free upgrade was available, lol.
The 5k monitor must be a sight to see. Luckily, I have 32gb, so I could handle, but i'm not sure my wallet could!

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> Stick with 1440p/120hz.
> 
> I'm running SLI and 4K/60hz, and while it works on AAA games without much issue, it doesn't work on all of them. I can run GTA5 maxed out on everything but grass at 4K and never drop below 60fps, and both cards don't have to push very hard. I've been playing The Division a lot, and I get the same performance whether I'm using one 1080 or 2x1080s in SLI, hovering around 45fps. Basically the only boost I got from going from an overclocked R9 290 to this GTX 1080 in The Div was turning the settings from low to medium at 4K, but I played in 4K with both cards (yes, I know, it's almost unbelievable that I could play 4K at all with a 290, but I did and was surprised at how small of a jump I experienced in performance.
> 
> Apparently Ubisoft hates SLI and generally does a bad job supporting it. But my friend who I play with SLIed 2 1080s and sees a big performance bump, we both are running same drivers, reinstalled win10 together, and only have slightly different hardware. So who knows.
> 
> Battlefield 1 ran buttery-smooth during the beta at 4K, though I didn't have enough time to give it a good testing, since I only had the game for one day before it went off beta. The scaling in 3dmark ultra was almost exactly 2x when adding the second card, for whatever that's worth.
> 
> I'm running both under water, and with CPU in the loop on a single (probably overtaxed) 280mm radiator. Usually hover around 45c while gaming, sometimes it can get to the low 50s C, but it remains dead silent the whole time, so working as intended. I'm sure if I added a 360 or 420mm radiator on top, I could do cool and quiet, but I like the way my system looks and sounds now.


Interesting.
I have to get rid of this DVI monitor set up at some point... deciding between 1440p & 4k is difficult b/c I don't just game on this PC. I also do a lot of reading/research, so my triple 27" 1440p screens are always full & i'm wanting more. I noticed that LG is coming out with a 38" 3840x1600. It's not 21:10, but it's better than 1440p. I used to run Dual 2560x1600 & I really miss those extra 180 pixels. That might be a way to gain some reading room without having to go full 4k.

It's amazing how certain GPUs will scale up to 4k & be playable.

Wow, that's amazing. What are you ambient temps? The rental I'm in runs hot (it gets up to 25.5°C), so maybe that's why my temps are higher than other people's. My Dual 180mm rad should put your loop to shame, no offense, it's just physics, lol.


----------



## tps3443

I'm about to get this system back together, I Silicond the lid back, and used Liquid metal for my TIM after deliding my Silicon lottery 4.9 6600K, it's really easy! That Liquid metal stuff is hard to spread! Big daddy snoozing in the bed after his daily beatings. I just bought another CHEAP Z170 motherboard to get me by lol I was to impatient. I'll sell the gigabyte itx when it gets back from warranty repair.


----------



## jrcbandit

So I have 30 days left on a potential EVGA Step-up, I'm currently running a 1070 GTX FTW edition and was thinking about stepping up to the 1080 GTX Gaming ACX 3.0 edition. Is it worth it? I decided to forgo SLI with my 1070 FTW since most EVGA 1070 FTW now come with Micron memory instead of Samsung memory found in the launch units (Micron on 1070s are either crappy overclockers or have voltage bug that still hasn't been fixed in firmware). Also, SLI doesn't look very promising in DX12 which relies on the developers to implement so a single 1080 might be a better choice than 1070 SLI. Anyone know if I can ask EVGA for GoW4 code if I step up lol?


----------



## tps3443

Quote:


> Originally Posted by *jrcbandit*
> 
> So I have 30 days left on a potential EVGA Step-up, I'm currently running a 1070 GTX FTW edition and was thinking about stepping up to the 1080 GTX Gaming ACX 3.0 edition. Is it worth it? I decided to forgo SLI with my 1070 FTW since most EVGA 1070 FTW now come with Micron memory instead of Samsung memory found in the launch units (Micron on 1070s are either crappy overclockers or have voltage bug that still hasn't been fixed in firmware). Also, SLI doesn't look very promising in DX12 which relies on the developers to implement so a single 1080 might be a better choice than 1070 SLI. Anyone know if I can ask EVGA for GoW4 code if I step up lol?


I just came from a 1070 ACX sc, to my 1080 pictured above your post. And it's a nice refreshing boost! So, I think it's a great card for encoding, gaming, everything! I love my FE 1080, but I'd love it more if it was a ACX.

You cannot go wrong!


----------



## jleslie246

Quote:


> Originally Posted by *Synthetickiller*
> 
> I never had a chance to try the 1080 under win 7. I've been using Win 10 since the last day the free upgrade was available, lol.
> The 5k monitor must be a sight to see. Luckily, I have 32gb, so I could handle, but i'm not sure my wallet could!
> Interesting.
> I have to get rid of this DVI monitor set up at some point... deciding between 1440p & 4k is difficult b/c I don't just game on this PC. I also do a lot of reading/research, so my triple 27" 1440p screens are always full & i'm wanting more. I noticed that LG is coming out with a 38" 3840x1600. It's not 21:10, but it's better than 1440p. I used to run Dual 2560x1600 & I really miss those extra 180 pixels. That might be a way to gain some reading room without having to go full 4k.
> 
> It's amazing how certain GPUs will scale up to 4k & be playable.
> 
> Wow, that's amazing. What are you ambient temps? The rental I'm in runs hot (it gets up to 25.5°C), so maybe that's why my temps are higher than other people's. My Dual 180mm rad should put your loop to shame, no offense, it's just physics, lol.


I use a 2k 144Hz Dell monitor AND 3 1920x1080 IPS 60Hz monitors. All on SLI 780's right now. I'm waiting for the 1080ti. Just letting you know you can actually mix things up, if you have a large desk of course


----------



## tps3443

Quote:


> Originally Posted by *jleslie246*
> 
> I use a 2k 144Hz Dell monitor AND 3 1920x1080 IPS 60Hz monitors. All on SLI 780's right now. I'm waiting for the 1080ti. Just letting you know you can actually mix things up, if you have a large desk of course


The only reason I purchased a GTX1080 when it was first released, well for one I got it for really cheap, it's a once in a lifetime scenario type of deal that I just happened upon.

But, I wanted to really enjoy it's 20 month + life cycle. And I figure I will upgrade every 2 years. I may buy another, at tax time. Or a Titan Pascal. Or I might just buy a really nice display, and stick with my current GTX1080

But, with my overclocks. I'm very satisfied I can play 4K with smooth frames. And those demanding scenes every now and then, you never ever dip below 30 like last gen 900 series did.

If you don't mind turning off vsync, and AA, it doesn't even do that. But technically, that's not considered completely maxed out.


----------



## Dry Bonez

Quote:


> Originally Posted by *Joshwaa*
> 
> I have just the 1080 and a 4790K at 4.6. Have EX res 140 w/D5 vario, Monsta 80mm 280(front intake), 30mm 280(top exhaust) and 45mm 140(back exhaust) in a Corsair 760T. Using Gelid Xtreme on both EKWB and EK Cpu block. Running all noctua industrial 3000rpm pwm fans @ 800rpm nice and quiet.


out of curiosity, what voltage you have use to be a @4.6 on your 4790k?


----------



## grimboso

Should anyone come by an extra GOW promo-code, PM me and I'll buy of you!


----------



## RJacobs28

Quote:


> Originally Posted by *Dry Bonez*
> 
> out of curiosity, what voltage you have use to be a @4.6 on your 4790k?


I have to use 1.27v to keep mine stable at 4.6 - oh well.


----------



## Joshwaa

Quote:


> Originally Posted by *Dry Bonez*
> 
> out of curiosity, what voltage you have use to be a @4.6 on your 4790k?


Stock voltage.


----------



## Juub

Did I make a mistake purchasing the Asus Founder's Edition? I was gunning for the Zotac Amp! Extreme but couldn't find it. These cards can't overclock for a lick. I simply increased the clock by 200Mhz and they lock up.









Got them in SLI.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *Synthetickiller*
> 
> I never had a chance to try the 1080 under win 7. I've been using Win 10 since the last day the free upgrade was available, lol.
> The 5k monitor must be a sight to see. Luckily, I have 32gb, so I could handle, but i'm not sure my wallet could!
> Interesting.
> I have to get rid of this DVI monitor set up at some point... deciding between 1440p & 4k is difficult b/c I don't just game on this PC. I also do a lot of reading/research, so my triple 27" 1440p screens are always full & i'm wanting more. I noticed that LG is coming out with a 38" 3840x1600. It's not 21:10, but it's better than 1440p. I used to run Dual 2560x1600 & I really miss those extra 180 pixels. That might be a way to gain some reading room without having to go full 4k.
> 
> It's amazing how certain GPUs will scale up to 4k & be playable.
> 
> Wow, that's amazing. What are you ambient temps? The rental I'm in runs hot (it gets up to 25.5°C), so maybe that's why my temps are higher than other people's. My Dual 180mm rad should put your loop to shame, no offense, it's just physics, lol.


Ambients are pretty consistently around 24c/75f. I'm a bit embarrassed by my temps, since I used to run a BlackIce GTX 360 with 2100rpm fans when I had 2x6950s, but it's a lot of space and loud. This case looks really slick with the top closed up. I know if I ever really want to get the low low temps, I can always pop the top panels off my case and add another 420 or 360mm up top which should lower temps pretty dramatically. But I got what I wanted out of my 280mm cooled system.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *Juub*
> 
> Did I make a mistake purchasing the Asus Founder's Edition? I was gunning for the Zotac Amp! Extreme but couldn't find it. These cards can't overclock for a lick. I simply increased the clock by 200Mhz and they lock up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got them in SLI.


The one thing we've learned is that silicon lottery is the biggest determinant of clocks on these cards. 200mhz is not a good overclock though for a single card.

What are you seeing for max sustained clock speed on the 2 cards when running in SLI? Try with no manual overclock, but fans at 100% One of the features of these cards is that the GPUboost 3.0 will over/under clock them with temps and use. Makes it more confusing/difficult to get clocks where you want them. I have 2 in SLI under water, and on their own, each card would pull over 2100, but in SLI, they're only going up to 2050. Also, I did manual overclocks with a single card, but with SLI, I found it better to just zero everything out in afterburner, and let GPUboost do it's thing. It boosts them up to a mostly-sustained 2050 automatically. I've been fine with this, but I'm sure many people will want to tweak and manually adjust for more performance. As soon as I start trying to increase clocks manually, I quickly run into crashes, despite following everyone's recommendations on overclocking strategy from single-card use.


----------



## AllGamer

Hey guys, what is a good game for Benching that is both CPU and GPU intensive.

Like Witcher 3 for example, but I'm looking for a few more titles.


----------



## juniordnz

Quote:


> Originally Posted by *AllGamer*
> 
> Hey guys, what is a good game for Benching that is both CPU and GPU intensive.
> 
> Like Witcher 3 for example, but I'm looking for a few more titles.


GTA V always crash in a few minutes if my CPU has a bad overclock. Also, bad GPU overclock gives me a lot of artifacts. I can find out in a few minutes if my OC is ok there. I could pass a full firestrike stress test and crash miserably in 2 minutes of gameplay there.

Some say BF4 is also good. But I don't have that installed to check anymore...

Just remember to test without any FPS caps nor vsync on.


----------



## AllGamer

Quote:


> Originally Posted by *juniordnz*
> 
> GTA V always crash in a few minutes if my CPU has a bad overclock. Also, bad GPU overclock gives me a lot of artifacts. I can find out in a few minutes if my OC is ok there. I could pass a full firestrike stress test and crash miserably in 2 minutes of gameplay there.
> 
> Some say BF4 is also good. But I don't have that installed to check anymore...
> 
> Just remember to test without any FPS caps nor vsync on.


Exactly! why I prefer to test with real games, instead of Futuremark Benches, it happened to before too with other video cards and/or cpu, they can pass flawless with Futuremark, yet crash with a real game shortly after it loads into the game.

but the GTX1080 is so good, not many games can tax it to its limits.


----------



## juniordnz

Quote:


> Originally Posted by *juniordnz*
> 
> GTA V always crash in a few minutes if my CPU has a bad overclock. Also, bad GPU overclock gives me a lot of artifacts. I can find out in a few minutes if my OC is ok there. I could pass a full firestrike stress test and crash miserably in 2 minutes of gameplay there.
> 
> Some say BF4 is also good. But I don't have that installed to check anymore...


Quote:


> Originally Posted by *AllGamer*
> 
> Exactly! why I prefer to test with real games, instead of Futuremark Benches, it happened to before too with other video cards and/or cpu, they can pass flawless with Futuremark, yet crash with a real game shortly after it loads into the game.
> 
> but the GTX1080 is so good, not many games can tax it to its limits.


I find Firestrike Stress Test very useful to test for maximum overclock at peak voltage. It helped me a lot and I didn't see any difference in maximum clock there or real gaming. Things got ugly when I tried to undervolt my card. Firestrike would accept the same clock at 1.031V, but GTA V would crash almost instantly everytime until I got it up to 1.062V again.

With a 120fps cap I hit CPU bottleneck long before I use 100% of the GPU in most games. That's why testing with fps cap or vsync is not a good idea. It may hide a bad overclock by not stressing the card to it's full potential.

I'm currently at 2138mhz/1.062V rock solid. Will try to lower it down to 1.025V and see if I can get at least 2114mhz with that. That way I would get 2100mhz stable after the first thermal throttle clock down.


----------



## GreedyMuffin

My card hits a wall after 2189mhz.

Therefor I run 2012 at 900mv.

Low power usage and temps. :-D

I'm really happy with that OC. Mem is on 498+.

I don't need to care about CPU bottleneck luckily. Usually see 20-30% usage in all games, and 99% on GPU. And that is even on 4000mhz. :-D


----------



## juniordnz

Quote:


> Originally Posted by *GreedyMuffin*
> 
> My card hits a wall after 2189mhz.
> 
> Therefor I run 2012 at 900mv.
> 
> Low power usage and temps. :-D
> 
> I'm really happy with that OC. Mem is on 498+.
> 
> I don't need to care about CPU bottleneck luckily. Usually see 20-30% usage in all games, and 99% on GPU. And that is even on 4000mhz. :-D


That's because you a little beast under the hood. Mine is a lazy dog


----------



## Synthetickiller

Got my Zotac ArcticStorm... Night & day difference comapred to the MSI Seahaw EK X.

Lower Temps
Arguably better block (from a design standpoint)
Higher clocks @ lower voltages
Accommodates SLI bridges
I have yet to even hit 1.093V & I'm already stable 2126mhz @ 1.062V. Seems like I got a winner.

That MSI card ran really hot. I have a feeling that the block was not properly seated. It's interesting that MSI included thermal paste. I recall someone else mentioning the inclusion of thermal paste. I don't remember if anyone got an answer to reseating the heatblock & warranty issues. Anyways, the card is got just was lack luster all around. Poor temps, poor chip & honestly, it's not as attractive as the Zotac.

As well, the EK block does a piss poor job of bleeding air. It just holds onto it for almost a day before most of the air is gone. I was able to bleed the loop & the Zotac card very, very quickly.

If anyone's on the fence, the card seems to live up to the hype!


----------



## Benjiw

Quote:


> Originally Posted by *Synthetickiller*
> 
> Got my Zotac ArcticStorm... Night & day difference comapred to the MSI Seahaw EK X.
> 
> Lower Temps
> Arguably better block (from a design standpoint)
> Higher clocks @ lower voltages
> Accommodates SLI bridges
> I have yet to even hit 1.093V & I'm already stable 2126mhz @ 1.062V. Seems like I got a winner.
> 
> That MSI card ran really hot. I have a feeling that the block was not properly seated. It's interesting that MSI included thermal paste. I recall someone else mentioning the inclusion of thermal paste. I don't remember if anyone got an answer to reseating the heatblock & warranty issues. Anyways, the card is got just was lack luster all around. Poor temps, poor chip & honestly, it's not as attractive as the Zotac.
> 
> As well, the EK block does a piss poor job of bleeding air. It just holds onto it for almost a day before most of the air is gone. I was able to bleed the loop & the Zotac card very, very quickly.
> 
> If anyone's on the fence, the card seems to live up to the hype!


Do you know if the block is aluminium or nickel plated copper? It seems to be a concern to a lot of people because of galvanic corrosion (mixed metals).


----------



## Synthetickiller

Quote:


> Originally Posted by *Benjiw*
> 
> Do you know if the block is aluminium or nickel plated copper? It seems to be a concern to a lot of people because of galvanic corrosion (mixed metals).


Off the top of my head, at least 10 pages back, someone contacted zotac about this.
The end result is that the mainblock is copper. The micro-channels and parts of the block that make contact with water are nickle plated. The rest of the block (I'm assuming that it cools the VRMs, Chokes & Memory) is aluminum with a coating to prevent electrolysis.

I will watch my loop & see what happens.


----------



## GreedyMuffin

My EK block did not have any bleeding problems, neither did my 980ti and 980g1 blocks I've had.

I'm very good to make my loop boble-free when filling.

Yeah, my 5969X is quite a nice CPU. :-D


----------



## pez

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah. I'm using the latest Afterburner. I mean it's not a huge deal as I don't keep my computer on 24/7. It's more annoying than anything. I mean it doesn't downclock in games unless I'm on a title screen on in a map menu. In gameplay it stays at a solid 2126 core. Maybe gpu boost 3.0 is boosting my card during idle because it idles in the 20's C.
> 
> 
> 
> 
> 
> 
> 
> 
> I think you just bought a 1070.


If you're not on the newest, I'd suggest updating to that. One of the earlier beta's of the current version was doing something similar. With the newest one, I'm down to 139/405 at desktop/non-gaming situations.


----------



## Synthetickiller

I haven't had a ton of time to play with this card & I have to play with the curve more, but....
2200mhz is solid. I will shoot for 2225+. Temps never broke lower 40s... maybe 43C max.









I'm really loving this card.


----------



## Bishop07764

Quote:


> Originally Posted by *AllGamer*
> 
> Hey guys, what is a good game for Benching that is both CPU and GPU intensive.
> 
> Like Witcher 3 for example, but I'm looking for a few more titles.


You might try Just Cause 3 if you have it. The water on a sunny day in that game will start to show red specs if you are on the ragged edge of your clocks.
Quote:


> Originally Posted by *Synthetickiller*
> 
> Got my Zotac ArcticStorm... Night & day difference comapred to the MSI Seahaw EK X.
> 
> Lower Temps
> Arguably better block (from a design standpoint)
> Higher clocks @ lower voltages
> Accommodates SLI bridges
> I have yet to even hit 1.093V & I'm already stable 2126mhz @ 1.062V. Seems like I got a winner.
> 
> That MSI card ran really hot. I have a feeling that the block was not properly seated. It's interesting that MSI included thermal paste. I recall someone else mentioning the inclusion of thermal paste. I don't remember if anyone got an answer to reseating the heatblock & warranty issues. Anyways, the card is got just was lack luster all around. Poor temps, poor chip & honestly, it's not as attractive as the Zotac.
> 
> As well, the EK block does a piss poor job of bleeding air. It just holds onto it for almost a day before most of the air is gone. I was able to bleed the loop & the Zotac card very, very quickly.
> 
> If anyone's on the fence, the card seems to live up to the hype!


Sounds like the like you may have won the lottery with this one then. Awesome. You must have really gotten a poor Seahawk. Mine averages 37c at 2126 and the block bled the air pretty darn quick for me. Can you post your curve? I'm pretty sure that I'm doing mine wrong. I can't get the voltage over 1.062.


----------



## boredgunner

Quote:


> Originally Posted by *AllGamer*
> 
> Hey guys, what is a good game for Benching that is both CPU and GPU intensive.
> 
> Like Witcher 3 for example, but I'm looking for a few more titles.


GTA V especially with mods, Star Wars: Battlefront, Metro Redux, Deus Ex: Mankind Divided, Alien: Isolation, Crysis 3, maybe Paragon.


----------



## Bishop07764

Derp. Well I forgot to max the voltage slider when using the curve. Got Firestrike to run through most of it's tests at 2.2ghz before it crashed. Darn. Max temps were 39C. Maybe this winter then when I can keep it low 30's.


----------



## LiquidHaus

It seems to me that these cards start to develop a few different characteristics when they get watercooled.

Gets me excited to watercool my FTW card.

@Watercool-Jakob anytime now


----------



## Koala Bear

Having purchased the Gigabyte GTX 1080 Extreme Gaming version I was hoping to see a water block for this card. At this stage no hint of any in the pipeline. EK state on their website that they have no plans to make a block for this card. Given that Gigabyte's factory water cooled version of my card has hit 2,300mhz in testing is very interesting. I can only hope that Watercool make a Heatkiller IV block ( my preferred option ) or another company comes to the party. Having looked at some of the dimensions for these non reference cards it may not be profitable to make a block for each of these cards because each company is so different. I suppose I can only hope that one company will eventually make a block.


----------



## LiquidHaus

Watercool is slightly thinking about a Strix card. FTW comes first. After that, I seriously doubt a Gigabyte card will be after. I don't think I've ever seen a Gigabyte AIB card waterblock. Not sure why you thought you'd see one. Just look at everyone's past. I actually see more Strix blocks than any other. EK has really stepped it up in terms of making blocks for cards, but never has Gigabyte been on par with EVGA or Strix cards - even marketing wise - so it just doesn't make sense for a company.

Best bet is to snag an AIO or Universal block and pair it up with a 120mm fan and some adhesive pad passive heatsinks.


----------



## Fediuld

Quote:


> Originally Posted by *Synthetickiller*
> 
> I haven't had a ton of time to play with this card & I have to play with the curve more, but....
> 2200mhz is solid. I will shoot for 2225+. Temps never broke lower 40s... maybe 43C max.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm really loving this card.


It doesn't matter what the speed it states, but the performance. Mine goes 2190 and 2200 @1.093v but the perf is the same as 2177 @ 1.081.
Have you used CLU to short the card?


----------



## Jim86

Quote:


> Originally Posted by *Juub*
> 
> Did I make a mistake purchasing the Asus Founder's Edition? I was gunning for the Zotac Amp! Extreme but couldn't find it. These cards can't overclock for a lick. I simply increased the clock by 200Mhz and they lock up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got them in SLI.


I have an Asus 1080 FE and mine is soild at +200 +500 but I also have a EK WB.


----------



## MiamiMuscleBoy

Guys i have a zotac 1080 amp extreme and from the reviews it seems it is the drivers that stop responding or its clocked almost to its max... I can barely put plus 50 to the core and get it sitting at 2101 etc . does that seem like the correct clock speeds because shooting for anything over that results in screen locks?


----------



## Jim86

That seems normal for a custom card some people are reporting stability issues under 2ghz so I would be happy if I were you.


----------



## Spieler4

When gaming in SLI. Voltage is not the same on both cards. Should I fix this somehow to increase stability or is it ok ?


----------



## pantsoftime

Quote:


> Originally Posted by *Spieler4*
> 
> When gaming in SLI. Voltage is not the same on both cards. Should I fix this somehow to increase stability or is it ok ?


That's normal. GPU Boost is running the cards at their lowest voltage to achieve a given frequency. It means that one of your chips is better than the other (and would likely overclock better on its own). You shouldn't have to make any adjustments.


----------



## Spieler4

Quote:


> Originally Posted by *pantsoftime*
> 
> That's normal. GPU Boost is running the cards at their lowest voltage to achieve a given frequency. It means that one of your chips is better than the other (and would likely overclock better on its own). You shouldn't have to make any adjustments.


Ok. thanks. I was thinking it might be possible to OC a little more if voltage stayed the same on both cards


----------



## awesomemcrad

Hey, anybody know if it's possible to fit a EVGA hybrid kit onto a ASUS GTX 1080 STRIX ?


----------



## tps3443

Quote:


> Originally Posted by *lifeisshort117*
> 
> It seems to me that these cards start to develop a few different characteristics when they get watercooled.
> 
> Gets me excited to watercool my FTW card.
> 
> @Watercool-Jakob anytime now


You ain't lying about that! I wish I could watercool mine.


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> You might try Just Cause 3 if you have it. The water on a sunny day in that game will start to show red specs if you are on the ragged edge of your clocks.
> Sounds like the like you may have won the lottery with this one then. Awesome. You must have really gotten a poor Seahawk. Mine averages 37c at 2126 and the block bled the air pretty darn quick for me. Can you post your curve? I'm pretty sure that I'm doing mine wrong. I can't get the voltage over 1.062.


I use MSI afterburner. I haven't bothered with the Zotac software, other than for aesthetics.
I basically dragged the curve using CTRL at the 1.093 to keep that the highest speed. I would micro adjust each point. Otherwise, I didn't do anything overly complicated.
The stock boost speed on this card was 1949mhz... which was 30+ less than what the MSI did. I guess initial boost clocks mean nothing? I was really disappointed w/ the card when I saw the stock boost & then started pushing it, lol. Talk about a Christmas present! I've been dealing with a string of bad luck, so I feel extremely fortunate to get this chip.



PM me if you want more details.

Quote:


> Originally Posted by *Fediuld*
> 
> It doesn't matter what the speed it states, but the performance. Mine goes 2190 and 2200 @1.093v but the perf is the same as 2177 @ 1.081.
> Have you used CLU to short the card?


My performance in Heaven scales almost linearly with MHZ increase. I am not seeing a "no gain" in performance when moving 23mhz up or down. I'm getting points that seem to scale a little. It's not perfectly linear by any means, but my score cilmbs a bit. When I was OCing w/ the MSI, higher clocks didn't help. Even though I could run Heaven & pass the benchmark, my scores dropped. I'm not seeing that here.

All scores are at a 500mhz boost clock or 5505mhz on memory. It seems stable, so I left it there. I don't really know what "ideal" memory clocks are for the card. I haven't played around with it that much, unfortunately
.
*Heaven 1440p Extreme Scores @ gpu clock*

1857 @ 2189mhz
1864 @ 2202mhz
1865 @ 2214mhz
1875 @ 2227mhz






I haven't messed with tweaking memory speeds or anything. This was just a series of runs to show that I am seeing increased performance.
What I find interesting is that the 12mhz jump from 2202 to 2214 shows arguably no increase in performance, even though 2227 is significant. I wonder if there's "dead zones" where even "significant" mhz increases don't yield benefits, but pushing past those zones shows improvement because the stability is there.

As for the question about CLU, no I haven't removed the block, nor the backplate. As much as it's tempting, my temps are so low, I see no point. On desktop, I idle at 27/28°C, and low 30s in game, creeping up to high 30s. I hit low to mod 40s when the room heats up to 24ish Celsius.


----------



## Spiriva

Quote:


> Originally Posted by *Spieler4*
> 
> When gaming in SLI. Voltage is not the same on both cards. Should I fix this somehow to increase stability or is it ok ?


It doesnt matter, its the same for me. Diff cards just need diff voltage, i got two Evga FE 1080 both clock to 2200mhz but one does it at 1.030v and one 1.070v


----------



## Dry Bonez

I cant wait to get my EVGA SC 1080 today! Well, if it is a 1080 and not a 1070,since i won a bid on ebay for $430. I will post back when mailman arrives. I will be coming from a GTX 580,i cant wait to be amazed.


----------



## pantsoftime

Quote:


> Originally Posted by *Synthetickiller*
> 
> All scores are at a 500mhz boost clock or 5505mhz on memory. It seems stable, so I left it there. I don't really know what "ideal" memory clocks are for the card. I haven't played around with it that much, unfortunately


One decent way of nailing down a good memory overclock is to pause the Heaven benchmark so that it's pointing at a relatively static image (i.e. no moving objects in the background). You'll find that your FPS counter stays relatively still. I like to find an area of the map where the FPS is sort of middle ground (i.e. not possible to have CPU limitations). At that point try raising your memory clocks and finding a frequency where the FPS is at its highest.

You'll find that the memory speeds have peaks and valleys as you do this. It's a bit interesting to watch, but don't stop raising the clock just because the FPS went down. Push a little further and make sure that you haven't maxed it out (i.e. ECC starts kicking in). Using this method I found my best memory clock was +575. I also found that +490 outperformed +500 along the way. You can usually double check your results with firestrike or heavensward bench to ensure the gains are persistent across applications.


----------



## Martin778

I've re done the TIM on my 1080 SC ACX today, used Thermal Grizzly Kryonaut and the temps dropped by 5*C combined with a custom fan curve. Now the card boosts to 2000-2012MHz with no additional overclock.
The original TIM was some dried out grey turd.


----------



## Fediuld

Quote:


> Originally Posted by *Synthetickiller*
> 
> I haven't messed with tweaking memory speeds or anything. This was just a series of runs to show that I am seeing increased performance.
> What I find interesting is that the 12mhz jump from 2202 to 2214 shows arguably no increase in performance, even though 2227 is significant. I wonder if there's "dead zones" where even "significant" mhz increases don't yield benefits, but pushing past those zones shows improvement because the stability is there.
> 
> As for the question about CLU, no I haven't removed the block, nor the backplate. As much as it's tempting, my temps are so low, I see no point. On desktop, I idle at 27/28°C, and low 30s in game, creeping up to high 30s. I hit low to mod 40s when the room heats up to 24ish Celsius.


Try to run Spy benchmark. Is DX12 bench and the GPU score is irrelevant to the CPU score. Only the main score is affected by the CPU.
Heaven isn't any more the best method to measure GPU grunt power nowadays. If you hit in 3d Mark Spy GPU score 8500+ then yes, your card is working at 2190+ range.


----------



## Dry Bonez

Quote:


> Originally Posted by *Martin778*
> 
> I've re done the TIM on my 1080 SC ACX today, used Thermal Grizzly Kryonaut and the temps dropped by 5*C combined with a custom fan curve. Now the card boosts to 2000-2012MHz with no additional overclock.
> The original TIM was some dried out grey turd.


Hey,i should get my EVGA SC 1080 today,are they good?


----------



## fat4l

Guys...its been months...Still no Bios Tool?


----------



## Benjiw

Quote:


> Originally Posted by *fat4l*
> 
> Guys...its been months...Still no Bios Tool?


Was going to buy a 1080 but some cheap 980ti's came up, at first I thought, why buy these then remembered no bios tool for the 1080 yet so it will only probably be out by the time the 1080ti's launch so pulled the trigger.


----------



## Martin778

No BIOS tool yet, afaik.

The SC ACX has the same PCB as the FE but with some OC and with the ACX3.0 cooler slapped on it. I am pretty happy with mine, stock it boosts to 2000-2012MHz but Pascal is very temperature/power limit dependant








The first thing you should do is to install EVGA Precision or MSI afterburner and change the fan curve.
It's pretty broken stock, it let's the card heat up to 70*C+ which is totally unnecessary and decreases the core clock by 50-70MHz - the ACX3.0 is quiet enough to disable the passive mode and let it spin faster.
Best OC I can get out of mine is 2050MHz core and 5400MHz on VRAM.


----------



## Jim86

Quote:


> Originally Posted by *Dry Bonez*
> 
> Hey,i should get my EVGA SC 1080 today,are they good?


What kind of dumbass question is that is a 1080 good?!?!


----------



## Martin778

I think he is not referring to the model but rather the manufacturer


----------



## Synthetickiller

Quote:


> Originally Posted by *pantsoftime*
> 
> One decent way of nailing down a good memory overclock is to pause the Heaven benchmark so that it's pointing at a relatively static image (i.e. no moving objects in the background). You'll find that your FPS counter stays relatively still. I like to find an area of the map where the FPS is sort of middle ground (i.e. not possible to have CPU limitations). At that point try raising your memory clocks and finding a frequency where the FPS is at its highest.
> 
> You'll find that the memory speeds have peaks and valleys as you do this. It's a bit interesting to watch, but don't stop raising the clock just because the FPS went down. Push a little further and make sure that you haven't maxed it out (i.e. ECC starts kicking in). Using this method I found my best memory clock was +575. I also found that +490 outperformed +500 along the way. You can usually double check your results with firestrike or heavensward bench to ensure the gains are persistent across applications.


Thanks for that bit of info.
When I have the time, I will play around with it.

Quote:


> Originally Posted by *Fediuld*
> 
> Try to run Spy benchmark. Is DX12 bench and the GPU score is irrelevant to the CPU score. Only the main score is affected by the CPU.
> Heaven isn't any more the best method to measure GPU grunt power nowadays. If you hit in 3d Mark Spy GPU score 8500+ then yes, your card is working at 2190+ range.


I'm not getting 8500, period. Maybe it's because I'm running 3 monitors, 1 running Timespy while the other 2 are sitting at 2D showing afterburner, HWmonitor, etc
Anyways, my scores to scale with mhz...

*Scores @ gpu mhz*

7507 @ 1936 mhz (what the card boosted to when reset to stock)
8191 @ 2100
8337 @ 2164
8460 @ 2214
I'd have to graph it, but the scores increases look linear vs gpu mhz increases


----------



## Bishop07764

Quote:


> Originally Posted by *Synthetickiller*
> 
> I use MSI afterburner. I haven't bothered with the Zotac software, other than for aesthetics.
> I basically dragged the curve using CTRL at the 1.093 to keep that the highest speed. I would micro adjust each point. Otherwise, I didn't do anything overly complicated.
> The stock boost speed on this card was 1949mhz... which was 30+ less than what the MSI did. I guess initial boost clocks mean nothing? I was really disappointed w/ the card when I saw the stock boost & then started pushing it, lol. Talk about a Christmas present! I've been dealing with a string of bad luck, so I feel extremely fortunate to get this chip.
> 
> 
> 
> PM me if you want more details.


Thanks. Yours sounds like a stellar chip. I had been forgetting to max my voltage slider. It appears that mine will be south of 2.2 GHz though based off a quick test last PM. Good to know about the trick with control to drag the whole curve.
Quote:


> Originally Posted by *Dry Bonez*
> 
> I cant wait to get my EVGA SC 1080 today! Well, if it is a 1080 and not a 1070,since i won a bid on ebay for $430. I will post back when mailman arrives. I will be coming from a GTX 580,i cant wait to be amazed.


I came from a gtx 780 and noticed a huge difference. It will be massive for you. Enjoy.


----------



## FattysGoneWild

Quote:


> Originally Posted by *Jim86*
> 
> What kind of dumbass question is that is a 1080 good?!?!


Because a Titan X is better dumbass.


----------



## Dry Bonez

Quote:


> Originally Posted by *Jim86*
> 
> What kind of dumbass question is that is a 1080 good?!?!


i was referring to the non reference brands.

Quote:


> Originally Posted by *Martin778*
> 
> I think he is not referring to the model but rather the manufacturer


correct, yay i got my card installed right now and have yet to play games on it, i think the 1st game im gonna try is batman arkham knight!

Quote:


> Originally Posted by *Bishop07764*
> 
> Thanks. Yours sounds like a stellar chip. I had been forgetting to max my voltage slider. It appears that mine will be south of 2.2 GHz though based off a quick test last PM. Good to know about the trick with control to drag the whole curve.
> I came from a gtx 780 and noticed a huge difference. It will be massive for you. Enjoy.


Yay, i am so glad it came in as expected,brand new,i thought i was going to get ripped off since i got it for $430 on a bid on ebay and i was the ONLY bidder and all other cards had bids, oh well, it came in and it indeed is as advertised a gtx 1080 SC by EVGA.

So, how can i know if i have a good card or not?


----------



## Synthetickiller

Ran timespy on a single monitor at 2227mhz. Got 8512 as a graphics score. Looks like I'm good to go.

Timespy & 3DMark seems to stress the GPU less. I might see if it's possible to hit 2250. Edit: Can't even get 2240 to be stable, lol. Oh well, dreaming is fun sometimes.


----------



## tps3443

Why is the fps counter in heaven BS? Well it averages in the 200's maxed out 8XAA at 1080p. It never goes below 100? But, yet when the scene changes and starts over it says minimum fps is 39 fps??

This is why 3dmark starts a new test everytime the scene changes. Technically the min fps is incorrect.

It affects the overall score because the minimum fps is false.

If you want to test this, go run unigine heaven at 1080 P. And see what your minimum is. It is totally BS.


----------



## Dry Bonez

why does my EVGA 1080 Superclocked get so hot? the temps on MSI AB read mid 60s playing Batman AK at 4k but the gpu itself is super hot


----------



## tps3443

Quote:


> Originally Posted by *Dry Bonez*
> 
> why does my EVGA 1080 Superclocked get so hot? the temps on MSI AB read mid 60s playing Batman AK at 4k but the gpu itself is super hot


60C is 140F, that will burn the hell out of you. Yes, it's Nice and cold for a GPU. Mine runs hotter than that lol 60C load, during gaming at 4K, on a air cooled GPU is very good! Mine runs around 78C-83 it is a FE Overclocked pretty well lol.

The back plates get warm, they hold alot of heat. It's not a bad idea, to point a silent case fan at it to blow the air off of the back of the GPU. This will really help push some air of the cards 5 phase VRM power delivery near the back. Those probably get the hottest.

It goes from burning to the touch without a case fan, to feeling barely warm, or room temp with a fan blowing on it, or over it. Also, I believe 108F is hot enough to burn us to the touch. And 60C is alot hotter than 108f.

Your fine though, 60C is nice and cool! At about 140F.


----------



## tps3443

Quote:


> Originally Posted by *Synthetickiller*
> 
> Ran timespy on a single monitor at 2227mhz. Got 8512 as a graphics score. Looks like I'm good to go.
> 
> Timespy & 3DMark seems to stress the GPU less. I might see if it's possible to hit 2250. Edit: Can't even get 2240 to be stable, lol. Oh well, dreaming is fun sometimes.


Can you tell me what bios your running? I'm trying to get my FE to do a little better, I'm about to do the shunt mod. But, and recommendations? I can get about a steady 2,088Mhz-2,126mhz in 4K. My temps are under 80C. I think you've got the best overclocking 1080 on here. I'm hitting 25,100 steady in firestrike so, I'm not sure higher clocks are even scaling.


----------



## tps3443

Not bad for a FE! And a 6600K!


----------



## Synthetickiller

Quote:


> Originally Posted by *tps3443*
> 
> Can you tell me what bios your running? I'm trying to get my FE to do a little better, I'm about to do the shunt mod. But, and recommendations? I can get about a steady 2,088Mhz-2,126mhz in 4K. My temps are under 80C. I think you've got the best overclocking 1080 on here. I'm hitting 25,100 steady in firestrike so, I'm not sure higher clocks are even scaling.


Whatever Bios comes w/ the Arcticstorm. I'm 100% stock. Take into account that I have zero thermal throttling because I'm on water. I never go above 43-44°C.
I'm only on 1440p, sadly.
Take a look at my score. It's a hair higher at almost 25,800. I have no clue how it scales as I haven't really tried lowering the overclock.

Quote:


> Originally Posted by *tps3443*
> 
> Not bad for a FE! And a 6600K!


Dude, that's a nice score. ON AIR? Wow!

Here's mine (I'll gladly take a pic of the screen with my phone, I'm just lazy).
I think this is with 2214mhz on the core, +500 for memory.



Can anyone explain why the card is invalid? Here's my GPU-Z screenshot...


----------



## Bishop07764

Quote:


> Originally Posted by *Synthetickiller*
> 
> Whatever Bios comes w/ the Arcticstorm. I'm 100% stock. Take into account that I have zero thermal throttling because I'm on water. I never go above 43-44°C.
> I'm only on 1440p, sadly.
> Take a look at my score. It's a hair higher at almost 25,800. I have no clue how it scales as I haven't really tried lowering the overclock.


Dang man. You got a winner there. I have experimented very shortly with the curve. Firestrike will pass at 2190 core but my scores are definitely lower than regular overclocking with a core of 2164. I mean my graphics score is almost 600 points higher with my clocks at 2164.

http://www.3dmark.com/3dm/15058612


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> Dang man. You got a winner there. I have experimented very shortly with the curve. Firestrike will pass at 2190 core but my scores are definitely lower than regular overclocking with a core of 2164. I mean my graphics score is almost 600 points higher with my clocks at 2164.
> 
> http://www.3dmark.com/3dm/15058612


I'm going to play around a little with both the curve & the slider more than I have. Memory can come later.
If I'm busy enough to stay off the PC, tomorrow, I might do runs from 2000mhz through 2225mhz in 25mhz increments (or close to that) & see how linearly it scales and at what point the performance drops off.


----------



## k1bih

Is there any way to water cool the strix 1080?


----------



## LiquidHaus

Quote:


> Originally Posted by *k1bih*
> 
> Is there any way to water cool the strix 1080?


yup, two blocks out currently, with Watercool making theirs before the end of the year.

EK
https://www.ekwb.com/news/ek-releases-asus-gtx-1080-strix-full-cover-water-block/

and

Bitspower
http://www.performance-pcs.com/bitspower-nvidia-gtx-1080-rog-strix-acrylic-clear-water-block-for-asus-rog-strix-gtx1080.html


----------



## ucode

Quote:


> Originally Posted by *Bishop07764*
> 
> Firestrike will pass at 2190 core but my scores are definitely lower than regular overclocking with a core of 2164. I mean my graphics score is almost 600 points higher with my clocks at 2164.
> 
> http://www.3dmark.com/3dm/15058612


600 points higher graphics score for 2190 sounds about right.

http://www.3dmark.com/fs/10087349


----------



## GreedyMuffin

I can't get my graphic score over 24000...

Not even with 2176/500+ on mem. :/


----------



## Bogga

Got my cards under water and could raise the clocks some more. Before, while on air I started at ~2025 but dropped below 2000 after a couple of minutes. Mem was at 5300...

Now I'm at 2101/5400 and no throttling what so ever. I passed many hours of heaven on 2114/5550 and hours of gta v. But those clocks crashed battlefield and the gain is so minimal that dropping down is no issue for me. Temps vary but mostly between 38-40 degrees in game (after hours of heaven the max was 45).

Know these aren't the best clocks around, but they work for me


----------



## Menthol

Quote:


> Originally Posted by *Synthetickiller*
> 
> Can anyone explain why the card is invalid? Here's my GPU-Z screenshot...


New drivers, usually takes several days for Futuremark to qualify new drivers


----------



## Nexosu

I wanted to know if I could get some advice.

I have a Corsair 800D case in which I used to watercool my cpu and gpus back when I had a pair of 480s. I still use the case but now I only watercool the cpu. I currently have 2 FE GTX 1080s on air because I got sick of waiting for my queue to step up to non-reference. Well, my queue is finally up for a EVGA 1080 SC and I'm torn on whether just to sell it, or use it for my top card for better cooling temps and sell one of the FE.

Would the better cooling on the EVGA SC be enough to offset the negative of dumping the heat inside the case? I may go back to watercooling eventually as the cards age and I may need to get more out of them.


----------



## Synthetickiller

Quote:


> Originally Posted by *Menthol*
> 
> New drivers, usually takes several days for Futuremark to qualify new drivers


I moved back to 372.70 (was running 372.90). I still have the same issue...
I want to tweak & get this card on HWBot for points, but that error is ticking me off, lol.


----------



## juniordnz

Quote:


> Originally Posted by *Spieler4*
> 
> When gaming in SLI. Voltage is not the same on both cards. Should I fix this somehow to increase stability or is it ok ?


which osd is that? me likes


----------



## Synthetickiller

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can't get my graphic score over 24000...
> 
> Not even with 2176/500+ on mem. :/


I ran mine at 2176mhz & 500 on memory. I get 25,730. Hope that gives you a baseline.
Have you tried runs at 2150 & 2125, just to see how the score changes & if it decreases linearly or not?


----------



## Bishop07764

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can't get my graphic score over 24000...
> 
> Not even with 2176/500+ on mem. :/


Have you tried backing off on your memory clock? My score took a serious dive when I had my memory clocked too high. My score for core at 2190 was worse than 2164 too. There seems to be a real point of diminishing returns with maxing the core.


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> Have you tried backing off on your memory clock? My score took a serious dive when I had my memory clocked too high. My score for core at 2190 was worse than 2164 too. There seems to be a real point of diminishing returns with maxing the core.


This is really true.
I tried running firestrike over and over with the following overclocks on memory: 475, 500, 550, 600. I kept the GPU core at 2176mhz to match what speed he was aiming for.
600mhz scored lower than 475. I have to dial my memory in, but 500 is almost the sweet spot. I saw at least a 300 point drop, if not more, when OCing too high. Poor memory overclocking is not as important as GPU clock speeds, but dialing in memory speeds can give a significant boost.


----------



## davepk

Synthetickiller, Any chance you could upload your arcticstorm BIOS using GPUZ please?

I might give it a try in my Amp Extreme.


----------



## Benjiw

Quote:


> Originally Posted by *tps3443*
> 
> Why is the fps counter in heaven BS? Well it averages in the 200's maxed out 8XAA at 1080p. It never goes below 100? But, yet when the scene changes and starts over it says minimum fps is 39 fps??
> 
> This is why 3dmark starts a new test everytime the scene changes. Technically the min fps is incorrect.
> 
> It affects the overall score because the minimum fps is false.
> 
> If you want to test this, go run unigine heaven at 1080 P. And see what your minimum is. It is totally BS.


Because that limitation is down to your CPU clock speeds and ram etc, not your card. It isn't false, it's correct.


----------



## Benjiw

Quote:


> Originally Posted by *lifeisshort117*
> 
> It seems to me that these cards start to develop a few different characteristics when they get watercooled.
> 
> Gets me excited to watercool my FTW card.
> 
> @Watercool-Jakob anytime now


Been saying this since the 9xx cards, the cooler they are the more stable your clocks because GPU boost isn't upping and downing all over the place.


----------



## Spieler4

Quote:


> Originally Posted by *juniordnz*
> 
> which osd is that? me likes


MSI afterburner + Rivatuner statics server software


----------



## Spiriva

Quote:


> Originally Posted by *juniordnz*
> 
> which osd is that? me likes


I think its the one that comes with EVGA Precision XOC

http://www.evga.com/precisionxoc/


----------



## tps3443

Just did SHUNT MOD. Here are avg results.

Before

1,900 Mhz Core 0.900 vcore 117% TDP power.

After 2,066 Mhz Core 1.030 vcore 95% TDP power.

So, even with higher VCORE, and higher clocks the TDP is less after the shunt mod.

It is EASY.

It's pretty drastic! I'm getting a much more stable clock without throttling the core down!

My card is a FE, so my temps are pushing 80C 100% fan. With much better overclocks though!

Bare in mind I was pulling a consistent 25,100+ in firestrike graphics, with my FE, and a 6600K, before doing the shunt mod.

I'm still testing stability on higher clocks. And to really see what kind of improvement there is. But, TDP is wayyyy lower!


----------



## tps3443

What is the absolute best thermal gel to use on the GPU? Any suggestions?

I would use the liquid metal but, it will corrode aluminum. I just reapplied standard thermal grease on reassembly.

Power, and clocks are better! But, temps are worse, or the same roughly.

I need a recommendation of the absolute best thermal there is! Lol


----------



## Koniakki

Quote:


> Originally Posted by *tps3443*
> 
> What is the absolute best thermal gel to use on the GPU? Any suggestions?
> 
> I would use the liquid metal but, it will corrode aluminum. I just reapplied standard thermal grease on reassembly.
> 
> Power, and clocks are better! But, temps are worse, or the same roughly.
> 
> I need a recommendation of the absolute best thermal there is! Lol


I recommend Thermal Grizzly Kryonaut personally.

Most recent use was on a Titan X(M) ref cooler and it dropped 3'C(69 VS 72 max temp).

My 1080 FE eagerly awaits it's turn.


----------



## LiquidHaus

Yeah I fully plan to get some Kryonaut and also their thermal pads to install on my FTW when the Watercool block comes out.

Want every little thing I can get to help with temps lol even when waterblocking it would be enough. it's never enough!


----------



## Synthetickiller

Quote:


> Originally Posted by *davepk*
> 
> Synthetickiller, Any chance you could upload your arcticstorm BIOS using GPUZ please?
> 
> I might give it a try in my Amp Extreme.


I just uploaded it to techpowerup. I don't have an account for a file sharing site & overclock doesn't seem to like me trying to upload any ".rom" files, lol.
I'll upload to one if you give me a link.


----------



## Vellinious

Got the block and backplate installed on the first 1080 FTW today. I've got another pump, pump top, fittings and coolant coming in on Tuesday. Should be ready to get the loop drained, flushed and rebuilt next weekend.


----------



## davepk

Thanks for doing that.

I'll give it a try in my extreme when it becomes available to DL from techpowerup.


----------



## Bishop07764

I just started using Kryonaut as well. Great results so far. Thinking about using it for my Seahawk. I'm probably too lazy to do it though as my temps are already pretty good. Not sure it's worth it for me even if I get 2-3c improvement.


----------



## tps3443

My FE, is already a great overclocking card compared to others! I felt like a fool reassembling the card, after doing the shunt mod and using this off brand white thermal grease on my GPU die lol.

Any of you guys running the shunt mod?

How was your results?

Once I have a spare $400 laying around I will build a custom loop! The GTX1080 looks amazing wearing those ek blocks


----------



## lanofsong

Hey GTX1080 owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Bishop07764

Quote:


> Originally Posted by *tps3443*
> 
> My FE, is already a great overclocking card compared to others! I felt like a fool reassembling the card, after doing the shunt mod and using this off brand white thermal grease on my GPU die lol.
> 
> Any of you guys running the shunt mod?
> 
> How was your results?
> 
> Once I have a spare $400 laying around I will build a custom loop! The GTX1080 looks amazing wearing those ek blocks


Word of warning if you ever do go the water cooling method, it can be quite addicting, expensive, etc. I just fell of the wagon again and water-cooled my wife's htpc. I couldn't stand the temps I was getting for that small little box. Results are awesome though. Dropped the CPU load temps by about 40c and GPU temps by 30c when my 6 year old and I game on it. It all started a few years ago with a Corsair AIO CPU cooler. Then I just had to have more.


----------



## Synthetickiller

Quote:


> Originally Posted by *Bishop07764*
> 
> Word of warning if you ever do go the water cooling method, it can be quite addicting, expensive, etc. I just fell of the wagon again and water-cooled my wife's htpc. I couldn't stand the temps I was getting for that small little box. Results are awesome though. Dropped the CPU load temps by about 40c and GPU temps by 30c when my 6 year old and I game on it. It all started a few years ago with a Corsair AIO CPU cooler. Then I just had to have more.


When I have some funds, I plan on watercooling my PFsense router that's based on a J1900 SoC. Overkill? Yes. Fun? OH HELL YEAH!


----------



## ucode

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can't get my graphic score over 24000...
> 
> Not even with 2176/500+ on mem. :/


Clocks can be awkward to read. One driver might result in lower stable clocks but produce the same scores at 100MHz lower clock than another driver.

Not even sure how accurate they are. Ran some FS tests at clocks reported as 12.5MHz, 25MHz, and 100MHz in 3DMark, GPU-z and HWiNFO and they all scored about the same with the 12.5MHz being mathematically 8x slower than 100MHz taking first place. WTH lol.

http://www.3dmark.com/compare/fs/10274004/fs/10274216/fs/10274433

So moral of the story, don't go chasing clocks, chase performance.


----------



## Juub

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can't get my graphic score over 24000...
> 
> Not even with 2176/500+ on mem. :/


Mine can barely go past 23K. I got two of them at least but the performance in SLI is kinda disappointing for a set of cards costing over 1,000$. Still struggles mightily at 4K in the latest games. Heard HB bridge or two standard bridges deliver better performance in many games though.


----------



## LiquidHaus

Hey anyone on here have a Strix 1080 that can give me some insight?

I'm currently talking about this on the 1060 owners thread...

But my girlfriend's 1060 Strix does NOT throttle boost when hitting the heat limits we see when we're testing our air cooled 1080s.

Like how all of ours start throttling boost at 55c and then again at 60c.

Her 1060 Strix doesn't fluctuate at ALL.

Another user on here said that might be because of the bios? Interesting theory. Wondering if any of you guys have tried a Strix bios on your card or have an actual Strix to give me some insight/experience.


----------



## KickAssCop

I have 2 Strix 1080s. They throttle. Bios is one I got from techpowerup 1080 OC bios. Even with my original AG bios, they would throttle.
Don't know why you are asking about 1060 in a 1080 thread since 1060 may have a different boost algorithm with different temperature limits (maybe)? Don't have a clue about the 1060 as a card lol.


----------



## LiquidHaus

Figured it was worth a try asking for other Pascal Strix experience.


----------



## wardo3640

That would be nice if you could drop a 1060 Bios on a 1080 and rid yourself of throttling all together.


----------



## MiamiMuscleBoy

I have tr5ied heaven 4.0 at 1440p but the screen just goes black ,,, its a 4k monitor and thoughts? also I did manage 5781 on firestrike 4k ultra which is what this card says it can do. reference is in the 4900-s to 5100etc.. so i guess its not hat bad, its not like im gonna notice that 600 extra points. IM just aggravated when we alt tab out the nvidia drivers drop it down to 30 fps until we undo and redo v sync , even vsyc fast has the same problem


----------



## Shultzy

Just received my new GTX 1080 FTW from EVGA and I gotta say that the RGB leds look pretty sweet. I haven't had much time to play with it yet but it boosts to 2025 straight out of the box. I already have the FTW block in my possession from EK, but I think i'll do some testing on air first. It is a good looking card and I will miss the way it looks all lit up in my case.


----------



## Bogga

Quote:


> Originally Posted by *tps3443*
> 
> What is the absolute best thermal gel to use on the GPU? Any suggestions?
> 
> I would use the liquid metal but, it will corrode aluminum. I just reapplied standard thermal grease on reassembly.
> 
> Power, and clocks are better! But, temps are worse, or the same roughly.
> 
> I need a recommendation of the absolute best thermal there is! Lol


I use the Kryonaut as well... since I didn't try any other brand I can't tell if there's any difference, but I'll just believe all the tests out there









The biggest difference I noticed was that it was much harder to spread and the included stuff wasn't that good. I've always done the spread and cover the die/ihs and not the pea method. So I just put some cling film on my finger and spread it that way


----------



## Fediuld

Another vote for Kryonaut.

I had watercooled the 1080 using the EK Ectotherm (similar to Arctic Silver 5)
Replaced it woth Kryonaut and dropped the idle temps by 13C (22C) and the full load at 2177 core speed by 10C (max 35C from previous 45C)


----------



## Dry Bonez

Can someone explain this to me, alright, some of you may know i bought this EVGA gtx 1080 SC for $430. Well anyway,i noticed 2 things with my card.

1: it gets very hot!,allow me to elaborate on some testing i did. EVERY single time i play at 4k,lets say, BO3 or Batman AK,the temps on MSI AB read about 63c which is not bad.So ok, i then played the same exact game and with higher graphical settings on 1440p and even at MAX settings the card reads at 42-45c with MAX settings and when i touch the card its cool as hell. wth???

2:My card sucks at OC, if i set the core clock beyond +63,i start getting artifacts,which i measured with gpu z and the highest i got was 2000mhz


----------



## juniordnz

Quote:


> Originally Posted by *Dry Bonez*
> 
> Can someone explain this to me, alright, some of you may know i bought this EVGA gtx 1080 SC for $430. Well anyway,i noticed 2 things with my card.
> 
> 1: it gets very hot!,allow me to elaborate on some testing i did. EVERY single time i play at 4k,lets say, BO3 or Batman AK,the temps on MSI AB read about 63c which is not bad.So ok, i then played the same exact game and with higher graphical settings on 1440p and even at MAX settings the card reads at 42-45c with MAX settings and when i touch the card its cool as hell. wth???
> 
> 2:My card sucks at OC, if i set the core clock beyond +63,i start getting artifacts,which i measured with gpu z and the highest i got was 2000mhz


What's the GPU utilization and TDP% with each of the situations you mentioned in 1? Those play a big role in temperatures. Maybe 1440p is very light in those games you're testing and the card is not being fully stressed, that's why the lower temps. And in 4k where the games stress the gpu to its max temps are higher (normal, though).

It seems you just got trolled by silicon lottery


----------



## Synthetickiller

Quote:


> Originally Posted by *Dry Bonez*
> 
> Can someone explain this to me, alright, some of you may know i bought this EVGA gtx 1080 SC for $430. Well anyway,i noticed 2 things with my card.
> 
> 1: it gets very hot!,allow me to elaborate on some testing i did. EVERY single time i play at 4k,lets say, BO3 or Batman AK,the temps on MSI AB read about 63c which is not bad.So ok, i then played the same exact game and with higher graphical settings on 1440p and even at MAX settings the card reads at 42-45c with MAX settings and when i touch the card its cool as hell. wth???
> 
> 2:My card sucks at OC, if i set the core clock beyond +63,i start getting artifacts,which i measured with gpu z and the highest i got was 2000mhz


At least you know that even without your OC, you got 1080 stockish performance for the price of a 1070, which can never OC to that level of performance.
Since you got the card cheap, you really have two options here.... 1, sell it for a significant profit & try your hand at an FE card (legend has it that they are binning those chips) or 2, if you have a watercooling loop, buy a block for $110ish & call it a day. You'll never throttle.

About the OC, one option is the shunt mod. I believe that the SC is basically an FE with a custom cooler. Double check this before attempting, though.


----------



## Dry Bonez

Quote:


> Originally Posted by *Synthetickiller*
> 
> At least you know that even without your OC, you got 1080 stockish performance for the price of a 1070, which can never OC to that level of performance.
> Since you got the card cheap, you really have two options here.... 1, sell it for a significant profit & try your hand at an FE card (legend has it that they are binning those chips) or 2, if you have a watercooling loop, buy a block for $110ish & call it a day. You'll never throttle.
> 
> About the OC, one option is the shunt mod. I believe that the SC is basically an FE with a custom cooler. Double check this before attempting, though.


im not gonna lie, when i bought it, i immediately thought about selling it for profit, but instead i used it because i was coming from a dreadful gtx 580 which was driving me insane. As for watercooling,i have on my 4790k a swiftech h22oX2 prestige, idk how to add a wb to that thing to the gpu. heck,i dont even know what i need in terms of equipment to do that. although i would love to have it but im moat likely gonna sell it once we near the gtx 1080ti


----------



## Bishop07764

Quote:


> Originally Posted by *Synthetickiller*
> 
> When I have some funds, I plan on watercooling my PFsense router that's based on a J1900 SoC. Overkill? Yes. Fun? OH HELL YEAH!


Do it!









Quote:


> Originally Posted by *lifeisshort117*
> 
> Hey anyone on here have a Strix 1080 that can give me some insight?
> 
> I'm currently talking about this on the 1060 owners thread...
> 
> But my girlfriend's 1060 Strix does NOT throttle boost when hitting the heat limits we see when we're testing our air cooled 1080s.
> 
> Like how all of ours start throttling boost at 55c and then again at 60c.
> 
> Her 1060 Strix doesn't fluctuate at ALL.
> 
> Another user on here said that might be because of the bios? Interesting theory. Wondering if any of you guys have tried a Strix bios on your card or have an actual Strix to give me some insight/experience.


I have a EVGA SC 1060 in my wife's htpc. The clocks have jumped around a bit on that even with it only reaching a max of mid 60's while it was on air. I have since watercooled her computer, and it boosts beyond 2ghz now at times anyway. It's currently being limited by my wife's cpu it appears: AMD 6800k apu. 1080's must just be a lot more sensitive to temps.


----------



## Synthetickiller

Quote:


> Originally Posted by *Dry Bonez*
> 
> im not gonna lie, when i bought it, i immediately thought about selling it for profit, but instead i used it because i was coming from a dreadful gtx 580 which was driving me insane. As for watercooling,i have on my 4790k a swiftech h22oX2 prestige, idk how to add a wb to that thing to the gpu. heck,i dont even know what i need in terms of equipment to do that. although i would love to have it but im moat likely gonna sell it once we near the gtx 1080ti


You need a screwdriver. That's it.
If you're going to get the Ti, dump it before Ti drops & settle for onboard graphics in the mean time. At least that should lower the price of the Ti about $170 (I figure you could get $600 for the card.

Quote:


> Originally Posted by *Bishop07764*
> 
> Do it!


Once funds are available my friend.








My PFSense box is sitting in a bitfenix prodigy. Talk about overkill, lol. The j1900 SoC is passively cooled (BOOOO!!!!!!). I could arguably throw the chip on water, buy a 180mm rad & mount it to the intake fan. All other fans become exhausts. Overkill? Of course, but temps would always stay the same.







.

The only thing that I don't have experience with is how to attach a generic waterblock to a chip. I also don't have access to the right tools (everything is in storage, sadly) at the moment. Once I have my stuff, I can probably find a source for how to do what I'm doing.


----------



## tps3443

GTX1080's usually running 1080P, or 1440P are pushing 60 to well over 100 fps very easily!

So, it's not going to heat them up very much.

When you run 4k on ultra averaging 35-60 it's a strain!

It's like inviting over 4 of your buddies, sitting them down in front of a 1080P display for each person, and all if them playing the game maxed out and the only thing powering it all, is your single GTX1080. And it's powering all (4) displays! That's alot of pixels! And hats off to a card that can actually do this lol!


----------



## juniordnz

Don't open unless you got the stomach to. One of the saddest things I've seen so far...


Spoiler: Warning: Spoiler!


----------



## tps3443

Quote:


> Originally Posted by *Dry Bonez*
> 
> Can someone explain this to me, alright, some of you may know i bought this EVGA gtx 1080 SC for $430. Well anyway,i noticed 2 things with my card.
> 
> 1: it gets very hot!,allow me to elaborate on some testing i did. EVERY single time i play at 4k,lets say, BO3 or Batman AK,the temps on MSI AB read about 63c which is not bad.So ok, i then played the same exact game and with higher graphical settings on 1440p and even at MAX settings the card reads at 42-45c with MAX settings and when i touch the card its cool as hell. wth???
> 
> 2:My card sucks at OC, if i set the core clock beyond +63,i start getting artifacts,which i measured with gpu z and the highest i got was 2000mhz


That is definitely not normal overclocks. The lowest I run my FE, is about +225 Core, +525 memory. After the shunt mod I am able to run +250 to +260 core. I have not retested the memory yet. But, it is super stable at these speeds!

At the end of the day, Overclocking is very addicting. Because after you test and run these high overclocks you feel the card is not good enough, or at its best without them.

And you have to run it Overclocked, or not run it at all!

It's still a GTX1080 though!


----------



## tps3443

Quote:


> Originally Posted by *juniordnz*
> 
> Don't open unless you got the stomach to. One of the saddest things I've seen so far...
> 
> 
> Spoiler: Warning: Spoiler!


It looks like, from the first pic they shut the door on it. Like a roll down door! It least it was only a 1070. It took for the team! Poor thing!


----------



## Vellinious

Quote:


> Originally Posted by *Synthetickiller*
> 
> At least you know that even without your OC, you got 1080 stockish performance for the price of a 1070, which can never OC to that level of performance.
> Since you got the card cheap, you really have two options here.... 1, sell it for a significant profit & try your hand at an FE card (legend has it that they are binning those chips) or 2, if you have a watercooling loop, buy a block for $110ish & call it a day. You'll never throttle.
> 
> About the OC, one option is the shunt mod. I believe that the SC is basically an FE with a custom cooler. Double check this before attempting, though.


The shunt mod doesn't do anything but override the power limit. If his card is getting artifacts at the offset he's talking about, that's not going to do anything for him.


----------



## tps3443

Quote:


> Originally Posted by *Vellinious*
> 
> The shunt mod doesn't do anything but override the power limit. If his card is getting artifacts at the offset he's talking about, that's not going to do anything for him.


Yes but, the shunt mod will automatically allow more voltage at any given frequency. It could be artifacting because it just doesn't have enough voltage. But usually the higher the ASIC less voltage at higher speeds, and low ASIC at is more voltage at lower speeds. GPU boost 3.0 automatically determines this.

In my case my GTX1080 ran 0.900volts 1900Mhz, 117% TDP power.

After shunt mod

2,000+ Mhz, 1.030 volts, 95% TDP.

So, it's possible it may help some.

Although, it does sound like he has a terribly overclocking GTX1080.

I would still take it over any GTX1070 anyday.


----------



## Vellinious

Quote:


> Originally Posted by *tps3443*
> 
> Yes but, the shunt mod will automatically allow more voltage at any given frequency. It could be artifacting because it just doesn't have enough voltage. But usually the higher the ASIC less voltage at higher speeds, and low ASIC at is more voltage at lower speeds. GPU boost 3.0 automatically determines this.
> 
> In my case my GTX1080 ran 0.900volts 1900Mhz, 117% TDP power.
> 
> After shunt mod
> 
> 2,000+ Mhz, 1.030 volts, 95% TDP.
> 
> So, it's possible it may help some.
> 
> Although, it does sound like he has a terribly overclocking GTX1080.
> 
> I would still take it over any GTX1070 anyday.


It's not going to raise the maximum allowable voltage...the shunt mod doesn't do anything for that. It may allow your card to hit a higher voltage because it's not power limit throttling, but.....shorting the shunts doesn't have any direct impact on voltage.


----------



## tps3443

Quote:


> Originally Posted by *Vellinious*
> 
> It's not going to raise the maximum allowable voltage...the shunt mod doesn't do anything for that. It may allow your card to hit a higher voltage because it's not power limit throttling, but.....shorting the shunts doesn't have any direct impact on voltage.


That is not what I mean. I said it increases and allows more voltage for any given frequency. So, if his card runs 0.900 at 1900 then it will allow more voltage at this Mhz, and every other boost level. Obviously it will still be limited to 1.093 volts.

GPU boost 3.0 without shunt would be like this on his card, the shunt mod, makes the card think the TDP is lower than it really is, and therefore allowing more voltage at each state.

1600Mhz = 0.700 volts

1800Mhz = 0.800 volts

1900Mhz = 0.900 volts

Well after the shunt the voltage is automatically higher at each stage!

1600mhz would be 0.800 volts, and so on and so forward.

GPU boost automatically picks voltage, to save power dependant upon each clock. Well maybe it's not enough for such a poor chip. So, the shunt mod may help some. A 60+ Mhz Overclock and artifacts is terrible, if the card is new and is indeed a retail GTX1080, I would return it or RMA. And say it artifacts out of the box or something.


----------



## Dry Bonez

Quote:


> Originally Posted by *tps3443*
> 
> That is definitely not normal overclocks. The lowest I run my FE, is about +225 Core, +525 memory. After the shunt mod I am able to run +250 to +260 core. I have not retested the memory yet. But, it is super stable at these speeds!
> 
> At the end of the day, Overclocking is very addicting. Because after you test and run these high overclocks you feel the card is not good enough, or at its best without them.
> 
> And you have to run it Overclocked, or not run it at all!
> 
> It's still a GTX1080 though!


I mean,no matter how i look at it, it is definately better than a gtx 580 lol. But allow me to ask this to you, what would YOU do if you had a card like this that doesnt OC well?

Quote:


> Originally Posted by *tps3443*
> 
> Yes but, the shunt mod will automatically allow more voltage at any given frequency. It could be artifacting because it just doesn't have enough voltage. But usually the higher the ASIC less voltage at higher speeds, and low ASIC at is more voltage at lower speeds. GPU boost 3.0 automatically determines this.
> 
> In my case my GTX1080 ran 0.900volts 1900Mhz, 117% TDP power.
> 
> After shunt mod
> 
> 2,000+ Mhz, 1.030 volts, 95% TDP.
> 
> So, it's possible it may help some.
> 
> Although, it does sound like he has a terribly overclocking GTX1080.
> 
> I would still take it over any GTX1070 anyday.


If you had this card, what would you do? Get rid of it?


----------



## tps3443

Quote:


> Originally Posted by *Dry Bonez*
> 
> I mean,no matter how i look at it, it is definately better than a gtx 580 lol. But allow me to ask this to you, what would YOU do if you had a card like this that doesnt OC well?
> If you had this card, what would you do? Get rid of it?


I would personally sell it as a like new GTX1080 for alot more money than you paid for it!

You could get probably $550-$600 used.

And, you could just buy another one.

What kind of GTX1080 is this?

Is it already Overclocked from the factory?

I've ran my GTX1080FE abunch of times stock, and the original auto fan profile. The card will boost around 1733-1797mhz stock. With the temps at 83C at 4K, and a fan that's barely spinning, to keep it quiet.

It is still a beast!

And at 1080P, and 1440P it will boost even higher to around 1860-1900 because it is running cooler, with less load.

It's still a GTX1080, and people buy the all the time and do not Overclock them! And, they are more than satisfied!

But, if you want to Overclock? Then sell it! And buy another one. The next one will Overclock.

I'm not gonna lie, my Founders Edition GTX1080 during the summer months was usually running at default speeds. My ambient was at 80-84F at times. And I was very happy with it!

How does the memory Overclock?

You sure your doing everything right?


----------



## Dry Bonez

Quote:


> Originally Posted by *tps3443*
> 
> I would personally sell it as a like new GTX1080 for alot more money than you paid for it!
> 
> You could get probably $550-$600 used.
> 
> And, you could just buy another one.
> 
> What kind of GTX1080 is this?
> 
> Is it already Overclocked from the factory?
> 
> I've ran my GTX1080FE abunch of times stock, and the original auto fan profile. The card will boost around 1733-1797mhz stock. With the temps at 83C at 4K, and a fan that's barely spinning, to keep it quiet.
> 
> It is still a beast!
> 
> And at 1080P, and 1440P it will boost even higher to around 1860-1900 because it is running cooler, with less load.
> 
> It's still a GTX1080, and people buy the all the time and do not Overclock them! And, they are more than satisfied!
> 
> But, if you want to Overclock? Then sell it! And buy another one. The next one will Overclock.
> 
> I'm not gonna lie, my Founders Edition GTX1080 during the summer months was usually running at default speeds. My ambient was at 80-84F at times. And I was very happy with it!
> 
> How does the memory Overclock?
> 
> You sure your doing everything right?


My card is an EVGA Superclocked 1080. i mean,i think im doing the right things, lol.


----------



## tps3443

Quote:


> Originally Posted by *Dry Bonez*
> 
> My card is an EVGA Superclocked 1080. i mean,i think im doing the right things, lol.


I'd be curious to try it myself. Want to trade?

I have a MSI GTX1080 Founders Edition. It's a $700 video card on Newegg, and it overclocks to roughly +250 Core,+525 memory. Original retail box, and all!


----------



## Vellinious

Quote:


> Originally Posted by *tps3443*
> 
> That is not what I mean. I said it increases and allows more voltage for any given frequency. So, if his card runs 0.900 at 1900 then it will allow more voltage at this Mhz, and every other boost level. Obviously it will still be limited to 1.093 volts.
> 
> GPU boost 3.0 without shunt would be like this on his card, the shunt mod, makes the card think the TDP is lower than it really is, and therefore allowing more voltage at each state.
> 
> 1600Mhz = 0.700 volts
> 
> 1800Mhz = 0.800 volts
> 
> 1900Mhz = 0.900 volts
> 
> Well after the shunt the voltage is automatically higher at each stage!
> 
> 1600mhz would be 0.800 volts, and so on and so forward.
> 
> GPU boost automatically picks voltage, to save power dependant upon each clock. Well maybe it's not enough for such a poor chip. So, the shunt mod may help some. A 60+ Mhz Overclock and artifacts is terrible, if the card is new and is indeed a retail GTX1080, I would return it or RMA. And say it artifacts out of the box or something.


That might be true if there were prescribed power limits for each clock state.....but I highly doubt it. Guess we'll see if we ever get to crack open the bios.


----------



## Synthetickiller

Quote:


> Originally Posted by *juniordnz*
> 
> Don't open unless you got the stomach to. One of the saddest things I've seen so far...
> 
> 
> Spoiler: Warning: Spoiler!


My heart goes out to you... wow.








Talk about taking it to the 11th in terms of "effing" the box...

Quote:


> Originally Posted by *Vellinious*
> 
> The shunt mod doesn't do anything but override the power limit. If his card is getting artifacts at the offset he's talking about, that's not going to do anything for him.


Quote:


> Originally Posted by *Vellinious*
> 
> It's not going to raise the maximum allowable voltage...the shunt mod doesn't do anything for that. It may allow your card to hit a higher voltage because it's not power limit throttling, but.....shorting the shunts doesn't have any direct impact on voltage.


Gotcha. I learned something new.
So the shunt mod is just to lower TDP? That's it?
Quote:


> Originally Posted by *Dry Bonez*
> 
> If you had this card, what would you do? Get rid of it?


TL;DR answer: Yes
Long answer... if you can get $170ish for it ($600), then I'd seriously consider that... It's not a card that clocks wells, so unless you're happy with almost stock settings, move on.
If you are happy with the price vs performance ratio, then keep it. If I could pick up another 1080 for $400 ish, I'd consider it for a second rig or SLI, even if it ran stock. Then again, my card is a speed demon. I'd flip that card asap & get a 1080Ti... using the extra $170 towards it or towards a water block if you were going in that direction.


----------



## Vellinious

It bypasses the prescribed power limit that's set in the bios.


----------



## Synthetickiller

Quote:


> Originally Posted by *Vellinious*
> 
> It bypasses the prescribed power limit that's set in the bios.


Is there a shunt mod for non-reference boards?
I'm note 1337 or daring enough to mess with my card, yet... I'm in no way thermal throttling, so it's simply a max power issue. I'd really like to try even 1.15v if not the full blow 1.2v


----------



## tps3443

Quote:


> Originally Posted by *Synthetickiller*
> 
> Is there a shunt mod for non-reference boards?
> I'm note 1337 or daring enough to mess with my card, yet... I'm in no way thermal throttling, so it's simply a max power issue. I'd really like to try even 1.15v if not the full blow 1.2v


I think so, look at my pics of my card in a page or two back of my shunt mod, and see if the pcb of yours has the same resistor/regulator on it.



YOU CAN SEE THE Liquid metal, in the 2nd photo


----------



## MiamiMuscleBoy

I have a zotac amp extreme 1080. , since its clocked at 1772/1911 , afterburner can barely put +50 on the clock and I have noticed all the cards even the founders seem to hi 2126 or lower before they settle in at around 2101 , 2088 etc no matter what the card is , so is there a POINT to buying a $700 card or a $610 dollar card on newegg... all the reviews are saying they all hit this clock and its like a lock they cant go any higher than 2.1ghz....

Now i heard theres some LN2 thing that can make 2400 ghz? does anyone know about that at all?? I thougth that was an msi lightning bios?


----------



## galeonki

I think i catch very good card

MSI Gaming X 1080

Core: 2341mhz - 2-3 days gaming and benchmarks 100% stable
Memory: 5584mhz - best for me for all games and bench's - but still looking sweet point
Voltage: 1.093
Temp: 64C max - I add thermal pads and copper radiators on backplate - 2-3C Temp down overall
I repaste Core by CLU - temps down 7-9C
80% max RPM cooler
Ambient temp: 24C

My only issue is POWER limit - sometimes throttle my core -13 or -26mhz down - but not very often and only for 1-2 sec

No shunts mod - because after paste CLU on resistors i had coil whine as hell

I tried 10 bioses and zero different for me - only Asus Strix OC bios without power limit push my card on 2380 on air but with 1.15 voltage - but my fps in games and bench results was the same like on stock Gaming X bios

Any idea how to throw away POWER limit? Shunt mod falls


----------



## MrTOOSHORT

@galeonki

Your score looks about right for a 2063MHz GTX 1080, not 2300MHz+

The bios you're using is not very good.


----------



## galeonki

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> @galeonki
> 
> Your score looks about right for a 2063MHz GTX 1080, not 2300MHz+
> 
> The bios you're using is not very good.


i know that the score of FS is crap but fps in games are much higher - on valley benchmark too







and i dont have MEM sweet point yet. Its a stock BIOS. Only this POWER limit.... I have no idea how bypass it...

btw ive reach 20820 points overall and 24920 graphics but fps in games was lower by 2-3 with MSI Hawk EK bios and similar with FE bios.

testing now higher clocks and we will see how far it goes

edit: 2354mhz seems ok after 3 benchmarks - i stick time spy graphics res:



the same story - with different bios i have higher bench results but 2-3 frames lower in games - testing in fallout4 and gta5

at 1.000v ive reach very stable 2151mhz

ps. sorry for double post


----------



## grimboso

Quote:


> Originally Posted by *Vellinious*
> 
> Got the block and backplate installed on the first 1080 FTW today. I've got another pump, pump top, fittings and coolant coming in on Tuesday. Should be ready to get the loop drained, flushed and rebuilt next weekend.


That is one sweet looking backplate!

Also got the FTW block comming in today with nickel backplate, Must say I like the colored ones much better than the "standard" black one!


----------



## Vellinious

Quote:


> Originally Posted by *grimboso*
> 
> That is one sweet looking backplate!
> 
> Also got the FTW block comming in today with nickel backplate, Must say I like the colored ones much better than the "standard" black one!


I was a little disappointed that it was more of a brushed nickel surface, than a polished one.....but that's easily fixed.


----------



## juniordnz

Quote:


> Originally Posted by *MiamiMuscleBoy*
> 
> I have a zotac amp extreme 1080. , since its clocked at 1772/1911 , afterburner can barely put +50 on the clock and I have noticed all the cards even the founders seem to hi 2126 or lower before they settle in at around 2101 , 2088 etc no matter what the card is , so is there a POINT to buying a $700 card or a $610 dollar card on newegg... all the reviews are saying they all hit this clock and its like a lock they cant go any higher than 2.1ghz....
> 
> Now i heard theres some LN2 thing that can make 2400 ghz? does anyone know about that at all?? I thougth that was an msi lightning bios?


Strange thing: my old 1080FTW did 2000mhz out of the box and 2126mhz maximum overclock. When I got the new one, I saw it did 2025mhz out of the box, so I guesses it would do 2151mhz overclocked, right? Wrong. Got the same 2126mhz as the old one.

The 2126mhz threshold you mentioned does seem to exist.

The way the card handles those 2126mhz is different, though. The new one flatlines after 1.031V, so the card can handle 2126mhz with 1.031V, while the last one would need the whole 1.062V to be stable.


----------



## ucode

Quote:


> Originally Posted by *galeonki*
> 
> Only this POWER limit.... I have no idea how bypass it...


At 121% that's 291W, more than enough to achieve a much better score. Somethings not right.


----------



## chiknnwatrmln

Hey guys, I get to join you later today.









The card I bought from another OCN member is capable of 2152 MHz on air, I wonder what it'll do under water.


----------



## Synthetickiller

Quote:


> Originally Posted by *galeonki*
> 
> i know that the score of FS is crap but fps in games are much higher - on valley benchmark too
> 
> 
> 
> 
> 
> 
> 
> and i dont have MEM sweet point yet. Its a stock BIOS. Only this POWER limit.... I have no idea how bypass it...
> 
> btw ive reach 20820 points overall and 24920 graphics but fps in games was lower by 2-3 with MSI Hawk EK bios and similar with FE bios.
> 
> testing now higher clocks and we will see how far it goes
> 
> edit: 2354mhz seems ok after 3 benchmarks - i stick time spy graphics res:
> 
> 
> 
> the same story - with different bios i have higher bench results but 2-3 frames lower in games - testing in fallout4 and gta5
> 
> at 1.000v ive reach very stable 2151mhz
> 
> ps. sorry for double post


The scores don't make sense. We're both having issues, lol.
Mine won't validate (I need to contact 3DMark about the card I'm using)...
Ignore the date, i was lazy & didn't change the post-it note's date, lol.

Anyways, here's 2214mhz with +505 on memory. My scores are around 400 points higher. My FPS scale linearly in games (2200mhz is stable, 2214 is probably not stable in Doom).


----------



## Alperen62002

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Hey guys, I get to join you later today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The card I bought from another OCN member is capable of 2152 MHz on air, I wonder what it'll do under water.


If it is capable of 2152 Mhz on air, it will not sooo more under water. With The Water Cooling, it will always stay under 2152 Mhz because of the lower temps and be more stable.
The boost clock will not throttle. That's why you will become better results in the benchmarks.

I tested it with my gtx 1080 fe. Same factory Clocks, much better results (over 300 Points in fire strrike). The boost clock was only 38Mhz faster with the ek water block.

At least the problem will be only the power limit, not the temps as galeonki realized in this post. He is whinig because of throttling core -13 or -26mhz down







.
@galeonki you can not throw away the power limit without a shunts mod. In electronics, a shunt is a device which allows electric current to pass around another point in the circuit by creating a low resistance path. You can solder the shunts if the CLU Paste is not helpful. The shuntis there to measuring the power to not damage the card. Don't forget it.




Quote:


> Originally Posted by *galeonki*
> 
> I think i catch very good card
> 
> MSI Gaming X 1080
> 
> Core: 2341mhz - 2-3 days gaming and benchmarks 100% stable
> Memory: 5584mhz - best for me for all games and bench's - but still looking sweet point
> Voltage: 1.093
> Temp: 64C max - I add thermal pads and copper radiators on backplate - 2-3C Temp down overall
> I repaste Core by CLU - temps down 7-9C
> 80% max RPM cooler
> Ambient temp: 24C
> 
> My only issue is POWER limit - sometimes throttle my core -13 or -26mhz down - but not very often and only for 1-2 sec
> 
> No shunts mod - because after paste CLU on resistors i had coil whine as hell
> 
> I tried 10 bioses and zero different for me - only Asus Strix OC bios without power limit push my card on 2380 on air but with 1.15 voltage - but my fps in games and bench results was the same like on stock Gaming X bios
> 
> Any idea how to throw away POWER limit? Shunt mod falls


@chiknnwatrmln
You can see the compare of watercooling and Stock cooling of my 1080 here:
http://www.3dmark.com/compare/fs/10298708/fs/9373549


----------



## chiknnwatrmln

What's considered an average overclock for a 1080? How about a good one?


----------



## Alperen62002

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What's considered an average overclock for a 1080? How about a good one?


@chiknnwatrmln You have a good one. If you have a founders edition, you have a very good one.







. The normal overclocking potential for gtx 1080 FE is 2000-2100Mhz. The normal overclocking potential for custom (MSI, ASUS, GIGABYTE) gtx 1080 is 2100-2200Mhz. Galeonki catched a very good card. He had to play a lottery.








If you don't play in 4k, it does not matter. Normal Clocks can handle it.


----------



## galeonki

Quote:


> Originally Posted by *ucode*
> 
> At 121% that's 291W, more than enough to achieve a much better score. Somethings not right.


I think power limit is the key here - i shunt only 1 resistor (closest to voltage controler) and results are better but not that much - still i have "blinking" POWER on riva stat on benchmarks.
But 400pts on graphics TimeSpy more... still downclocking from max core speed bcz PL.
Quote:


> Originally Posted by *Alperen62002*
> 
> At least the problem will be only the power limit, not the temps as galeonki realized in this post. He is whinig because of throttling core -13 or -26mhz down
> 
> 
> 
> 
> 
> 
> 
> .
> @galeonki you can not throw away the power limit without a shunts mod. In electronics, a shunt is a device which allows electric current to pass around another point in the circuit by creating a low resistance path. You can solder the shunts if the CLU Paste is not helpful. The shunt measuring the power to not damage the card


thx for advice - much better with shunt 1 resistor - i put now a very thin layer and no coil whine - i must test it with 2nd and 3rd resistor. Maybe this is the key why throttling core down...


----------



## Alperen62002

@galeonki I hope you know what you do. Do not overdo it with the resistors. The current cycle or the chip may can not handle it.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Alperen62002*
> 
> @chiknnwatrmln You have a good one. If you have a founders edition, you have a very good one.
> 
> 
> 
> 
> 
> 
> 
> . The normal overclocking potential for gtx 1080 FE is 2000-2100Mhz. The normal overclocking potential for custom (MSI, ASUS, GIGABYTE) gtx 1080 is 2100-2200Mhz. Galeonki catched a very good card. He had to play a lottery.
> 
> 
> 
> 
> 
> 
> 
> 
> If you don't play in 4k, it does not matter. Normal Clocks can handle it.


Sweet thanks. It's an FE but it'll be under water in a week or so.


----------



## Alperen62002

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Sweet thanks. It's an FE but it'll be under water in a week or so.


Save the benchmark results of your gtx 1080 and then compare it in a week with water cooled card benchmarks. I hope to see the build in a week,







:thumb:


----------



## chiknnwatrmln

I'll update you guys. Unfortunately (or rather fortunately for my wallet) I'm not going to be upgrading the rest of the system... yet.

3770k @ 4.6 can handle most everything still. I noticed some dips in BF1 beta but we'll see how the full game goes before I make any decisions. 6700k and 7700k don't look like they're worth the extra $$.


----------



## tin0

*delete, wrong forum lol*


----------



## Meaker

My two 1080s











Under the hood:


----------



## galeonki

Quote:


> Originally Posted by *Alperen62002*
> 
> @galeonki I hope you know what you do. Do not overdo it with the resistors. The current cycle or the chip may can not handle it.


2nd resistor "shunted" but still some "POWER" peak on benchmarks and core speed drop - but results are much better - still i think i dont have "sweet spot" for memory - bcz different speed give me more fps in fallout 4 and different speed give me more fps on gta5 (much more, sometimes 4-5 fps) and different settings give me more points on timespy or firestrike.

I think the only way is waiting for some bios editor - or make a hard mod like der8auer or kingpin.

But still im very happy with that scores. 2300mhz+ core 24h stable and some room to improve but POWER limit drop me all the time... Unlocked bios like OC strix or Zotac Amp give me equal fps in games but benchmark scores are lower.

This card has very good potential but without bios editor is simply locked at some level i think.

*edited:
I lower the vcore to 1.025 and clock to 2215mhz and results are much better - without POWER limit finally - so the solution without bios modding for me is lower vcore and clock i think*

edit2: asus bios make me the same results - but no power limit with or without shunt mod. Higher voltage downclocking me as hell like power limit bcz temp +65

_edit3: finally reach 8545 in graphics score - 2240mhz on 1.025v stable - above some artefacts shows up - so 2240mhz is my limit at this voltage_




so far 3rd place in 6700k & 1080gtx combo on 3dmark results
& 1st in graphic score only - 8545 - skit and bychus have max 8497 points



and nice results in furmark where i test max stability for couple hours gpu stress or benchmarks and max temps




If u have any solution for me am grateful what can i do more here


----------



## tps3443

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'll update you guys. Unfortunately (or rather fortunately for my wallet) I'm not going to be upgrading the rest of the system... yet.
> 
> 3770k @ 4.6 can handle most everything still. I noticed some dips in BF1 beta but we'll see how the full game goes before I make any decisions. 6700k and 7700k don't look like they're worth the extra $$.


There worth the extra money.

I can tell you this, my 6600K at 4.6 Ghz is outpacing a 3770K in multi threaded apps lol.

Well, I really run mine at 4.91Ghz. So with only 4 threads it is very impressive, that it out does the older 8 thread chips.

Now in single threaded, this is a entirely different ball game!

I'm seeing upwards of 30-40 more fps in games at 1080P, and 1440P with a Skylake vs. A ivy bridge.

Although, unless you found a super deal on a 6600K for like $175 or a 6700K for less than $300 I would just wait a tad bit longer for kaby lake. And see another 5-10% ipc improvement over Skylake. For even more improvement!

If you play games like Fallout 4, the jump from ddr3, to ddr4 shows like 10-12 more fps with just faster memory.

This boost is with ddr4 2133 to ddr4 2400 lol.

So, it would be even greater coming from your current platform.

I play Fallout 4 alot, with my GTX1080 FE, so I'm considering going with DDR4 4000 just to get even more performance.

Your CPU is still great! But, a 6700K will walk it in newer games, with more powerful gpu's like the GTX980, GTX1080 etc. Etc. It's worth upgrading.

I ran my 3570K for well over 3 years, at 4.85Ghz lol voltage was locked in lol. I bought it in 07/2012, and purchased a 6600K around the end of 2015. I loved it, and that's the chip that moved me to Intel "Forever"

But, there are plenty of comparisons that will show your current pulling 80+fps, and NEW cpu's pushing 120+


----------



## tps3443

Also I've notice alot of you guys cards running, 2175-2300+ and they just are not scaling. My card around 2088-2114 pulls 25000 graphics. So, I would back the clocks down

Synthetic Killer runs his in the very high 2100's to 2,200 Mhz and he's passing me by a good 850 points or more. His card is scaling with the clocks.

But, a 24,000 graphics score with a core any higher than 2150 is not scaling properly.

You should easily get 25,000+ WELL below 2.2Ghz core LOL


----------



## galeonki

Quote:


> Originally Posted by *tps3443*
> 
> Also I've notice alot of you guys cards running, 2175-2300+ and they just are not scaling. My card around 2088-2114 pulls 25000 graphics. So, I would back the clocks down
> 
> Synthetic Killer runs his in the very high 2100's to 2,200 Mhz and he's passing me by a good 850 points or more. His card is scaling with the clocks.
> 
> But, a 24,000 graphics score with a core any higher than 2150 is not scaling properly.
> 
> You should easily get 25,000+ WELL below 2.2Ghz core LOL


25622 graphics FS 1.1 and 8545 graphics in TimeSpy - so scalling is good but power limit throttle and downclock even if temp is low - this is my main problem bcz could be more


----------



## tps3443

Quote:


> Originally Posted by *galeonki*
> 
> 25622 graphics FS 1.1 and 8545 graphics in TimeSpy - so scalling is good but power limit throttle and downclock even if temp is low - this is my main problem bcz could be more


Try Shunt mod, it reduced my TDP by about about 20% while increasing voltage, and clocks!


----------



## galeonki

Quote:


> Originally Posted by *tps3443*
> 
> Try Shunt mod, it reduced my TDP by about about 20% while increasing voltage, and clocks!


I try (in different variations and different bios'es) and still POWER peak at benchmarks... OC strix give me clear clock but lower results in benchmark like FE bios or Gaming Z

my last FS - 25831 - could be higher - i know it


----------



## tps3443

Quote:


> Originally Posted by *galeonki*
> 
> I try (in different variations and different bios'es) and still POWER peak at benchmarks... OC strix give me clear clock but lower results in benchmark like FE bios or Gaming Z
> 
> my last FS - 25831 - could be higher - i know it


You sir, are at the end of the road with it. Lol just like everyone else.

The only other way to make more power, includes soldering, and some more in depth modification. There is a great written guide right here.

This guide does go over how to really by pass the 120% limit, and add voltage to memory, and GPU. It is not for the faint of heart. If you are not experienced, and you love your (1) and only GTX1080 you have like me, then you will not attempt this. Good luck!

https://xdevs.com/guide/pascal_oc


----------



## galeonki

Quote:


> Originally Posted by *tps3443*
> 
> You sir, are at the end of the road with it. Lol just like everyone else.
> 
> The only other way to make more power, includes soldering, and some more in depth modification. There is a great written guide right here.
> 
> https://xdevs.com/guide/pascal_oc


Thanks for great guide. Im not good at PCB soldiering but my friends from Pratt&Whitney Co could make this - but question is - is this worth it? I dont think so but I might be wrong. Specially on air cooling - maybe Water will be better solution but still - we dont know excactly how far we can go with clocks on water or air.

Maybe better wait for some pascal bios editor?


----------



## ucode

Quote:


> Originally Posted by *galeonki*
> 
> I think power limit is the key here - i shunt only 1 resistor (closest to voltage controler) and results are better but not that much - still i have "blinking" POWER on riva stat on benchmarks.
> But 400pts on graphics TimeSpy more... still downclocking from max core speed bcz PL.


You say you tried the Strix OC VBIOS. If it was the T4 version then there are no power limits set and also enables up to 1.2V although the curve needs to be adjusted for this. By default it flatlines at less than that. While it may give lower clock for clock performance it should still be way better than what was posted.

Have you tried an earlier driver such as 372.54?

Or perhaps it's just down clocking more than you realize. It's still a great card but something is stopping it from giving it that 'wow' benchmark scores.


----------



## DStealth

There is a BIOS w/o Power limits. I can match your TS score with 200mhz lower...

Ah just saw you're not running 2350 or so in TS but 2240-50 anyway Strix T4 is not slower clock per clock than any other BIOS...


----------



## grimboso

Quote:


> Originally Posted by *Vellinious*
> 
> I was a little disappointed that it was more of a brushed nickel surface, than a polished one.....but that's easily fixed.


I always got the impression that it was polished as well, at least from looking at the pictures. Did you polish yours? I might just have to polish mine


----------



## Derko1

I just got a second 1080 and my regular overclock for when I only had the one card isn't working right. If I use the curve setting for the core clock in afterburner, it just always maxes out at the same core clock (1987 mhz) at a different voltage. Am I doing something wrong now that I have SLI on?


----------



## Meaker

Here are the scores I have had out of my notebook so far.


----------



## Martin778

I changed the TIM and did der8auer's power limit mod for GTX1080 by shorting the resistor that's the closest to the 8-pin PEG connector.
As result I don't even have to touch the power limit in XOC/Afterburner anymore and the card boosts to 2012MHz (1080SC ACX)

GPU-Z now reports up to 55-57% TDP under FurMark.

There seems to be a side effect to this mod - GPU-Z reports a lot of Pwr perfcap when the card is idling. I haven't noticed any suspicious behaviour yet.

I don't think the mod was really worth it, my top OC remained unchanged at around 2050MHz. I'd have to try the Strix BIOS again to see if I can push the voltage higher than the stock maximum of 1.093v.


----------



## wimo

I did exactly the same. 1080 shunt mod + EK WB :










My cpu is holding me down in Timespy


----------



## DStealth

Quote:


> Originally Posted by *wimo*
> 
> My cpu is holding me down in Timespy


What is it?
Highly doubt 1440p dx12 benchmark can be CPU limited...you can try FS Ultra @4k should be no difference comparing GPU scores from Celeron to I7


----------



## ROKUGAN

Quote:


> Originally Posted by *MiamiMuscleBoy*
> 
> I have a zotac amp extreme 1080. , since its clocked at 1772/1911 , afterburner can barely put +50 on the clock and I have noticed all the cards even the founders seem to hi 2126 or lower before they settle in at around 2101 , 2088 etc no matter what the card is , so is there a POINT to buying a $700 card or a $610 dollar card on newegg... all the reviews are saying they all hit this clock and its like a lock they cant go any higher than 2.1ghz....
> 
> Now i heard theres some LN2 thing that can make 2400 ghz? does anyone know about that at all?? I thougth that was an msi lightning bios?


See my previous post on this card.

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/4580#post_25414181

The point on buying this card is that the temps will be in the 60´s vs 80´s in the FE. It´s already well known that OC limit is 99% lottery with Pascal so not much point on buying one model over the other (as it seems BIOS is not going to get modded), but also temps are importante to avoid throttling.


----------



## tps3443

Quote:


> Originally Posted by *Meaker*
> 
> Here are the scores I have had out of my notebook so far.


That's a powerful laptop! There started a new revolution when they finally started offering a desktop GPU in a laptop, since the 900 series.

I probably would have just gone with (1) single gtx1080 though. The price steps up so much going with (2) in SLI.

Definitely a powerful laptop.


----------



## Juub

I had heard reports 2 flexible SLI bridges improved scaling in games like TW3 or The Division. I tried them and noticed no difference whatsoever. Did I miss anything?


----------



## Meaker

Quote:


> Originally Posted by *tps3443*
> 
> That's a powerful laptop! There started a new revolution when they finally started offering a desktop GPU in a laptop, since the 900 series.
> 
> I probably would have just gone with (1) single gtx1080 though. The price steps up so much going with (2) in SLI.
> 
> Definitely a powerful laptop.


Quote:


> Originally Posted by *Juub*
> 
> I had heard reports 2 flexible SLI bridges improved scaling in games like TW3 or The Division. I tried them and noticed no difference whatsoever. Did I miss anything?


Cheers.

It will depend on the res/settings.


----------



## Juub

Quote:


> Originally Posted by *Meaker*
> 
> Cheers.
> 
> It will depend on the res/settings.


I tried 4K at max settings. Watched this video 




This guy is playing at 4K/60fps and max settings. I go to the same area at night and get 52-53 fps with same GPU and an OC'd 4790K to 4.5.


----------



## toncij

So, how high have you guys been able to stable run 1080 on water?


----------



## Synthetickiller

Quote:


> Originally Posted by *toncij*
> 
> So, how high have you guys been able to stable run 1080 on water?


You're looking to get the FE?
I have the Zotac ArcticStorm & it pushes 2200+ easily (I can game at 2214mhz, currently). Temps rarely break 40°C.
Water is going to help stop thermal throttling, but it's not like back in the day where you can get much high clocks. It's more about noise/temp/throttling control than pure overclocking, although lower temps never, ever hurt.


----------



## toncij

Quote:


> Originally Posted by *Synthetickiller*
> 
> You're looking to get the FE?
> I have the Zotac ArcticStorm & it pushes 2200+ easily (I can game at 2214mhz, currently). Temps rarely break 40°C.
> Water is going to help stop thermal throttling, but it's not like back in the day where you can get much high clocks. It's more about noise/temp/throttling control than pure overclocking, although lower temps never, ever hurt.


I know, but I'm asking because I need to hear if anyone is able to get 2250 or more on water. I'm almost certain it's impossible without a hw mod for voltage, power, etc. I've tested 7 1080s and neither could go past low 2.2 on water, which is just shy from 2.1x on air.


----------



## Synthetickiller

The highest I've gone for benchmarking purposes is 2227. My card throttles down if I push 2230.
I rarely see anything posted about 2250+ unless we are talking hw mods & exotic cooling like you mentioned.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> I know, but I'm asking because I need to hear if anyone is able to get 2250 or more on water. I'm almost certain it's impossible without a hw mod for voltage, power, etc. I've tested 7 1080s and neither could go past low 2.2 on water, which is just shy from 2.1x on air.


I've posted this b4 and this was on air , this card will be flying here later .


Spoiler: Warning: Spoiler!


----------



## chiknnwatrmln

Is that the card I bought from you?

Edit: it is. It'll be under water this Friday. I'm psyched to see what she can do!


----------



## Nicklas0912

Hello.

I have GTX 1080 Evga ACX 3.0.

Can anyone make a bios to raise Power taget to 140% instead of 120?


----------



## Martin778

No BIOS mods possible yet.


----------



## Nicklas0912

Quote:


> Originally Posted by *Martin778*
> 
> No BIOS mods possible yet.


what a shame...

im hitting 120% TDP on only 1970Mhz

is this card: http://www.evga.com/Products/Product.aspx?pn=08G-P4-6181-KR

what if flash the card to the superclock version? is same board.

any one know if that have higher tdp ?


----------



## Vellinious

Quote:


> Originally Posted by *Nicklas0912*
> 
> what a shame...
> 
> im hitting 120% TDP on only 1970Mhz
> 
> is this card: http://www.evga.com/Products/Product.aspx?pn=08G-P4-6181-KR
> 
> what if flash the card to the superclock version? is same board.
> 
> any one know if that have higher tdp ?


No, they don't.


----------



## Benjiw

I can't believe that after all this time people are still not understanding how these cards work with GPU boost and why their speeds etc are all over or scores are lower than they should be.


----------



## Synthetickiller

Quote:


> Originally Posted by *Benjiw*
> 
> I can't believe that after all this time people are still not understanding how these cards work with GPU boost and why their speeds etc are all over or scores are lower than they should be.


It's fairly intuitive, at least to me & I've never even owned a GPU with GPU Boost before now.


----------



## Juub

Is it me or this card just sucks in SLI? My 980's SC were great in SLI. Supposedly one 1080 is supposed to come close to a pair of 980's that can handle almost anything at max settings in 1440p/60fps. You'd think with two 1080's 4K/60fps would be easily achievable but they struggle a lot to do that.


----------



## ucode

Quote:


> Originally Posted by *Benjiw*
> 
> I can't believe that after all this time people are still not understanding how these cards work with GPU boost and why their speeds etc are all over or scores are lower than they should be.


Yep, I'm one of those people that don't understand everything. Please explain *why* memory clocks are designed to run well below manufacturers spec when running GPU computational tasks.


----------



## Martin778

Well my opinion on this, is that the whole turbo/boost idea is nothing short of stupid and used only for marketing purposes.

The 1080 with 1 resistor bypassed and massively increased fan RPM seems to stay above 2000MHz all the time,


----------



## Nicklas0912

Quote:


> Originally Posted by *Vellinious*
> 
> No, they don't.


my card can only +150 on core, and +670 on memoery.

but only if I have fan speed on 50% if I take it to 60% or more, it will crash, cause it hit Power target limit.

i'm really disappointed of this card, always had Classfield version, and now I know for sure I nerver try smallere versions again.

will sell thies 2 cards and get Titan XP SLI instead.


----------



## juniordnz

why everybody keeps saying "my card does +150 on core is that good?"

It doesn't matter how much you can add on core, all that matters is what you achieve after that. My FTW is 2025mhz out of the box and I can get +89 to work without any crashes. That adds up to 2113mhz, and that's what really matters. If you can add +150 and get 2100mhz that doesn't mean you have a better card than mine just because you can add more to it.

Just stop saying "my card can do +WHATEVER on core", that means nothing.


----------



## Nicklas0912

Quote:


> Originally Posted by *juniordnz*
> 
> why everybody keeps saying "my card does +150 on core is that good?"
> 
> It doesn't matter how much you can add on core, all that matters is what you achieve after that. My FTW is 2025mhz out of the box and I can get +89 to work without any crashes. That adds up to 2113mhz, and that's what really matters. If you can add +150 and get 2100mhz that doesn't mean you have a better card than mine just because you can add more to it.
> 
> Just stop saying "my card can do +WHATEVER on core", that means nothing.


The problem is, my card with +150 is only 2025/2000Mhz.

it cant do more, before it hit Power limit. I have tryed a other card . same card. that can do 2100, without it hit power target limit?

thats funny, that one of the cards hit power taget limit, on lower mhz and the other do not.


----------



## toncij

Quote:


> Originally Posted by *Nicklas0912*
> 
> my card can only +150 on core, and +670 on memoery.
> 
> but only if I have fan speed on 50% if I take it to 60% or more, it will crash, cause it hit Power target limit.
> 
> i'm really disappointed of this card, always had Classfield version, and now I know for sure I nerver try smallere versions again.
> 
> will sell thies 2 cards and get Titan XP SLI instead.


You won't have better overclocking on a Titan XP, au contraire, you'll have worse. TXPs are pretty much power and voltage starved cards with a lot of lost potential.


----------



## KillerBee33

Quote:


> Originally Posted by *chiknnwatrmln*


Quote:


> Originally Posted by *toncij*
> 
> You won't have better overclocking on a Titan XP, au contraire, you'll have worse. TXPs are pretty much power and voltage starved cards with a lot of lost potential.


How so? Boosts from 1417MHz o 2100MHz against GTX 1080 1607MHz to 2100MHz


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> How so? Boosts from 1417MHz o 2100MHz against GTX 1080 1607MHz to 2100MHz


It's the same architecture, no worries. The base clock is irrelevant. The lower base clock is there only because of the power limits.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> It's the same architecture, no worries. The base clock is irrelevant. The lower base clock is there only because of the power limits.


But where do you find worse? On it's own TXP runs @ around 1858MHz


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> But where do you find worse? On it's own TXP runs @ around 1858MHz


Well, I've been able to overclock 1080s (5 out of 6) to 2179 or 2205. Tried only 2 but haven't even seen TXP that does that much.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Well, I've been able to overclock 1080s (5 out of 6) to 2179 or 2205. Tried only 2 but haven't even seen TXP that does that much.


Mine ran @ 2126 with a Hybrid kit @ around 60 degrees . Better cooling is on its way and TXPs Memory unlike 1080's is well worth to clock higher than +500


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Mine ran @ 2126 with a Hybrid kit @ around 60 degrees . Better cooling is on its way and TXPs Memory unlike 1080's is well worth to clock higher than +500


I can oclock memory to 600 or 650 - but I'm starving GPU for power that way. Power limits are even worse on TXP. You should start seeing worse perf if you go high.


----------



## pez

If going on base clock to final OC, then yes, from what I've seen, the Titan XP generally OC's better (percentage-wise) than the 1080.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> I can oclock memory to 600 or 650 - but I'm starving GPU for power that way. Power limits are even worse on TXP. You should start seeing worse perf if you go high.


It's a single VRel from start to End. Just like 1080 was. Just don't see how one can say its worse , is all . Also as i've seen on the 1080 going over +500 on the memory will only cap the performance .


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> It's a single VRel from start to End. Just like 1080 was. Just don't see how one can say its worse , is all . Also as i've seen on the 1080 going over +500 on the memory will only cap the performance .


True, but you have some after-market 1080s which work better towards that end. Have you seen TXP running 2.2?







I guess not.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> True, but you have some after-market 1080s which work better towards that end. Have you seen TXP running 2.2?
> 
> 
> 
> 
> 
> 
> 
> I guess not.


Have you ever seen ANY titan going over lower end GPUs on the clock?


----------



## GreedyMuffin

I remember the days where you could add a little extra voltage without needing to worry about the power limit.. You are almost forced to run the card on stock voltage.

Seems like 2200 at 1.093v might work. Not a bad card, but could have been better i guess if the power limit was a non exiting issue.


----------



## juniordnz

Voltage does nothing except adding heat and raising TDP. My FTW's wall is 2126mhz. 2138mhz is not stable at 1.062V nor 1.093V.


----------



## GreedyMuffin

I guess I'll go back to 2088 at 0.975V or 2012 at 0.900V. My core clock was not consistent at all with 2202, not even at 2138/1.050V. Disappointed.


----------



## rakesh27

Hey guys,

Im alittle late to the party, well anyways i recently got a Zotac Amp Edition 1080, and i was overclocking using msi and latest nvidia drivers. Is this normal... if i push gpu past 2100 i get in game crashes and memory not tested so far is at 450....

ive got voltages and power turned all the way up, should i use a different overclocking tool then the usual msi afterburner ?

Thanks all..


----------



## DStealth

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I guess I'll go back to 2088 at 0.975V or 2012 at 0.900V. My core clock was not consistent at all with 2202, not even at 2138/1.050V. Disappointed.


Yep,
Less than 2.2 gigs is quite disappointing...


----------



## juniordnz

Quote:


> Originally Posted by *rakesh27*
> 
> Hey guys,
> 
> Im alittle late to the party, well anyways i recently got a Zotac Amp Edition 1080, and i was overclocking using msi and latest nvidia drivers. Is this normal... if i push gpu past 2100 i get in game crashes and memory not tested so far is at 450....
> 
> ive got voltages and power turned all the way up, should i use a different overclocking tool then the usual msi afterburner ?
> 
> Thanks all..


You just got a mediocre card, that's normal. Not all cards will get 2100+ core. Also, adding voltage does nothing good to stability, just assess your maximum overclock with stock 1.062V .

I recommend GTA V to test for stability. No other game would crash my overclocks like that. Stable at 2138mhz on firestrike but had to lower it to 2114mhz to be GTA V stable. mems go up to +600 but performance deteriorates after +575.


----------



## rakesh27

Thanks dude...

Oh well i was alittle pissed when i brought it the 1080, even though i got a really good deal, they just announced the 1080TI. Now thats the card i really want... The 1080 i got i stuck Kraken G10 and Corsair H75 cooler on it and temps a really low... i did the same before with my 980ti EVGA hybrid card and i love it, not to worry about temps.....

Thanks for helping, im happy so far, ill just wait for the 1080TI Hybrids, then sell the whole lot.


----------



## juniordnz

Wait, what?

The 1080ti has been announced? Where? When?


----------



## Tdbeisn554

I just bit the bullet, and bought a 1080 Classified, the Ti is really tempting but 1080 is 800€ here already, so that is kinda my max anyway. And coming from a GTX 770 I think I will be more than satisfied. + I am still on 1080P (Gonna upgrade monitor probably this year, just not sure 1440P or 4K)


----------



## Fediuld

Quote:


> Originally Posted by *juniordnz*
> 
> Wait, what?
> 
> The 1080ti has been announced? Where? When?


Nah rumour mill.

NV just said that next year going to refresh the Pascals (aka rebage them) and Volta was pushed for 2018


----------



## chiknnwatrmln

I'm really not impressed with the power delivery on the reference design.

For a $700 video card it should be able to push over 200 watts - the missing phase makes me feel like Nvidia skimped a bit on the 1080.

It seems that the lack of power is what's limiting many overclocks from going higher.


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm really not impressed with the power delivery on the reference design.
> 
> For a $700 video card it should be able to push over 200 watts - the missing phase makes me feel like Nvidia skimped a bit on the 1080.
> 
> It seems that the lack of power is what's limiting many overclocks from going higher.


The only limiting factor in power delivery, is in the bios....and yes, it's a limiting factor in most 1080s overclocks. The other limiting factor, and probably the most important one, is core temps.


----------



## Rikuo

Quote:


> Originally Posted by *Vellinious*
> 
> The only limiting factor in power delivery, is in the bios....and yes, it's a limiting factor in most 1080s overclocks. The other limiting factor, and probably the most important one, is core temps.


35c temps here

Cant go over 2150mhz

/e LOL if you type in F M L, It sensors it in OCN? DAFUQ?


----------



## Jim86

Sensors?
Quote:


> Originally Posted by *Archang3l*
> 
> I just bit the bullet, and bought a 1080 Classified, the Ti is really tempting but 1080 is 800€ here already, so that is kinda my max anyway. And coming from a GTX 770 I think I will be more than satisfied. + I am still on 1080P (Gonna upgrade monitor probably this year, just not sure 1440P or 4K)


I'd go with a 1440p I think 4K is pretty dumb on PC monitors for TV it is great but not really needed with screens this small. Plus you would need some serious GPU horsepower to utilize 144hz.


----------



## Iceman2733

I am looking for some advice going to buy two 1080s and wanting to know what would the best choice of one to put under water? I hate to spend a bunch on a cooler and rgb lighting to tear it right off. Also was thinking about the Seahawk but cheaper to do it all myself. I would like to jump to EVGA the trade up program seems amazing I have MSI right now and don't think I want another MSI product to be honest so not sure where that leaves me.

Sent from my SM-N930V using Tapatalk


----------



## tps3443

Quote:


> Originally Posted by *Iceman2733*
> 
> I am looking for some advice going to buy two 1080s and wanting to know what would the best choice of one to put under water? I hate to spend a bunch on a cooler and rgb lighting to tear it right off. Also was thinking about the Seahawk but cheaper to do it all myself. I would like to jump to EVGA the trade up program seems amazing I have MSI right now and don't think I want another MSI product to be honest so not sure where that leaves me.
> 
> Sent from my SM-N930V using Tapatalk


I would buy a gtx1080 FE, and install a ek block.

Board partners is not such a big deal. Ive purchased a evga gtx1070 SC ACX, and it had several issues.

And now I have a MSI gtx1080 Founders Edition. And I love it!

They all overclock the same to so, if your going to watercool it I personally would get a reference FE model. Just because waterblocks are in high availability.

There are a few people who have custom board cards, with (2) 8 pins etc. Etc. And no one has released a water block that is compatible yet.

Not to mention a reference cooler for a FE can easily be sold for $80-$100. So, you practically get the waterblock for free.


----------



## Rikuo

Quote:


> Originally Posted by *tps3443*
> 
> I would buy a gtx1080 FE, and install a ek block.
> 
> Board partners is not such a big deal. Ive purchased a evga gtx1070 SC ACX, and it had several issues.
> 
> And now I have a MSI gtx1080 Founders Edition. And I love it!
> 
> They all overclock the same to so, if your going to watercool it I personally would get a reference FE model. Just because waterblocks are in high availability.
> 
> There are a few people who have custom board cards, with (2) 8 pins etc. Etc. And no one has released a water block that is compatible yet.
> 
> Not to mention a reference cooler for a FE can easily be sold for $80-$100. So, you practically get the waterblock for free.


EK has blocks out for pretty much every brand of non-ref 1080.

Currently they all OC the same because of bios limitations, When that gets fixed FE will be limited by the 1x 8pin.


----------



## Vellinious

Quote:


> Originally Posted by *Rikuo*
> 
> 35c temps here
> 
> Cant go over 2150mhz
> 
> /e LOL if you type in F M L, It sensors it in OCN? DAFUQ?


Sounds about right....you got an decent clocker, but not great. @ 2150 on the reference board, you're most likely hitting the power limit perf cap anyway. Which, again, is a bios problem, not a power delivery problem.

Quote:


> Originally Posted by *Rikuo*
> 
> EK has blocks out for pretty much every brand of non-ref 1080.
> 
> Currently they all OC the same because of bios limitations, When that gets fixed FE will be limited by the 1x 8pin.


No....with a good PSU, you can pull 300 watts from a single 8 pin. Pascal isn't going to be limited by a single 8 pin. = )


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Vellinious*
> 
> Sounds about right....you got an decent clocker, but not great. @ 2150 on the reference board, you're most likely hitting the power limit perf cap anyway. Which, again, is a bios problem, not a power delivery problem.
> No....with a good PSU, you can pull 300 watts from a single 8 pin. Pascal isn't going to be limited by a single 8 pin. = )


Do you think the power delivery system on the reference board could safely handle more than 120% TDP, assuming that temperatures are in within control?


----------



## KillerBee33

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Do you think the power delivery system on the reference board could safely handle more than 120% TDP, assuming that temperatures are in within control?


Here's another way of looking @ PASCALS
GTX 980 1127MHz +35%**BIOS CLOCK** =1554MHz in MOST CASES
GTX 1080 1601MHz+35%SOFT CLOCK=2150MHz in MOST CASES
Basically nVidia took away the FUN of DO IT YOURSELF but gave us BOOST 3.0 which isn't Great but Not Bad and same Performance.


----------



## juniordnz

I just really miss the ability to bake-in the overclock and fan profiles. It was nice not having to deal with afterburner. It's kinda messy if you try to change anything on the curve instead of just adding clock offset. When you apply the first time it may be ok, but after a restart the curve will always shift up or down, it doesn't follow EXACTLY what you told it to do.

It was nice setting each clock voltage on mbt, and it would also work great


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Do you think the power delivery system on the reference board could safely handle more than 120% TDP, assuming that temperatures are in within control?


There's no doubt in my mind that it'll handle it just fine. My bigger concern, would be with people using crap PSUs to push that much power to it.

That said, I'd be absolutely shocked to see anyone with above ambient cooling, push a pascal to the point that they need anything more than 280 watts set as the power limit. Time will tell.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Vellinious*
> 
> There's no doubt in my mind that it'll handle it just fine. My bigger concern, would be with people using crap PSUs to push that much power to it.
> 
> That said, I'd be absolutely shocked to see anyone with above ambient cooling, push a pascal to the point that they need anything more than 280 watts set as the power limit. Time will tell.


Can't wait for them BIOS mods


----------



## juniordnz

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Can't wait for them BIOS mods


Don't hold your breath, we're probably never getting it.


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Can't wait for them BIOS mods


I'm still hopeful, but.....starting to get a little concerned that it's not going to happen.


----------



## chiknnwatrmln

Huh. I looked into the shunt mod, specifically the one where people put CLU on the resistor, but I don't know how comfortable I am having runny liquid metal sitting on my video card's PCB.

The things we go through for that last 50 MHz


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Huh. I looked into the shunt mod, specifically the one where people put CLU on the resistor, but I don't know how comfortable I am having runny liquid metal sitting on my video card's PCB.
> 
> The things we go through for that last 50 MHz


Yeah, I'm not messing with that either. Worse comes to worse, I sell my FTWs and get a couple of Classifieds. They get a 245 watt power limit, as opposed to the 215 of the FTW.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah, I'm not messing with that either. Worse comes to worse, I sell my FTWs and get a couple of Classifieds. They get a 245 watt power limit, as opposed to the 215 of the FTW.


I don't believe that would be a wise move, it's just not necessary.

We all know that voltages does nothing to improve stability in these cards. The 280W (215W stock + 30% power limit on the slave BIOS) we get on the FTW is more than enough to keep a stable 1.062V through all imaginable stress situations.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> I don't believe that would be a wise move, it's just not necessary.
> 
> We all know that voltages does nothing to improve stability in these cards. The 280W (215W stock + 30% power limit on the slave BIOS) we get on the FTW is more than enough to keep a stable 1.062V through all imaginable stress situations.


Mine hit the power limit under heavy loads.... I never mentioned anything about voltages....don't much care about voltage, because I'm not running sub-ambient. I'm just looking for more power limit, without doing a hard mod. The Classy gets it there.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Mine hit the power limit under heavy loads.... I never mentioned anything about voltages....don't much care about voltage, because I'm not running sub-ambient. I'm just looking for more power limit, without doing a hard mod. The Classy gets it there.


I just mentioned voltages because I thought you might say: "but at 1.093V I'm hitting power limit...". Anyway, I could never go past 120% with anything I would throw at it @1.062V. Did you hit the limit with synthetics or any other 3d application? I'm just curious, really.


----------



## chiknnwatrmln

I hit the limit on my FE playing games such as Fallout 4. Granted I'm also at 1.093v but at lower voltages the card is not stable at higher clocks.


----------



## Rikuo

Quote:


> Originally Posted by *Vellinious*
> 
> Sounds about right....you got an decent clocker, but not great. @ 2150 on the reference board, you're most likely hitting the power limit perf cap anyway. Which, again, is a bios problem, not a power delivery problem.
> No....with a good PSU, you can pull 300 watts from a single 8 pin. Pascal isn't going to be limited by a single 8 pin. = )


With a modded bios, Sure.

But the max wattage according to regulations is 150w for the 8pin & 75w for PCI-E. So 225w total.

Also, I have non ref MSI cards.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> I just mentioned voltages because I thought you might say: "but at 1.093V I'm hitting power limit...". Anyway, I could never go past 120% with anything I would throw at it @1.062V. Did you hit the limit with synthetics or any other 3d application? I'm just curious, really.


4k synthetics. And it wasn't constant, but enough to notice. The extra 30 watts at 100% would probably get rid of it entirely.


----------



## Vellinious

Quote:


> Originally Posted by *Rikuo*
> 
> With a modded bios, Sure.
> 
> But the max wattage according to regulations is 150w for the 8pin & 75w for PCI-E. So 225w total.
> 
> Also, I have non ref MSI cards.


I was responding to this portion of your statement, where you said: "Currently they all OC the same because of bios limitations, When that gets fixed FE will be limited by the 1x 8pin." Which isn't true. We all know the limitations of the stock bios.


----------



## wardo3640

Quote:


> Originally Posted by *juniordnz*
> 
> Wait, what?
> 
> The 1080ti has been announced? Where? When?


http://www.fool.com/investing/2016/09/15/nvidia-corporation-geforce-gtx-1080-ti-specs-leak.aspx

Tada!!!


----------



## AllGamer

Quote:


> Originally Posted by *wardo3640*
> 
> http://www.fool.com/investing/2016/09/15/nvidia-corporation-geforce-gtx-1080-ti-specs-leak.aspx
> 
> Tada!!!


I'm not surprised to see it, as it has always been like that for many years for nvidia.

however the news are just kind of _*too soon*_

they pretty much release the Titan version of GTX1080 back to back, and now they even want to throw in the Ti version of GTX1080... that's a dumb way to do business IMO.

It's most likely a prototype. in the past Ti versions comes out after another new release of Something new has been release, say for example GTX1180, then they put out the Ti version of the previous model GTX1080Ti.

So, this time not having something in the middle before the Ti version, it's bit unexpected.


----------



## LiquidHaus

If anyone was interested in trying out new thermal paste for their air cooled cards, I'd highly recommend Thermal Grizzly's Kryonaut.



Dropped temps by 6-9c


----------



## Fediuld

Quote:


> Originally Posted by *lifeisshort117*
> 
> If anyone was interested in trying out new thermal paste for their air cooled cards, I'd highly recommend Thermal Grizzly's Kryonaut.
> 
> 
> 
> Dropped temps by 6-9c


Yep writing the same pages now. 10C on idle, 13C on load I lost by replacing the EK Ectotherm.
(watercooled)


----------



## GreedyMuffin

Quote:


> Originally Posted by *Fediuld*
> 
> Yep writing the same pages now. 10C on idle, 13C on load I lost by replacing the EK Ectotherm.
> (watercooled)


Perhaps I should replace mine as well as mine is a tad to hot.


----------



## pantsoftime

Quote:


> Originally Posted by *AllGamer*
> 
> It's most likely a prototype. in the past Ti versions comes out after another new release of Something new has been release, say for example GTX1180, then they put out the Ti version of the previous model GTX1080Ti.


I have no idea what gave you that impression.

The 780Ti launched on Nov 7 2013 and the 980 came out on Sep 18 2014
The 980Ti launched on Jun 2 2015 and the 1080 came out on May 27 2016

Is it too soon for a 1080Ti? Yes. Does the 1180 have to launch first? No.


----------



## max883

I used Thermal grizzly kryonaut and thermal mod with updated bios on my Evga GTX 1080 ACX 3.0 SC. Fann at custom in Msi Afterburner and 120% power and 50+ Gpu.

before I had a MSI 980 Ti gaming. Temps were 75.c and fann speed 66%

Now with Evga GTX 1080 ACX 3.0 SC temps are 60.c and fann speed 45%!


----------



## tps3443

Quote:


> Originally Posted by *Rikuo*
> 
> EK has blocks out for pretty much every brand of non-ref 1080.
> Currently they all OC the same because of bios limitations, When that gets fixed FE will be limited by the 1x 8pin.


I doubt it will be limited. You can pull 100 watts through PCI-E, and then the 8 pin is 150 watts. And it can pull more than that.

So, you could easily provide 250 watts default. And pull even more.

My card draws about 205 watts overclocked in 4K. So we've got plenty of room!


----------



## tps3443

Quote:


> Originally Posted by *pantsoftime*
> 
> I have no idea what gave you that impression.
> 
> The 780Ti launched on Nov 7 2013 and the 980 came out on Sep 18 2014
> The 980Ti launched on Jun 2 2015 and the 1080 came out on May 27 2016
> 
> Is it too soon for a 1080Ti? Yes. Does the 1180 have to launch first? No.


The gtx980 was out for 22 months before gtx1080 came out. And roughly 2 years before there was availability really.

Looks like they are releasing a Ti version 6 months after gtx1080 was released. So, it will have a longer life span.

The gtx980Ti was only out for about 10 months. And here comes gtx1080.


----------



## tps3443

Anyways, the GTX1080Ti looks pretty promising. I think it will offer 25% - 30% over my GTX1080. Default vs. default.

They are released in very early January. So, should I sell my GTX1080 in December?

I'd like to get around $525-$550. And, I'm going to add the difference, and buy a GTX1080Ti.

Think it is worth it?


----------



## AllGamer

Quote:


> Originally Posted by *tps3443*
> 
> The gtx980 was out for 22 months before gtx1080 came out. And roughly 2 years before there was availability really.
> 
> Looks like they are releasing a Ti version 6 months after gtx1080 was released. So, it will have a longer life span.
> 
> The gtx980Ti was only out for about 10 months. And here comes gtx1080.


yup, that's exactly what I was thinking about.

well in that aspect I guess it does make sense, if they want to give it a longer life span to the GTX1080Ti

before the next big thing comes around and over shadows it, like right now the GTX1080 to the GTX980Ti


----------



## Juub

Quote:


> Originally Posted by *tps3443*
> 
> Anyways, the GTX1080Ti looks pretty promising. I think it will offer 25% - 30% over my GTX1080. Default vs. default.
> 
> They are released in very early January. So, should I sell my GTX1080 in December?
> 
> I'd like to get around $525-$550. And, I'm going to add the difference, and buy a GTX1080Ti.
> 
> Think it is worth it?


Thinking about the same. Although the GTX 1080's are very good, they fall a bit short of running every game at 4K/60fps. A 25-30% boost would really make 4K/60fps a reality for all current games and would leave some headroom for upcoming games in the next year or two. Unless they do something about the voltage and power delivery to allow for higher overclocks, the 1080 even in SLI won't be quite good enough for 4K with no compromises in the future(it already isn't).


----------



## tps3443

Quote:


> Originally Posted by *max883*
> 
> I used thermal grizzly kryonaut on my Evga GTX 1080 ACX 3.0 SC. And my fann speed never goes above 40%
> 
> 
> 
> 
> 
> 
> 
> Fann at auto in presision X and 120% power and 50+ gpu and 250 mem
> 
> before I had a MSI 980 Ti gaming. Temps were 75. and fann speed 66%
> 
> Now with EVGA 1080 ACX SC temps are 76 and fann speed 38%!
> 
> 
> 
> 
> 
> 
> 
> With 60% fann speed temps drop to 61.c


I'm going to order some of this for my GTX1080 Founders Edition. I'm hoping to see a big difference.

The FE coolers are terrible on the DIE contact area. I think the coolers work sufficiently enough. And they are really not all that bad. I'm a big fan of the FE cards.

But, the heatsink is very rough, with gaps, and holes in it where it touches the die. I imagine lapping it, and using thermal grizzly would improve temps by itleast 10C.


----------



## tps3443

Quote:


> Originally Posted by *Juub*
> 
> Thinking about the same. Although the GTX 1080's are very good, they fall a bit short of running every game at 4K/60fps. A 25-30% boost would really make 4K/60fps a reality for all current games and would leave some headroom for upcoming games in the next year or two. Unless they do something about the voltage and power delivery to allow for higher overclocks, the 1080 even in SLI won't be quite good enough for 4K with no compromises in the future(it already isn't).


I'm not really sure if it will be worth it. The Ti will be $900.00, so once first released they will be out of stock and price gouging.

If I could get $575 for my card, and add $325. That would be a little better.

I think Nvidia is doing something different with the TI, maybe a huge power limiter, like 135% or even 150%.

And they can run at a solid 2.4Ghz lol.

That would be worth it.


----------



## Juub

Quote:


> Originally Posted by *tps3443*
> 
> I'm not really sure if it will be worth it. The Ti will be $900.00, so once first released they will be out of stock and price gouging.
> 
> If I could get $575 for my card, and add $325. That would be a little better.
> 
> I think Nvidia is doing something different with the TI, maybe a huge power limiter, like 135% or even 150%.
> 
> And they can run at a solid 2.4Ghz lol.
> 
> That would be worth it.


Why would the 1080 Ti be 900$? It'll be 699$ tops. Pascal is 30$ more expensive than Maxwell for the same model. There's no way you're selling a 900$ card. That's Titan's territory. They won't go above 749$ if they wanna sell anything.


----------



## VSG

Quote:


> Originally Posted by *Juub*
> 
> Why would the 1080 Ti be 900$? It'll be 699$ tops. Pascal is 30$ more expensive than Maxwell for the same model. There's no way you're selling a 900$ card. That's Titan's territory. They won't go above 749$ if they wanna sell anything.


Find me a GTX 1080 that costs $30 more than what the GTX 980 did, which in turn was priced higher than the GTX 680 to begin with.


----------



## Vellinious

I agree...at the price point that the 1080 released at, I believe the 1080ti will be around $850 - $900 for the Founder's Edition.


----------



## Juub

Quote:


> Originally Posted by *geggeg*
> 
> Find me a GTX 1080 that costs $30 more than what the GTX 980 did, which in turn was priced higher than the GTX 680 to begin with.


Quote:


> Originally Posted by *Vellinious*
> 
> I agree...at the price point that the 1080 released at, I believe the 1080ti will be around $850 - $900 for the Founder's Edition.


Sorry, mean 50$, not 30$. I was talking about the MSRP. Can't do much about sellers price gouging us. The 1080 is supposed to retail for 599$ but the cheapest I've seen was 629$ I believe. I believe the 1080 Ti will launch at an MSRP of 799$ for the Founder's Edition and 699$ for non-reference models. With retails price hikes then yeah' we'll likely to see some hit 900$.

I hope NVIDIA will drop the 1080 to 549$ for the non-reference and 649$ for the reference. It would leave some headroom to price the 1080 Ti at 699$. Would be even better if they don't pull that crap of FE.


----------



## jleslie246

Ill just leave this here.

ti Classified


----------



## Vellinious

Quote:


> Originally Posted by *Juub*
> 
> Sorry, mean 50$, not 30$. I was talking about the MSRP. Can't do much about sellers price gouging us. The 1080 is supposed to retail for 599$ but the cheapest I've seen was 629$ I believe. I believe the 1080 Ti will launch at an MSRP of 799$ for the Founder's Edition and 699$ for non-reference models. With retails price hikes then yeah' we'll likely to see some hit 900$.
> 
> I hope NVIDIA will drop the 1080 to 549$ for the non-reference and 649$ for the reference. It would leave some headroom to price the 1080 Ti at 699$. Would be even better if they don't pull that crap of FE.


Uh, no. The FE opened at $699 retail. Still selling at that price, in the NVIDIA website.

http://www.geforce.com/hardware/10series/geforce-gtx-1080


----------



## Juub

Quote:


> Originally Posted by *Vellinious*
> 
> Uh, no. The FE opened at $699 retail. Still selling at that price, in the NVIDIA website.
> 
> http://www.geforce.com/hardware/10series/geforce-gtx-1080


Yeah but the non-FE is 599$ MSRP hence why I said the MSRP is supposed to be 599$.


----------



## moustang

Quote:


> Originally Posted by *Juub*
> 
> Why would the 1080 Ti be 900$? It'll be 699$ tops. Pascal is 30$ more expensive than Maxwell for the same model. There's no way you're selling a 900$ card. That's Titan's territory. They won't go above 749$ if they wanna sell anything.


You actually think the 1080 Ti is going to launch for $100 cheaper than the 980 Ti launch price, despite the fact that the 1080 launched for $50 more than the 980?

Keep on dreaming. The $900 estimate is probably spot on.


----------



## VSG

Quote:


> Originally Posted by *Juub*
> 
> Yeah but the non-FE is 599$ MSRP hence why I said the MSRP is supposed to be 599$.


True, but no card has launched at that supposed MSRP of $599.


----------



## Juub

Quote:


> Originally Posted by *moustang*
> 
> You actually think the 1080 Ti is going to launch for $100 cheaper than the 980 Ti launch price, despite the fact that the 1080 launched for $50 more than the 980?
> 
> Keep on dreaming. The $900 estimate is probably spot on.


No. The 980 Ti MSRP was 649$. I expect the 1080 Ti to be around 679-699$ MSRP for non-FE models. FE will be 769$-799$. Problem with that is that it would make the 1080 Ti the same price as FE 1080 which is almost impossible. Here's hoping NVIDIA knocks off 50$ frm the 1080 and brings the non-FE ti 549$ and FE to 649$. It'd be crazy if the 1080 Ti non-FE was 799$ and FE 899$. Don't think people would flock to them.


----------



## Vellinious

Quote:


> Originally Posted by *Juub*
> 
> No. The 980 Ti MSRP was 649$. I expect the 1080 Ti to be around 679-699$ MSRP for non-FE models. FE will be 769$-799$. Problem with that is that it would make the 1080 Ti the same price as FE 1080 which is almost impossible. Here's hoping NVIDIA knocks off 50$ frm the 1080 and brings the non-FE ti 549$ and FE to 649$. It'd be crazy if the 1080 Ti non-FE was 799$ and FE 899$. Don't think people would flock to them.


We'll see when they release. My bet is on 850-900 for the FE.


----------



## istudy92

Just wanted to post my new 1080s inside the new EVGA case


----------



## RJacobs28

Quote:


> Originally Posted by *istudy92*
> 
> 
> 
> Just wanted to post my new 1080s inside the new EVGA case


That looks absolutely mint!!


----------



## Barterlos

hi guys, i have sometimes strange behaviour of my gtx 1080 fe when my pc is runing like 2/3 days w/o shutting down....it just goes wild with Power Target


----------



## PasK1234Xw

Quote:


> Originally Posted by *istudy92*
> 
> 
> 
> Just wanted to post my new 1080s inside the new EVGA case


Would look so much better without all the branding. EVGA really needs to chill. Will look even more obnoxious when you throw in EVGA SLI bridge.


----------



## Tdbeisn554

A bit unfortunate you can not select other cards other than reference and founders edition. Is this gonna change?


----------



## tps3443

Quote:


> Originally Posted by *Juub*
> 
> Why would the 1080 Ti be 900$? It'll be 699$ tops. Pascal is 30$ more expensive than Maxwell for the same model. There's no way you're selling a 900$ card. That's Titan's territory. They won't go above 749$ if they wanna sell anything.


They said that it will be $899.00 msrp and gtx1080 Ti is releasing very early January 2017.

It will be very similar to Titan P, this is why it is more expensive. Same memory size as Titan X p. With only around 3300 something cuda cores. The Ti is more expensive this time around.

Yes, it will be $899.99

That is $300 less than the Titan P.

Now, the old Titan was only $1000

So GTX980 Ti was like $699 or $749.99

The GTX1080 Ti is more money. Although, It is very close to the gtx Titan pascal with only 256 less cuda cores.

The Ti is usually released a year after the first card. This time it is looking like 6 months after GTX1080 has launched.

Either way, it's gonna be a beast. But it will be expensive. It is $200 more just like the Titan X P.


----------



## xartic1

Quote:


> Originally Posted by *tps3443*
> 
> They said that it will be $899.00 msrp and gtx1080 Ti is releasing very early January 2017.
> 
> It will be very similar to Titan P, this is why it is more expensive. Same memory size as Titan X p. With only around 3300 something cuda cores. The Ti is more expensive this time around.
> 
> Yes, it will be $899.99
> 
> That is $300 less than the Titan P.
> 
> Now, the old Titan was only $1000
> 
> So GTX980 Ti was like $699 or $749.99
> 
> The GTX1080 Ti is more money. Although, It is very close to the gtx Titan pascal with only 256 less cuda cores.
> 
> The Ti is usually released a year after the first card. This time it is looking like 6 months after GTX1080 has launched.
> 
> Either way, it's gonna be a beast. But it will be expensive. It is $200 more just like the Titan X P.


I have to stop you and ask where the $899 price came from


----------



## tps3443

Quote:


> Originally Posted by *xartic1*
> 
> I have to stop you and ask where the $899 price came from


Think about the pricing for a minute, GTX1080 has only been out for almost 5 months now, and the Titan X P price has increased 20% over last generation. Why do you think they did that?

If a GTX1070 is $449, and a GTX1080 is $699, while a GTX Titan P is $1200.

What falls between the GTX1080, and the GTX Titan P. And we have had one every generation? It cost $300 more than a GTX1080, but $300 less than a GTX Titan P? It also will use the same memory size as GTX Titan P, unlike previous models.

Call me a fool all you want..

But, it will be $899.00 this is only reasonable common sense. The pricing fits in with it perfectly! I

PCgamer.com, tweaktown, and wfccftech

It is all leaked reported information for January ces 2017.

http://www.tweaktown.com/news/54165/nvidia-geforce-gtx-1080-ti-teased-january-2017/index.html


----------



## tps3443

http://www.tweaktown.com/news/54165/nvidia-geforce-gtx-1080-ti-teased-january-2017/index.html

Gtx1080Ti $899


----------



## Juub

Quote:


> Originally Posted by *tps3443*
> 
> http://www.tweaktown.com/news/54165/nvidia-geforce-gtx-1080-ti-teased-january-2017/index.html
> 
> Gtx1080Ti $899


Did you read the article?


----------



## danjal

Quote:


> Originally Posted by *tps3443*
> 
> They said that it will be $899.00 msrp and gtx1080 Ti is releasing very early January 2017.
> 
> It will be very similar to Titan P, this is why it is more expensive. Same memory size as Titan X p. With only around 3300 something cuda cores. The Ti is more expensive this time around.
> 
> Yes, it will be $899.99
> 
> That is $300 less than the Titan P.
> 
> Now, the old Titan was only $1000
> 
> So GTX980 Ti was like $699 or $749.99
> 
> The GTX1080 Ti is more money. Although, It is very close to the gtx Titan pascal with only 256 less cuda cores.
> 
> The Ti is usually released a year after the first card. This time it is looking like 6 months after GTX1080 has launched.
> 
> Either way, it's gonna be a beast. But it will be expensive. It is $200 more just like the Titan X P.


How much faster would the ti be than the normal 1080?


----------



## tps3443

Quote:


> Originally Posted by *Juub*
> 
> Did you read the article?


Yep, I read it. Pricedrop to compete with Vega.

So a FE will be $649.99 and the FE Ti will be $849.99.

It's gonna be around $900, and I've never heard of Nvidia ever dropping prices on a GTX Titan card. They never have, so I believe that last statement is kinda silly.

I'm buying a GTX1080 Ti either way, so the cheaper the better. But, I have $900 on stand by until then.

I personally believe Nvidia pricing is more than fair right now.


----------



## tps3443

Quote:


> Originally Posted by *danjal*
> 
> How much faster would the ti be than the normal 1080?


It looks like the difference will be about how a GTX1070 vs a GTX1080 are. Maybe a little less.

Whereas a GTX 1080 can handle 4K really well alot of games it avg's 60, but the minimums are 45-48. So the GTX1080 TI will achieve about 9-10 higher minimums.

At the minimum it will be 15-20% faster. But, because of 384bit interface, and 480GB memory bandwidth. it will be up to 30-35% faster in some cases with good Overclocking.

So, I would say it least 15%-25% faster.

A well Overclocked GTX1080 is gonna come close to a GTX1080 Ti. But, Overclocking the GTX1080 Ti is where it will start to really pull ahead.

Because once you overclock a GTX1080Ti to 11.0 GHz memory it will it will have 540 GBPS memor y bandwidth! That doubles a GTX1070 bandwidth.

It will be a force to reckon with. Pretty powerful.


----------



## Juub

Quote:


> Originally Posted by *tps3443*
> 
> Yep, I read it. Pricedrop to compete with Vega.
> 
> So a FE will be $649.99 and the FE Ti will be $849.99.
> 
> It's gonna be around $900, and I've never heard of Nvidia ever dropping prices on a GTX Titan card. They never have, so I believe that last statement is kinda silly.
> 
> I'm buying a GTX1080 Ti either way, so the cheaper the better. But, I have $900 on stand by until then.
> 
> I personally believe Nvidia pricing is more than fair right now.


Is English your first language? That article is nothing more than rumors and the 900$ is what the guy who wrote it thinks. None of that is based on facts.


----------



## tps3443

Quote:


> Originally Posted by *Juub*
> 
> Is English your first language? That article is nothing more than rumors and the 900$ is what the guy who wrote it thinks. None of that is based on facts.


Yes. I read it. Price lowering to compete with Vega..

All I am saying is, there will be a Pascal V2 this is where alot of the price drops will come in to play.

I just cannot see a GTX1080 Ti being almost 50% cheaper than a Titan X Pascal.

Previous models are roughly 25-30% cheaper than the Titan variant

And with the same memory size as the Titan P it makes even more sense.

Introducing the GTX1080Ti! The same thing as a Titan P other than 256 less cuda cores, for almost half price!

And, my first language is English I'm using a half beatin to death cell phone. You must be getting angry, or reality is sinking in that Nvidia loves charging more! Because now your trying to insult me.

I'll bet you that the Nvidia GTX1080Ti is 849-900. I'm so sure of it, I'd mail you my GTX1080 if I was wrong.


----------



## Martin778

wrong thread


----------



## Juub

Quote:


> Originally Posted by *tps3443*
> 
> Yes. I read it. Price lowering to compete with Vega..
> 
> All I am saying is, there will be a Pascal V2 this is where alot of the price drops will come in to play.
> 
> I just cannot see a GTX1080 Ti being almost 50% cheaper than a Titan X Pascal.
> 
> Previous models are roughly 25-30% cheaper than the Titan variant
> 
> And with the same memory size as the Titan P it makes even more sense.
> 
> Introducing the GTX1080Ti! The same thing as a Titan P other than 256 less cuda cores, for almost half price!
> 
> And, my first language is English I'm using a half beatin to death cell phone. You must be getting angry, or reality is sinking in that Nvidia loves charging more! Because now your trying to insult me.
> 
> I'll bet you that the Nvidia GTX1080Ti is 849-900. I'm so sure of it, I'd mail you my GTX1080 if I was wrong.


I'm really not trying to insult you. You linked an article and say "899$" like it was a fact when all it was was the guy saying he thinks it's gonna be 899$. You may very well be right, it may cost 900$ though I hope not. At that price point NVIDIA will lose tons of sales. It's way too far above the 650-700$ price tag we are used too. Dangerously close to 999$ and is just a single, gaming GPU. Not a dual gaming GPU or a single "professional" GPU like the GTX 690 or OG Titan.


----------



## TWiST2k

Quote:


> Originally Posted by *Juub*
> 
> I'm really not trying to insult you. You linked an article and say "899$" like it was a fact when all it was was the guy saying he thinks it's gonna be 899$. You may very well be right, it may cost 900$ though I hope not. At that price point NVIDIA will lose tons of sales. It's way too far above the 650-700$ price tag we are used too. Dangerously close to 999$ and is just a single, gaming GPU. Not a dual gaming GPU or a single "professional" GPU like the GTX 690 or OG Titan.


And a what paper launch at CES? So we MAYBE see them coming out around March and after they get legit stock and price gougers go away we will be able to pick one up in April LOL. I cannot believe there is so much hype about this **** already. I agree with you all the way man, it is all conjecture at this point.

Quote:


> Originally Posted by *tps3443*
> 
> Yes. I read it. Price lowering to compete with Vega..
> 
> All I am saying is, there will be a Pascal V2 this is where alot of the price drops will come in to play.
> 
> I just cannot see a GTX1080 Ti being almost 50% cheaper than a Titan X Pascal.
> 
> Previous models are roughly 25-30% cheaper than the Titan variant
> 
> And with the same memory size as the Titan P it makes even more sense.
> 
> Introducing the GTX1080Ti! The same thing as a Titan P other than 256 less cuda cores, for almost half price!
> 
> And, my first language is English I'm using a half beatin to death cell phone. You must be getting angry, or reality is sinking in that Nvidia loves charging more! Because now your trying to insult me.
> 
> I'll bet you that the Nvidia GTX1080Ti is 849-900. I'm so sure of it, I'd mail you my GTX1080 if I was wrong.


This is the [Official] NVIDIA GTX 1080 Owner's Club, I am sure there is a 1080Ti rumor thread for you to go hang out in bro.


----------



## feznz

Quote:


> Originally Posted by *TWiST2k*
> 
> And a what paper launch at CES? So we MAYBE see them coming out around March and after they get legit stock and price gougers go away we will be able to pick one up in April LOL. I cannot believe there is so much hype about this **** already. I agree with you all the way man, it is all conjecture at this point.
> *This is the [Official] NVIDIA GTX 1080 Owner's Club, I am sure there is a 1080Ti rumor thread for you to go hang out in bro.*



















I been having a seriously thinking about getting a Asus Strix 1080 I noticed there is the hotwire solder points and 6+8 power plug just wondering if anyone has had a good go with this card with OC aspects might be the extra ammo to get this thing to 2300Mhz.


----------



## juniordnz

Now that's some serious worthless discussion...how about we wait and see instead of arguing about rumors and fighting over an educated guess?

Let the *GTX 1080 thread* continue


----------



## ralphi59

Great response juniordnz !


----------



## juniordnz

Guys,

So the best non-conductive TIM we get today is Kryonaut? I have some CLU laying around but I'm not confortable at all to use that on the GPU with all those smd around the die. I believe 1 microscopic drop of CLU could wreck the card completely if it get in contact with those smds.

If I could drop anything between 5-10ºC that would allow my card to drop only 1 clock under stress with the stock cooler.


----------



## AllGamer

hmm... with the GTX1080Ti coming around the corner...

maybe is time to start selling out the GTX1080

I still got the 2x MSI GTX1080 EK X Sea Hawk edition in the box, waiting for the remaining water cooling parts to arrive.

Does anyone know if there will be a GTX1080*Ti* sea hawk edition ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *AllGamer*
> 
> hmm... with the GTX1080Ti coming around the corner...
> 
> maybe is time to start selling out the GTX1080
> 
> I still got the 2x MSI GTX1080 EK X Sea Hawk edition in the box, waiting for the remaining water cooling parts to arrive.
> 
> Does anyone know if there will be a GTX1080*Ti* sea hawk edition ?


Probably will be a Seahawk EK 1080TI.

I'd just keep your 1080s and go sli since you have them, they are nice cards those 1080 Seahawks. Had one for a week, but returned it when the Titan XP was announced. I'm a single card user and like to bench, so I went Titan XP. Nothing but positive things to say about the 1080 EK X, nice looking card.


----------



## LiquidHaus

Quote:


> Originally Posted by *feznz*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I been having a seriously thinking about getting a Asus Strix 1080 I noticed there is the hotwire solder points and 6+8 power plug just wondering if anyone has had a good go with this card with OC aspects might be the extra ammo to get this thing to 2300Mhz.


Interesting. Care to elaborate about the Strix potential?


----------



## galeonki

Quote:


> Originally Posted by *DStealth*
> 
> There is a BIOS w/o Power limits. I can match your TS score with 200mhz lower...
> 
> Ah just saw you're not running 2350 or so in TS but 2240-50 anyway Strix T4 is not slower clock per clock than any other BIOS...


25943 graphics with new drivers on firestrike and my personal record on furmark. I lower clock and change the curve to more agressive.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *juniordnz*
> 
> Guys,
> 
> So the best non-conductive TIM we get today is Kryonaut? I have some CLU laying around but I'm not confortable at all to use that on the GPU with all those smd around the die. I believe 1 microscopic drop of CLU could wreck the card completely if it get in contact with those smds.
> 
> If I could drop anything between 5-10ºC that would allow my card to drop only 1 clock under stress with the stock cooler.


I looked around and it seems like Kryonaut is some of the best currently available. I ordered 11g of it and will be installing my block with it tomorrow, I'll let you know how it goes!


----------



## feznz

Quote:


> Originally Posted by *lifeisshort117*
> 
> Interesting. Care to elaborate about the Strix potential?


I was hoping a member here could, tell me about their experiences with a 1080 strix.

I can only go on my Asus GTX 770 experiences with hot wire simply I out benched every other 770 SLI rig with a Quad core CPU in benches like 3Dmark but smashed other benches like Valley.
Almost got top single card except another member had a golden Asus card that could bench @ 1500Mhz+ on water.

Just really torn between GTX titan with I can get here for the price of 2 GTX 1080 strix.
I have noticed that there are a few games that scale negatively with GTX 1080 SLI.
SO I am torn between getting 2 GTX 1080 Strix now or GTX titan or wait a little longer till the 1080ti comes out and probably push the 1080 prices down a little.
one thing is sure I am definitely at a point where a GPU upgrade is worthy of doing.


----------



## juniordnz

Free PowerLink for 1080/1070/1060 EVGA cards:

http://www.evga.com/articles/01058/evga-powerlink-promotion/

That little thing helps cable management a lot.


----------



## AllGamer

Quote:


> Originally Posted by *juniordnz*
> 
> Free PowerLink for 1080/1070/1060 EVGA cards:
> 
> http://www.evga.com/articles/01058/evga-powerlink-promotion/
> 
> That little thing helps cable management a lot.


Damn! that's great!

I hope they make more of them for the other brands too, like ASUS cards, MSI cards, Gigabyte cards, etc...


----------



## Vellinious

I submitted my request. Very cool.


----------



## SauronTheGreat

I upgraded to a Zotac 1080 AMP Extreme and very impressed how quiet its fans are


----------



## TWiST2k

Quote:


> Originally Posted by *AllGamer*
> 
> hmm... with the GTX1080Ti coming around the corner...
> 
> maybe is time to start selling out the GTX1080
> 
> I still got the 2x MSI GTX1080 EK X Sea Hawk edition in the box, waiting for the remaining water cooling parts to arrive.
> 
> Does anyone know if there will be a GTX1080*Ti* sea hawk edition ?


So getting back on topic lasted 2 posts, glad to see people still can read.


----------



## Juub

Quote:


> Originally Posted by *galeonki*
> 
> 25943 graphics with new drivers on firestrike and my personal record on furmark. I lower clock and change the curve to more agressive.


Whoa nice score. The most I managed was 23K.


----------



## nexxusty

Quote:


> Originally Posted by *juniordnz*
> 
> Free PowerLink for 1080/1070/1060 EVGA cards:
> 
> http://www.evga.com/articles/01058/evga-powerlink-promotion/
> 
> That little thing helps cable management a lot.


Thanks bro. Sent my request in.

Running a core X9 so it wont look as good as some peoples. However. It's free and has capacitors. I'm game.


----------



## LiquidHaus

Quote:


> Originally Posted by *galeonki*
> 
> 25943 graphics with new drivers on firestrike and my personal record on furmark. I lower clock and change the curve to more agressive.
> 
> 
> 
> *snip*


super impressive. mind telling me what your running?


----------



## galeonki

Quote:


> Originally Posted by *lifeisshort117*
> 
> super impressive. mind telling me what your running?


newest - 26 035 graphics - lowering voltage and uncore on cpu give me better graphics score. I think my 600W BeQ PSU is too weak for stable OC on CPU and GPU. Lowering CPU give me always better higher clocks on GPU (13-26mhz). I think my PSU 12v line isnt good. Lowering voltage on GPU give me the same results and lower temp (50C). Strange - i thought that higher voltage give me some extra mhz.

MSI Gaming X 1080
FS graphics score: 26053
Clock: 2265mhz
Mem: 11200mhz
Voltage: 1.093 (1.2 is too high bcz temp raise and clock throttle down)
FE Bios/Asus OC Strix T4 bios





The key is the curve i think... and the main problem is my bad PSU


----------



## ucode

Wouldn't be surprised to see a 1080Ti FE at $1050 with MSRP to follow at $900. Suppliers market for now for nVidia.

That EVGA powerlink looks tidy. Not sure on the extra caps though, is the onboard regulation on the EVGA cards lacking? Also adding extra connectors means increasing line resistance but hopefully not an issue.

@galeonki congrats on 26k+. Did you try an earlier driver?

FWIW for me the last two drivers allow higher GPU clocks but give lower scores. The T4 VBIOS scores a little less clock for clock than my default FE VBIOS, perhaps memory related. However the T4 does allow higher voltage which in turn allows higher clocks albeit a small gain for the amount of voltage used. This coupled with no power limit allows the T4 VBIOS to give me higher scores than the FE VBIOS. Results seem to
vary from person to person so results may be YMMV type results. Would be nice to see a proper VBIOS mod.


----------



## galeonki

Quote:


> Originally Posted by *ucode*
> 
> Wouldn't be surprised to see a 1080Ti FE at $1050 with MSRP to follow at $900. Suppliers market for now for nVidia.
> 
> That EVGA powerlink looks tidy. Not sure on the extra caps though, is the onboard regulation on the EVGA cards lacking? Also adding extra connectors means increasing line resistance but hopefully not an issue.
> 
> @galeonki congrats on 26k+. Did you try an earlier driver?
> 
> FWIW for me the last two drivers allow higher GPU clocks but give lower scores. The T4 VBIOS scores a little less clock for clock than my default FE VBIOS, perhaps memory related. However the T4 does allow higher voltage which in turn allows higher clocks albeit a small gain for the amount of voltage used. This coupled with no power limit allows the T4 VBIOS to give me higher scores than the FE VBIOS. Results seem to
> vary from person to person so results may be YMMV type results. Would be nice to see a proper VBIOS mod.


yeah, 5 scores above 26+, its regular now, i will try push more but i must change PSU first and stabilize voltage incoming.

Yeah, for me Asus bios better too bcz no more power limit "blinking" in games or benchmarks. Still working on voltage and core and temp - i think this is the end of the road for me on air. Will see tmorrow when new PSU will come. But the voltage on 1.2 give me only +13mhz stable and +10C temp. Maybe Water Cooling...

About drivers - newest .90 is very similar score like .70 for me. I tried with older .20 - no difference for me.

Still searching, still trying, still im stupid with the curve and boost 3.0.


----------



## ucode

Running FS Ultra and HWiNFO see's a minimum of 285W and over 330W in the first graphics test at 2.2GHz and 1.2V, probably a little high for this FE card. Measurements on the 12V rails finds the HWiNFO power readings very close to multimeter readings.

I find the memory strange. As soon as I go over 1.4GHz (11.2GT/s) then scores drop to that of default memory clock. Increasing clocks further see's some increase in gain again. It's like some new memory timing takes effect at 1400MHz...


----------



## galeonki

Quote:


> Originally Posted by *ucode*
> 
> Running FS Ultra and HWiNFO see's a minimum of 285W and over 330W in the first graphics test at 2.2GHz and 1.2V, probably a little high for this FE card. Measurements on the 12V rails finds the HWiNFO power readings very close to multimeter readings.
> 
> I find the memory strange. As soon as I go over 1.4GHz (11.2GT/s) then scores drop to that of default memory clock. Increasing clocks further see's some increase in gain again. It's like some new memory timing takes effect at 1400MHz...


So maybe not FE but some non reference card will be the key when we get bios editor? I dont know...

About memory - yeah i saw the same strange behavior of mem controller... +600 give me best performance, +700 and +800 lower, but +900 give me better than +600! Not stable on Asus Bios, very stable on FE bios, but FE bios give me Power Limit error - it sucks









ps. i remove CLU from shunt resistors - and i think card behave more stable after this, or maybe this was placebo.

Too many questions honestly







and a lot of unknows









ps2. what PSU U have?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *galeonki*
> 
> So maybe not FE but some non reference card will be the key when we get bios editor? I dont know...
> 
> About memory - yeah i saw the same strange behavior of mem controller... +600 give me best performance, +700 and +800 lower, but +900 give me better than +600! Not stable on Asus Bios, very stable on FE bios, but FE bios give me Power Limit error - it sucks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ps. i remove CLU from shunt resistors - and i think card behave more stable after this, or maybe this was placebo.
> 
> Too many questions honestly
> 
> 
> 
> 
> 
> 
> 
> and a lot of unknows
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ps2. what PSU U have?


+700 and over give looser timings. Getting to +900 will finally benefit than over +600 with tighter timings. Thats my guess.


----------



## feznz

Quote:


> Originally Posted by *galeonki*
> 
> So maybe not FE but some non reference card will be the key when we get bios editor? I dont know...
> 
> About memory - yeah i saw the same strange behavior of mem controller... +600 give me best performance, +700 and +800 lower, but +900 give me better than +600! Not stable on Asus Bios, very stable on FE bios, but FE bios give me Power Limit error - it sucks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ps. i remove CLU from shunt resistors - and i think card behave more stable after this, or maybe this was placebo.
> 
> Too many questions honestly
> 
> 
> 
> 
> 
> 
> 
> and a lot of unknows
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ps2. what PSU U have?


I am sceptical of the CLU shunt mod simply not enough conductive material to bypass the resistors.

When I done my 770s I put a big blob of solder to ensure that was the path of least resistance.


----------



## galeonki

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> +700 and over give looser timings. Getting to +900 will finally benefit than over +600 with tighter timings. Thats my guess.


Yeah like U wrote. I can set +900 on FE bios but this sh**ty power limit... Bios editor will be the key like was on 9xx cards.
Quote:


> Originally Posted by *feznz*
> 
> I am sceptical of the CLU shunt mod simply not enough conductive material to bypass the resistors.
> 
> When I done my 770s I put a big blob of solder to ensure that was the path of least resistance.


Ive made the same like U but without success at OC

btw new personal record 26 164 - lowered clock to 2252 but much more agressive curve vs stock



and my timespy record 8679:



so i think on my card - agressive curve is the key

still dont know how to raise up memory - maybe hard mod by kingpin? Anyone test it?


----------



## juniordnz

Just bought a 11gram tube of Kryonaut. Hopefully it will help decrease some of the temp on air. Expecting 5-10ºC worst-best improvement.

It's coming from great britain, though. Cheapest place I could find, 24,70GBP. Will take a month to get here at least


----------



## Vellinious

Quote:


> Originally Posted by *galeonki*
> 
> Yeah like U wrote. I can set +900 on FE bios but this sh**ty power limit... Bios editor will be the key like was on 9xx cards.
> Ive made the same like U but without success at OC
> 
> btw new personal record 26 164 - lowered clock to 2252 but much more agressive curve vs stock
> 
> 
> 
> and my timespy record 8679:
> 
> 
> 
> so i think on my card - agressive curve is the key
> 
> still dont know how to raise up memory - maybe hard mod by kingpin? Anyone test it?


What are you using for cooling, to get those kinds of clocks?


----------



## galeonki

Quote:


> Originally Posted by *Vellinious*
> 
> What are you using for cooling, to get those kinds of clocks?


Stock MSI Twin Frozr air cooler







. Ive made core repaste to CLU and 8 small ram radiators on backplate. Nothing else.

edit: changin PSU to EVGA 1000 P2 give me 13mhz more stable and new personal records on timespy







. This card is a beast


----------



## Vellinious

Quote:


> Originally Posted by *galeonki*
> 
> Stock MSI Twin Frozr air cooler
> 
> 
> 
> 
> 
> 
> 
> . Ive made core repaste to CLU and 8 small ram radiators on backplate. Nothing else.
> 
> edit: changin PSU to EVGA 1000 P2 give me 13mhz more stable and new personal records on timespy
> 
> 
> 
> 
> 
> 
> 
> . This card is a beast


What kind of temps are you seeing, peak, during FS runs? And...what's your ambient?


----------



## galeonki

Quote:


> Originally Posted by *Vellinious*
> 
> What kind of temps are you seeing, peak, during FS runs? And...what's your ambient?


Ambient 23-24
Peak temp 64
Load temp 61-62

When gaming more than 3h temp sometimes reach 65 and downclock -13mhz core but still this is very good on air. I try higher voltage but downclocking very fast bcz temps go above 70 and peak 75. So i stay on 1.093v .


----------



## danjal

Quote:


> Originally Posted by *SauronTheGreat*
> 
> I upgraded to a Zotac 1080 AMP Extreme and very impressed how quiet its fans are


I have the amp edition 1080 and it sits arms length away from me on my desk and I cant hear my machine running, even when the fans are running at 85%.. Overclocks decent too, build quality is as good as any, very satisfied with my zotac.


----------



## Darkboomhoney

hi i saw a test from hardwareluxx about the gtx 1080 Classified. They use a tool for overclocking, you can push the voltage to 1,25 volt .
Know anybody the name of this Tool i need it for my classified to look how high i can push my clock. Actually i can get 2164 Mhz clock and 490 Memory gamestable... but i need more ^^
sry my english is not the best ...

i found the test only on german site : http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/40215-evga-geforce-gtx-1080-classified-im-test.html?start=19


----------



## Tdbeisn554

My classy is coming in tomorrow







Super stoked!


----------



## Vellinious

Quote:


> Originally Posted by *galeonki*
> 
> Ambient 23-24
> Peak temp 64
> Load temp 61-62
> 
> When gaming more than 3h temp sometimes reach 65 and downclock -13mhz core but still this is very good on air. I try higher voltage but downclocking very fast bcz temps go above 70 and peak 75. So i stay on 1.093v .


And you get up to 2250+ on the core with those temps?


----------



## galeonki

Quote:


> Originally Posted by *Vellinious*
> 
> And you get up to 2250+ on the core with those temps?


catch my results on TS vs others in the same conf (cpu+gpu):

http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/spy/P/2005/1085/8200?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K

and FS:

http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/fs/P/2005/1085/20821?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K (click on 1 GPU on quick filter bcz between 1st skit and me is many fellas with sli)

so finally it is not only a higher clock but higher fps - not like couple days ago when i start


----------



## Vellinious

Quote:


> Originally Posted by *galeonki*
> 
> catch my results on TS vs others in the same conf (cpu+gpu):
> 
> http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/spy/P/2005/1085/8200?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K
> 
> and FS:
> 
> http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/fs/P/2005/1085/20821?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K (click on 1 GPU on quick filter bcz between 1st skit and me is many fellas with sli)
> 
> so finally it is not only a higher clock but higher fps - not like couple days ago when i start


That's gotta be the best 1080 I've ever seen or heard of......ya got lucky. Congrats! = )


----------



## juniordnz

sell it to some hardcore professional bencher and make some profit of it


----------



## Synthetickiller

That's one hell of a card & on air!
I agree with juniordnz, sell it to the highest bidder & upgrade to a 1080Ti/Titan XP.
Quote:


> Originally Posted by *galeonki*
> 
> newest - 26 035 graphics - lowering voltage and uncore on cpu give me better graphics score. I think my 600W BeQ PSU is too weak for stable OC on CPU and GPU. Lowering CPU give me always better higher clocks on GPU (13-26mhz). I think my PSU 12v line isnt good. Lowering voltage on GPU give me the same results and lower temp (50C). Strange - i thought that higher voltage give me some extra mhz.
> 
> MSI Gaming X 1080
> FS graphics score: 26053
> Clock: 2265mhz
> Mem: 11200mhz
> Voltage: 1.093 (1.2 is too high bcz temp raise and clock throttle down)
> FE Bios/Asus OC Strix T4 bios
> 
> 
> 
> 
> 
> The key is the curve i think... and the main problem is my bad PSU


Quote:


> Originally Posted by *galeonki*
> 
> Yeah like U wrote. I can set +900 on FE bios but this sh**ty power limit... Bios editor will be the key like was on 9xx cards.
> Ive made the same like U but without success at OC
> 
> btw new personal record 26 164 - lowered clock to 2252 but much more agressive curve vs stock
> 
> 
> 
> and my timespy record 8679:
> 
> 
> 
> so i think on my card - agressive curve is the key
> 
> still dont know how to raise up memory - maybe hard mod by kingpin? Anyone test it?


Thanks for posting the curve. I was setting up a very linear curve. I want to try this. It might be the best way to break this 2214mhz wall that I've hit for stable. I know I can't catch you, but it's fun trying.









As for the card itself, that's one hell of a card & on air, no less!
I agree with juniordnz, sell it to the highest bidder & upgrade to a 1080Ti/Titan XP unless you just want to benchmark. I would be curious to see how that score stacks up against Titan XP scores. I know I can look it up it, but I don't want to be tempted to buy one, lol.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> Just bought a 11gram tube of Kryonaut. Hopefully it will help decrease some of the temp on air. Expecting 5-10ºC worst-best improvement.
> 
> It's coming from great britain, though. Cheapest place I could find, 24,70GBP. Will take a month to get here at least


I replaced my FTW's paste with either Noctua or MX-4, I can't recall at this time. But it ended up being about the same temp wise, I have some Kryonaut here as well that I use on my 6700k but thought about swapping it out on my 1080, let us all know how your temps fair when you slap that **** on!


----------



## Synthetickiller

Big shout out to galeonki. I tried his method to force higher clocks with a very non-linear core clock curve. I gained over 60mhz.















I need to play around with clocks a little more. I've done absolutely no game testing, but I will do a little tonight.
I broke 2300mhz (2303mhz), but 2316 is in no way stable.

A question for everyone here, is there any way to force core clocks? I cannot set any speed between 2278mhz & 2290mhz. No matter where I set the curve, the clock either moves up or down in speed, depending on how close they are to either speed. Does anyone know anything about this?


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> I replaced my FTW's paste with either Noctua or MX-4, I can't recall at this time. But it ended up being about the same temp wise, I have some Kryonaut here as well that I use on my 6700k but thought about swapping it out on my 1080, let us all know how your temps fair when you slap that **** on!


I will. It will just take a while lol if it helps me lose only 1 clock to temp throttle I will be satisfied.
Quote:


> Originally Posted by *Synthetickiller*
> 
> Big shout out to galeonki. I tried his method to force higher clocks with a very non-linear core clock curve. I gained over 60mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need to play around with clocks a little more. I've done absolutely no game testing, but I will do a little tonight.
> I broke 2300mhz (2303mhz), but 2316 is in no way stable.
> 
> A question for everyone here, is there any way to force core clocks? I cannot set any speed between 2278mhz & 2290mhz. No matter where I set the curve, the clock either moves up or down in speed, depending on how close they are to either speed. Does anyone know anything about this?


Now I'm just gonna have to try that...


----------



## LiquidHaus

Quote:


> Originally Posted by *Synthetickiller*
> 
> Big shout out to galeonki. I tried his method to force higher clocks with a very non-linear core clock curve. I gained over 60mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need to play around with clocks a little more. I've done absolutely no game testing, but I will do a little tonight.
> I broke 2300mhz (2303mhz), but 2316 is in no way stable.
> 
> A question for everyone here, is there any way to force core clocks? I cannot set any speed between 2278mhz & 2290mhz. No matter where I set the curve, the clock either moves up or down in speed, depending on how close they are to either speed. Does anyone know anything about this?


Okay just to recap - how exactly are these curves working?

Is it a certain bios only that this works on?

It seems crazy to me that as soon as voltage hits 1.093, you'd force the core way up to 2300mhz and the core actually being okay with it.

Both of your guys' cards were high clockers from the beginning weren't they?


----------



## Synthetickiller

Quote:


> Originally Posted by *lifeisshort117*
> 
> Okay just to recap - how exactly are these curves working?
> 
> Is it a certain bios only that this works on?
> 
> It seems crazy to me that as soon as voltage hits 1.093, you'd force the core way up to 2300mhz and the core actually being okay with it.
> 
> Both of your guys' cards were high clockers from the beginning weren't they?


My guess is that the bios "wants" to drop the voltage as low as possible while being stable to reduce heat. I was able to run firestrike at 2278 @ 1.050v, but only when setting the curve with the high mhz jump. I tried running Doom at 2290mhz & it was stable for about 7 or 8 minutes, then locked up. 2240mhz seems stable, but I haven't had time to run it a long time. If I were to go back to a more "natrual" curve that's far more linear, it wouldn't be stable. I don't know by, but I'm surprised by it.

My card was stable at 2214mhz+ before this. 2278mhz was the most stable so its a 64mhz increase with the highest firestrike scores. 2290 & 2303 are stable, but scores drop, so there's very, very minor instability. At most, I get 90mhz "stable" in benchmarks with this setup. Take into account that I'm watercooled and never broke 40°C, so I have no issues with built in thermal throttling by the bios. As for the bios, I'm on whatever Zotac shipped my ArcticStorm.


----------



## Vellinious

Quote:


> Originally Posted by *galeonki*
> 
> catch my results on TS vs others in the same conf (cpu+gpu):
> 
> http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/spy/P/2005/1085/8200?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K
> 
> and FS:
> 
> http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/fs/P/2005/1085/20821?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K (click on 1 GPU on quick filter bcz between 1st skit and me is many fellas with sli)
> 
> so finally it is not only a higher clock but higher fps - not like couple days ago when i start


You don't happen to have a GPUz sensors tab screenshot of what the GPU is actually doing during your FS runs, do you? I'm really curious to see what's going on with the voltages / clocks during the runs.

EDIT: I tried playing around with this....didn't do a thing for the GPU in my rig right now. Still stuck around 2163. I might drop the other one in this weekend and try again. It was a little better overclocker.


----------



## Swolern

Anyone having issues with a GTX 1080 and a 144hz monitor causing horizontal flickering? Doesnt happen at 120hz. I think it's the monitor, but I have seen some threads with the GTX 1080 causing flickering.


----------



## FattysGoneWild

Quote:


> Originally Posted by *Swolern*
> 
> Anyone having issues with a GTX 1080 and a 144hz monitor causing horizontal flickering? Doesnt happen at 120hz. I think it's the monitor, but I have seen some threads with the GTX 1080 causing flickering.


Drivers. Many flickering reports. Nvidia is trying to address it and aware of it. The issue exist obviously. I have seen videos people posted showing it. I have never had the issue though with my Dell S2716DG and GTX 1080.


----------



## juniordnz

just spent the last hour trying that crazy spiking the last clock method with no success.

Got my card up to 2189mhz (65mhz increase) but with a decrease in performance when comparing to 2114mhz with the normal offset method. (600points less in Firestrike Graphics to be more specific)

at least to my card, it is a no go.


----------



## chiknnwatrmln

Well, got my card blocked. She's under 40c during full load.

Still get tons of clock fluctuations though. All the way from 2202 down to 2000 MHz.... We really need a way to push more power to the card.

http://www.3dmark.com/3dm/15251058

Vs my old 290x + 290 CF setup

http://www.3dmark.com/compare/fs/10373885/fs/8207585


----------



## Swolern

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Drivers. Many flickering reports. Nvidia is trying to address it and aware of it. The issue exist obviously. I have seen videos people posted showing it. I have never had the issue though with my Dell S2716DG and GTX 1080.


Thx for reply,that's the same monitor I'm trying to help my buddy with to figure out

+ Rep.


----------



## galeonki

Quote:


> Originally Posted by *Vellinious*
> 
> That's gotta be the best 1080 I've ever seen or heard of......ya got lucky. Congrats! = )


Its just lottery, i had 2x1080 from polish komputronik shop, One of them is a beast, 2nd is crap (coil whine as hell and i RMA), and ofcourse its all about the way of overclocking. When i started posting here with my high clocks above 2200 i always wondered why my performance not increase linearly with core clock. Now i know that the stupid boost 3.0 and the curve is the key. Not only the highest clock. And ofcourse the big mystery is the memory. I think without vram modding or bios editor i got stuck with that.
Quote:


> Originally Posted by *juniordnz*
> 
> sell it to some hardcore professional bencher and make some profit of it


Yeah, propably i will bcz Titan X tempt me. But i think my wife would kill me to know how much I spent for the next card. She does not know about the other card






















Quote:


> Originally Posted by *Synthetickiller*
> 
> That's one hell of a card & on air!
> I agree with juniordnz, sell it to the highest bidder & upgrade to a 1080Ti/Titan XP.
> 
> Thanks for posting the curve. I was setting up a very linear curve. I want to try this. It might be the best way to break this 2214mhz wall that I've hit for stable. I know I can't catch you, but it's fun trying.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for the card itself, that's one hell of a card & on air, no less!
> I agree with juniordnz, sell it to the highest bidder & upgrade to a 1080Ti/Titan XP unless you just want to benchmark. I would be curious to see how that score stacks up against Titan XP scores. I know I can look it up it, but I don't want to be tempted to buy one, lol.


I change little more the curve and the results are higher so it depends on every card i think, try to find sweet point between stability and max performance. Last 3 days im doing only this. The curve. And the performance increased day after day. Until yesterday when i hit the wall. Maybe with Bios Editor or/and water cooling i can go a little further.
Quote:


> Originally Posted by *lifeisshort117*
> 
> Okay just to recap - how exactly are these curves working?
> 
> Is it a certain bios only that this works on?
> 
> It seems crazy to me that as soon as voltage hits 1.093, you'd force the core way up to 2300mhz and the core actually being okay with it.
> 
> Both of your guys' cards were high clockers from the beginning weren't they?


In 1st day i set 1.000v and 2151mhz core clock stable only with sliders. Without changing the curve, Only locking the voltage by "L".

And it was the first sign that it might be good.
Quote:


> Originally Posted by *Vellinious*
> 
> You don't happen to have a GPUz sensors tab screenshot of what the GPU is actually doing during your FS runs, do you? I'm really curious to see what's going on with the voltages / clocks during the runs.
> 
> EDIT: I tried playing around with this....didn't do a thing for the GPU in my rig right now. Still stuck around 2163. I might drop the other one in this weekend and try again. It was a little better overclocker.


Yeah, i always use gpu-z in the background. Catch the log:

http://www13.zippyshare.com/v/66hdsVBb/file.html

Test on TimeSpy +8700 graphics couple mins ago.


----------



## feznz

Quote:


> Originally Posted by *Darkboomhoney*
> 
> hi i saw a test from hardwareluxx about the gtx 1080 Classified. They use a tool for overclocking, you can push the voltage to 1,25 volt .
> Know anybody the name of this Tool i need it for my classified to look how high i can push my clock. Actually i can get 2164 Mhz clock and 490 Memory gamestable... but i need more ^^
> sry my english is not the best ...
> 
> i found the test only on german site : http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/40215-evga-geforce-gtx-1080-classified-im-test.html?start=19


It would be the Evbot.
http://www.evga.com/articles/00521/

Good luck on getting one they were discontinued I believe it was a conflict with NVidia and the ability of unlocking the 7XX series of cards without voiding warrantee, this it my speculation.
Google translate always works wonders


----------



## Darkboomhoney

i have the evbot but the display is broken...... -.-
I think I have expressed myself wrong....they use a software tool like the Classified Controller , is already a new version out there?


----------



## DStealth

Quote:


> Originally Posted by *Synthetickiller*
> 
> Big shout out to galeonki. I tried his method to force higher clocks with a very non-linear core clock curve. I gained over 60mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need to play around with clocks a little more. I've done absolutely no game testing, but I will do a little tonight.
> I broke 2300mhz (2303mhz), but 2316 is in no way stable.
> 
> A question for everyone here, is there any way to force core clocks? I cannot set any speed between 2278mhz & 2290mhz. No matter where I set the curve, the clock either moves up or down in speed, depending on how close they are to either speed. Does anyone know anything about this?


What's wrong with your score...24k is the area where these cards are crossing bare stock boosting close to 2ghz not 2300...something is definitely wrong with your result or the clocks reported and the real ones highly deffer


----------



## feznz

Quote:


> Originally Posted by *Darkboomhoney*
> 
> i have the evbot but the display is broken...... -.-
> I think I have expressed myself wrong....they use a software tool like the Classified Controller , is already a new version out there?


Not to sure but I found this bios for the Strix removing the Power limit removed completely
Also others are reporting this bios works on FE, Zotac and Evga
Of course it is do it at your own risk.

http://forum.hwbot.org/showthread.php?t=159025


----------



## galeonki

Quote:


> Originally Posted by *DStealth*
> 
> What's wrong with your score...24k is the area where these cards are crossing bare stock boosting close to 2ghz not 2300...something is definitely wrong with your result or the clocks reported and the real ones highly deffer


I had the same problem couple days ago, its only matter of adjusting properly the curve and setting the highest performance memory clocks.


----------



## destiNATION1337

delete me..ignored anyway


----------



## Darkboomhoney

Quote:


> Originally Posted by *feznz*
> 
> Not to sure but I found this bios for the Strix removing the Power limit removed completely
> Also others are reporting this bios works on FE, Zotac and Evga
> Of course it is do it at your own risk.
> 
> http://forum.hwbot.org/showthread.php?t=159025


thx for help








The Problem with this Bios ( strix t4) is the performance..... lower fps and bench results but higher clocks ....
I think the only way to push the voltage is to wait for Pascal Tweaker tool ...


----------



## jedimasterben

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Well, got my card blocked. She's under 40c during full load.
> 
> Still get tons of clock fluctuations though. All the way from 2202 down to 2000 MHz.... We really need a way to push more power to the card.


We already have one that works flawlessly


----------



## jedimasterben

Quote:


> Originally Posted by *DStealth*
> 
> What's wrong with your score...24k is the area where these cards are crossing bare stock boosting close to 2ghz not 2300...something is definitely wrong with your result or the clocks reported and the real ones highly deffer


They're likely using the Asus Strix XOC BIOS without having an Asus Strix OC card, clock speeds are higher but performance is much lower on any other card.


----------



## juniordnz

Always nice to remember: we chase performance, not clocks.

I could care less about my card being able to do 2189mhz core if that ends up giving me less performance than my stable 2114mhz OC.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jedimasterben*
> 
> We already have one that works flawlessly


How so?


----------



## jedimasterben

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> How so?


Add small capacitors to resistors RS1, RS2, and RS3, or add some liquid metal over them to reduce their resistance. This will effectively remove power limit from the equation.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jedimasterben*
> 
> Add small capacitors to resistors RS1, RS2, and RS3, or add some liquid metal over them to reduce their resistance. This will effectively remove power limit from the equation.


I thought you were referring to a software way. I've heard about the shunt mod but I'm not comfortable doing any hardware mods on my new card... yet


----------



## jedimasterben

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I thought you were referring to a software way. I've heard about the shunt mod but I'm not comfortable doing any hardware mods on my new card... yet


It's really not scary and it's pretty easy to remove, so there isn't any real downside, which is good because this is the only way to mod Pascal


----------



## Spieler4

Quote:


> Originally Posted by *Swolern*
> 
> Anyone having issues with a GTX 1080 and a 144hz monitor causing horizontal flickering? Doesnt happen at 120hz. I think it's the monitor, but I have seen some threads with the GTX 1080 causing flickering.


Had some vertical issues in SLI on a 144hz monitor and G-sync.
What fixed it for me
Nvidia 3d settings : Prefered refresh rate -> highest available
In game settings lock fps at 160 instead of 144. Looks better even if game only runs 100-120fps


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jedimasterben*
> 
> It's really not scary and it's pretty easy to remove, so there isn't any real downside, which is good because this is the only way to mod Pascal


My concern is the CLU or whatever dripping onto the PCB. I wonder if a drop of solder would serve the same purpose..


----------



## jedimasterben

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My concern is the CLU or whatever dripping onto the PCB. I wonder if a drop of solder would serve the same purpose..


I soldered a piece of 16AWG solid wire over mine, from what I understand if the resistance is lowered by too much then the card is assumed to be in an error state and locks clock speeds at their minimums. This info came out after I'd done mine, so I got lucky


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jedimasterben*
> 
> I soldered a piece of 16AWG solid wire over mine, from what I understand if the resistance is lowered by too much then the card is assumed to be in an error state and locks clock speeds at their minimums. This info came out after I'd done mine, so I got lucky


I might have to tear apart my card and drip some solder on the resistors.

I just got my block together last night


----------



## Synthetickiller

Quote:


> Originally Posted by *DStealth*
> 
> What's wrong with your score...24k is the area where these cards are crossing bare stock boosting close to 2ghz not 2300...something is definitely wrong with your result or the clocks reported and the real ones highly deffer


I've been having driver issues. 372.70 works fine. I can't even run the benchmark with the 372.90 driver. I need wipe the drivers & start over.








That could be part of the issue. 3DMark was also not even recognizing what card I was using & now magically does. I'm sort of perplexed.

Quote:


> Originally Posted by *galeonki*
> 
> I had the same problem couple days ago, its only matter of adjusting properly the curve and setting the highest performance memory clocks.


I don't believe I pushed my ram up for the runs I did. They just sat at stock. Not sure about the last run I did. I did about 10 runs & it gets monotonous, lol.

Quote:


> Originally Posted by *jedimasterben*
> 
> They're likely using the Asus Strix XOC BIOS without having an Asus Strix OC card, clock speeds are higher but performance is much lower on any other card.


No. I'm on whatever bios Zotac ships w/ this card. I haven't flashed the card.


----------



## jedimasterben

Quote:


> Originally Posted by *Synthetickiller*
> 
> No. I'm on whatever bios Zotac ships w/ this card. I haven't flashed the card.


Gotcha, my mistake


----------



## Synthetickiller

It's alright.
I'm confused by it. Like people say, I should have higher scores. My FPS in-game are higher, so something is "working." As how all of this works, I have no clue.
I removed 372.70 using DDU in safe mode & installed 372.90 drivers. Same issue.

Any ideas?

Edit:
I reset all overclocks in MSI afterburner.
The card boosts to 1936mhz & the memory is set to 5005mhz.
Firestrike 1.1 on 372.90 is giving me the following scores:
Graphics: 22,817
Physics: 12,665
Combined: 7,942
Final score: 17,450


----------



## moustang

Quote:


> Originally Posted by *Synthetickiller*
> 
> It's alright.
> I'm confused by it. Like people say, I should have higher scores. My FPS in-game are higher, so something is "working." As how all of this works, I have no clue.
> I removed 372.70 using DDU in safe mode & installed 372.90 drivers. Same issue.
> 
> Any ideas?
> 
> Edit:
> I reset all overclocks in MSI afterburner.
> The card boosts to 1936mhz & the memory is set to 5005mhz.
> Firestrike 1.1 on 372.90 is giving me the following scores:
> Graphics: 22,817
> Physics: 12,665
> Combined: 7,942
> Final score: 17,450


I think the people telling you that you should have higher scores don't know what they are talking about. Yours are right about where they should be if that's the Main Rig in your signature. I think people are either overestimating the CPU, or simply don't understand how that's limiting the score. Very few 4790ks are exceeding 19,000 without some massive tweaking to tailor their systems just for this benchmark or some really exotic cooling to reach far higher overclocks than most 4970k users can touch.

In fact I just checked and the highest score for a 4790k with a single GTX 1080 is 20,043

To achieve that score their CPU is clocked at 5.1ghz and GPU is clocked at 2190mhz with the memory bus at 1387mhz (5548mhz Afterburner setting) . Some pretty incredible overclocking there.

I've got almost exactly the same system as you do. Same CPU, same motherboard, I've only got 16gb of RAM compared to your 32GB, but I've got Gskill Trident X PC2400 RAM (Faster RAM and I'm running a 9 CAS setting with it).

With my CPU at 4.5ghz and my GPU at 2113mhz and graphics memory bus at 1377mhz (5508 according to Afterburner) I'm only getting a final score of 18,141 in Firestrike 1.1.

So your scores really are right about where they should be for your system and settings.

As for your scores when you're overclocking the GPU so high.... I'll bet it's simply throttling the GPU back at those speeds. What temp is your GPU hitting under full load at 2300mhz?


----------



## DStealth

Quote:


> Originally Posted by *jedimasterben*
> 
> They're likely using the Asus Strix XOC BIOS without having an Asus Strix OC card, clock speeds are higher but performance is much lower on any other card.


T4 Strix BIOS is very efficient clock per clock compared to FE and other card BIOSes. Here's how mine is acting @ 24/7 OC 2126 Core


----------



## Vellinious

With the clock on the GPU he was running, he was getting pretty close to the same scores my GPU is at 2164. At 2300ish on the GPU, the graphics score should be higher.

Nobody cares about the overall score in here.....compare graphics scores.


----------



## feznz

Quote:


> Originally Posted by *Vellinious*
> 
> With the clock on the GPU he was running, he was getting pretty close to the same scores my GPU is at 2164. At 2300ish on the GPU, the graphics score should be higher.
> 
> Nobody cares about the overall score in here.....compare graphics scores.


Would have to agree with that I was doing some comparisons between 3500k, 3770k 6700k 6850k 6900k and 6950k I quickly lost the CPU upgrade itch.
with no surprise there was only a minimal difference in a single GTX 1080 Time spy benchmark only looking a at the GPU FPS, but look at the overall score was a whole different story.


----------



## galeonki

Quote:


> Originally Posted by *moustang*
> 
> I think the people telling you that you should have higher scores don't know what they are talking about. Yours are right about where they should be if that's the Main Rig in your signature. I think people are either overestimating the CPU, or simply don't understand how that's limiting the score. Very few 4790ks are exceeding 19,000 without some massive tweaking to tailor their systems just for this benchmark or some really exotic cooling to reach far higher overclocks than most 4970k users can touch.
> 
> In fact I just checked and the highest score for a 4790k with a single GTX 1080 is 20,043
> 
> To achieve that score their CPU is clocked at 5.1ghz and GPU is clocked at 2190mhz with the memory bus at 1387mhz (5548mhz Afterburner setting) . Some pretty incredible overclocking there.
> 
> I've got almost exactly the same system as you do. Same CPU, same motherboard, I've only got 16gb of RAM compared to your 32GB, but I've got Gskill Trident X PC2400 RAM (Faster RAM and I'm running a 9 CAS setting with it).
> 
> With my CPU at 4.5ghz and my GPU at 2113mhz and graphics memory bus at 1377mhz (5508 according to Afterburner) I'm only getting a final score of 18,141 in Firestrike 1.1.
> 
> So your scores really are right about where they should be for your system and settings.
> 
> As for your scores when you're overclocking the GPU so high.... I'll bet it's simply throttling the GPU back at those speeds. What temp is your GPU hitting under full load at 2300mhz?


Like Velinious said - the graphics score is the key when u oc your card - not overall or physics. Physics and Combined depends on many things - not only CPU. My CPU on different motherboard have better results and when i put faster ddr4 ram - my overall scores is higher too. Look at graphics only







. He has too low like i couple days ago.

I know that many ppl hated furmark, but for me is the first tool for testing stability. When furmark (gpu stress test and benchmark) give me stable results - this is a time to turn on 3dmark and games. When i change something in the curve - the same story - 1st furmark, later 3dmark and games.


----------



## galeonki

Quote:


> Originally Posted by *DStealth*
> 
> T4 Strix BIOS is very efficient clock per clock compared to FE and other card BIOSes. Here's how mine is acting @ 24/7 OC 2126 Core


Good score, but invalid score







, like russian guys on hwbot


----------



## destiNATION1337

So after reading a lot of Topics on several Sites, i recently bought a Palit 1080 JetStream !
Sadly i couldn´t find a real answer to a Question i still have..

I know that Gainward/Palit/Gamerock are basically all the same, so I think about Flashing a Super-JetStream Bios onto my JetStream Card..

1. Is this a good Idea or should i try OC manually?
2. Are there any more benefits from the Super-JetStream Bios ? (not just higher clocks?, e.g. FanCurve?)
3. Any other suggestions? ^^


----------



## galeonki

Quote:


> Originally Posted by *destiNATION1337*
> 
> So after reading a lot of Topics on several Sites, i recently bought a Palit 1080 JetStream !
> Sadly i couldn´t find a real answer to a Question i still have..
> 
> I know that Gainward/Palit/Gamerock are basically all the same, so I think about Flashing a Super-JetStream Bios onto my JetStream Card..
> 
> 1. Is this a good Idea or should i try OC manually?
> 2. Are there any more benefits from the Super-JetStream Bios ? (not just higher clocks?, e.g. FanCurve?)
> 3. Any other suggestions? ^^


1. Yeah, u can flash without risk (if u know how to do it)
2. I dont think so - u will hit the Power Limit wall still. But u should try. Sometimes different bios gives u couple FPS more at the same core clock and voltage. For me the best is FE Bios but i use asus strix oc bios bcz no more power limit.


----------



## sirleeofroy

Quote:


> Originally Posted by *destiNATION1337*
> 
> So after reading a lot of Topics on several Sites, i recently bought a Palit 1080 JetStream !
> Sadly i couldn´t find a real answer to a Question i still have..
> 
> I know that Gainward/Palit/Gamerock are basically all the same, so I think about Flashing a Super-JetStream Bios onto my JetStream Card..
> 
> 1. Is this a good Idea or should i try OC manually?
> 2. Are there any more benefits from the Super-JetStream Bios ? (not just higher clocks?, e.g. FanCurve?)
> 3. Any other suggestions? ^^


Thought I would chime in here since I have the Gainward GTX 1080 GLH Edition.

The bottom line is, the equivalent Palit and Gainward cards are exactly the same (eg; Palit Super Jetsream = Gainward GLH........ Palit Jetstream = Gainward GS). Same boards, same clocks, same coolers etc etc.. The only difference are the cooler shrouds and the naming in the BIOS.

I know a few GS (Golden Sample) owners have flashed the GLH BIOS and gained on clocks but as far as I know, that is the only difference. One would assume that the Super Jetstream and the GLH models are binned and guarantee the higher clocks, whilst you might well be ok flashing the lower clocked model, you might reduce your OC headroom or even run into artifacts due to the chip not being quite good enough for the increased clocks.

That said, they have dual BIOS so you can easily go back if it doesn't work out. I say give it a go!


----------



## destiNATION1337

Quote:


> Originally Posted by *sirleeofroy*
> 
> Thought I would chime in here since I have the Gainward GTX 1080 GLH Edition.
> 
> The bottom line is, the equivalent Palit and Gainward cards are exactly the same (eg; Palit Super Jetsream = Gainward GLH........ Palit Jetstream = Gainward GS). Same boards, same clocks, same coolers etc etc.. The only difference are the cooler shrouds and the naming in the BIOS.
> 
> I know a few GS (Golden Sample) owners have flashed the GLH BIOS and gained on clocks but as far as I know, that is the only difference. One would assume that the Super Jetstream and the GLH models are binned and guarantee the higher clocks, whilst you might well be ok flashing the lower clocked model, you might reduce your OC headroom or even run into artifacts due to the chip not being quite good enough for the increased clocks.
> 
> That said, they have dual BIOS so you can easily go back if it doesn't work out. I say give it a go!


Well..i guess i give it a try









But warranty is void then, even with Dual Bios i guess? But isnt it possible to reflash the "stock" bios before trying to use warranty? ^^

Anyway, did u even realise Palit released a new Bios a few Days ago?
http://palit.com/palit/vgapro.php?id=2619&lang=de&pn=NEB1080S15P2-1040J&tab=do

Sadly "only" for Gamrock Premium and SuperJetstream, but with my Plans with Flashing 1 of them to my JetStream it would be awesome.

Guess this new Bios adresses the Micron Ship "bug"


----------



## sirleeofroy

Quote:


> Originally Posted by *destiNATION1337*
> 
> Well..i guess i give it a try
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But warranty is void then, even with Dual Bios i guess? But isnt it possible to reflash the "stock" bios before trying to use warranty? ^^
> 
> Anyway, did u even realise Palit released a new Bios a few Days ago?
> http://palit.com/palit/vgapro.php?id=2619&lang=de&pn=NEB1080S15P2-1040J&tab=do
> 
> Sadly "only" for Gamrock Premium and SuperJetstream, but with my Plans with Flashing 1 of them to my JetStream it would be awesome.
> 
> Guess this new Bios adresses the Micron Ship "bug"


I think your warranty would only be void if you were to send it back with an incorrect BIOS, best thing would be to extract your current BIOS (both of them to be sure) so you can flash it back to stock.

I didn't know there was an updated vBIOS, I just checked the Gainward site and it looks like the BIOS for my card was updated about a week before the Palit so I'll extract my current BIOS and try the new one when I get home from work. Not heard of the Micron Ship "Bug" though.....


----------



## juniordnz

Anyone here successfuly flashed the Strix T4 BIOS to a 1080FTW? I tried long ago and it bricked my card, now I'm not sure I was flashing the correct one.

If any FTW owner did it successfuly would you be so kind to link me to the correct BIOS?


----------



## GreedyMuffin

Where's the link to the latest T4 BIOS? Want to flash my FE again.

At 2139/500+ on mem I'm only getting 24K on graphic score. What's wrong? My CPU is probably not the issue.


----------



## Tdbeisn554

My baby came in the mail yesterday


----------



## juniordnz

Quote:


> Originally Posted by *Archang3l*
> 
> My baby came in the mail yesterday


What a beautiful view, congratz







Please post your "out of the box" max clock and max overclock. Always nice to hear how these little monsters behave


----------



## EDGERRIES

rig.PNG 35k .PNG file


Hey guys, can i join the club? 2x Nvidia founders edition GTX 1080s. They really do move in 4k these puppies!


----------



## Tdbeisn554

Quote:


> Originally Posted by *juniordnz*
> 
> What a beautiful view, congratz
> 
> 
> 
> 
> 
> 
> 
> Please post your "out of the box" max clock and max overclock. Always nice to hear how these little monsters behave


Yeah sure








Probably will install it tomorrow and run some tests


----------



## DStealth

Quote:


> Originally Posted by *galeonki*
> 
> Good score, but invalid score
> 
> 
> 
> 
> 
> 
> 
> , like russian guys on hwbot


Don't have a valid key for this benchmark...
Here's a valid Time spy where it's not taking it in account - http://www.3dmark.com/spy/430038
Will try these days some benching w/o a key but these demos are annoying...Just to validate the score....

Edit: Oh forgot to mention it was made on non-Strix card...a Palit JS one...


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> Always nice to remember: we chase performance, not clocks.
> 
> I could care less about my card being able to do 2189mhz core if that ends up giving me less performance than my stable 2114mhz OC.


Absolutely, just about there. Fire Strike clocks steady with what looks like no drops thermal or otherwise.


Spoiler: Fire Strike rock solid clocks







Hahaha, seriously though something funky with Pascal clocks. Running with over ten times the GPU/Video and memory clocks reported only gives a little over twice the score.

@galeonki Have Corsair RM850 bro.


----------



## juniordnz

New driver seems good. Got some +80 on firestrike graphics. Getting 25390 @ 2114/+575

*Anyone on this?

*
Quote:


> Originally Posted by *juniordnz*
> 
> Anyone here successfuly flashed the Strix T4 BIOS to a 1080FTW? I tried long ago and it bricked my card, now I'm not sure I was flashing the correct one.
> 
> If any FTW owner did it successfuly would you be so kind to link me to the correct BIOS?


----------



## chiknnwatrmln

So wait, what's the deal with the Strix BIOS?

It has no power limit but gives less performance? Is this right?


----------



## DStealth

@juniordnz I have very identical scores with them here's for compare 372.90 vs 373.09
Tried your efficiency with Strix T4 [email protected] but unfortunately my video memory is decreasing the score after +535-540 range in this benchmark...seems on par with yours clock for clock I assume.


----------



## juniordnz

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> So wait, what's the deal with the Strix BIOS?
> 
> It has no power limit but gives less performance? Is this right?


Some say it's better clock for clock on performance. I wanned to try but haven't seen anyone with a FTW using it.
Quote:


> Originally Posted by *DStealth*
> 
> @juniordnz I have very identical scores with them here's for compare 372.90 vs 373.09
> Tried your efficiency with Strix T4 [email protected] but unfortunately my video memory is decreasing the score after +535-540 range in this benchmark...seems on par with yours clock for clock I assume.


Which 1080 you have, mate? Your sig is outdated.


----------



## DStealth

Palit JetStream flashed with Asus Strix BIOS...


----------



## chiknnwatrmln

I might do some testing later today and compare the standard FE BIOS to the STRIX BIOS.

I just wish I could run FS without the demo. It takes so much longer that way.

Maybe I'll try 3DM11...


----------



## juniordnz

Quote:


> Originally Posted by *DStealth*
> 
> Palit JetStream flashed with Asus Strix BIOS...


Would you mind posting the t4 here mate?


----------



## Derek1

Quote:


> Originally Posted by *Synthetickiller*
> 
> Big shout out to galeonki. I tried his method to force higher clocks with a very non-linear core clock curve. I gained over 60mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need to play around with clocks a little more. I've done absolutely no game testing, but I will do a little tonight.
> I broke 2300mhz (2303mhz), but 2316 is in no way stable.
> 
> A question for everyone here, is there any way to force core clocks? I cannot set any speed between 2278mhz & 2290mhz. No matter where I set the curve, the clock either moves up or down in speed, depending on how close they are to either speed. Does anyone know anything about this?


This is interesting. You adjusted the voltage curve as galeonki suggested and got clocks over 2300 but graphics score here is about the same as my FTW set at +125, +400 with basic curve voltage set at +75% in Precision X.



Call me confused. galeonki must have God's card. I am not seeing the relationship between voltage-TDP-Clock-Mem-Performance here. My FTW seems to perform best at +125-135 on the Clock and +400-500 on Mem with the voltage running around 1.8. My Temps are never over 65C on a custom fan curve. Anything above those parameters and drivers start to fail.

(ETA the Temp stuff)


----------



## 66racer

Hi guys,

whats the word on coil whine? I have an EVGA 1080FTW on 1080p/120hz (plan on 1440p but not soon) and I get coil whine at higher fps. Im not sure if I should return for another or talk to EVGA about it first but wanted to ask your guys thoughts here on it. I have used a 1080FE on the same resolution and dont recall hearing the whine. It was also only benching and not gaming. I think my main problem is that the rest of the system is SOOoooo quiet. Cant tell if its golden yet as I only pushed it to 2050mhz since I hear they all seem to do 2100mhz, just low on time to find the max this gpu can do.

So does this sound normal for a FTW 1080? I can still return it. I mean I do know its to be expected with a card turning out such high fps in games like overwatch but its a little distracting.....maybe I should just turn up the game volume lol.

Thanks!


----------



## GreedyMuffin

Windows defender is blocking my nvflash..

How can I disable the damn thing?


----------



## Derek1

My FTW is quiet as a mouse.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *66racer*
> 
> Hi guys,
> 
> whats the word on coil whine? I have an EVGA 1080FTW on 1080p/120hz (plan on 1440p but not soon) and I get coil whine at higher fps. Im not sure if I should return for another or talk to EVGA about it first but wanted to ask your guys thoughts here on it. I have used a 1080FE on the same resolution and dont recall hearing the whine. It was also only benching and not gaming. I think my main problem is that the rest of the system is SOOoooo quiet. Cant tell if its golden yet as I only pushed it to 2050mhz since I hear they all seem to do 2100mhz, just low on time to find the max this gpu can do.
> 
> So does this sound normal for a FTW 1080? I can still return it. I mean I do know its to be expected with a card turning out such high fps in games like overwatch but its a little distracting.....maybe I should just turn up the game volume lol.
> 
> Thanks!


My FE has a tiny bit of whine at high FPS and almost inaudible amount at 60fps.

If my volume is off and I listen closely I can hear it, but coming from two 290x's it's damn near silent.

With the stock blower I couldnt' hear the whine at all.


----------



## fjordiales

Quote:


> Originally Posted by *juniordnz*
> 
> Would you mind posting the t4 here mate?


I'm not sure if you want the t4 bios but I have it bookmarked. Here you go.

https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803


----------



## galeonki

Quote:


> Originally Posted by *Derek1*
> 
> This is interesting. You adjusted the voltage curve as galeonki suggested and got clocks over 2300 but graphics score here is about the same as my FTW set at +125, +400 with basic curve voltage set at +75% in Precision X.
> 
> 
> 
> Call me confused. galeonki must have God's card. I am not seeing the relationship between voltage-TDP-Clock-Mem-Performance here. My FTW seems to perform best at +125-135 on the Clock and +400-500 on Mem with the voltage running around 1.8. My Temps are never over 65C on a custom fan curve. Anything above those parameters and drivers start to fail.
> 
> (ETA the Temp stuff)


Another step forward - lower clock -13mhz but curve starts higher and more flat = +200 points extra on graphics score, and 2-3 fps more in games







.
26 342 sounds nice but truly i see potential to break 26 400







.


----------



## Dry Bonez

well,depending how this hurricane matthew goes,i might sell my gtx 1080,i placed it safe cuz i know it has value and i might need to sell it. im probably not even supposed to mention anything about a storm but it has gtx 1080 related...how many other 1080 owners are about to go thru this storm? Im a bit scared not going to lie.


----------



## TWiST2k

Quote:


> Originally Posted by *Dry Bonez*
> 
> well,depending how this hurricane matthew goes,i might sell my gtx 1080,i placed it safe cuz i know it has value and i might need to sell it. im probably not even supposed to mention anything about a storm but it has gtx 1080 related...how many other 1080 owners are about to go thru this storm? Im a bit scared not going to lie.


Hang in there man, stay away from windows and let us know how it goes and keep those bonez dry heh.


----------



## Dry Bonez

Quote:


> Originally Posted by *TWiST2k*
> 
> Hang in there man, stay away from windows and let us know how it goes and keep those bonez dry heh.


hey, thanks fror those words. I boarded the windows today,but im not sure if our roof will handle it and idk what to do if it leaks or gives out. Any recommendations?


----------



## DStealth

Quote:


> Originally Posted by *juniordnz*
> 
> Would you mind posting the t4 here mate?


Here you go...from the original thread
Direct link
Using it since 2 months from now...actually since day one i have this card


----------



## galeonki

Quote:


> Originally Posted by *ucode*
> 
> Absolutely, just about there. Fire Strike clocks steady with what looks like no drops thermal or otherwise.
> 
> 
> Spoiler: Fire Strike rock solid clocks
> 
> 
> 
> 
> 
> 
> 
> Hahaha, seriously though something funky with Pascal clocks. Running with over ten times the GPU/Video and memory clocks reported only gives a little over twice the score.
> 
> @galeonki Have Corsair RM850 bro.


hahahaha nice score, i think u hit the max here.























thx for info - no coil whine in corsair psu?
Quote:


> Originally Posted by *Dry Bonez*
> 
> hey, thanks fror those words. I boarded the windows today,but im not sure if our roof will handle it and idk what to do if it leaks or gives out. Any recommendations?


I keep my fingers crossed for you


----------



## artemis2307

Got mine brand new for 520$
Nvidia sure knows how to make sexy hardware


----------



## destiNATION1337

So..someone here who owns a 1080 Palit JetStream?

Can i flash a SuperJetStream Bios?


----------



## juniordnz

Quote:


> Originally Posted by *DStealth*
> 
> Here you go...from the original thread
> Direct link
> Using it since 2 months from now...actually since day one i have this card


I'll try that since I can't remember if it was the T4 or Classy that bricked my card, only one way to know









God bless the dual BIOS


----------



## DStealth

Quote:


> Originally Posted by *destiNATION1337*
> 
> So..someone here who owns a 1080 Palit JetStream?
> 
> Can i flash a SuperJetStream Bios?


Yes


----------



## Koniakki

Quote:


> Originally Posted by *destiNATION1337*
> 
> So..someone here who owns a 1080 Palit JetStream?
> 
> Can i flash a SuperJetStream Bios?


As DStealth said above, yes and I want to add that I have also flashed GameRock Premium bios to a Super Jetstream.

It was a GTX 1070 tho but I'm quite positive it would be even "safer" for GTX 1080 since all are using same Micron ram.

All running smoothly.


----------



## destiNATION1337

Quote:


> Originally Posted by *Koniakki*
> 
> As DStealth said above, yes and I want to add that I have also flashed GameRock Premium bios to a Super Jetstream.
> 
> It was a GTX 1070 tho but I'm quite positive it would be even "safer" for GTX 1080 since all are using same Micron ram.
> 
> All running smoothly.


Well..according to some investigation, the Gamrock Versionen uses 8 Phase for GPU and 2 Phase for Memory, so i am not quite sure if this is "safe"...


----------



## bloddy

This is what I can get..

http://www.3dmark.com/3dm/15295867


----------



## Vellinious

Quote:


> Originally Posted by *bloddy*
> 
> This is what I can get..
> 
> http://www.3dmark.com/3dm/15295867


That's right about where most of the 1080s are on air.


----------



## Koniakki

Quote:


> Originally Posted by *destiNATION1337*
> 
> Well..according to some investigation, the Gamrock Versionen uses 8 Phase for GPU and 2 Phase for Memory, so i am not quite sure if this is "safe"...


Well, again, I'm quite positive that's just plain marketing info.

Even the Palit website advertises 8 phases for both Super JetStream and GameRock Premium

The below is quoted from Palit:

Palit GTX 1080 GameRock Premium:
"The 8 phases for those 2560 cores provides unlimited power and minimized current loading for each phase to stabilize the voltage level , the overall efficiency can be improved, the choke noise and EMI noise also be reduced."

Palit GTX 1080 Super JetStream:
"Unique 8-Phase PWM design reduces max current load for each phase to stabilize the voltage level and improve overclocking ability. It also provides another 30% current capacity"

Now compare both pcb's and tell me which is which.










Source


Source


----------



## bloddy

Quote:


> Originally Posted by *Vellinious*
> 
> That's right about where most of the 1080s are on air.


Yea but the issue is that I'm on water so







) i can go over 2200 but will not stay reliable even at 1.1V


----------



## LiquidHaus

I'm pretty sure the highest clocking cards on this thread are watercooled - regardless of whatever bios they're running. They were running nicely before they had flashed them and before they waterblocked them.

This sudden surge of wanting to start flashing bios' again happened like 400 pages ago. It's not going to be a cure-all.

The first thing anyone would want to do with a Pascal card is to watercool it. Temps are the biggest factor as it causes instability with consistent clocks as well as instability with drivers crashing.

Personally, I plan to watercool my 1080 FTW before anything else.

These cards act much different under water, so I will see how the card starts acting once underwater. After all that, maybe I'll consider the Strix T4 bios flash.


----------



## chiknnwatrmln

Watercooling my card made no difference whatsoever in performance.

The card runs a hell of a lot cooler (40c as opposed to 85c) and is silent but it's still power limit and throttle down from its max clock.


----------



## bloddy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Watercooling my card made no difference whatsoever in performance.
> 
> The card runs a hell of a lot cooler (40c as opposed to 85c) and is silent but it's still power limit and throttle down from its max clock.


30C at me .. so at least is cooling properly, and on full load 37C. I've tried the bios from the strix but well is crashing like hell. And I don't like that it's disabling 1 x DP and I can't do Surround anymore ...


----------



## Shogoki

Sorry if the question has been asked before but, i have a i5 4460, should i expect to be CPU limited if a get a GTX 1080 ?
Also, has this card found it's match in a game that doesn't run in 100+ fps in 1080p ?


----------



## rdhrdh

I'd love to try our this curve but unfortunately I can't quite make out your exact placements, any chance you could map out the main parts? It would be greatly appreciated.
Quote:


> Originally Posted by *galeonki*
> 
> Another step forward - lower clock -13mhz but curve starts higher and more flat = +200 points extra on graphics score, and 2-3 fps more in games
> 
> 
> 
> 
> 
> 
> 
> .
> 26 342 sounds nice but truly i see potential to break 26 400
> 
> 
> 
> 
> 
> 
> 
> .


----------



## Vellinious

Quote:


> Originally Posted by *rdhrdh*
> 
> I'd love to try our this curve but unfortunately I can't quite make out your exact placements, any chance you could map out the main parts? It would be greatly appreciated.


The curve is going to be different for every card....copying exactly what someone else has, has a high probability of not working.


----------



## Vellinious

Quote:


> Originally Posted by *bloddy*
> 
> Yea but the issue is that I'm on water so
> 
> 
> 
> 
> 
> 
> 
> ) i can go over 2200 but will not stay reliable even at 1.1V


Check GPUz sensors tab, to make sure you're not hitting the power limit. That will certainly bring performance down as well.


----------



## bloddy

Quote:


> Originally Posted by *Vellinious*
> 
> Check GPUz sensors tab, to make sure you're not hitting the power limit. That will certainly bring performance down as well.


I hit it, and VREL issue is that is not stable not that i hit Power Limit.... stability si the biggest problem. My card at 1.1 V 2200Mhz is freezing.


----------



## juniordnz

I believe GPU-Z don't understand pascal yet. The VREL we see should be THRML. We are throttling due to temperature, not voltages.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> I believe GPU-Z don't understand pascal yet. The VREL we see should be THRML. We are throttling due to temperature, not voltages.


^^This


----------



## shilka

After going back and forth between the GTX 1070 and GTX 1080 i have decided to go with the GTX 1080 IF i can sell my second GTX 970 for a fair price
Probably wont be for 3 weeks but i am already looking at which GTX 1080 to get

So far i think my first choice is going to be the EVGA GTX 1080 FTW both because i think its the best looking card but also because i want to try an EVGA card this time around

I am also looking at the Gigabyte G1 Gaming GTX 1080 and despite the fact its the cheapest of the GTX 1080 i am looking at its also the most boring and plain looking one
The Gigabyte Extreme Gaming GTX 1080 really turns me off both because of its price and to the fact that the cooler is so huge i am worried that the card will sag over time

Been looking at the Asus Strix as well but that one does not appeal to me that much
Long story short is the EVGA GTX 1080 FTW a good choice or not?

I own an Asus PG279Q 165Hz Nvidia G-Sync monitor so what i am looking for is a card that can do 165 FPS in 1440P
Does not need to be at the highest settings not that i play the newest games the first year they are out anyway

I can live with lower but i really want to use the full power of the monitor
From what i have seen a GTX 1070 is simply not powerfull enough for what i want, which is shame as the price is really good

Edit: i am NOT doing SLI ever again as my experience with SLI has terrible over the last two years so i need ONE card even if i have to lower settings.


----------



## bloddy

You can get any of them, I think the bios is not mature on any card, even on the founder's edition ... I'm still waiting the release of some modified bios for FE ...


----------



## juniordnz

Quote:


> Originally Posted by *shilka*
> 
> After going back and forth between the GTX 1070 and GTX 1080 i have decided to go with the GTX 1080 IF i can sell my second GTX 970 for a fair price
> Probably wont be for 3 weeks but i am already looking at which GTX 1080 to get
> 
> So far i think my first choice is going to be the EVGA GTX 1080 FTW both because i think its the best looking card but also because i want to try an EVGA card this time around
> 
> I am also looking at the Gigabyte G1 Gaming GTX 1080 and despite the fact its the cheapest of the GTX 1080 i am looking at its also the most boring and plain looking one
> The Gigabyte Extreme Gaming GTX 1080 really turns me off both because of its price and to the fact that the cooler is so huge i am worried that the card will sag over time
> 
> Been looking at the Asus Strix as well but that one does not appeal to me that much
> Long story short is the EVGA GTX 1080 FTW a good choice or not?
> 
> I own an Asus PG279Q 165Hz Nvidia G-Sync monitor so what i am looking for is a card that can do 165 FPS in 1440P
> Does not need to be at the highest settings not that i play the newest games the first year they are out anyway
> 
> I can live with lower but i really want to use the full power of the monitor
> From what i have seen a GTX 1070 is simply not powerfull enough for what i want, which is shame as the price is really good
> 
> Edit: i am NOT doing SLI ever again as my experience with SLI has terrible over the last two years so i need ONE card even if i have to lower settings.


Not even a single 1080 will be able to push [email protected] on new AAA titles, unless you compromise a lot on filters/effects. Also, you will need A LOT of CPU power to push stable 165fps.


----------



## shilka

Quote:


> Originally Posted by *juniordnz*
> 
> Not even a single 1080 will be able to push [email protected] on new AAA titles, unless you compromise a lot on filters/effects. Also, you will need A LOT of CPU power to push stable 165fps.


I believe i stated i am not playing the newest games and that i am willing to lower settings a bit
Lets say i want to re-play Far Cry 2-3-Blood Dragon-4 would a GTX 1080 be able to do 165 FPS in 1440P in those games?

My CPU is a 6850K and i have 32 GB of 3000 MHz RAM so those are not a problem.


----------



## Juub

Quote:


> Originally Posted by *shilka*
> 
> I believe i already said i am not playing the newest games and that i am willing to lower settings a bit
> Lets say i want to re-play Far Cry 2-3-Blood Dragon-4 would a GTX 1080 be able to do 165 FPS in 1440P
> 
> My CPU is a 6850K and i have 32 GB of 3000 MHz RAM so those are not a problem.


Far Cry 2 most likely. Far Cry 3 if you drop down some settings probably. It's still a very demanding game at 1440p.


----------



## shilka

Quote:


> Originally Posted by *Juub*
> 
> Far Cry 2 most likely. Far Cry 3 if you drop down some settings probably. It's still a very demanding game at 1440p.


165 FPS is not a must even 100 FPS would be way better then the old 60 Hz monitor i was stuck with before
Just Cause 3 is the main game that i want to play and with my old GTX 970 SLI setup i could just hit 50 FPS which is totally unacceptable for me

a GTX 1080 can do JC3 in 1440P from about 80 to 100 FPS which is way better then the 40-50 i was stuck with before.


----------



## Tdbeisn554

I installed and tested my 1080 classified today, and I am actually a bit disappointing, I ran Heaven benchmark first to see what my out of the box clocks where, but then the card started to whine really loud, so I tried mankind divided --> again really loud coil whine. Clock in heaven were around 1936... I was expecting at least 2100 or something even pushed voltage limit to the max, power limit to 122%, added a 100Mhz offset to the core and set the fan speed to 100%. Ran heaven --> 1936-ish Mhz clock... It got to 2000 sometimes but would not stay there long..
I was expecting a lot more of this card to be honest, it is like EVGA's top end Gtx 1080 at the moment with a super heavy power delivery system (14 + 3 phases,... So super overkill) and then I get 1936







I know silicon lottery and that the cards are not binned but still


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> I installed and tested my 1080 classified today, and I am actually a bit disappointing, I ran Heaven benchmark first to see what my out of the box clocks where, but then the card started to whine really loud, so I tried mankind divided --> again really loud coil whine. Clock in heaven were around 1936... I was expecting at least 2100 or something even pushed voltage limit to the max, power limit to 122%, added a 100Mhz offset to the core and set the fan speed to 100%. Ran heaven --> 1936-ish Mhz clock... It got to 2000 sometimes but would not stay there long..
> I was expecting a lot more of this card to be honest, it is like EVGA's top end Gtx 1080 at the moment with a super heavy power delivery system (14 + 3 phases,... So super overkill) and then I get 1936
> 
> 
> 
> 
> 
> 
> 
> I know silicon lottery and that the cards are not binned but still


I also got my new 1080 classified yesterday.

Out of the box with no adjustments Heaven 4.0 (at 2560x1440) it is boosting to 1974.

a) If I turn the fan up to 100% temperature will go down to about 47c-49c and boost will then automatically stay at 1999 (2000 reported in Heaven).

b) If I then set the voltage to 100% the voltage will increase to 1.093v and it then boosts to 2012mhz-2015mhz at temps 47c-49c.

c) And also If I then add a power target of 30% and increase temp target to 92c the boost does not increase (since Heaven doesn't push the card to anywhere near the Classified's 320w power limit) -- _*Heaven 4.0 is only drawing an avg. 176w so increasing power limit will not help *_

d) If I add 150 to the clock offset it will then boost to 2151mhz and stay there at 50c
while drawing about 10w more (power draw measured using HWiNFO64).

e) [Final Step] If reset the clock offset to 0 then use the voltage point/clock speed curve instead, to set boost offsets at each voltage point for a "finer tuned" overclock I am then able to get 2200 with a little trial and error. I haven't even sat and played around with the voltage curve for long yet -- the 2200 I got with only about 30 minutes "play around" time.

This was all done with the BIOS switch set to 'slave'.

Archang3l if you follow the same routine as I have above do you not get similar results? Are you in a hotter country? I am in the U.K where ambient temperatures are very mild.


----------



## NTME9

Quote:


> Originally Posted by *Archang3l*
> 
> I installed and tested my 1080 classified today, and I am actually a bit disappointing, I ran Heaven benchmark first to see what my out of the box clocks where, but then the card started to whine really loud, so I tried mankind divided --> again really loud coil whine. Clock in heaven were around 1936... I was expecting at least 2100 or something even pushed voltage limit to the max, power limit to 122%, added a 100Mhz offset to the core and set the fan speed to 100%. Ran heaven --> 1936-ish Mhz clock... It got to 2000 sometimes but would not stay there long..
> I was expecting a lot more of this card to be honest, it is like EVGA's top end Gtx 1080 at the moment with a super heavy power delivery system (14 + 3 phases,... So super overkill) and then I get 1936
> 
> 
> 
> 
> 
> 
> 
> I know silicon lottery and that the cards are not binned but still


I haven't got a chance to push my cards but what you say sounds just like how my cards run right out of the box, I was seeing the cards pop 2000 for a short periods too. I originally also was going to go with the classys but after reading a few reviews for both, the OC results were basically the same. Like you say tho one would hope with all those fancy power delivery bells and whistles you would get better results.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> I'll try that since I can't remember if it was the T4 or Classy that bricked my card, only one way to know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> God bless the dual BIOS


The T4 worked fine for me, the Classy crashed my FTW as well, after reading about your card dying, I actually stopped using the T4 bios and went back to the 130% FTW bios for fear of them being related haha.


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> I believe GPU-Z don't understand pascal yet. The VREL we see should be THRML. We are throttling due to temperature, not voltages.


I believe Pascal doesn't understand Pascal yet.

Have had power and thermal limits shown for 35C and less than 10W. Seems clocks can read wrong too.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> I also got my new 1080 classified yesterday.
> 
> Out of the box with no adjustments Heaven 4.0 (at 2560x1440) it is boosting to 1974.
> 
> a) If I turn the fan up to 100% temperature will go down to about 47c-49c and boost will then automatically stay at 1999 (2000 reported in Heaven).
> 
> b) If I then set the voltage to 100% the voltage will increase to 1.093v and it then boosts to 2012mhz-2015mhz at temps 47c-49c.
> 
> c) And also If I then add a power target of 30% and increase temp target to 92c the boost does not increase (since Heaven doesn't push the card to anywhere near the Classified's 320w power limit) -- _*Heaven 4.0 is only drawing an avg. 176w so increasing power limit will not help *_
> 
> d) If I add 150 to the clock offset it will then boost to 2151mhz and stay there at 50c
> while drawing about 10w more (power draw measured using HWiNFO64).
> 
> e) [Final Step] If reset the clock offset to 0 then use the voltage point/clock speed curve instead, to set boost offsets at each voltage point for a "finer tuned" overclock I am then able to get 2200 with a little trial and error. I haven't even sat and played around with the voltage curve for long yet -- the 2200 I got with only about 30 minutes "play around" time.
> 
> This was all done with the BIOS switch set to 'slave'.
> 
> Archang3l if you follow the same routine as I have above do you not get similar results? Are you in a hotter country? I am in the U.K where ambient temperatures are very mild.


Gonna try this







And I am from Belgium so not really hot here...


----------



## tatne

Quote:


> Originally Posted by *juniordnz*
> 
> Just bought a 11gram tube of Kryonaut. Hopefully it will help decrease some of the temp on air. Expecting 5-10ºC worst-best improvement.
> 
> It's coming from great britain, though. Cheapest place I could find, 24,70GBP. Will take a month to get here at least


Did you get better temps? I have same thermal compound lying around somewhere.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> I also got my new 1080 classified yesterday.
> 
> Out of the box with no adjustments Heaven 4.0 (at 2560x1440) it is boosting to 1974.
> 
> a) If I turn the fan up to 100% temperature will go down to about 47c-49c and boost will then automatically stay at 1999 (2000 reported in Heaven).
> 
> b) If I then set the voltage to 100% the voltage will increase to 1.093v and it then boosts to 2012mhz-2015mhz at temps 47c-49c.
> 
> c) And also If I then add a power target of 30% and increase temp target to 92c the boost does not increase (since Heaven doesn't push the card to anywhere near the Classified's 320w power limit) -- _*Heaven 4.0 is only drawing an avg. 176w so increasing power limit will not help *_
> 
> d) If I add 150 to the clock offset it will then boost to 2151mhz and stay there at 50c
> while drawing about 10w more (power draw measured using HWiNFO64).
> 
> e) [Final Step] If reset the clock offset to 0 then use the voltage point/clock speed curve instead, to set boost offsets at each voltage point for a "finer tuned" overclock I am then able to get 2200 with a little trial and error. I haven't even sat and played around with the voltage curve for long yet -- the 2200 I got with only about 30 minutes "play around" time.
> 
> This was all done with the BIOS switch set to 'slave'.
> 
> Archang3l if you follow the same routine as I have above do you not get similar results? Are you in a hotter country? I am in the U.K where ambient temperatures are very mild.


The moment I go 2050Mhz or more I get a lot of artifacts and/or Heaven crashes frown.gif I mean for a top end card like the Classified (which has a huge cooler) 2050 is kinda low... especially when you know some founders editions with 5+1 phases, single 8 pin and stock cooler go to 2100+ ...


----------



## GreedyMuffin

I run 2139 on my 1080 FE on stock 1.050V. Tihi.


----------



## Derek1

Just out of curiousity, does anyone have the Strix O8G. That is the card that I originally ordered back in July but after waiting 2 months without that unicorn being seen on the North American continent I bought the FTW instead. The aggravating part was that ASUS had in the interim released the A8G which I didn't understand at all. Finally though, I now see that NewEgg here in Canada has got the O8G in stock but are limiting one per customer. I am interested to see the numbers on the O8G because it had the highest advertised rated boost out of the box at 1936, followed by the AMP Extreme.


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> The moment I go 2050Mhz or more I get a lot of artifacts and/or Heaven crashes frown.gif I mean for a top end card like the Classified (which has a huge cooler) 2050 is kinda low... especially when you know some founders editions with 5+1 phases, single 8 pin and stock cooler go to 2100+ ...


Where are you reading the speed from?


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> I also got my new 1080 classified yesterday.
> 
> Out of the box with no adjustments Heaven 4.0 (at 2560x1440) it is boosting to 1974.
> 
> a) If I turn the fan up to 100% temperature will go down to about 47c-49c and boost will then automatically stay at 1999 (2000 reported in Heaven).
> 
> b) If I then set the voltage to 100% the voltage will increase to 1.093v and it then boosts to 2012mhz-2015mhz at temps 47c-49c.
> 
> c) And also If I then add a power target of 30% and increase temp target to 92c the boost does not increase (since Heaven doesn't push the card to anywhere near the Classified's 320w power limit) -- _*Heaven 4.0 is only drawing an avg. 176w so increasing power limit will not help *_
> 
> d) If I add 150 to the clock offset it will then boost to 2151mhz and stay there at 50c
> while drawing about 10w more (power draw measured using HWiNFO64).
> 
> e) [Final Step] If reset the clock offset to 0 then use the voltage point/clock speed curve instead, to set boost offsets at each voltage point for a "finer tuned" overclock I am then able to get 2200 with a little trial and error. I haven't even sat and played around with the voltage curve for long yet -- the 2200 I got with only about 30 minutes "play around" time.
> 
> This was all done with the BIOS switch set to 'slave'.
> 
> Archang3l if you follow the same routine as I have above do you not get similar results? Are you in a hotter country? I am in the U.K where ambient temperatures are very mild.


The moment I go 2050Mhz or more I get a lot of artifacts and/or Heaven crashes frown.gif I mean for a top end card like the Classified (which has a huge cooler) 2050 is kinda low... especially when you know some founders editions with 5+1 phases, single 8 pin and stock cooler go to 2100+ ...
Quote:


> Originally Posted by *nrpeyton*
> 
> Where are you reading the speed from?


EVGA's PrecisionX and the OSD Overlay from precision in Heaven...


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> The moment I go 2050Mhz or more I get a lot of artifacts and/or Heaven crashes frown.gif I mean for a top end card like the Classified (which has a huge cooler) 2050 is kinda low... especially when you know some founders editions with 5+1 phases, single 8 pin and stock cooler go to 2100+ ...
> EVGA's PrecisionX and the OSD Overlay from precision in Heaven...


I've done a lot of research and even successfully flashed a STRIX bios to my new classified in the last few days (I've spent days tinkering/comparing and only 5 minutes playing games) so I'll share ma findings as soon as I get in later (I'll have more time to type then). In the meantime if you *REALLY* want to see the difference between our classified's and a FE edition and also 100% of other cards out there *in terms of its power delivery system* which I can see this is important to you -- download "HWINFO64" - goole search and also the "EVGA OC Scanner" from EVGA site and run the first furmark GPU core burner from within the OC scanner and watch the GPU's power in HWINFO. Make sure your on the "slave" / aka LN2 bios and set power limit to 30%. Also download GPU-z and watch the "PerfCap" in 'sensors' tab and you'll see how your card will not throttle until it is drawing 320W - -this is where our classified's will really beat a FE which is capped to 180w. It really depends on what game your playing and the environment at the time.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> I've done a lot of research and even successfully flashed a STRIX bios to my new classified in the last few days (I've spent days tinkering/comparing and only 5 minutes playing games) so I'll share ma findings as soon as I get in later (I'll have more time to type then). In the meantime if you *REALLY* want to see the difference between our classified's and a FE edition and also 100% of other cards out there *in terms of its power delivery system* which I can see this is important to you -- download "HWINFO64" - goole search and also the "EVGA OC Scanner" from EVGA site and run the first furmark GPU core burner from within the OC scanner and watch the GPU's power in HWINFO. Make sure your on the "slave" / aka LN2 bios and set power limit to 30%. Also download GPU-z and watch the "PerfCap" in 'sensors' tab and you'll see how your card will not throttle until it is drawing 320W - -this is where our classified's will really beat a FE which is capped to 180w. It really depends on what game your playing and the environment at the time.


Well I do not really care that much about the power delivery, it is just a shame and really unfortunate that the FE with their stock blower style cooler, 180w TDP, single 8 pin and a really simple (in comparison with the Classy that is) power delivery solution beat my classy... I am really thinking about RMA'ing it cause it whines the whole time when I game while the rest of my system is almost dead silent...


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Well I do not really care that much about the power delivery, it is just a shame and really unfortunate that the FE with their stock blower style cooler, 180w TDP, single 8 pin and a really simple (in comparison with the Classy that is) power delivery solution beat my classy... I am really thinking about RMA'ing it cause it whines the whole time when I game while the rest of my system is almost dead silent...


Your Classified could be clocking at 1960mhz and drawing 320W on Furmark doing 100 FPS.

The FE could be clocked at 1970 and only drawing 180W on Furmark doing 80 FPS due to power throttling.

Your Classsified still wins.


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Well I do not really care that much about the power delivery, it is just a shame and really unfortunate that the FE with their stock blower style cooler, 180w TDP, single 8 pin and a really simple (in comparison with the Classy that is) power delivery solution beat my classy... I am really thinking about RMA'ing it cause it whines the whole time when I game while the rest of my system is almost dead silent...


Also, the clock rate is only 'part' of what makes up the entire architecture, the difference between 2000mhz and 2100 on pascal is probably 1-2FPS

If you'd bought a FE and had the same luck with silicon lottery you'd have a 2000 max clock FE with a 180W power cap instead of a 2000 max clock Classified with 320W power limit. I know which one I'd rather have


----------



## nrpeyton

Anyone with a GTX *Classified 1080* could you tell me what your fan RPM is at 100% ?

Mine is at about 2800rpm -- just want to check it isn't running slower than it should.


----------



## ucode

Yep, if you need to run furmark at 340W+ then FE isn't the card to have.


----------



## dVeLoPe

my 6181 evga acx boosts to 1911 stock if +200 core it runs at 2100 stable everything stock except for 100% fan speed


----------



## nrpeyton

I don't disagree and I have nothing against FE but if you took your GPU off the Classified board and repacked it as a FE it wouldn't perform half as well as it did before.

You could have been just as "unlucky" with the silicon lottery on a FE with FE cooler/5 phase VRM your card wouldn't be getting to where it is now. And even if you went 'water' on your FE your still limited by 180W power limit. I was pulling nearly 260w on a directX 12 benchmark the other day.

If you took 10,00 random Classified's and 10,000 random FE and compared the averages the Classifieds would win hands down.-- and there is nothing wrong with this at all and no reason for a FE owner to say that they don't want to hear that their £600 card isn't as good as a custom 1080 because that other custom 1080 owner may have spent a lot more money.

I don't know about America but here in the U.K. that difference is between £150-200 which translates to $186-248


----------



## jlp0209

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone with a GTX *Classified 1080* could you tell me what your fan RPM is at 100% ?
> 
> Mine is at about 2800rpm -- just want to check it isn't running slower than it should.


My max fan rpm is the same, you're all good.

I see you mention more than once to set power limit to 30%- what do you mean? The lowest the power target slider goes in Precision is 36%? Are you really saying 130%? No matter what I adjust in settings my max stable OC is always 2088mhz. I'm pretty sure if someone ever cracks GPU boost 3.0 I will be stable at 2100mhz+. When my voltage dips below 1.093v due to temps going over 60C it can't maintain 2100mhz. It'll either crash (if I force the core clock up to stay at 2100mhz, the voltage won't increase along with it---> crash) or downclock to 2088. So annoying, but beast of a card nonetheless and I'm happy with 2088mhz.


----------



## galeonki

Quote:


> Originally Posted by *nrpeyton*
> 
> Also, the clock rate is only 'part' of what makes up the entire architecture, the difference between 2000mhz and 2100 on pascal is probably 1-2FPS


I do not agree with you







.
More, particularly on 1080p and 1440p









For example in Fallout 4 on 1440p +100mhz on core is about 7-8 fps more


----------



## nrpeyton

Quote:


> Originally Posted by *jlp0209*
> 
> My max fan rpm is the same, you're all good.
> 
> I see you mention more than once to set power limit to 30%- what do you mean? The lowest the power target slider goes in Precision is 36%? Are you really saying 130%? No matter what I adjust in settings my max stable OC is always 2088mhz. I'm pretty sure if someone ever cracks GPU boost 3.0 I will be stable at 2100mhz+. When my voltage dips below 1.093v due to temps going over 60C it can't maintain 2100mhz. It'll either crash (if I force the core clock up to stay at 2100mhz, the voltage won't increase along with it---> crash) or downclock to 2088. So annoying, but beast of a card nonetheless and I'm happy with 2088mhz.


Yes sorry I meant 130%.

Also I just read something somewhere saying that MSI afterburner now allows you to lock the voltage at each point on the curve so you can test for stability at each point to build your curve.

I can't wait to try it myself and see how it affects games/FPS. I hope I read it right lol... coz to be honest what I just described should now be a "basic" in these overclocking programmes when you consider GPU Boost 3.0.

"If you always do what you always did you'll always get what you always got".


----------



## nrpeyton

Quote:


> Originally Posted by *galeonki*
> 
> I do not agree with you
> 
> 
> 
> 
> 
> 
> 
> .
> More, particularly on 1080p and 1440p
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For example in Fallout 4 on 1440p +100mhz on core is about 7-8 fps more


I hear what your saying mate but if you are over 100 FPS anyway you won't notice the difference lol.

Where it "matters" is when you are playing *4k at 35-45fps* where it is indeed only 1 or 2 frames difference.


----------



## Tdbeisn554

I also have kinda loud coil whine... So that ruins it a bit too... Probably gonna do an rma for that, plus I do not think I can get a chip that clocks even less than the one I have so with a bit of luck my replacement will go to 2100+


----------



## nrpeyton

Anyone already tried this? How did you get on??  \/ \/ \/

Just noticed with 'MSI Afterburner' 4.3, Beta 14 you can now actually lock your voltage and core frequency at any point along your curve to test for stability at each voltage point to build your overclock.

Officially they only support FE and MSI but I just tried it on my EVGA Classified 1080 and its working 100%

Here is a quote from the page:

"You may press after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling"

And link: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

There is no button on MSI Afterburner yet to access the voltage curve you have to press CNTRL-F to get the window up.

This should let us squeeze every last drip of performance out of our cards by fine tuning them to the absolute limit.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone already tried this? How did you get on??  \/ \/ \/
> 
> Just noticed with 'MSI Afterburner' 4.3, Beta 14 you can now actually lock your voltage and core frequency at any point along your curve to test for stability at each voltage point to build your overclock.
> 
> Officially they only support FE and MSI but I just tried it on my EVGA Classified 1080 and its working 100%
> 
> Here is a quote from the page:
> 
> "You may press after selecting any point on the curve with mouse cursor to disable GPU dynamic voltage/frequency adjustment and lock the voltage and core clock frequency to a state defined by the target point. This feature allows you to test graphics card stability independently for each voltage/frequency point of the curve using real 3D applications or any stress test of your choice. In addition to stability testing usage scenario, MSI Afterburner allows you to save a curve with locked point setting to a profile, so you may easily switch between dynamic voltage/frequency management and fixed voltage/frequency settings in realtime (e.g. to achieve the maximum performance during benchmarking). Please take a note that fixed voltage and frequency settings do not allow you to disable power and thermal throttling"
> 
> And link: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> There is no button on MSI Afterburner yet to access the voltage curve you have to press CNTRL-F to get the window up.
> 
> This should let us squeeze every last drip of performance out of our cards by fine tuning them to the absolute limit.


Why can't you do that in Precision in Manual mode?


----------



## GreedyMuffin

-delete-


----------



## galeonki

Quote:


> Originally Posted by *nrpeyton*
> 
> I hear what your saying mate but if you are over 100 FPS anyway you won't notice the difference lol.
> 
> Where it "matters" is when you are playing *4k at 35-45fps* where it is indeed only 1 or 2 frames difference.


Couple mins ago - tested on 4k dell monitor 60hz- +113mhz give me 3-4 fps average more on fallout and gta5







.Always something







. On 4k always hard to gain more free fps on ultra settings. But +100mhz extra is always welcome







.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Why can't you do that in Precision in Manual mode?


The voltage and clock markings leave a LOT to be desired. With MSI AB, you can fine tune a lot more than you can with with PCX. I had been using PCX, but switched out to AB beta. It's much better.


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> The voltage and clock markings leave a LOT to be desired. With MSI AB, you can fine tune a lot more than you can with with PCX. I had been using PCX, but switched out to AB beta. It's much better.










Ok thanks.


----------



## PasK1234Xw

Looks like EVGA released a reference 1080 hybrid model

If anyone gets could you please post the BIOS TIA


----------



## qLiixz

Hey guys, just want to share my results in oc'ing my 1080.
http://www.3dmark.com/fs/10422725


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Why can't you do that in Precision in Manual mode?


Good question mate -- maybe we should start one of those massive threads that everyone signs and supports trying to persuade EVGA to update Precision X..

This kind of thing should be a "basic" now with boost 3.0. 

*Anyone disagree this level of control should be a "given" in any mainstream overclocking utility since boost 3.0??*


----------



## nrpeyton

Mind you.. if MSI does more sales as a result of being the first company to implement this on their sponsored overclocking utility this would actually indeed be a good thing -- they deserve the credit 

And even more credit to them for not locking out the function to other manufacturers cards -- I'm sure Gigabyte would probably pull something like that... I went with EVGA after it took Gigabyte 7 days just to answer one email. EVGA got back to me in 3 hours 

Beginning to get quite fond of EVGA since they do actually show they care -- I really hope they implement this control on their Precision X


----------



## Tdbeisn554

@nrpeyton

What are your clocks?


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> @nrpeyton
> 
> What are your clocks?


Just begun working on my Voltage/Clock curve tonight...

Stress testing each point as I go with Heaven 4.0

I'm no where near finished yet but so far after first few hours I've got to here with MSI Afterburner:

800mv +200mhz - 1797mhz (heaven 4.0 stable)
850mv +185mhz - 1898mhz (heaven 4.0 stable)
875mv +165mhz - 1911mhz (heaven 4.0 stable)
881mv +180mhz - 1961mhz *<---- just crashed so need to lower this one to +165mhz - 1936mhz* then re-test 

Obviously everyone's +offset will be different depending on what their base is with their own model/manufacturer/BIOS.


----------



## nrpeyton

One thing I am very confused about -- and I don't know if anyone here can help -- is I *never artifact.*

It runs perfectly right up to the bleeding edge of its maximum at any voltage point then is either 100% stable or crashes driver. Not one artifact -- ever!

This is painfully annoying because there is NO warning I am nearing my limit until my computer freezes or the game shuts down and returns me to desktop with error.


----------



## Juub

Quote:


> Originally Posted by *nrpeyton*
> 
> One thing I am very confused about -- and I don't know if anyone here can help -- is I *never artifact.*
> 
> It runs perfectly right up to the bleeding edge of its maximum at any voltage point then is either 100% stable or crashes driver. Not one artifact -- ever!
> 
> This is painfully annoying because there is NO warning I am nearing my limit until my computer freezes or the game shuts down and returns me to desktop with error.


You don't always get artifacts. Driver crashes are a telltale sign that your OC is too high.


----------



## chiknnwatrmln

My card is the same. No artifacts, just driver crashes.


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> *Anyone disagree this level of control should be a "given" in any mainstream overclocking utility since boost 3.0??*


On the contrary, personally I'd say the level of control is lacking even in AB.

If you want to see artifacts try high memory clocks.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> Just begun working on my Voltage/Clock curve tonight...
> 
> Stress testing each point as I go with Heaven 4.0
> 
> I'm no where near finished yet but so far after first few hours I've got to here with MSI Afterburner:
> 
> 800mv +200mhz - 1797mhz (heaven 4.0 stable)
> 850mv +185mhz - 1898mhz (heaven 4.0 stable)
> 875mv +165mhz - 1911mhz (heaven 4.0 stable)
> 881mv +180mhz - 1961mhz *<---- just crashed so need to lower this one to +165mhz - 1936mhz* then re-test
> 
> Obviously everyone's +offset will be different depending on what their base is with their own model/manufacturer/BIOS.


I meant what is your max clock


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> Just begun working on my Voltage/Clock curve tonight...
> 
> Stress testing each point as I go with Heaven 4.0
> 
> I'm no where near finished yet but so far after first few hours I've got to here with MSI Afterburner:
> 
> 800mv +200mhz - 1797mhz (heaven 4.0 stable)
> 850mv +185mhz - 1898mhz (heaven 4.0 stable)
> 875mv +165mhz - 1911mhz (heaven 4.0 stable)
> 881mv +180mhz - 1961mhz *<---- just crashed so need to lower this one to +165mhz - 1936mhz* then re-test
> 
> Obviously everyone's +offset will be different depending on what their base is with their own model/manufacturer/BIOS.


Please post back with the results. I'd love to see what you get with all that. Although I wouldn't use heaven as a stresser, I find Firestrike graphics test 1 and 2 to be nice starting point to test stability, then moving to GTA V with mods to get that ultimate test. If I'm stable on modded GTA V, I'm probably stable everywhere.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Please post back with the results. I'd love to see what you get with all that. Although I wouldn't use heaven as a stresser, I find Firestrike graphics test 1 and 2 to be nice starting point to test stability, then moving to GTA V with mods to get that ultimate test. If I'm stable on modded GTA V, I'm probably stable everywhere.


*I will...*

*So far I am here now:*

800mv - 200 - 1797
850mv - 185 - 1898
875mv - 165 - 1911
881mv - 180 - 1949
893mv - 160 - 1949
900mv - 160 - 1961
912mv - 150 - 1961
925mv - 135 - 1974
931mv - 150 - 1987
943mv - 160 - 2025
950mv - 165 - 2037 < need to re-check this one

Will be continuing tonight..

One question though don't know if anyone can help.

Can you lose points in 3dmark for example... being "right at the edge of your limit".. not crashing.. but just before it? *(Reason I ask is I read something somewhere about error-correcting)??*


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> *I will...*
> 
> *So far I am here now:*
> 
> 800mv - 200 - 1797
> 850mv - 185 - 1898
> 875mv - 165 - 1911
> 881mv - 180 - 1949
> 893mv - 160 - 1949
> 900mv - 160 - 1961
> 912mv - 150 - 1961
> 925mv - 135 - 1974
> 931mv - 150 - 1987
> 943mv - 160 - 2025
> 950mv - 165 - 2037 < need to re-check this one
> 
> Will be continuing tonight..
> 
> One question though don't know if anyone can help.
> 
> Can you lose points in 3dmark for example... being "right at the edge of your limit".. not crashing.. but just before it? *(Reason I ask is I read something somewhere about error-correcting)??*


You won't lose performance being on the edge. But there are some clocks that simply don't work well. I could get my card past +575 on memory but performance would decrease before that. Also, I could get my card running 2180mhz manipulating the curve but the performance would be much lower than 2114mhz using the offset method. You just gotta test it out. To get a good base to compare, I'd suggest you to find out your max stable clock using the old offset way and write down it's performance on firestrike graphics. Then you'll get a score to compare and see if your actually gaining performance or just numbers. Looking only to the clock can be misleading...


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

October Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## nrpeyton

Quote:


> Originally Posted by *lanofsong*
> 
> Hey GTX 1080 owners,
> 
> We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> October Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


What is a foldathon?

Right I've read up on it a bit so I assume my computer takes on a bit of a puzzle and works it out then re-uploads part of the "sum" ? How does it do that and have I read it correctly?


----------



## lanofsong

Quote:


> Originally Posted by *nrpeyton*
> 
> What is a foldathon?
> 
> Right I've read up on it a bit so I assume my computer takes on a bit of a puzzle and works it out then re-uploads part of the "sum" ? How does it do that and have I read it correctly?


This is correct. You download the [email protected] client to your computer, it will contact the server and download a work unit for hardware to crunch. When the unit has completed, it will be uploaded and a new WU will be sent to you - If you successfully returned the WU, you will be given points. The faster you fold a unit, the more points you will be given. Note, too high an OC on your hardware will produce errors and may fail a unit for which you will receive little to no points.


----------



## nrpeyton

Quote:


> Originally Posted by *lanofsong*
> 
> This is correct. You download the [email protected] client to your computer, it will contact the server and download a work unit for hardware to crunch. When the unit has completed, it will be uploaded and a new WU will be sent to you - If you successfully returned the WU, you will be given points. The faster you fold a unit, the more points you will be given. Note, too high an OC on your hardware will produce errors and may fail a unit for which you will receive little to no points.


I see -- is it just CPU or GPU too?

Before I do the download it -- is it possible to "throttle" it so I can still use my PC or set it up to only run at full load when I'm AFK? I.E the moment it detects mouse/keyboard input it throttles right down then only uses the "non used CPU time"?


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> You won't lose performance being on the edge. But there are some clocks that simply don't work well. I could get my card past +575 on memory but performance would decrease before that. Also, I could get my card running 2180mhz manipulating the curve but the performance would be much lower than 2114mhz using the offset method. You just gotta test it out. To get a good base to compare, I'd suggest you to find out your max stable clock using the old offset way and write down it's performance on firestrike graphics. Then you'll get a score to compare and see if your actually gaining performance or just numbers. Looking only to the clock can be misleading...


Okay I have downloaded 3dmark and got the Advanced Edition.

I ran some tests and got the following: (last colum is Firestrike Extreme Graphics 1 & 2 score with demo off) so it looks like I'm going to need to revisit every damn voltage point before 950mv lol. (last night was effectively wasted)

And to make matters worse -- What makes this *even harder* is that the boost also changes slightly with each + offset you set as temperature changes!! *head going to explode*

875mv +165 - 1911mhz
881mv +180 - 1949mhz
893mv +160 - 1949mhz
900mv +160 - 1961mhz
912mv +150 - 1961mhz
925mv +135 - 1974mhz
931mv +150 - 1987mhz
943mv +160 - 2025mhz
950mv +135 - 2012mhz - *12505*
+125 - 1999mhz - *12521*
+110 - 1987mhz - *12513*

I am going to have another look and see if I can just lock the voltage at 1.093v and find my highest score on that -- this might be the easy option and it kind of defeats the purpose of Nvidia giving us control with boost 3.0 but I will have a tinker about and report back...

P.S. I do still plan on completing my curve but might take a few days + the girlfriend is moaning like hell about the amount time I'm spending on this lol


----------



## lanofsong

Quote:


> Originally Posted by *nrpeyton*
> 
> I see -- is it just CPU or GPU too?
> 
> Before I do the download it -- is it possible to "throttle" it so I can still use my PC or set it up to only run at full load when I'm AFK? I.E the moment it detects mouse/keyboard input it throttles right down then only uses the "non used CPU time"?


Your 1080 will put out about 700K+ PPD (probably 800K) whereas your CPU may put out 30-50K PPD so I would not bother folding on the CPU.

If you leave it on Medium setting, your GPU should activate (fold) only when computer is idle.
Full setting will fold flat out no matter what you are doing on the computer


----------



## nrpeyton

Quote:


> Originally Posted by *lanofsong*
> 
> Your 1080 will put out about 700K+ PPD (probably 800K) whereas your CPU may put out 30-50K PPD so I would not bother folding on the CPU.
> 
> If you leave it on Medium setting, your GPU should activate (fold) only when computer is idle.
> Full setting will fold flat out no matter what you are doing on the computer


Whats the science behind my CPU being so weak with this?


----------



## lanofsong

Quote:


> Originally Posted by *nrpeyton*
> 
> Whats the science behind my CPU being so weak with this?


It is not that your CPU is weak







it is just that your GPU is so powerful


----------



## nrpeyton

Quote:


> Originally Posted by *lanofsong*
> 
> It is not that your CPU is weak
> 
> 
> 
> 
> 
> 
> 
> it is just that your GPU is so powerful


But isn't it calculations its doing? I thought the CPU was better at that? And the GPU was better at only graphics? (for instance rendering scenes in gaming or making movies)


----------



## OZrevhead

Guys, I am looking for a good bios for my (soon to be) water cooled 1080 SC, is there a thread for this stuff? Or can you guys help me?

Any help would be great.


----------



## artemis2307

my 1080 FE is doing 2038-2050 mhz at 1.030v, might play with the curve abit to shave off abit heat and power


----------



## GanGstaOne

Quote:


> Originally Posted by *Derek1*
> 
> Why can't you do that in Precision in Manual mode?


You can its called KBoost but to use it in the new Precision OC X you need to put EVGA serial number they locked the future to only EVGA owners where the old Precision X you can use KBoost on any card but the old program dosnt support pascal gpus


----------



## moustang

Quote:


> Originally Posted by *nrpeyton*
> 
> But isn't it calculations its doing? I thought the CPU was better at that? And the GPU was better at only graphics? (for instance rendering scenes in gaming or making movies)


It's not just calculations, it's the specific type of calculations that matter.

To make it as simple as possible, Folding is very Floating Point calculation intensive, and GPUs specialize in Floating Point calculations. They can't do everything a CPU can do as fast as a CPU can do it, but they blow CPUs out of the water when it comes to Floating Point calculations because that's precisely what 3D rendering relies on. Floating Point is the whole reason GPUs exist at all.


----------



## Koniakki

/DP


----------



## Koniakki

Quote:


> Originally Posted by *moustang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> But isn't it calculations its doing? I thought the CPU was better at that? And the GPU was better at only graphics? (for instance rendering scenes in gaming or making movies)
> 
> 
> 
> It's not just calculations, it's the specific type of calculations that matter.
> 
> To make it as simple as possible, Folding is very Floating Point calculation intensive, and GPUs specialize in Floating Point calculations. They can't do everything a CPU can do as fast as a CPU can do it, but they blow CPUs out of the water when it comes to Floating Point calculations because that's precisely what 3D rendering relies on. Floating Point is the whole reason GPUs exist at all.
Click to expand...


----------



## artemis2307

Messed a bit with the curve and this is what I get
1080 FE

Before curve, 220 core / 600 mem





After curve




Temps have gone down 3-4 Celsius, was 81-82 now 78 max, ignore the "Before" temps bcz thats when the A/C was on and the case open
is this a good result guys?


----------



## nrpeyton

Quote:


> Originally Posted by *moustang*
> 
> It's not just calculations, it's the specific type of calculations that matter.
> 
> To make it as simple as possible, Folding is very Floating Point calculation intensive, and GPUs specialize in Floating Point calculations. They can't do everything a CPU can do as fast as a CPU can do it, but they blow CPUs out of the water when it comes to Floating Point calculations because that's precisely what 3D rendering relies on. Floating Point is the whole reason GPUs exist at all.


I understand -- thank you


----------



## nrpeyton

Quote:


> Originally Posted by *GanGstaOne*
> 
> You can its called KBoost but to use it in the new Precision OC X you need to put EVGA serial number they locked the future to only EVGA owners where the old Precision X you can use KBoost on any card but the old program dosnt support pascal gpus


I have 1080 EVGA Classified. Kboost on Precision XOC does *not* allow you to to lock specific voltages. It only allows you to lock your frequency at the highest factory boost/core. So it isn't good for building your curve. With MSI AB beta 14 you can lock any voltage and frequency offset until your hearts content. Great for tweaking your curve and testing for stability at every single voltage & frequency point. They only officially support MSI and FE but it works perfectly on my EVGA 1080 Classified as long as I reset to default between trying to lock each new voltage.


----------



## Derpinheimer

Quote:


> Originally Posted by *nrpeyton*
> 
> One question though don't know if anyone can help.
> 
> Can you lose points in 3dmark for example... being "right at the edge of your limit".. not crashing.. but just before it? *(Reason I ask is I read something somewhere about error-correcting)??*


Memory overclocks can reduce performance if they are in an unstable zone, i believe because more time is spent on error correction than is gained from the increased bandwidth. Worst part is, it's not easy to verify and probably temperature dependent too. I believe some people have found certain core to memory clock ratios to perform better, especially in the days of bitcoin mining, further complicating things.

Increasing core clocks (I think) will never lower performance even if it's unstable. The losses you see are probably just uncertainty.


----------



## juniordnz

Well, just out of curiosity I flashed T4 BIOS on my FTW. It bricked it. Switched to backup BIOS, flased stock over it and now everything is fine again.

I'm officially done with cross flashing. I'm happy with the 25.380 points on firestrike my FTW BIOS is giving me.

I just had that itch, you know?


----------



## Tdbeisn554

Quote:


> Originally Posted by *juniordnz*
> 
> Well, just out of curiosity I flashed T4 BIOS on my FTW. It bricked it. Switched to backup BIOS, flased stock over it and now everything is fine again.
> 
> I'm officially done with cross flashing. I'm happy with the 25.380 points on firestrike my FTW BIOS is giving me.
> 
> I just had that itch, you know?


What is your max clock if I may ask?


----------



## outofmyheadyo

Palit gamerock is 615€, palit jestream is 625€ and zotac amp extreme is 700€ i dont think the zotach is worth it to pay 75/85€ more for it


----------



## juniordnz

Quote:


> Originally Posted by *Archang3l*
> 
> What is your max clock if I may ask?


Core 2.114mhz(+89)
Mem 11.160mhz(+575)
1.062V
130%TDP

http://www.3dmark.com/fs/10388708


----------



## greg1184

Anyone have the bitspower waterblock for the reference card? I'm thinking of putting my card under water and it's the first time I'm doing this.


----------



## Jim86

I am using a EK wb for my FE card as is most people works great easy to install no issues haven't seen anybody with a BP block.


----------



## Koniakki

Quote:


> Originally Posted by *artemis2307*
> 
> Messed a bit with the curve and this is what I get
> 1080 FE
> 
> Before curve, 220 core / 600 mem
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> After curve
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Temps have gone down 3-4 Celsius, was 81-82 now 78 max, ignore the "Before" temps bcz thats when the A/C was on and the case open
> is this a good result guys?


Try finding a sweetspot where your minimum FPS will be higher. I've tested ROTR extensively and the minimum are quite sensitive to core and memory clocks.

I would say more sensitive to memory clocks than core clocks. Finding a sweetspot on the clocks will give quite consistent high minimum FPS(and good average FPS too).

Example +520MHz mem in my case will almost constantly give me high minimum FPS while adding even +10MHz will drop them heavily.

Referring specifically to the internal benchmark since I haven't even started the game yet so I can't say for sure if this goes for the actual gameplay too.

Gamerock default bios max OC.



Strix T4 bios max OC



Quote:


> Originally Posted by *juniordnz*
> 
> Well, just out of curiosity I flashed T4 BIOS on my FTW. It bricked it. Switched to backup BIOS, flased stock over it and now everything is fine again.
> 
> I'm officially done with cross flashing. I'm happy with the 25.380 points on firestrike my FTW BIOS is giving me.
> 
> I just had that itch, you know?


I tried the T4 vbios on my Palit Gamerock. Tested it for a few hours. No problems of any sort and voltage, well at least as reported in AB, was working fine.

Didn't gained much actually performance wise and temps weren't that bad over stock bios staying >60'[email protected]% fan speed(3000ish RPM).

Stock bios 100% fan speed is ~2600RPM.which translates to ~70% on the Strix T4 bios.

Reverted back to stock Gamerock bios and performance was about the same.


----------



## LiquidHaus

Interesting to see how many people are using the 'slave' bios on their EVGA cards.

I haven't even tried the 'slave' bios yet but I think today I will do that.


----------



## Tdbeisn554

Quote:


> Originally Posted by *juniordnz*
> 
> Core 2.114mhz(+89)
> Mem 11.160mhz(+575)
> 1.062V
> 130%TDP
> 
> http://www.3dmark.com/fs/10388708


Your FTW is beating my Classy to a pulp


----------



## juniordnz

Quote:


> Originally Posted by *lifeisshort117*
> 
> Interesting to see how many people are using the 'slave' bios on their EVGA cards.
> 
> I haven't even tried the 'slave' bios yet but I think today I will do that.


The slave BIOS is a good thing because it will keep you from hitting TDP on stressful situations. I've never seen mine reach 130% on any game I played.
Quote:


> Originally Posted by *Archang3l*
> 
> Your FTW is beating my Classy to a pulp


How's your classy doing? Had time to find it's max stable OC yet? I'd suggest you find memory OC first then moving to core clock. These cards scale very well with VRAM OC .


----------



## Tdbeisn554

Quote:


> Originally Posted by *juniordnz*
> 
> The slave BIOS is a good thing because it will keep you from hitting TDP on stressful situations. I've never seen mine reach 130% on any game I played.
> How's your classy doing? Had time to find it's max stable OC yet? I'd suggest you find memory OC first then moving to core clock. These cards scale very well with VRAM OC .


Everything above 2050 is artifacting like hell or crashing Heaven... 2050is getting downclocked to around 2000 so... Yeah.
But I am RMA'ing the card as I have loud coil whine on my card. So probably my replacement will have a better chip (I'm gonnz cry if it is a full fledged potato...) I actually want at least 2100 on a Classified I mean what is the point if a cheap 1080 with simple cheap blower style cooler has better clocks than a huge card with over the top power delivery system and huge fans?


----------



## juniordnz

Quote:


> Originally Posted by *Archang3l*
> 
> Everything above 2050 is artifacting like hell or crashing Heaven... 2050is getting downclocked to around 2000 so... Yeah.
> But I am RMA'ing the card as I have loud coil whine on my card. So probably my replacement will have a better chip (I'm gonnz cry if it is a full fledged potato...) I actually want at least 2100 on a Classified I mean what is the point if a cheap 1080 with simple cheap blower style cooler has better clocks than a huge card with over the top power delivery system and huge fans?


I get your frustration...and you're not the first frustrated classy owner we've seen here.

Wish you luck on the next draw on the silicon lottery, mate!


----------



## Tdbeisn554

Quote:


> Originally Posted by *juniordnz*
> 
> I get your frustration...and you're not the first frustrated classy owner we've seen here.
> 
> Wish you luck on the next draw on the silicon lottery, mate!


Thanks buddy! I wish they were binned, that they keep the 2300+ ones and super oc chips for the kingpins ok.. But if you buy a classy with all the bells and whistles it's not really fun to get a Gtx 1080 Potatofied instead of a Classified


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Well, just out of curiosity I flashed T4 BIOS on my FTW. It bricked it. Switched to backup BIOS, flased stock over it and now everything is fine again.
> 
> I'm officially done with cross flashing. I'm happy with the 25.380 points on firestrike my FTW BIOS is giving me.
> 
> I just had that itch, you know?


I flashed the T4 BIOS to my 1080 Classified and it did not brick it. But due to the "removed" power limit I was not able to draw more than about 110w from the card (normally I can pull up to 320W).
I think its because it didn't know how to deal with the removed power limit. I tried cross flashing loads of other BIOS's and they all worked perfectly. I think if the power limit had simply been "increased" instead of "removed" it would have worked. It seemed to default to a failsafe low-power mode. I did see 1.1v at one moment -- if only for a brief period. Again -- if power limit was intact I believe I'd of seen a lot more voltage -- even up to 1.2 

I've left a message on a forum some of the top LN2 guys communicate on (need to wait for approval just to get it posted) and I mentioned the power limit problem -- if they get back to me with anything useful I'll share on here 

One thing that was worth noting with the T4 bios is that I was able to get an extra 1000 fan RPM (could get up to 3800rpm. Which proves EVGA is artificially limiting our max fan speed with Classified 1080(probably because many reviewers are so critical about fan noise) -- I'd much prefer to have the *CHOICE* !! or maybe they could even have added a "fan overclock" feature on PRECISION X.

Again --- If the power limit was intact on the T4 BIOS, not only would we have extra voltage for those LN2 newbies but for non-LN2 guys we'd of still had extra voltage and extra cooling -- I know voltage isn't doing much but imagine your card was "RIGHT" on the bleeding edge of *JUST* falling short of hitting 2200 or 2100 -- the extra cooling and extra voltage may of helped some guys hit a few extra MHZ to get their 'even' one-hundred or two-hundred number.


----------



## juniordnz

So if the fan can spin at 3700rpm it doesn't mean it will harm the fan?

After you flash it you jest restart the computer normally and everything goes fine? I got a black screen after flashing it, but the card is lit up and the fans are spinning.


----------



## MACH1NE

Hey guys I'm looking at buying 1080 what's the best bang for buck card to go for and has there been any bios mods release which lets us unlock its full potential (on air)


----------



## juniordnz

God, I'm so dumb. The T4 BIOS don't work with the first DP on the FTW. Changed to the DP in the middle and everything works fine.

Same overclock as FTW BIOS got me 200+ points on firestrike graphics, now let's try pushing it up!


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> @nrpeyton
> 
> What are your clocks?


With no curve adjustment and only changing core clock setting my max stable is:

+150 = 2151mhz

On 'The Witcher 3' if I do a 'before' and after this equates to a 3FPS difference (76 to 79 avg.)

I've not touched memory yet.

I get no artifacts at all -- ever -- it either runs or crashes.


----------



## juniordnz

Just got this fiddling with T4 BIOS: http://www.3dmark.com/fs/10445175

A nice 360 points increase.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Just got this fiddling with T4 BIOS: http://www.3dmark.com/fs/10445175
> 
> A nice 360 points increase.


360 points increase on the T4 bios? Not bad mate. At least you went back and tried again lol 

I suffered a loss due to not being able to pull the usual power from the card when on this BIOS.. did you have the same problem.. and are the power controls disabled for you?

I monitor my power usage in Watts using HWINFO64 and I validate it (roughly) using an energy saving meter plugged in between computer and wall socket. It gives me the current watts in real time (only shows full draw for system but by making comparisons I can roughly validate HWINFO64 so I know it isn't giving me a false reading)

With the T4 BIOS, even on FURMARK I couldn't seem to draw more than 110W from the card. I tried it with The Witcher and I think I lost about 10 FPS although it was late at night and I wasn't paying too much attention.

Hmm I'm confused now how you got extra points. Can you give me anymore details? Was your card boosting higher on this BIOS or were you able to set your max overclock higher? How did you achieve this.... ? .

...the T4 BIOS did work for me booted up perfectly etc. etc. but just didn't seem to give me any results.

My max stable overclock on my classified is +150 = 2151mhz but this only gives me 3 FPS.

I wonder if there is something I am doing wrong...

I flashed it using *nvflash_5.319.0-win* with command nvflash --overridesub strix1080xoc.rom


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> So if the fan can spin at 3700rpm it doesn't mean it will harm the fan?
> 
> After you flash it you jest restart the computer normally and everything goes fine? I got a black screen after flashing it, but the card is lit up and the fans are spinning.


I never hit 130% TDP either.. unless I run the EVGA OC Scanner X (with furmark GPU core burn selected)... when I monitor HWINFO64 my power draw goes up to a max of about 322W and GPU-Z perfCap reason switches between Power and Reliable voltage for reason code. In games I never get above about 220W. (Although I've only really been playing The Witcher 3 so far).

Regards to the fan spinning up to 3700 not doing damage -- don't quote me on this -- but I wouldn't think so -- most fans are rated at 12v -- I've never found a fan anywhere -- ever -- or found a fan controller that enables you to go above 12V or ever been able to find a way to overclock a fan. Which leads me to believe that EVGA are probably fitting higher rated fans on their GPU that are capable of higher speeds but running them at lower speeds for more 'longevity/reliability' and less noise. (Running a fan at 50% - 70% duty for 5000 hours will probably be more reliable than running at 70% - 100% duty for the same time period.?? This definitely seems like the best explanation. ??


----------



## artemis2307

I wont be messing with bios flashes on my FE though.








But yeah 550 mem is a bit unstable so back to 500 for now. Havent benched yet


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> I flashed it using *nvflash_5.319.0-win* with command nvflash --overridesub strix1080xoc.rom


IIRC strix1080xoc was an older VBIOS, not the T4 which can be downloaded from TPU. Again IIRC the older XOC VBIOS video clock topped out at 1708.5MHz while the T4 didn't and the video clock can make a difference in performance.

Yes, no power or temperature settings and draws over 300W on an FE card.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> God, I'm so dumb. The T4 BIOS don't work with the first DP on the FTW. Changed to the DP in the middle and everything works fine.
> 
> Same overclock as FTW BIOS got me 200+ points on firestrike graphics, now let's try pushing it up!


Haha sorry bro, I just now saw your PM and then I caught up on the thread









The main thing I loved about the T4 BIOS was the fan speed increase, I also gained a tiny bit of performance. I ended up switching back to the 130 FTW BIOS when you said your card died a while back, I was afraid it might be related to the T4 lol.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> I never hit 130% TDP either.. unless I run the EVGA OC Scanner X (with furmark GPU core burn selected)... when I monitor HWINFO64 my power draw goes up to a max of about 322W and GPU-Z perfCap reason switches between Power and Reliable voltage for reason code. In games I never get above about 220W. (Although I've only really been playing The Witcher 3 so far).
> 
> 360 points increase on the T4 bios? Not bad mate. At least you went back and tried again lol
> 
> I suffered a loss due to not being able to pull the usual power from the card when on this BIOS.. did you have the same problem.. and are the power controls disabled for you?
> 
> I monitor my power usage in Watts using HWINFO64 and I validate it (roughly) using an energy saving meter plugged in between computer and wall socket. It gives me the current watts in real time (only shows full draw for system but by making comparisons I can roughly validate HWINFO64 so I know it isn't giving me a false reading)
> 
> With the T4 BIOS, even on FURMARK I couldn't seem to draw more than 110W from the card. I tried it with The Witcher and I think I lost about 10 FPS although it was late at night and I wasn't paying too much attention.
> 
> Hmm I'm confused now how you got extra points. Can you give me anymore details? Was your card boosting higher on this BIOS or were you able to set your max overclock higher? How did you achieve this.... ? .
> 
> ...the T4 BIOS did work for me booted up perfectly etc. etc. but just didn't seem to give me any results.
> 
> My max stable overclock on my classified is +150 = 2151mhz but this only gives me 3 FPS.
> 
> I wonder if there is something I am doing wrong...
> 
> I flashed it using *nvflash_5.319.0-win* with command nvflash --overridesub strix1080xoc.rom


Well well well...I just ditched the T4 bios after playing around with it. FOr me, the 350 points (roughly 0,99% increase) is not worth the "risks" of having a BIOS not made for your card running it. Also, the "unlimited" wattage/temperature kinda freaks me out a bit. Anyway, I just don't think is worth it.

Before, I thought my max OC on FTW's Slave BIOS was 2114mhz/1.062V (offset mode). But after doing some fiddling with the curve I could get it up to stable 2151mhz/1.093V (offset + curve). What I did was use the curve I got from the offset method and then just messing with the clocks from 1.062V and higher. All that got me 100 points more than the old OC I had. In the real world, that's plain simple NOTHING, but is nice to get a new max OC. It's also nice sayuing you have a 2150mhz card lol

One thing I noticed, the way the card handles the temperature throttle changes when you increase voltages. At 1.062V my card would have it's first throttle down at 54-55-C. With 1.093V it happens at 59-60ºC. I don't know if that 5ºC room is lost to the added heat from the +31mv, I'll have to check into that.

So, T4 was fun and all. But I'm sticking with stock BIOS. And, boy, the fan is LAWD at 3700RPM. It was able to top my H100i Full Speed (and we all know corsair's fans are jet engines...)
Quote:


> Originally Posted by *TWiST2k*
> 
> Haha sorry bro, I just now saw your PM and then I caught up on the thread
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The main thing I loved about the T4 BIOS was the fan speed increase, I also gained a tiny bit of performance. I ended up switching back to the 130 FTW BIOS when you said your card died a while back, I was afraid it might be related to the T4 lol.


It wasn't related at all. I could never get the T4 to work with that old one







thought it bricked it lol I used it mostly with the stock BIOS and 1.063V.

It was a faulty VRAM thing.

Anyway, It's strange how Firestrike scores changes over time. Yesterday I got 25.387 and today 25.290 with the exact same overclock settings.


----------



## MACH1NE

Just purchased the evga 1080 sc gaming acx 3.0 from
Newegg $900 delivered with free gow 4 code kind of a side grade from my evga 980ti classy which I'll be selling to my cousin


----------



## artemis2307

Quote:


> Originally Posted by *MACH1NE*
> 
> Just purchased the evga 1080 sc gaming acx 3.0 from
> Newegg $900 delivered with free gow 4 code kind of a side grade from my evga 980ti classy which I'll be selling to my cousin


900$ is a very high price, I got mine for 520$
1080 max OC is still 15% faster than 980ti max OC though, and plus 2GB VRAM


----------



## MACH1NE

$900 au


----------



## artemis2307

Quote:


> Originally Posted by *MACH1NE*
> 
> $900 au


then it's an ok price


----------



## OZrevhead

Guys, I am looking for a bios for my (soon to be) water cooled evga 1080 SC, has anyone got one that is suitable for 1.20v or so? Has anyone tried the strix bios in a FE based card?


----------



## ucode

T4 VBIOS works okay with my FE card although fan speed was reduced to ~3600RPM IIRC from 4000.

Personally one might want to test their card is okay on stock VBIOS first in case it needs to be sent back under warranty.


----------



## OZrevhead

I have already tested it with an FE bios, doesn't appear to have any issues. But yes I will evaluate it with water and SC bios before I go nuts with a custom bios.

What voltage does that give you? Did clock ceiling reflect the change?


----------



## ucode

1.2V (needs curve editing to reduce flatlining)

Higher clocks but voltage scaling not so great so returns are small.


----------



## KickAssCop

Which is best bios for STRIX cards. I am losing track here.


----------



## bloot

Hi! Anyone tried Palit's Gamerock Premium/Super JetStream offical newer BIOS? If so would you mind sharing some impressions, has it been adressed the fans bug, micron memory bug etc..

Thanks!


----------



## Derpinheimer

Quote:


> Originally Posted by *MACH1NE*
> 
> $900 au


Only us Americans are allowed to say a price without specifying a currency on the internet


----------



## artemis2307

Quote:


> Originally Posted by *Koniakki*
> 
> Try finding a sweetspot where your minimum FPS will be higher. I've tested ROTR extensively and the minimum are quite sensitive to core and memory clocks.
> 
> I would say more sensitive to memory clocks than core clocks. Finding a sweetspot on the clocks will give quite consistent high minimum FPS(and good average FPS too).
> 
> Example +520MHz mem in my case will almost constantly give me high minimum FPS while adding even +10MHz will drop them heavily.
> 
> Referring specifically to the internal benchmark since I haven't even started the game yet so I can't say for sure if this goes for the actual gameplay too.
> 
> Gamerock default bios max OC.
> 
> 
> 
> Strix T4 bios max OC
> 
> 
> I tried the T4 vbios on my Palit Gamerock. Tested it for a few hours. No problems of any sort and voltage, well at least as reported in AB, was working fine.
> 
> Didn't gained much actually performance wise and temps weren't that bad over stock bios staying >60'[email protected]% fan speed(3000ish RPM).
> 
> Stock bios 100% fan speed is ~2600RPM.which translates to ~70% on the Strix T4 bios.
> 
> Reverted back to stock Gamerock bios and performance was about the same.


You're right, backed down the mem clock to +500 does help with min fps


----------



## Koniakki

Quote:


> Originally Posted by *artemis2307*
> 
> You're right, backed down the mem clock to +500 does help with min fps


Nice avg!









If you wish, you can fill in you Rig details. I don't know if my minimum are higher because of my high ram speed or cpu since I don't know your specs.

I have seen up to 120ish, 80ish and 100ish on the minimum's respectively for the 3 test scenes. Averagely I would say about 115ish, 75ish and 95ish.

So far from what I have seen I would consider 110-120ish, 70-80ish and 90-100ish minimum FPS for the 3 scenes good for OC'ed GTX 1080.

Also I forgot to mention that I'm using SMAA/Ultra [email protected]

For the average FPS I would consider anything from 148-150+FPS including the above minimum's good for a GTX [email protected]+core/5500+mem.

But obviously because of different OS setups, optimization and hardware specification's, results will vary.


----------



## artemis2307

Quote:


> Originally Posted by *Koniakki*
> 
> Nice avg!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you wish, you can fill in you Rig details. I don't know if my minimum are higher because of my high ram speed or cpu since I don't know your specs.
> .


Xeon 1231v3 @ 3.8
8GB 1600 cl7
windows 10 pro 64bit
game stored on a 5400rpm 2.5" drive so that's maybe the cause of low min, or maybe my ram is not fast enough


----------



## juniordnz

I'm back at my stock offset overclock. Just found out that although 2151mhz seems nice, it results in lower min and max fps, average stays the same though.

Overclocking this cards are not for those who lack patience...


----------



## artemis2307

pascal chips are heavily (maybe hardware-level) power limited
even though some model have an additional 6-8pin it's not going anywhere higher than even the FE
man... when will we see maxwell-level of OC cards again








or may be that's intentional, otherwise nobody would buy a 1080 or 1080ti


----------



## Vellinious

Quote:


> Originally Posted by *artemis2307*
> 
> pascal chips are heavily (maybe hardware-level) power limited
> even though some model have an additional 6-8pin it's not going anywhere higher than even the FE
> man... when will we see maxwell-level of OC cards again
> 
> 
> 
> 
> 
> 
> 
> 
> or may be that's intentional, otherwise nobody would buy a 1080 or 1080ti


It's a setting in the bios, pretty sure of it. We just don't have a pascal bios editor yet. May not ever get one.....jury is out.


----------



## xartic1

Quote:


> Originally Posted by *artemis2307*
> 
> pascal chips are heavily (maybe hardware-level) power limited
> even though some model have an additional 6-8pin it's not going anywhere higher than even the FE
> man... when will we see maxwell-level of OC cards again
> 
> 
> 
> 
> 
> 
> 
> 
> or may be that's intentional, otherwise nobody would buy a 1080 or 1080ti


It's very sad to see the 10xx series overclock about all the same. Going from a 1080 FE to a 1080 HOF, the only difference I have noticed is cooler temps, better looking card overall, and lower power consumption at the same frequencies. FE could hold 2038 until thermals came to crash the party and my hof 2051 solid since thermals aren't an issue.

Quote:


> Originally Posted by *Vellinious*
> 
> It's a setting in the bios, pretty sure of it. We just don't have a pascal bios editor yet. May not ever get one.....jury is out.


We better get one eventually! My HOF is ready to scarf down some voltage.


----------



## OZrevhead

Where is sky? His previous bioses worked very well. We will have to wait I guess .... then soon will be 1080Ti build up and it all starts again . .


----------



## Vellinious

Quote:


> Originally Posted by *OZrevhead*
> 
> Where is sky? His previous bioses worked very well. We will have to wait I guess .... then soon will be 1080Ti build up and it all starts again . .


If NVIDIA cuts off the bios modding all together, I'll be looking to move to AMD....hopefully they get something at least in the neighborhood of "enthusiast" at some point in the near future.


----------



## JoeDirt

My best so far with the T4 BIOS on my ASUS Strix Gaming.


----------



## Vellinious

Quote:


> Originally Posted by *JoeDirt*
> 
> My best so far with the T4 BIOS on my ASUS Strix Gaming.


That's a pretty nice score, man. What kind of clocks are you running on the core / memory to see that kind of graphics score?


----------



## JoeDirt

Quote:


> Originally Posted by *Vellinious*
> 
> That's a pretty nice score, man. What kind of clocks are you running on the core / memory to see that kind of graphics score?


Thanks. 2114MHz / 1382MHz:


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> I'm back at my stock offset overclock. Just found out that although 2151mhz seems nice, it results in lower min and max fps, average stays the same though.
> 
> Overclocking this cards are not for those who lack patience...


You can say that again lol -- my head is going to explode.

I'm gaining more with memory than I am with frequency. Frequency won't give me more than 3FPS at a push.


----------



## artemis2307

My friend's 1080 fe on water is doing 2150mhz fine. So bios limiter is real.
Nvidia sure is shady af lol


----------



## TWiST2k

Quote:


> Originally Posted by *artemis2307*
> 
> My friend's 1080 fe on water is doing 2150mhz fine. So bios limiter is real.
> Nvidia sure is shady af lol


Did someone say 970 4GB RAM? lol


----------



## GreedyMuffin

I can't get more than 24K. Not even with 2200/472+ on mem.. Testet 2100 as well. Won't go over 24K on graphic score.


----------



## artemis2307

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can't get more than 24K. Not even with 2200/472+ on mem.. Testet 2100 as well. Won't go over 24K on graphic score.


give it a rest you greedy muffin


----------



## OZrevhead

I have a 19k score too, gpu at 2101mhz with 24391 graphics score:

19252 http://www.3dmark.com/3dm/15216090?


----------



## Fixxxer696

After much tinkering, I've a 18442 score, GPU at 2113mhz with a 24938 graphics on an FE on air, +203/+503. Not too shabby?

http://www.3dmark.com/fs/10458723


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> You can say that again lol -- my head is going to explode.
> 
> I'm gaining more with memory than I am with frequency. Frequency won't give me more than 3FPS at a push.


Same here. That's why I've set memory overclock first, than moved to core clock. Since core/vram overclock hinders each other I'd rather stick with the one that pays out more.

vram > core
Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can't get more than 24K. Not even with 2200/472+ on mem.. Testet 2100 as well. Won't go over 24K on graphic score.


Can't you get that vram higher? Not even overclocking it first (before messing with core clock)? IME after the core clock reaches a certain number it's performance increase will stop or even start to deteriorate, while VRAM clock will go up to +575mhz with positive results.


----------



## iiydro

Submitted the form but when it asked about cooling I selected liquid thinking it was my CPU.
I resubmitted for air!


----------



## fat4l

Still no bios tools guys ?
I'm glad I did the TDP hard mod ....now I can enjoy stable mhz 24/7...


----------



## x7007

Did anyone had an issue with Tomb Raider and Mafia III ? and maybe Quantum Break Dx11 Steam , that the performance once I had 89+ Fps or so and now I barely have 55-60. I have the same weird issue in Skyrim since I installed windows 10 and I couldn't fix it... in windows 7 it worked fine. other people it works fine in windows 10 that I have tested but not with my version, only theirs.

I can't figure, I've seen 95-80 Fps in Tomb Raider, same settings , same everything, and after something happened and I get only barely 55-60 and in the main menu only 75 , once I did get 100+ .

what bloody changes in those specific games.

The WEIRD thing is no matter if I change the resolution or quality, I don't gain as much as performance expected ! it is almost the same or just doesn't change at all !
WHAT could this be ?


----------



## Koniakki

Quote:


> Originally Posted by *x7007*
> 
> Did anyone had an issue with Tomb Raider and Mafia III ? and maybe Quantum Break Dx11 Steam , that the performance once I had 89+ Fps or so and now I barely have 55-60. I have the same weird issue in Skyrim since I installed windows 10 and I couldn't fix it... in windows 7 it worked fine. other people it works fine in windows 10 that I have tested but not with my version, only theirs.
> 
> I can't figure, I've seen 95-80 Fps in Tomb Raider, same settings , same everything, and after something happened and I get only barely 55-60 and in the main menu only 75 , once I did get 100+ .
> 
> what bloody changes in those specific games.
> 
> The WEIRD thing is no matter if I change the resolution or quality, I don't gain as much as performance expected ! it is almost the same or just doesn't change at all !
> WHAT could this be ?


Opem CMD(command prompt) with admin rights/privileges,
then paste "*bcdedit /deletevalue useplatformcloc*k" without the quotes.

Press Enter, should say success or something. Restart. Test FPS/Perf. again. Report back.

This will disable HPET(High Precision Event Timer) for Windows 10.

Edit: If it doesn't help, use CMD again and paste "*bcdedit /set useplatformclock true*" without the quotes to enable it again and keep troubleshooting.


----------



## x7007

Quote:


> Originally Posted by *Koniakki*
> 
> Opem CMD(command prompt) with admin rights/privileges,
> then paste "*bcdedit /deletevalue useplatformcloc*k" without the quotes.
> 
> Press Enter, should say success or something. Restart. Test again. Report back.
> 
> This will disable HPET(High Precision Event Timer) for Windows 10.
> 
> Edit: If it doesn't help, use CMD again and paste "*bcdedit /set useplatformclock true*" without the quotes to enable it again and keep troubleshooting.


I checked it, there is no line

I even have fresh window installed on other SSD and I only tried Mafia III which worked fine right after the patch as same time Tomb Raider worked fine, and then straight both games doesn't work right, no matter what setting.


----------



## Koniakki

Quote:


> Originally Posted by *x7007*
> 
> I checked it, there is no line
> 
> I even have fresh window installed on other SSD and I only tried Mafia III which worked fine right after the patch as same time Tomb Raider worked fine, and then straight both games doesn't work right, no matter what setting.


So the problem occurs only after the W10 1607/Anniversary patch/update?


----------



## x7007

Quote:


> Originally Posted by *Koniakki*
> 
> So the problem occurs only after the W10 1607/Anniversary patch/update?


From what happens , it started with skyrim which I couldn't play anymore in windows 10. then I did everything I could , and somehow all games started working just perfect, maximum fps, no stuttering no slowdowns, and then after couple of days it went to 50% just like skyrim for 2 games. and Tomb Raider is on SSD , Mafia III is on Raid0 x3 Samsung HDD , so it can't be any of this. No matter if I change resolution or graphics settings, the FPS doesn't change.. but if I got higher on resolution the FPS goes down more to 15 fps..


----------



## Koniakki

Quote:


> Originally Posted by *x7007*
> 
> From what happens , it started with skyrim which I couldn't play anymore in windows 10. then I did everything I could , and somehow all games started working just perfect, maximum fps, no stuttering no slowdowns, and then after couple of days it went to 50% just like skyrim for 2 games. and Tomb Raider is on SSD , Mafia III is on Raid0 x3 Samsung HDD , so it can't be any of this. No matter if I change resolution or graphics settings, the FPS doesn't change.. but if I got higher on resolution the FPS goes down more to 15 fps..


I had a similar scenario(FC4 went down from 90-100FPS at my usual test scene, to 60-80FPS) and the Disable HPET command helped greatly.

I had it enabled btw only because I had done a few x265 HWBot runs and it was the culprit for me.

But your case seems to need more in-depth troubleshoot and possibly more info is needed for assisting( like cpu/gpu usage, tdp usage, trying different benches/games to see if the performance is down in general instead of specific games, etc etc).

Don't worry tho. I'm sure we will get to the bottom of it in the end.


----------



## x7007

Quote:


> Originally Posted by *Koniakki*
> 
> I had a similar scenario(FC4 went down from 90-100FPS at my usual test scene, to 60-80FPS) and the Disable HPET command helped greatly.
> 
> I had it enabled btw only because I had done a few x265 HWBot runs and it was the culprit for me.
> 
> But your case seems to need more in-depth troubleshoot and possibly more info is needed for assisting( like cpu/gpu usage, tdp usage, trying different benches/games to see if the performance is down in general instead of specific games, etc etc).
> 
> Don't worry tho. I'm sure we will get to the bottom of it in the end.


Good to hear , this really stopped my weekend after finally wanting to start playing Quantum Break .

Cpu usage is 50%-70% goes higher on some occasions but doesn't stay, 3770k is overclock to 4.3GHz more than ok to run most games without issues on DSR 2560x1440
Gpu Usage is 99%
tdp usage not sure, do I look on GPUZ or Nvidia Inspector or just MSI AfterBurner ? Would that be matter much ? so I'll know to really look for it all the times.

Just Cause 3 works the same like before , fine no issues.
Heaven Benchmark 4.0 Works fine , though the very first benchmark I had terrible freeze which lower my minimum 9.5 Fps only, but after that I cycled through the areas again till the very same spot in the shadows door that it walks through and it didn't freeze like before. I checked youtube and everything looks the same as other GTX1080 .

Can't understand why Tomb Raider worked fine, and I can't understand why I can't play Skyrim anymore in windows 10. I even reinstalled the game and it was the same, in town I get only 45 Fps if I look straight from the main king tower down the stairs. doesn't matter if it's Tridef S3D which is real 3D, the FPS is exactly the same .....

Could it be Power Cable to the GPU ? I can't think anything that it's Bios or Windows. but why only couple of games won't work and they did work before. and skyrim worked fine on windows 7 but not on windows 10. The main issue is changing the settings doesn't change the FPS. what could cause that, a software , a driver, a setting, a windows update.

I had this issue before, randomly on times, it's like came back and go , didn't really know or looked into it much. but on couple of days everything just worked so good, the best you can think of. always 60 fps or more, never less. It was like dream coming true exactly as I wanted , and I finally wanted to play but now this happens. the worst imagine.

Are there still people with Z77 Intel 7 Series ?
Because for some reason I had this service long time installed, and somehow it disappeared... so that's one thing to look for, maybe it's not something that I installed, but something that was and then not.

Installing XTU brought it back.

https://communities.intel.com/thread/33910
That's Quote from : Correct Answer
by Diego_Intel on Sep 13, 2013 11:10 AM

"Intel(R) Integrated Clock Controller Service - Intel(R) ICCS" is a service used for accessing the integrated clock controller in the PCH to adjust the clocks to the CPU (BCLK, DPCLK, and DPNSCLK). The graphics driver uses this service to adjust the graphics clocks (DPCLK & DPNSCLK) to perform clock bending. Clock bending adjusts the display clock frequencies to reduce screen flicker. Originally access to the ICC registers was only available internally to the PCH's embedded controller (ME) so the registers were exposed to host through the HECI interface. On Intel® 8 Series PCHs and beyond, the HW has changed allowing the graphics driver to directly access the display clock registers, and the "Intel(R) Integrated Clock Controller Service" should not be necessary with those chipsets.
In addition the "Intel(R) Integrated Clock Controller Service" is used by the Intel eXtreme Tuning Utility (XTU) to perform overclocking. Overclocking is more complicated with its larger frequency range and dynamic configuration, so the PCH's embedded controller and SW service are used to abstract the ICC implementation.
Disabling "Intel(R) Integrated Clock Controller Service - Intel(R) ICCS" on Intel 8 Series PCHs will only impact the ability to do runtime overclocking with the XTU. With older chipsets, it will also disable the ability to do clock bending (meaning you may get additional screen flicker).

EDIT : As I look on other gameplay 2560x1440p GTX1080 it seems that they get the same performance as I do, maybe VXAO wasn't enabled at the one time that I tested ? but how come Mafia III worked well. can't understand that


----------



## x7007

Something weird.

When using MSI Afterburner it shows me LIM : Voltage is that maybe something to do with voltage limit ?

Now for some reason I tried Skryim and this game works good .. now it's not stuck on 45 Fps. I get from the same save game 89+ , I can even reach 150-250 if I look to the sky. in town it's 80-79 , which before was only 45 fps max ! Could it be items I had in inventory ? I use the Inventory mod and the mouse lag input fix mod.

Maybe because I ran skyrim on other SSD with windows 10 fresh installed now something changed ?? why it's happening.


----------



## juniordnz

This is so beautiful...


----------



## Derpinheimer

Quote:


> Originally Posted by *juniordnz*
> 
> This is so beautiful...


I thought so too until I realized the orange was plexiglass and not copper.. Or is it copper and clear plexi?


----------



## juniordnz

Quote:


> Originally Posted by *Derpinheimer*
> 
> I thought so too until I realized the orange was plexiglass and not copper.. Or is it copper and clear plexi?


I believe it's plexi with orange leds. You can see a clear white plexiglass in the pictures where there's no led on.

If the rest of it is full copper I don't know...


----------



## Noirgheos

So I'm looking at buying a 1080. Currently, I've settled between the Asus Strix A8G (ASUS recalled the O8G, so I'm good on that front) for $870 CAD. Other one is the Gigabyte G1 Gaming 1080 for $830 CAD.

After my past 980 Ti G1 Gaming, I'm reluctant to go Gigabyte again. Feels cheap and flimsy, though then again, it is the cheapest 1080 around. In terms of RMA they both seem to be quite terrible, but ASUS has a depot 3 hours away from me, so I'm leaning towards them. Thoughts?


----------



## jleslie246

Evga


----------



## Noirgheos

Quote:


> Originally Posted by *jleslie246*
> 
> Evga


Sorry, but the FTW is too expensive, and I don't feel like having it explode or black screen, as noted multiple times on Reddit.


----------



## jleslie246

Quote:


> Originally Posted by *Noirgheos*
> 
> Sorry, but the FTW is too expensive, and I don't feel like having it explode or black screen, as noted multiple times on Reddit.


Their customer service is phenomenal.


----------



## Noirgheos

Quote:


> Originally Posted by *jleslie246*
> 
> Their customer service is phenomenal.


That is true, and that's why I got an 850 P2 from them. Still, the FTW is $950. Much more than the Strix and the G1 Gaming.


----------



## Derek1

Quote:


> Originally Posted by *jleslie246*
> 
> Their customer service is phenomenal.


+1:thumb:
I had a minor question regarding the Slave Bios and filed a ticket and received a reply to my email within 5 minutes. Awesome customer support.
I have the FTW and am more than happy with it.
For the extra 30$ I would go with the EVGA FTW over the Asus A8G.
I ordered the Asus O8G back in July and waited 2 months for it and received absolutely no idication or reply to my repeated emails and texts to Asus as to when it would be delivered and finally after waiting 2 months decided to go with EVGA FTW.
Best decision I made.

ETA Right now the FTW is 920$
http://www.canadacomputers.com/product_info.php?cPath=43_1200_557_559&item_id=096784


----------



## Noirgheos

Quote:


> Originally Posted by *Derek1*
> 
> +1:thumb:
> I had a minor question regarding the Slave Bios and filed a ticket and received a reply to my email within 5 minutes. Awesome customer support.
> I have the FTW and am more than happy with it.
> For the extra 30$ I would go with the EVGA FTW over the Asus A8G.
> I ordered the Asus O8G back in July and waited 2 months for it and received absolutely no idication or reply to my repeated emails and texts to Asus as to when it would be delivered and finally after waiting 2 months decided to go with EVGA FTW.
> Best decision I made.


I'm ordering from Newegg, and it is in stock. Also, the difference is $80.


----------



## Derek1

Quote:


> Originally Posted by *Noirgheos*
> 
> I'm ordering from Newegg, and it is in stock. Also, the difference is $80.


Split the diff, 50$


----------



## Noirgheos

Quote:


> Originally Posted by *Derek1*
> 
> Split the diff, 50$


I don't follow...

Anyway, my choices are the G1 and A8G Strix. Which would you guys pick? Sadly I'm skipping EVGA this time around.


----------



## jleslie246

Quote:


> Originally Posted by *Derek1*
> 
> Split the diff, 50$


Quote:


> Originally Posted by *Noirgheos*
> 
> I'm ordering from Newegg, and it is in stock. Also, the difference is $80.


Try evga website. It's worth the extra imo. Asus would be my next choice. If you make enough noise on the forums you will get customer service.


----------



## Noirgheos

Quote:


> Originally Posted by *jleslie246*
> 
> Try evga website. It's worth the extra imo. Asus would be my next choice. If you make enough noise on the forums you will get customer service.


Ordering from EVGA's site will force me to pay in Canadian dollars as well as pay duties, putting it over 1K CAD. Guess I'll grab the Strix. Also, I could just drive to the depot and drop it off really.


----------



## Derek1

Quote:


> Originally Posted by *jleslie246*
> 
> Try evga website. It's worth the extra imo. Asus would be my next choice. If you make enough noise on the forums you will get customer service.


Ya the EVGA website is pricing the FTW at 679US. Which works out to 893 Canadian. Beats my link to Canada Computers for 920.


----------



## Noirgheos

Quote:


> Originally Posted by *Derek1*
> 
> Ya the EVGA website is pricing the FTW at 679US. Which works out to 893 Canadian. Beats my link to Canada Computers for 920.


I'll have to pay duties, which will put it over 1K.


----------



## outofmyheadyo

Msi gaming x
Zotac amp xtreme
Evga ftw

Wich one would you pick, must be quiet and no coilwhine, if it overclock like hell I would be happy too. Prices are the same.


----------



## Noirgheos

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Msi gaming x
> Zotac amp xtreme
> Evga ftw
> 
> Wich one would you pick, must be quiet and no coilwhine, if it overclock like hell I would be happy too. Prices are the same.


So anyway I just ordered the Strix, but you didn't list it. Out of the three you listed, I'd eliminate the MSI, clearly too expensive in all places. EVGA is nice if you don't get a card that has a black screen issue or blows up, but their support is great, so they should take care of you. Zotac has the best cooling out of the three, support is meh but they get it done.

Whichever is cheaper out of the Zotac and EVGA is my pick. They can't all be the same.


----------



## feznz

Quote:


> Originally Posted by *Noirgheos*
> 
> So I'm looking at buying a 1080. Currently, I've settled between the Asus Strix A8G (ASUS recalled the O8G, so I'm good on that front) for $870 CAD. Other one is the Gigabyte G1 Gaming 1080 for $830 CAD.
> 
> After my past 980 Ti G1 Gaming, I'm reluctant to go Gigabyte again. Feels cheap and flimsy, though then again, it is the cheapest 1080 around. In terms of RMA they both seem to be quite terrible, but ASUS has a depot 3 hours away from me, so I'm leaning towards them. Thoughts?


I have always had good components from Asus I helped my friend put his Gigabyte GTX 1080 G1 in the other day all I can say in Gigabyte card does feel nice to handle and clocked to 2100Mhz off the bat.

on the other hand I prefer the design of Asus the illuminated logo makes me dribble

besides my choice has been already made on brand just model Strix 1080 or wait for 1080ti


----------



## Noirgheos

Quote:


> Originally Posted by *feznz*
> 
> I have always had good components from Asus I helped my friend put his Gigabyte GTX 1080 G1 in the other day all I can say in Gigabyte card does feel nice to handle and clocked to 2100Mhz off the bat.
> 
> on the other hand I prefer the design of Asus the illuminated logo makes me dribble
> 
> besides my choice has been already made on brand just model Strix 1080 or wait for 1080ti


Ordered the Strix. Excited. Now I'm broke.


----------



## feznz

Quote:


> Originally Posted by *Noirgheos*
> 
> Ordered the Strix. Excited. Now I'm broke.


----------



## Thetbrett

Quote:


> Originally Posted by *rakesh27*
> 
> Hey guys,
> 
> Im alittle late to the party, well anyways i recently got a Zotac Amp Edition 1080, and i was overclocking using msi and latest nvidia drivers. Is this normal... if i push gpu past 2100 i get in game crashes and memory not tested so far is at 450....
> 
> ive got voltages and power turned all the way up, should i use a different overclocking tool then the usual msi afterburner ?
> 
> Thanks all..


I have an AMP and it's about the same. i add +104 to core plus +400 to memory. Starts off ok ove 2100 but due to thermal downclocking it's settles at 2038-2054. If I leave the case open and add a fan in the case blowing more air on it it will sit at 2090-2103, but i dont do that all the time. need to keep it cool. Adding more than 25% voltage does nothing but add heat. I leave it at 25%. The AMP is not a bad card. Same as the AMP extreme except for the different cooler, for a much better price. I am perfectly happy with the performance. Quiet too.


----------



## OZrevhead

Guys, do any factory bioses have higher voltage limits (and are compatible with my SC)?


----------



## TWiST2k

Quote:


> Originally Posted by *OZrevhead*
> 
> Guys, do any factory bioses have higher voltage limits (and are compatible with my SC)?


*Sigh*


----------



## OZrevhead

Does that mean no?


----------



## Fediuld

Quote:


> Originally Posted by *OZrevhead*
> 
> Does that mean no?


All 1080s are limited to 1.093v and also all of them start throttling from 32C


----------



## Reefer

Guys, i read some posts that some games are doing worse performance....

Ï had the same with some games... Check the NVidia Geforce experience app. It set some of my games to max quality, meaning some games were @ 200% rendering... that brought some games down to 50 fps iunstead of over 100.

Good luck solving..


----------



## OZrevhead

@ Fediuld - Well that sucks then .... is anyone working on a solution? Where can we find skyn3t?


----------



## sirleeofroy

Hi Guys

I have a Gainward GTX 1080 GLH which has recently had an updated BIOS released (21/09/2016) however, the BIOS updater from Gainward hangs at 99% and does not complete (I left it overnight, no dice).

I have also tried the updated BIOS for the Palit Gamerock Premium (same card and same bios) but NVFlash fails due to a mismatch of ID's.

I've tried extracting the .rom from the exe but NVFlash doesn't like it, even after changing the extension to ".rom"

Of course, the card is disabled and I have tried the above in safe mode to no avail









Anyone have any ideas?


----------



## ucode

@OZrevhead only special Asus strix VBIOS allows over 1.093V (up to 1.2V) AFAIK. IMHO great move by Asus as users of competitors cards only have this option as a firmware solution so if it's cross flashed to a competitors board then any super high benches run with it will report it as an Asus card in the results. Many of those looking at the results will believe the Asus Strix to be the best performing GTX 1080 on the planet even though it may actually be another manufacturers card.

@sirleeofroy the VBIOS files in the exe already come with the .rom extension. Have extracted for you here.

11259-G10803BM0N1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.66, 90W, 230W, 276W
11260-G10803BM0N1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.67, 90W, 230W, 276W
11261-G10803BM0G1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.68, 90W, 230W, 276W


----------



## Koniakki

Quote:


> Originally Posted by *sirleeofroy*
> 
> Hi Guys
> 
> I have a Gainward GTX 1080 GLH which has recently had an updated BIOS released (21/09/2016) however, the BIOS updater from Gainward hangs at 99% and does not complete (I left it overnight, no dice).
> 
> I have also tried the updated BIOS for the Palit Gamerock Premium (same card and same bios) but NVFlash fails due to a mismatch of ID's.
> 
> I've tried extracting the .rom from the exe but NVFlash doesn't like it, even after changing the extension to ".rom"
> 
> Of course, the card is disabled and I have tried the above in safe mode to no avail
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone have any ideas?


What exact command did you use for the flash? Have you tried the -6 command?

E.g: "nvflash -6 x.rom" *or* "nvflash -5 -6 x.rom"? Or using the "nvflash --protectoff" before doing the flash? Don't forget using "nvflash --protecton" command if/after flash is succesful.

Quote:


> Originally Posted by *x7007*
> 
> Something weird.
> 
> When using MSI Afterburner it shows me LIM : Voltage is that maybe something to do with voltage limit ?
> 
> Now for some reason I tried Skryim and this game works good .. now it's not stuck on 45 Fps. I get from the same save game 89+ , I can even reach 150-250 if I look to the sky. in town it's 80-79 , which before was only 45 fps max ! Could it be items I had in inventory ? I use the Inventory mod and the mouse lag input fix mod.
> 
> Maybe because I ran skyrim on other SSD with windows 10 fresh installed now something changed ?? why it's happening.


What's the update? I hope it's solved.


----------



## bloot

Quote:


> Originally Posted by *sirleeofroy*
> 
> Hi Guys
> 
> I have a Gainward GTX 1080 GLH which has recently had an updated BIOS released (21/09/2016) however, the BIOS updater from Gainward hangs at 99% and does not complete (I left it overnight, no dice).
> 
> I have also tried the updated BIOS for the Palit Gamerock Premium (same card and same bios) but NVFlash fails due to a mismatch of ID's.
> 
> I've tried extracting the .rom from the exe but NVFlash doesn't like it, even after changing the extension to ".rom"
> 
> Of course, the card is disabled and I have tried the above in safe mode to no avail
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone have any ideas?


Hi,

Had the exact same problem with the official BIOS updater on a Palit GTX 1080 Super JetStream, had no problem with nvflash though, just make sure you use a version with certificate checks bypassed.


----------



## sirleeofroy

Quote:


> Originally Posted by *Koniakki*
> 
> What exact command did you use for the flash? Have you tried the -6 command?
> 
> E.g: "nvflash -6 x.rom" *or* "nvflash -5 -6 x.rom"? Or using the "nvflash --protectoff" before doing the flash? Don't forget using "nvflash --protecton" command if/after flash is succesful.


I've tried the commands above but get the below error:

"ERROR: can't find support list magic number"

Maybe an issue with the extracted rom?


----------



## sirleeofroy

Quote:


> Originally Posted by *ucode*
> 
> @OZrevhead only special Asus strix VBIOS allows over 1.093V (up to 1.2V) AFAIK. IMHO great move by Asus as users of competitors cards only have this option as a firmware solution so if it's cross flashed to a competitors board then any super high benches run with it will report it as an Asus card in the results. Many of those looking at the results will believe the Asus Strix to be the best performing GTX 1080 on the planet even though it may actually be another manufacturers card.
> 
> @sirleeofroy the VBIOS files in the exe already come with the .rom extension. Have extracted for you here.
> 
> 11259-G10803BM0N1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.66, 90W, 230W, 276W
> 11260-G10803BM0N1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.67, 90W, 230W, 276W
> 11261-G10803BM0G1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.68, 90W, 230W, 276W


Many thanks for your reply and the link, I've now updated the bios. I don't get why I couldn't find the rom file within the exe though.

Will get to some testing and see if there are any improvements in performance/overclocking.


----------



## nrpeyton

Quote:


> Originally Posted by *bloot*
> 
> Hi,
> 
> Had the exact same problem with the official BIOS updater on a Palit GTX 1080 Super JetStream, had no problem with nvflash though, just make sure you use a version with certificate checks bypassed.


The version with 'certificates bypassed' *doesn't* work with 'signed' BIOS's. The STRIX/T4 (volt modified BIOS) was a BIOS originally supplied to one of the top LN2 world champions by ASUS specifically for world record breaking. The LN2 guy was nice enough to realise the BIOS to the public. But since it *is* still an ASUS 'signed' BIOS you don't need to use the "certificates bypassed" version of NVFLASH for it to work.

In fact the 'bypassed' version will *NOT* work with official, 'signed' BIOS's.

Use the standard version with the following command: nvflash --overridesub [filename.rom] without the [ ]'s

Nick


----------



## nrpeyton

Are there any other volt modified BIOS's apart from the STRIX/T4? Even if they are *not* signed I *still* want to try them as the STRIX/T4 one is not compatible with my EVGA Classified 1080. I am desperately looking for another volt modified BIOS to try. I am actually thinking of sending my card back and getting an ASUS one. :-(


----------



## bloot

Quote:


> Originally Posted by *nrpeyton*
> 
> The version with 'certificates bypassed' *doesn't* work with 'signed' BIOS's. The STRIX/T4 (volt modified BIOS) was a BIOS originally supplied to one of the top LN2 world champions by ASUS specifically for world record breaking. The LN2 guy was nice enough to realise the BIOS to the public. But since it *is* still an ASUS 'signed' BIOS you don't need to use the "certificates bypassed" version of NVFLASH for it to work.
> 
> In fact the 'bypassed' version will *NOT* work with official, 'signed' BIOS's.
> 
> Use the standard version with the following command: nvflash --overridesub [filename.rom] without the [ ]'s
> 
> Nick


Thanks for the info, didn't know it


----------



## Fediuld

Quote:


> Originally Posted by *ucode*
> 
> @OZrevhead only special Asus strix VBIOS allows over 1.093V (up to 1.2V) AFAIK. IMHO great move by Asus as users of competitors cards only have this option as a firmware solution so if it's cross flashed to a competitors board then any super high benches run with it will report it as an Asus card in the results. Many of those looking at the results will believe the Asus Strix to be the best performing GTX 1080 on the planet even though it may actually be another manufacturers card.
> 
> @sirleeofroy the VBIOS files in the exe already come with the .rom extension. Have extracted for you here.
> 
> 11259-G10803BM0N1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.66, 90W, 230W, 276W
> 11260-G10803BM0N1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.67, 90W, 230W, 276W
> 11261-G10803BM0G1.rom,10DE:1B80, GeForce GTX 1080 VGA BIOS, 86.04.3B.00.68, 90W, 230W, 276W


However the performance doesn't improve. Even the best watercooled Strix, showing core 2200+ and 1.2v limit, it's real performance is the same as 2177 /2164 @ 1.081v/1.093v.


----------



## khemist

https://imageshack.com/i/poJMgOSIj

Managed to bodge this on to my 1080 FE, temps hardly hitting 50c now!.


----------



## ZeroSeventy

I've been a happy owner of Palit's Super JetStream. But I ran into a problem while flashing the newest bios released by Palit, no matter what I do be it in safe mode, with admin rights or without, the damn flashing tool doesn't go beyond 99%.

Could anyone help out and extract the bios .rom file for GTX1080 Super JetStream from the Palit's tool and upload in anywhere for me? I would be eternally grateful!

I've been trying to do it myself, but couldnt go past the password protected archive, yet seen people who has done it, there is apparently 6 bios files, 4 for GTX1070 models and 2 for 1080 Super JetStream and Gamerock Premium models.


----------



## OZrevhead

Good work khemist, some ghetto mods are the best mods.










@ umod - thanks mate, will the Strix/T4 bios have a negative effect on my SC?









Quote:


> Originally Posted by *Fediuld*
> 
> However the performance doesn't improve. Even the best watercooled Strix, showing core 2200+ and 1.2v limit, it's real performance is the same as 2177 /2164 @ 1.081v/1.093v.


Yes but I am 100mhz off that, if I can bump the volts and get 2150+ I would be happy.


----------



## sirleeofroy

Quote:


> Originally Posted by *ZeroSeventy*
> 
> I've been a happy owner of Palit's Super JetStream. But I ran into a problem while flashing the newest bios released by Palit, no matter what I do be it in safe mode, with admin rights or without, the damn flashing tool doesn't go beyond 99%.
> 
> Could anyone help out and extract the bios .rom file for GTX1080 Super JetStream from the Palit's tool and upload in anywhere for me? I would be eternally grateful!
> 
> I've been trying to do it myself, but couldnt go past the password protected archive, yet seen people who has done it, there is apparently 6 bios files, 4 for GTX1070 models and 2 for 1080 Super JetStream and Gamerock Premium models.


See here buddy - https://www.techpowerup.com/vgabios/186382/palit-superjetstream-8192-160921

Sounds like you've had the exact same issue as me, I also got to that same protected archive and couldn't figure out how to get to the ROM files. Fortunately a member here uploaded them for me as the link I've give you above does not have updated files for my card. It was my first port of call!


----------



## ZeroSeventy

Quote:


> Originally Posted by *sirleeofroy*
> 
> See here buddy - https://www.techpowerup.com/vgabios/186382/palit-superjetstream-8192-160921
> 
> Sounds like you've had the exact same issue as me, I also got to that same protected archive and couldn't figure out how to get to the ROM files. Fortunately a member here uploaded them for me as the link I've give you above does not have updated files for my card. It was my first port of call!


You're a life saver man! Been lookin on techpower up, but instead of checking all I just checked the GTX 1080 models, n there is only Gamerock Premium updated







Thanks a bucnh!


----------



## khemist

Quote:


> Originally Posted by *OZrevhead*
> 
> Good work khemist, some ghetto mods are the best mods.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @ umod - thanks mate, will the Strix/T4 bios have a negative effect on my SC?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes but I am 100mhz off that, if I can bump the volts and get 2150+ I would be happy.


I say bodge because i'm using some of the mounting hardware for the AMD and NVIDIA parts, just whatever worked, this cooler has been out for many years.


----------



## ucode

Quote:


> Originally Posted by *Fediuld*
> 
> However the performance doesn't improve. Even the best watercooled Strix, showing core 2200+ and 1.2v limit, it's real performance is the same as 2177 /2164 @ 1.081v/1.093v.


If the card is already at the end of it's voltage scaling then extra voltage isn't going to help.

AFAIK galeonki used the T4 to reach a Fire Strike graphics score of 26,337
http://www.3dmark.com/fs/10391228

Please would you provide link(s) for cards scoring the same or more at 1.093V / 2177, I'd like to know about them.


----------



## Bishop07764

Quote:


> Originally Posted by *ucode*
> 
> If the card is already at the end of it's voltage scaling then extra voltage isn't going to help.
> 
> AFAIK galeonki used the T4 to reach a Fire Strike graphics score of 26,337
> http://www.3dmark.com/fs/10391228
> 
> Please would you provide link(s) for cards scoring the same or more at 1.093V / 2177, I'd like to know about them.


Wow. That's quite a good Firestrike graphics score there. I bet that the memory helped more than anything on that one. I just discovered that I accidentally had my memory overclocked just maybe 25-50mhz too high. My score jumped almost 600 points from that alone. http://www.3dmark.com/3dm/15466953


----------



## Ale10

Hi guys. I'm looking to buy an Asus 1080, but I don't know which model take. The A8G and O8G cards are the same? I'm waiting about 2 months for O8G so I'm planning to take now an A8G.

The cards are the same? Or the O8G are selected by Asus as luckiest in chip lottery?

I'm planning to OC, so I know that I can flash O8G bios to A8G, but it will be stable with the O8G default clocks?

Thanks


----------



## Tdbeisn554

Quote:


> Originally Posted by *Ale10*
> 
> Hi guys. I'm looking to buy an Asus 1080, but I don't know which model take. The A8G and O8G cards are the same? I'm waiting about 2 months for O8G so I'm planning to take now an A8G.
> 
> The cards are the same? Or the O8G are selected by Asus as luckiest in chip lottery?
> 
> I'm planning to OC, so I know that I can flash O8G bios to A8G, but it will be stable with the O8G default clocks?
> 
> Thanks


The O8G has higher stock clocks so those cards will at least get those speeds. You can have bad luck and only have the stock speeds as their max or be lucky and have a super good OC chip. Same with the A8G, if you have a bit of luck the A8G will clock higher than the O8G but if you have bad luck the card will only get to the stock speed (which is a bit lower than the O8G)
But I am pretty sure both cards will perform about the same and at least 1900+ MHz.

If they are selected by Asus, I'm not really sure. But the O8G cards will all at least get 1936 MHz boost while the other will have a lower boost speed. But since you want to overclock... I do not think their will be a huge difference between both cards


----------



## SauronTheGreat

Dear fellow gamers recently i have encountered a bit of a problem when i am playing GTAV online sometimes my game crashes then a black screen comes and display goes off, i tried uninstalling the driver with DDU then reinstalled it, then i ran unigene heaven for almost 80 mins, then i played GTAV for almost 2 hours and again my game crashed, then i reverted back to the old driver and ran unigene heaven again then heaven stop working this time the display did not turn off it was just the application crash, so i restarted my computer switched to the default bios settings of my motherboard. then again ran unigene heaven for almost 30 mins and later played gtaV for more than an hour. please also know i have no overclocking software enabled, the first game to crash was mafia III. although i have had no crash after using the default motherboard bios but what should i do next if there is a crash again as i plan to run unigene heaven for a few hours . i have windows 10 64 bit education


----------



## Tdbeisn554

So people with 1080 Classy's, does the white light on top of the EVGA logo bother anyone else??


----------



## Ale10

Quote:


> Originally Posted by *Archang3l*
> 
> The O8G has higher stock clocks so those cards will at least get those speeds. You can have bad luck and only have the stock speeds as their max or be lucky and have a super good OC chip. Same with the A8G, if you have a bit of luck the A8G will clock higher than the O8G but if you have bad luck the card will only get to the stock speed (which is a bit lower than the O8G)
> But I am pretty sure both cards will perform about the same and at least 1900+ MHz.
> 
> If they are selected by Asus, I'm not really sure. But the O8G cards will all at least get 1936 MHz boost while the other will have a lower boost speed. But since you want to overclock... I do not think their will be a huge difference between both cards


Thanks! I will waiting another 1-2 weeks and if the O8G isn't still available I will take the aa8G.


----------



## Tdbeisn554

Quote:


> Originally Posted by *Ale10*
> 
> Thanks! I will waiting another 1-2 weeks and if the O8G isn't still available I will take the aa8G.


Pretty sure both cards will perform about the same, but There is a chance the O8G will have a bit better chip







(You can have a potato too... My classified with all its overkill power phases, etc can't go above 2050 without artefacting like crazy







)


----------



## GreedyMuffin

Will get a 4790K which can do 4900 at 1.350V. That is replacing my 5960X setup (I needed the money for the outboard I wanted).

I guess a 4790K 4900 should perform better than a 4500 5960X in games?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Will get a 4790K which can do 4900 at 1.350V. That is replacing my 5960X setup (I needed the money for the outboard I wanted).
> 
> I guess a 4790K 4900 should perform better than a 4500 5960X in games?


Yes, except when the games can use the extra cores.


----------



## Derek1

Quote:


> Originally Posted by *SauronTheGreat*
> 
> Dear fellow gamers recently i have encountered a bit of a problem when i am playing GTAV online sometimes my game crashes then a black screen comes and display goes off, i tried uninstalling the driver with DDU then reinstalled it, then i ran unigene heaven for almost 80 mins, then i played GTAV for almost 2 hours and again my game crashed, then i reverted back to the old driver and ran unigene heaven again then heaven stop working this time the display did not turn off it was just the application crash, so i restarted my computer switched to the default bios settings of my motherboard. then again ran unigene heaven for almost 30 mins and later played gtaV for more than an hour. please also know i have no overclocking software enabled, the first game to crash was mafia III. although i have had no crash after using the default motherboard bios but what should i do next if there is a crash again as i plan to run unigene heaven for a few hours . i have windows 10 64 bit education


Do you get a message that the display driver has stopped working but has recovered?

If you do not see the message go into the Event Viewer to see what has happened.


----------



## SauronTheGreat

Quote:


> Originally Posted by *Derek1*
> 
> Do you get a message that the display driver has stopped working but has recovered?
> 
> If you do not see the message go into the Event Viewer to see what has happened.


no i never got any message like that and in all the crashes the reason was mentioned for the crash as critical as the system rebooted without being cleanly shutdown first, i saw gpu adapter error in the event viewer , all i saw was this critical error which you have mentioned and a Ethernet link disconnection link warning


----------



## Derek1

Quote:


> Originally Posted by *SauronTheGreat*
> 
> no i never got any message like that and in all the crashes the reason was mentioned for the crash as critical as the system rebooted without being cleanly shutdown first, i saw gpu adapter error in the event viewer , all i saw was this critical error which you have mentioned and a Ethernet link disconnection link warning


Then this is an error in Windows. It times out or something. Google the error or go to Microsoft and search for "Display driver has stopped responding but has recovered."

I had this error while using Steam on the 372.90 driver and went back to the previous version. I haven't had it on the newest driver though.

ETA I can't be sure, but I believe I was attempting to optimize my game through the Nvidia Experience stuff and changing settings in the 3D panel of the Nvidia Control Panel at the time. I remember resetting all that stuff to default.


----------



## Lays

I saw someone in another thread mentioning there's a BIOS floating around for MSI Gaming X with 1.1v available, but I can't seem to find it literally anywhere. Does anyone know where I can get it?

I have an MSI EK Seahawk X, so it's just a Gaming X with an EK Block on it from the factory. It seems to do 2164 no problem with the 1.062v the stock BIOS allows, was hopeful that if I can get the 1.1v BIOS I can maybe do 2185 or something.


----------



## nrpeyton

Quote:


> Originally Posted by *khemist*
> 
> https://imageshack.com/i/poJMgOSIj
> 
> Managed to bodge this on to my 1080 FE, temps hardly hitting 50c now!.


What is that exactly that you have attached to your card?? 

and what about cooling your VRM & memory? or is there a water block underneath that too and the fans are actually just there cooling a heatsink attached to your VRM/memory?


----------



## khemist

Gelid Icy Vision Rev 2 cooler and i have heatsinks attached to the memory and vrm etc.

I have a EVGA SC cooler on the way, i'll use whatever cools best.


----------



## feznz

Quote:


> Originally Posted by *Ale10*
> 
> Hi guys. I'm looking to buy an Asus 1080, but I don't know which model take. The A8G and O8G cards are the same? I'm waiting about 2 months for O8G so I'm planning to take now an A8G.
> 
> The cards are the same? Or the O8G are selected by Asus as luckiest in chip lottery?
> 
> I'm planning to OC, so I know that I can flash O8G bios to A8G, but it will be stable with the O8G default clocks?
> 
> Thanks


I would assume that the chips were binned to a certain extent, and if I am correct then there will be another series of 1080 to come with the highest binned chips.
Just going by the 980ti series there was the Gold edition and Matrix released way after the Strix that had mad clocks on them.
In saying that it is a hefty premium to pay for a better chip.


----------



## KickAssCop

Quote:


> Originally Posted by *Ale10*
> 
> Hi guys. I'm looking to buy an Asus 1080, but I don't know which model take. The A8G and O8G cards are the same? I'm waiting about 2 months for O8G so I'm planning to take now an A8G.
> 
> The cards are the same? Or the O8G are selected by Asus as luckiest in chip lottery?
> 
> I'm planning to OC, so I know that I can flash O8G bios to A8G, but it will be stable with the O8G default clocks?
> 
> Thanks


I also waited but bought the AG. I flashed both my AG cards to OG clocks so they now boost 2025 default. I have only had luck to max them out at 2050 after extended hours of gaming (they start at 2076 but throttle down 2 notches).

Honestly, apart from some ePENOS marks there is no benefit of having an OG vs. AG in real world gameplay. It is like 1-2 fps difference which can also be error.


----------



## nrpeyton

Quote:


> Originally Posted by *KickAssCop*
> 
> I also waited but bought the AG. I flashed both my AG cards to OG clocks so they now boost 2025 default. I have only had luck to max them out at 2050 after extended hours of gaming (they start at 2076 but throttle down 2 notches).
> 
> Honestly, apart from some ePENOS marks there is no benefit of having an OG vs. AG in real world gameplay. It is like 1-2 fps difference which can also be error.


When u say your A8G's maxed out at 2050-2076 with OC bios-- is that their max DEFAULT or the max OVERCLOCK? If you overclock them manually what is your max stabke boost then?


----------



## rdhrdh

Quote:


> Originally Posted by *nrpeyton*
> 
> When u say your A8G's maxed out at 2050-2076 with OC bios-- is that their max DEFAULT or the max OVERCLOCK? If you overclock them manually what is your max stabke boost then?


I was in the same boat, debating between A8G and 08G. Ended up with the A8G, first card had artifacting on the stock OC mode so I sent it back. Second card Oc's like a champ, I get a stable 2110-2150 while gaming with temps in the mid to high 50's on stock cooling. I hit 25.5k in firestrike or so a couple times, but 24.8-25k being the average. Overall I'm really happy with the card.


----------



## nrpeyton

Quote:


> Originally Posted by *rdhrdh*
> 
> I was in the same boat, debating between A8G and 08G. Ended up with the A8G, first card had artifacting on the stock OC mode so I sent it back. Second card Oc's like a champ, I get a stable 2110-2150 while gaming with temps in the mid to high 50's on stock cooling. I hit 25.5k in firestrike or so a couple times, but 24.8-25k being the average. Overall I'm really happy with the card.


Did you have to pay a re-stocking fee?


----------



## nrpeyton

Quote:


> Originally Posted by *khemist*
> 
> Gelid Icy Vision Rev 2 cooler and i have heatsinks attached to the memory and vrm etc.
> 
> I have a EVGA SC cooler on the way, i'll use whatever cools best.


So you took the old cooler assembly off the circuit board and this aftermarket one just attaches onto the board with its own card-sized heatsink and copper plate for GPU chip? hmm never knew you could do that with air.

I get temps max 55c on my 1080 classified and I'm trying to decide if it would be worthwhile going for a universal block. I can't see temps getting down to 30c. If they reach 40c its hardly worthwhile for a 10c difference.... Especially as vrm & memory will also no longer be properly cooled too.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> So you took the old cooler assembly off the circuit board and this aftermarket one just attaches onto the board with its own card-sized heatsink and copper plate for GPU chip? hmm never knew you could do that with air.
> 
> I get temps max 55c on my 1080 classified and I'm trying to decide if it would be worthwhile going for a universal block. I can't see temps getting down to 30c. If they reach 40c its hardly worthwhile for a 10c difference.... Especially as vrm & memory will also no longer be properly cooled too.


Your VRAM and VRM are already not properly cooled. The ACX3.0 design blows hot air that have gone through the heatsink fins on the plate that (should) makes contact with the VRM and VRAM modules. A dedicated GPU block + something blowing cold air on the front plate should actually cool VRM and VRAM better.

I'd totally go for a universal gpu block if I had easy and cheap access to it. I'll just probably hook an AIO cooler with a G10 (90mm fan blowing cold air on the VRM/Frontplate).


----------



## Derek1

Do you have the FTW Junior? Is the G10 compatible?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Do you have the FTW Junior? Is the G10 compatible?


Yes, I do. And yes, it is. The only problem is that because of the front plate, most AIO coolers like corsair's can't make contact with the GPU die, so you need to put a copper shim between them. IME a 2mm thick copper shim is enough. I currently have a G10 sitting here that I'll have to mod in order to make compatible with new corsair coolers (H100i V2). But if you use an older version like H75, H90, H105 it is pretty easy to install.

Once I get everything sorted out I'll attach a H100i V2 on the FTW and hopefully everything will work fine.


----------



## rdhrdh

Quote:


> Originally Posted by *nrpeyton*
> 
> Did you have to pay a re-stocking fee?


Negative. It was within 30 days so there was a no nonsense exchange.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Yes, I do. And yes, it is. The only problem is that because of the front plate, most AIO coolers like corsair's can't make contact with the GPU die, so you need to put a copper shim between them. IME a 2mm thick copper shim is enough. I currently have a G10 sitting here that I'll have to mod in order to make compatible with new corsair coolers (H100i V2). But if you use an older version like H75, H90, H105 it is pretty easy to install.
> 
> Once I get everything sorted out I'll attach a H100i V2 on the FTW and hopefully everything will work fine.


I suppose that includes EVGA's own AIO as well. Because when I asked support if their kit was compatible with the FTW they said No because it "has a custom PCB with a different layout and size. The Hybrid cooler is only designed for the reference design boards."
But perhaps they were only referring to the bracket assembly rather than the block/pump head unit. So possibly may not need a shim for that?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> I suppose that includes EVGA's own AIO as well. Because when I asked support if their kit was compatible with the FTW they said No because it "has a custom PCB with a different layout and size. The Hybrid cooler is only designed for the reference design boards."
> But perhaps they were only referring to the bracket assembly rather than the block/pump head unit. So possibly may not need a shim for that?


EVGA hybrid kit is also made by Asetek. So the waterblock size should be the same as other brands like Corsair. If that's the case, it won't fit either without a shim. The problem is not the PCB or the screws wholes or whatever, it's the front plate that have 4 "fingers" near the GPU die, and those won't let the waterblock go near enough to make contact with the die.

I can assure you that ANY asetek cooler + a Kraken G10 will fit with the FTW if a 2mm or thicker shim is used.


----------



## x-apoc

I felt uneasy about my temps on 1080 ftw and bought better thermal pads and replaced them. This is what I found.

Before.



After.



My room temps is about 75f
GPU idle temp 40c with out fan spin

While gaming its 58-60c on auto fan w/out adjusting fan speed and bit of overclock +100 core +500 mem.


----------



## KickAssCop

Quote:


> Originally Posted by *nrpeyton*
> 
> When u say your A8G's maxed out at 2050-2076 with OC bios-- is that their max DEFAULT or the max OVERCLOCK? If you overclock them manually what is your max stabke boost then?


After OC bios they stabilize at 2025. I can add max of +45 on the core before the display driver stops working. Net after an hour of gaming is stable at 2050 for me which is what I report in my sig.


----------



## nrpeyton

hmm I'm going to have to find a good universal block for my Classified 1080.

That's the easy bit.

Not sure what I'm going to do about VRM/memory etc. Will be the first time I've even taken one apart.



Getting max of 55 - 58c in games anyway so also need to decide if it even worthwhile.

Has a 14+3 phase VRM.

I definitely also want to buy a waterchiller if I see a good difference initially with water.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm I'm going to have to find a good universal block for my Classified 1080.
> 
> That's the easy bit.
> 
> Not sure what I'm going to do about VRM/memory etc. Will be the first time I've even taken one apart.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Getting max of 55 - 58c in games anyway so also need to decide if it even worthwhile.
> 
> Has a 14+3 phase VRM.
> 
> I definitely also want to buy a waterchiller if I see a good difference initially with water.


I have universal blocks the problem with then is no card support so the card will end up drooping down at the end.
next time round I will be throwing my universals in the bin and buy a full cover admittedly these blocks have done 2 sets of cards but they are more hassle than what they are worth I was lucky on my MSI GTX 580s they had a full cover base heat sink for memory and VRMs the Asus GTX 770s I have had no memory heat sinks as it was simply too hard to fit memory heat sinks afterwards but to be honest there is no need to actively cool the memory.
I looks at heat sink solutions and really I was better to go full cover in the end. the initial cost is a little more but the benefits out weigh the cost savings. IMHO

I might also add that chances are that a GTX1080 will go a long way to the stage were you might even skip the 11XX series I skipped the 9XX series simply there was a hefty premium for a 50% gain now I am looking at 100% with one card I simply a compelled to upgrade.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> I have universal blocks the problem with then is no card support so the card will end up drooping down at the end.
> next time round I will be throwing my universals in the bin and buy a full cover admittedly these blocks have done 2 sets of cards but they are more hassle than what they are worth I was lucky on my MSI GTX 580s they had a full cover base heat sink for memory and VRMs the Asus GTX 770s I have had no memory heat sinks as it was simply too hard to fit memory heat sinks afterwards but to be honest there is no need to actively cool the memory.
> I looks at heat sink solutions and really I was better to go full cover in the end. the initial cost is a little more but the benefits out weigh the cost savings. IMHO
> 
> I might also add that chances are that a GTX1080 will go a long way to the stage were you might even skip the 11XX series I skipped the 9XX series simply there was a hefty premium for a 50% gain now I am looking at 100% with one card I simply a compelled to upgrade.


I would love to go full waterblock but they're not available for 1080 Classified. EK nearly done one but they cancelled it completely.

hmm I see. I upgraded from 2 SLI 980's to a single 1080 Classified;

Could have skipped this generation but my excitement got the better of me lol.

Sold both for £450 so only really paid £250 for the 1080. This way I also spread the cost of my next upgrade too. (In 12 months my 1080 will still be worth £££ 

I will *not SLI again*... the support is pathetic. With the exception of 'The Witcher 3' which was fantastic on SLI.

My single 1080 is giving me about 5 FPS extra "average" at 4k over my twin 980's  The "minimum" is where I see the biggest difference 
The workload & temps on my CPU are also *much lower*

*Thanks for information about your water experience* -- what do you mean by "droop"? What were your temps like with universal block? My 1080 Classified normally pulls 250 watts (gaming) and never exceeds 55-58c


----------



## Tdbeisn554

My replacement for my classy is on the way







Let's hope the card will have no coil whine and a better chip which will OC to at least 2100...


----------



## chiknnwatrmln

I went from 290x CF to 1080... Couldn't be happier. Even though raw performance is a little lower (28k FS vs 24.5k) soo many games don't support multi-GPU that it makes it worth it.

Fallout 4, DayZ, Skyrim, Forza Apex, etc. The list goes on. It's a shame really, developers should focus more on multi GPU support.


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> My replacement for my classy is on the way
> 
> 
> 
> 
> 
> 
> 
> Let's hope the card will have no coil whine and a better chip which will OC to at least 2100...


I was in the same boat mate. But I was disappointed that I couldn't get to 2200. But what I was even more annoyed about is how ASUS owners get to take advantage of voltage up to the HARD limit. All other manufacturers are limited by BIOS.

Tomorrow is my last day to 'change my mind'. Distance selling regulations here are 14 days. But a 10% restock fee applies. (Since there was nothing wrong with card).

Lack of full waterblock annoyed me too. But it wasn't a deal breaker. The BIOS was though.

*However --* in the end I refuse to pay £70 restocking fee *and* have the worry that the new STRIX OC won't clock at least as high as my Classified. I could swap it and end up with a card that only boosts to 1936mhz (factory OC for STRIX OC 1080). -- Not a risk I'm willing to take.

Anyway I *have my fingers crossed for you mate*. I am interested to see how you get on. Keep us posted 

Nick Peyton


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> I was in the same boat mate. But I was disappointed that I couldn't get to 2200. But what I was even more annoyed about is how ASUS owners get to take advantage of voltage up to the HARD limit. All other manufacturers are limited by BIOS.
> 
> Tomorrow is my last day to 'change my mind'. Distance selling regulations here are 14 days. But a 10% restock fee applies. (Since there was nothing wrong with card).
> 
> Lack of full waterblock annoyed me too. But it wasn't a deal breaker. The BIOS was though.
> 
> *However --* in the end I refuse to pay £70 restocking fee *and* have the worry that the new STRIX OC won't clock at least as high as my Classified. I could swap it and end up with a card that only boosts to 1936mhz (factory OC for STRIX OC 1080). -- Not a risk I'm willing to take.
> 
> Anyway I *have my fingers crossed for you mate*. I am interested to see how you get on. Keep us posted
> 
> Nick Peyton


What is your max OC now?
Ans yeah 70 is a lot of money to have z slight chance of having a better chip. If you are unlucky and get a potato... Then you just wasted a relatively good card and 70 pound.

Anyway pretty sure new card will beat 1936 stock boost and 2050 on OC, I still could have bzd luck but I stay positive.
I used the advanced RMA option so I will get the other one first








Gonna run some SLI benchmarks and test with 2 classy's for sure!
And do some OC on the new cards and post my results probably around end of the week!


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> EVGA hybrid kit is also made by Asetek. So the waterblock size should be the same as other brands like Corsair. If that's the case, it won't fit either without a shim. The problem is not the PCB or the screws wholes or whatever, it's the front plate that have 4 "fingers" near the GPU die, and those won't let the waterblock go near enough to make contact with the die.
> 
> I can assure you that ANY asetek cooler + a Kraken G10 will fit with the FTW if a 2mm or thicker shim is used.


I sent EVGA Joseph or whatever his name is a twitter message and he said they are still working on a AIO for the FTW, but they still did not have a valid release date at this time.


----------



## Derek1

Quote:


> Originally Posted by *TWiST2k*
> 
> I sent EVGA Joseph or whatever his name is a twitter message and he said they are still working on a AIO for the FTW, but they still did not have a valid release date at this time.


I asked Tech Support the same question and they told me that the Design team is keeping it confidential/don't trust them with that kind of info. lol


----------



## TWiST2k

Quote:


> Originally Posted by *Derek1*
> 
> I asked Tech Support the same question and they told me that the Design team is keeping it confidential/don't trust them with that kind of info. lol


I actually went back and my initial tweet was from August 11th lol, so I sent a follow up today asking if there has been any movement in 2 months.


----------



## Derek1

Quote:


> Originally Posted by *TWiST2k*
> 
> I actually went back and my initial tweet was from August 11th lol, so I sent a follow up today asking if there has been any movement in 2 months.


If not, then that would make them full of it.

I tried to pry info out of them about a Ti as well but got no info regarding that either.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> I tried to pry info out of them about a Ti as well but got no info regarding that either.


lol


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> What is your max OC now?
> Ans yeah 70 is a lot of money to have z slight chance of having a better chip. If you are unlucky and get a potato... Then you just wasted a relatively good card and 70 pound.
> 
> Anyway pretty sure new card will beat 1936 stock boost and 2050 on OC, I still could have bzd luck but I stay positive.
> I used the advanced RMA option so I will get the other one first
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna run some SLI benchmarks and test with 2 classy's for sure!
> And do some OC on the new cards and post my results probably around end of the week!


Max now is 2189. Reported in 3dmark online comparison as *2190mhz*.

I've finished my entire voltage curve now.. will post it up tomorrow. But the last few entries are:

1043mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *99.2% frame stable
1050mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
1062mv: +165 - 2176mhz ~~ passed 3dmark stress test: Firestrike Ultra at *97.8 frame stable
1075mv: +165 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.7 frame stable
1081mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.0 frame stable
1093mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable

/\ /\ /\ /\ imagine if I even had that *one extra voltage step* I keep moaning about on EVERY forum lol..... :-(

Just tried the last entry with a +530 memory and it passed too 

I'm hoping once I input the entire curve GPU boost 3.0 will reward me with 2202mhz in-game lol...

P.S. its taken me 2 weeks to complete that curve and stress test it at every single voltage point.

If I simply enter a +150mhz on the core clock offset (boosts to 2151mhz) it crashes nearly every time. I think Pascal is about giving it the "right voltage at the *right time*". 

.


----------



## zipper17

Quote:


> Originally Posted by *nrpeyton*
> 
> Max now is 2189. Reported in 3dmark online comparison as *2190mhz*.
> 
> I've finished my entire voltage curve now.. will post it up tomorrow. But the last few entries are:
> 
> 1043mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *99.2% frame stable
> 1050mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
> 1062mv: +165 - 2176mhz ~~ passed 3dmark stress test: Firestrike Ultra at *97.8 frame stable
> 1075mv: +165 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.7 frame stable
> 1081mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.0 frame stable
> 1093mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
> 
> /\ /\ /\ /\ imagine if I even had that *one extra voltage step* I keep moaning about on EVERY forum lol..... :-(
> 
> Just tried the last entry with a +530 memory and it passed too
> 
> I'm hoping once I input the entire curve GPU boost 3.0 will reward me with 2202mhz in-game lol...
> 
> P.S. its taken me 2 weeks to complete that curve and stress test it at every single voltage point.
> 
> If I simply enter a +150mhz on the core clock offset (boosts to 2151mhz) it crashes nearly every time. I think Pascal is about giving it the "right voltage at the *right time*".
> 
> .


interesting observation, this may also help other 10 series overclocking. I'm also getting crashes everytime.
do you mean locked the voltage (Ctrl+L) at certain corespeed? how did you do this? any ss to curve?


----------



## Iceman2733

OK guys need a little input, got rid of both of my watercooled 980ti and picked up 2ea EVGA 1080 FTW. I got one today one will be here Thursday and waterblocks next week sometime. Anyways I have pointed to a reddit thread and quite a few others that the EVGA 1080 FTW are plagued with issues a major and common being a "black screen and 100% fan speed" wanted to see if you guys had any info on this with these cards or if this has been fixed or if EVGA has even addressed? I have read through quite a bit of post and seems like all of the questions are negative and people just keep exchanging them or just switch card manufacture. I am going to test on air but quite a few of the people have stated the problems just occur don't want to deal with it if I don't have too. Anyways looking for some advice and maybe a little more information.


----------



## TWiST2k

Quote:


> Originally Posted by *Derek1*
> 
> If not, then that would make them full of it.


Got a reply from him today.


__ https://twitter.com/i/web/status/788124680225816576


----------



## Derek1

Quote:


> Originally Posted by *Iceman2733*
> 
> OK guys need a little input, got rid of both of my watercooled 980ti and picked up 2ea EVGA 1080 FTW. I got one today one will be here Thursday and waterblocks next week sometime. Anyways I have pointed to a reddit thread and quite a few others that the EVGA 1080 FTW are plagued with issues a major and common being a "black screen and 100% fan speed" wanted to see if you guys had any info on this with these cards or if this has been fixed or if EVGA has even addressed? I have read through quite a bit of post and seems like all of the questions are negative and people just keep exchanging them or just switch card manufacture. I am going to test on air but quite a few of the people have stated the problems just occur don't want to deal with it if I don't have too. Anyways looking for some advice and maybe a little more information.


I have the FTW, had it a month and never once had a problem with it.
OC's just fine. got 24K on FS graphics at 2135 with +400 on Mem, on Air never over 65C. Check my Valley and Heaven scores in my sig. I am getting another soon for SLI.
Havent tried any bios flashes or nrpeytons voltage curve he talks about above.


----------



## Derek1

Quote:


> Originally Posted by *TWiST2k*
> 
> Got a reply from him today.
> 
> 
> __ https://twitter.com/i/web/status/788124680225816576


Hopefully soon.


----------



## TWiST2k

Quote:


> Originally Posted by *Derek1*
> 
> Hopefully soon.


----------



## rintalahri

Quote:


> Originally Posted by *x-apoc*
> 
> I felt uneasy about my temps on 1080 ftw and bought better thermal pads and replaced them. This is what I found.
> 
> Before.
> 
> 
> 
> After.
> 
> 
> 
> My room temps is about 75f
> GPU idle temp 40c with out fan spin
> 
> While gaming its 58-60c on auto fan w/out adjusting fan speed and bit of overclock +100 core +500 mem.


LOL!! My 2 x FTW has normal pads, like your pic nro2
Pads was right places and everything was good. Those your pic1 looks like they are throw the pads into card


----------



## rintalahri

Has anyone find custom bios to 1080FTW?
would be awesome if someone makes that, like evga gtx980
has bios for water (maybe was nolimits or something). [email protected] 24/7 and works allways.


----------



## Derek1

Quote:


> Originally Posted by *rintalahri*
> 
> Has anyone find custom bios to 1080FTW?
> would be awesome if someone makes that, like evga gtx980
> has bios for water (maybe was nolimits or something). [email protected] 24/7 and works allways.




__ https://twitter.com/i/web/status/788063798737276929
Maybe a 1080 bios update is in the works very soon.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> I would love to go full waterblock but they're not available for 1080 Classified. EK nearly done one but they cancelled it completely.
> 
> hmm I see. I upgraded from 2 SLI 980's to a single 1080 Classified;
> 
> Could have skipped this generation but my excitement got the better of me lol.
> 
> Sold both for £450 so only really paid £250 for the 1080. This way I also spread the cost of my next upgrade too. (In 12 months my 1080 will still be worth £££
> 
> I will *not SLI again*... the support is pathetic. With the exception of 'The Witcher 3' which was fantastic on SLI.
> 
> My single 1080 is giving me about 5 FPS extra "average" at 4k over my twin 980's  The "minimum" is where I see the biggest difference
> The workload & temps on my CPU are also *much lower*
> 
> *Thanks for information about your water experience* -- what do you mean by "droop"? What were your temps like with universal block? My 1080 Classified normally pulls 250 watts (gaming) and never exceeds 55-58c


The back plate doesn't give enough support to keep the card flat see my 770s droop down about 15mm at the power end.

Bugger about the full classified waterblock being cancelled hopefully another company might release one. the same reason I also went with universal water blocks.

my temps max out at 56°C on GPU on the hottest day of about 30°C but that is to do with the architecture with the same clocks I would expect 80°C if were on air I would expect sub 45ºC with GTX 1080.


----------



## Koniakki

Quote:


> Originally Posted by *nrpeyton*
> 
> Max now is 2189. Reported in 3dmark online comparison as *2190mhz*.
> 
> I've finished my entire voltage curve now.. will post it up tomorrow. But the last few entries are:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1043mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *99.2% frame stable
> 1050mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
> 1062mv: +165 - 2176mhz ~~ passed 3dmark stress test: Firestrike Ultra at *97.8 frame stable
> 1075mv: +165 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.7 frame stable
> 1081mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.0 frame stable
> 1093mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
> 
> /\ /\ /\ /\ imagine if I even had that *one extra voltage step* I keep moaning about on EVERY forum lol..... :-(
> 
> Just tried the last entry with a +530 memory and it passed too
> 
> I'm hoping once I input the entire curve GPU boost 3.0 will reward me with 2202mhz in-game lol...
> 
> P.S. its taken me 2 weeks to complete that curve and stress test it at every single voltage point.
> 
> If I simply enter a +150mhz on the core clock offset (boosts to 2151mhz) it crashes nearly every time. I think Pascal is about giving it the "right voltage at the *right time*".
> 
> .


Would like to see it too.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> Max now is 2189. Reported in 3dmark online comparison as *2190mhz*.
> 
> I've finished my entire voltage curve now.. will post it up tomorrow. But the last few entries are:
> 
> 1043mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *99.2% frame stable
> 1050mv: +160 - 2151mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
> 1062mv: +165 - 2176mhz ~~ passed 3dmark stress test: Firestrike Ultra at *97.8 frame stable
> 1075mv: +165 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.7 frame stable
> 1081mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.0 frame stable
> 1093mv: +175 - 2189mhz ~~ passed 3dmark stress test: Firestrike Ultra at *98.5 frame stable
> 
> /\ /\ /\ /\ imagine if I even had that *one extra voltage step* I keep moaning about on EVERY forum lol..... :-(
> 
> Just tried the last entry with a +530 memory and it passed too
> 
> I'm hoping once I input the entire curve GPU boost 3.0 will reward me with 2202mhz in-game lol...
> 
> P.S. its taken me 2 weeks to complete that curve and stress test it at every single voltage point.
> 
> If I simply enter a +150mhz on the core clock offset (boosts to 2151mhz) it crashes nearly every time. I think Pascal is about giving it the "right voltage at the *right time*".
> 
> .


That's very interesting. Have you tested that curve without the lock on clock/voltge on real games rather than firestrike? Also, have you compared your graphics score from that curve to the one you'd get with the normal "max offset" OC? I found out I could get my card up to 2175mhz but the scores would be lower than the normal offset OC.


----------



## nrpeyton

Quote:


> Originally Posted by *zipper17*
> 
> interesting observation, this may also help other 10 series overclocking. I'm also getting crashes everytime.
> do you mean locked the voltage (Ctrl+L) at certain corespeed? how did you do this? any ss to curve?


Yes. You can "lock" the voltage at any voltage along the curve.

You need MSI Afterburner Beta 14.

There are no buttons on the beta yet to enter the new voltage curve menu.

*The voltage curve menu on MSI Afterburner Beta 14 is more precise/flexible than any other offerings*

1) So you need to press ctrl. and F

This brings up the voltage curve window.

2) Select a voltage point and press L

A vertical "yellow" line will show to mark your voltage point selection.

3) Plot the frequency offset you desire for that voltage. (Note: the voltage *will* stay the same but the actual boost frequency will depend on temperature as well as offset.

3)click the "tick" in MSI Afterburner main window to "apply".

If you want to lock a voltage higher than about 1050mv (think it was 1050 might be 1063mv) you still have to increase voltage overclock slider on main window.

4) use your desired stress tester to test for stability at each "locked voltage" and "locked offset".

Best to turn fan up to 100% when stress testing because there are different frequency steps for each offset - these are temp dependant.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes. You can "lock" the voltage at any voltage along the curve.
> 
> You need MSI Afterburner Beta 14.
> 
> There are no buttons on the beta yet to enter the new voltage curve menu.
> 
> *The voltage curve menu on MSI Afterburner Beta 14 is more precise/flexible than any other offerings*
> 
> 1) So you need to press ctrl. and F
> 
> This brings up the voltage curve window.
> 
> 2) Select a voltage point and press L
> 
> A vertical "yellow" line will show to mark your voltage point selection.
> 
> 3) Plot the frequency offset you desire for that voltage. (Note: the voltage *will* stay the same but the actual boost frequency will depend on temperature as well as offset.
> 
> 3)click the "tick" in MSI Afterburner main window to "apply".
> 
> If you want to lock a voltage higher than about 1050mv (think it was 1050 might be 1063mv) you still have to increase voltage overclock slider on main window.
> 
> 4) use your desired stress tester to test for stability at each "locked voltage" and "locked offset".
> 
> Best to turn fan up to 100% when stress testing because there are different frequency steps for each offset - these are temp dependant.


Ok, just for clarity.

From reading the thread on the EVGA forum, and info here, it seems that 1.063v is the point at which there is any beneficial increase in clocks or performance.Going above that to 1.7 or the max 1.093 will not get you stable increase in oc? Is that correct?

So it becomes a matter of just adjusting clock along the 1.063 axis until you get a stable max clock speed? Does this sound right? And then adjust the voltage curve in a nonlinear fashion down to base speeds? Did you also find that a flatter curve is best? Or was that someone else's finding?

What about the relationship of Mem clock to Core Clock? Did I read somewhere to set Mem Clock first? Getting its max stabilized before attempting Core/Voltage adjustment?

Thanks.
Keep up the good work!


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Ok, just for clarity.
> 
> From reading the thread on the EVGA forum, and info here, it seems that 1.063v is the point at which there is any beneficial increase in clocks or performance.Going above that to 1.7 or the max 1.093 will not get you stable increase in oc? Is that correct?
> 
> So it becomes a matter of just adjusting clock along the 1.063 axis until you get a stable max clock speed? Does this sound right? And then adjust the voltage curve in a nonlinear fashion down to base speeds? Did you also find that a flatter curve is best? Or was that someone else's finding?
> 
> What about the relationship of Mem clock to Core Clock? Did I read somewhere to set Mem Clock first? Getting its max stabilized before attempting Core/Voltage adjustment?
> 
> Thanks.
> Keep up the good work!


IME, going up to 1.093V does bring more room for overclock on the core and performance follows. It's just a very sensible job to find the sweet spot, one point on the curve too low or too high and your performance will decrease. I find that the extra heat and the thermal throttles it causes is not worth the increase to 1.093V. So I stick with the max offset overclock I can get at 1.062V. But increasing voltages works. You just have to handle the heat.

Memory overclock scales very well in performance, on mine I hit a performance wall at +575mhz, after that, every 10mhz the performance would decrease. So, at least for my card, +575mhz is the sweet spot. I also found out that it has a slight effect on maximum core clock, so I did prefer to max VRAM first and then moving to core clock, since +575mhz alone gave almost 1000 points in firestrike graphics.

It's all a matter of finding out what works with your card. On mine it was: VRAM first until performance starts to decrease, then mov to core clock at maximum clock possible @1.062V.


----------



## Djreversal

I have 2 SeaHawk EK water blocked 1080's.. They run the 8 pin and 6 pin.. Which I should be able to pull whatever power out that I need.. I want to get some more out of this.. I haven't really played much with stock overclock potential but I would like to get a bios that I could flash and maybe get some reliable stable gains out of it.. Any ideas how long till this type of option will be available.?

Thanks


----------



## outofmyheadyo

Nobody knows, could be never.


----------



## juniordnz

Quote:


> Originally Posted by *Djreversal*
> 
> I have 2 SeaHawk EK water blocked 1080's.. They run the 8 pin and 6 pin.. Which I should be able to pull whatever power out that I need.. I want to get some more out of this.. I haven't really played much with stock overclock potential but I would like to get a bios that I could flash and maybe get some reliable stable gains out of it.. Any ideas how long till this type of option will be available.?
> 
> Thanks


Try T4 bios. No temperature and power cap. Got me the highest score ever at firestrike graphics (25750). But you gotta be able to handle the heat (you can) and make sure your card has a solid energy system to handle all those watts.

If you search the thread you'll find the link to it a few pages back!


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> The back plate doesn't give enough support to keep the card flat see my 770s droop down about 15mm at the power end.
> 
> Bugger about the full classified waterblock being cancelled hopefully another company might release one. the same reason I also went with universal water blocks.
> 
> my temps max out at 56°C on GPU on the hottest day of about 30°C but that is to do with the architecture with the same clocks I would expect 80°C if were on air I would expect sub 45ºC with GTX 1080.


hmm begs the question if its even worth it for me when I'm already only hitting about 55-58. I suppose they're not expensive though so can't hurt to try it and see 

Did you lose your warranty taking it apart to fit your block?


----------



## LiquidHaus

To refresh my memory - didn't someone brick their FTW card when they flashed to the T4 bios?


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> That's very interesting. Have you tested that curve without the lock on clock/voltge on real games rather than firestrike? Also, have you compared your graphics score from that curve to the one you'd get with the normal "max offset" OC? I found out I could get my card up to 2175mhz but the scores would be lower than the normal offset OC.


Not yet mate.. only finished the curve late last night so not really had a chance to do any testing yet. Will hopefully be having a good look tonight though


----------



## juniordnz

Quote:


> Originally Posted by *lifeisshort117*
> 
> To refresh my memory - didn't someone brick their FTW card when they flashed to the T4 bios?


I had reported that, but I'll take it back.

I was using a display port that the T4 BIOS disables, so I would get no video. Instead of trying to change the DP, I assumed it was bricked and went through the whole recovery steps.

I'm sorry if I misled someone here, just don't use the display port on the rightmost part of the IO panel.


----------



## LiquidHaus

Quote:


> Originally Posted by *juniordnz*
> 
> I had reported that, but I'll take it back.
> 
> I was using a display port that the T4 BIOS disables, so I would get no video. Instead of trying to change the DP, I assumed it was bricked and went through the whole recovery steps.
> 
> I'm sorry if I misled someone here, just don't use the display port on the rightmost part of the IO panel.


Ah yes thank you for refreshing my memory. My FTW seems to favor that rightmost DP too. I'm sure the card will be fine when I switch it up.

I still plan on trying this T4 bios as soon as it's watercooled. Thanks.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm begs the question if its even worth it for me when I'm already only hitting about 55-58. I suppose they're not expensive though so can't hurt to try it and see
> 
> Did you lose your warranty taking it apart to fit your block?


One would assume you already got a custom water loop then in is too much expense, a couple of extra fittings, block, a bit of tube and coolant.
best not to mention going on water for warranty purposes I honestly say I am not really sure it differs between manufactures/ retailers accepting your claim.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Ok, just for clarity.
> 
> From reading the thread on the EVGA forum, and info here, it seems that 1.063v is the point at which there is any beneficial increase in clocks or performance.Going above that to 1.7 or the max 1.093 will not get you stable increase in oc? Is that correct?:


Not exactly; but after 1.063 your frequency won't scale up as well as you increase voltage. It really depends on your card.
1063mv gets me 2176mhz but 1075mv gets me 2189mhz.
1081mv and 1093mv still doesn't get me past 2189mhz but I'm pretty sure if I could modify the BIOS to allow 1100mv (1.100v) I could get to 2200.
There were other spots lower down my voltage/frequency ladder where I found myself "stuck" at the same frequency over 2 different voltages.

Also the cooler you can get your card, the more voltage it can take before it goes unstable. The scale on Pascal is about 100mhz per 50 degrees C. (The colder the microscopic pathways in your chip the better they conduct; therefore less electrons leak into adjacent pathways -- this leakage is what causes instability.

Its like being at 5ghz on CPU and needing a much bigger voltage boost just to get to 5.1. Than you would need to get from 4ghz to 4.1... 
Quote:


> Originally Posted by *Derek1*
> 
> So it becomes a matter of just adjusting clock along the 1.063 axis until you get a stable max clock speed? Does this sound right?


Yes that's it exactly  Stress test each frequency until you find your *maximum* frequency +offset for that voltage.
Then move onto the next voltage. So after you have found your max stable offset for 1063v, write it down, then repeat for 1075v.
Quote:


> Originally Posted by *Derek1*
> 
> What about the relationship of Mem clock to Core Clock? Did I read somewhere to set Mem Clock first? Getting its max stabilized before attempting Core/Voltage adjustment?


As juniordnz rightfully pointed out in last message; memory scales very well so YES I would absolutely recommend, find your best performing memory overclock first before you begin working on your curve. 

I never done it that way-- *wish I had now*. But a few last minute tests, last night, at my maximum 2189mhz with +530mhz memory *still passed* so hopefully I haven't ruined all my hard work by not doing the memory first.. lol 

Nick Peyton


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> I had reported that, but I'll take it back.
> 
> I was using a display port that the T4 BIOS disables, so I would get no video. Instead of trying to change the DP, I assumed it was bricked and went through the whole recovery steps.
> 
> I'm sorry if I misled someone here, just don't use the display port on the rightmost part of the IO panel.


Your EVGA, 1080 FTW works with the STRIX/T4?

When I flash that BIOS to my EVGA 1080 Classified it actually draws *less* watts. I get less FPS and a smaller score.

Really interested in how your FTW works with that BIOS yet my CLASSIFIED doesn't -- since we're both EVGA.. odd. :-(

I'd kill to get it working on my card lol

Starting to wish I'd just got the FTW. With full waterblock support I could have thrown [/B]anything[/B] at it. Would prolly of handled even t4/furmark lol.

If only I could get t4 working on my card I'd be happy. Seriously thinking of hard-moding power limit to see if that fixes T4 problem!


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Your EVGA, 1080 FTW works with the STRIX/T4?
> 
> When I flash that BIOS to my EVGA 1080 Classified it actually draws *less* watts. I get less FPS and a smaller score.
> 
> Really interested in how your FTW works with that BIOS yet my CLASSIFIED doesn't -- since we're both EVGA.. odd. :-(
> 
> I'd kill to get it working on my card lol


I'm curious....does the Classy voltage tool allow for higher voltages?


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> Your EVGA, 1080 FTW works with the STRIX/T4?
> 
> When I flash that BIOS to my EVGA 1080 Classified it actually draws *less* watts. I get less FPS and a smaller score.
> 
> Really interested in how your FTW works with that BIOS yet my CLASSIFIED doesn't -- since we're both EVGA.. odd. :-(
> 
> I'd kill to get it working on my card lol


Very different cards I guess. Classy's BIOS bricks my FTW, hard! Yet, T4 works extremely well. Got my best results with it, but the lack of monitoring on the power draw and the fact that it is "unlimitd" freaked me out a bit. I imagined my card pulling 500W, starting a fire, burning my whole building down and me being hold responsible for multiple murders...

Jokes apart, I'll definitely give it a try once I get my card watercooled. I won't put it through 1.093V on air.


----------



## khemist

https://imageshack.com/i/plU9N6ZLj

New cooler arrived, not got around to fitting it yet.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I'm curious....does the Classy voltage tool allow for higher voltages?


What? Where? How? When?? lol

Classy voltage tool?? that sounds like the solution to all my problems lol.... what is that mate? I have honestly never heard of it...?

Do you have a link by any chance? 

EDIT:
Just found this on a different thread. Just going to test the latest version now....

EDIT 2:
The link for download is down :-(


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> What? Where? How? When?? lol
> 
> Classy voltage tool?? that sounds like the solution to all my problems lol.... what is that mate? I have honestly never heard of it...?
> 
> Do you have a link by any chance?
> 
> EDIT:
> Just found this on a different thread. Just going to test the latest version now....
> 
> EDIT 2:
> The link for download is down :-(


I have a copy of an old one for Maxwell, but I doubt it works with pascal, unless they used the exact same controller.

Or, you can try this one....

http://www.overclock.net/t/1411500/official-evga-classified-k-ngp-n-owners-club


----------



## fat4l

We defo need some voltage tool.....
My card is sitting at about ~35C under load with 2190MHz ...I need more volts to heat it up


----------



## chiknnwatrmln

So is the T4 BIOS the best one? I'm not necessarily looking to push more voltage but I want to be able to draw more power on my FE. However, I've heard vague reports that the T4 BIOS yields lower scores per clock? Is this true or nonsense?


----------



## juniordnz

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> So is the T4 BIOS the best one? I'm not necessarily looking to push more voltage but I want to be able to draw more power on my FE. However, I've heard vague reports that the T4 BIOS yields lower scores per clock? Is this true or nonsense?


au contraire

it yields HIGHER scores clock per clock.

May not be optimal, but it's the only thing we have in hand so far.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I have a copy of an old one for Maxwell, but I doubt it works with pascal, unless they used the exact same controller.
> 
> Or, you can try this one....
> 
> http://www.overclock.net/t/1411500/official-evga-classified-k-ngp-n-owners-club


Thats where I went mate, keep getting an error message. Do you need priveleges to download on here?

Nick


----------



## nrpeyton

Any of you guys have experience with universal GPU blocks? My custom loop equipment arrives this weekend and I have everything picked out *except* a universal GPU block for my 1080 Classified.

The EK-Thermosphere apparently isn't compatible due to the increased "height" of the Classified... (if it wasn't bad enough already that EK cancelled the full water block for it)...

The thought of taking it apart e.t.c is actually scaring me a little.

I took apart an old 'ATI HD 5770' tonight just to get a feel for dismantling them lol...


----------



## chiknnwatrmln

Quote:


> Originally Posted by *juniordnz*
> 
> au contraire
> 
> it yields HIGHER scores clock per clock.
> 
> May not be optimal, but it's the only thing we have in hand so far.


Huh, might have to give this a try Saturday when I get some time. Thanks


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Any of you guys have experience with universal GPU blocks? My custom loop equipment arrives this weekend and I have everything picked out *except* a universal GPU block for my 1080 Classified.
> 
> The EK-Thermosphere apparently isn't compatible due to the increased "height" of the Classified... (if it wasn't bad enough already that EK cancelled the full water block for it)...
> 
> The thought of taking it apart e.t.c is actually scaring me a little.
> 
> I took apart an old 'ATI HD 5770' tonight just to get a feel for dismantling them lol...


Word is, that the 780 Classy blocks will work for the 1080 Classy. Couple of the guys on the EVGA forums got them working.

http://forums.evga.com/Waterblock-for-1080-Classified-m2485335-p7.aspx


----------



## turtletrax

cards.JPG 265k .JPG file
Got my FE Asus 1080's under the heatkiller blocks I got. Took me a month to get the time to polish them to a respectable luster. Turned out pretty awesome, and love the led glow.

Seems I can only get 2100mhz at 35c sustained. Not too shabby, but hoped for more. Might try T4, but probably won't notice in anything but benches


----------



## fat4l

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> So is the T4 BIOS the best one? I'm not necessarily looking to push more voltage but I want to be able to draw more power on my FE. However, I've heard vague reports that the T4 BIOS yields lower scores per clock? Is this true or nonsense?


Quote:


> Originally Posted by *juniordnz*
> 
> au contraire
> 
> it yields HIGHER scores clock per clock.
> 
> May not be optimal, but it's the only thing we have in hand so far.


No.

The true answer is.

T4 bios has unlimited TDP so your card will not TDP throttle, thus = more performance than stock bios(in most cases) because on stock bios your nvidia 1080 will hit the TDP limit.
However, if you have increased TDP(like me, done by hardmode) then T4 bios yelds LOWER scores than original Nvidia bios. So clock per clock scores = lower scores with T4, if nvidia bios has unlocked TDP.


----------



## ucode

Woot! If it's TDP throttling then clocks are dropping and it's no longer running clock for clock.

Yes, T4 has less performance clock for clock than FE for my FE card. The extra power available and extra voltage that allows an increase in clocks (unless you have a lemon) with the T4 means that it can outperform the FE VBIOS simply by running higher clocks and allowing greater power consumption.


----------



## juniordnz

Quote:


> Originally Posted by *fat4l*
> 
> No.
> 
> The true answer is.
> 
> T4 bios has unlimited TDP so your card will not TDP throttle, thus = more performance than stock bios(in most cases) because on stock bios your nvidia 1080 will hit the TDP limit.
> However, if you have increased TDP(like me, done by hardmode) then T4 bios yelds LOWER scores than original Nvidia bios. So clock per clock scores = lower scores with T4, if nvidia bios has unlocked TDP.


I'll have to disagree with you, mate. Firestrike Bench won't even get close to the 130% TDP limit allowed by FTW's slave BIOS (280W). I can go through the whole thing without a single drop in clocks (on cold days lol). On that situation, with 2114mhz stable throughout the whole test, T4 got better results at the same clock/voltage. What you said might be true for cards that reaches the TDP limit easily, but not for the FTW.

Difference was like 200 points clock for clock, and 350 points maxed out on T4 BIOS. But since I couldn't handle the heat, I went back to stock.


----------



## OZrevhead

Ucode do you have t4 on your FE or do you mean comparing strix to your FE?


----------



## ucode

T4 VBIOS cross flashed to my FE. It's okay for squeezing the last drop out of benches but nothing personally that would be really missed. YMMV

Something else to watch out for is the video clock. Some cards with higher base and boost clocks may have a lower video clock. The original 1.2V strix VBIOS video clock was limited to something like 1708.5MHz while the T4 video clock goes a fair bit higher. If you want to see the effect of video clock try forcing a P-State such as P0 or P2. and compare with normal clocks with same GPU and Memory clocks. IIRC elmor said the T4 gained something like 400 points over the previous strix 1.2V VBIOS with same GPU and Mem clocks in firestrike.


----------



## fat4l

Quote:


> Originally Posted by *juniordnz*
> 
> I'll have to disagree with you, mate. Firestrike Bench won't even get close to the 130% TDP limit allowed by FTW's slave BIOS (280W). I can go through the whole thing without a single drop in clocks (on cold days lol). On that situation, with 2114mhz stable throughout the whole test, T4 got better results at the same clock/voltage. What you said might be true for cards that reaches the TDP limit easily, but not for the FTW.
> 
> Difference was like 200 points clock for clock, and 350 points maxed out on T4 BIOS. But since I couldn't handle the heat, I went back to stock.


yep, im talking about FE


----------



## Vellinious

Power limit throttling, not TDP. Just sayin.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Power limit throttling, not TDP. Just sayin.


Care to elaborate, mate? I'm not sure what argument you're trying to make.


----------



## Tdbeisn554

I just put my little girls to sleep...


----------



## nrpeyton

*EVGA Classified 1080 Owners:*

*I've been emailing all known suppliers of water blocks etc. trying to see if we could get some water-block support going for CLASSIFIED 1080 and looks like my email to Alpha-cool paid off *

This is the first time I've heard about block support for 1080 Classified *anywhere*

See below:

*=====================================================*
Hi Nicholas,

We are working already on a block for this card which will be available at our shop within the *next 5 to 6 weeks.*

The description of the block is : Eiswolf GPX-N 1080pro-M05

Have a great day.

Bitte beachten Sie dass wir uns 10 Tage an unsere Angebote gebunden fühlen.
Please note that we shall remain bound by our offer for 10 days Mit freundlichen Grüßen / Best Regards

Taner Demirci
Alphacool International GmbH
Marienberger Str. 1
D-38122 Braunschweig/Germany

Telefon/Phone: +49 531 28874-0
Durchwahl/Direct dial: +49 531 28874-0
Fax: +49 531 28874-22

E-Mail: [email protected]
MSN:
Homepage: http://www.alphacool.de / http://www.alphacool.com

Bankverbindung/Bank connection:
Volksbank eG Brawo
Am Mühlengraben 1
38440 Wolfsburg
Kto.: 1390465000
Blz.: 26991066
IBAN: DE31269910661390465000
BIC/SWIFT: GENODEF1WOB

Handelsregister/Commercial register:
Amtsgericht Braunschweig HRB 202390

Geschäftsführer/CEO:
CEO Fabian Noelte

Steuer Nr./Tax number:
13/207/02047

UST.ID.Nr.:
DE270458421

*============================================================*

Ursprüngliche Nachricht

Von: [email protected] [mailto:[email protected]]
Gesendet: Dienstag, 18. Oktober 2016 22:46
An: [email protected]
Betreff: Inquiry by contact form

Kunde: Mr. Nicholas Peyton
eMail: [email protected]
Telefon: 077290######
Betreff: Compatible waterblock - EVGA CLASSIFIED 1080

Kommentar:

*Dear Alphacool,*

I am trying to find a compatible waterblock for the EVGA CLASSIFIED 1080 Graphics Card.

EK was initially meant to be making a full-cover waterblock for this card (as they did with all previous EVGA Classified models) however they recently cancelled their plans to make a water block for this model.

You can see from the link below:
https://www.ekwb.com/news/official-list-of-ek-water-blocks-for-gtx-1080-series/

There is a growing movement online urging a company to make a waterblock for this card.

Due to the greater height of the Classified some of the most recent universal waterblocks are not even compatible (for example: "EK-Thermosphere ".

Because of the lack of a full cover block AND lack of universal block support for the EVGA 1080 CLASSIFIED; I believe there is a very big gap in the market for the CLASSIFIED.

The Classified 1080 is also EVGA's flagship 1080 model.

I firmly believe if you manufactured a waterblock the enthusiast PC community would definitely remember this.

I recently watched a video by "J's 2 cents" saying Alphacool was "reaching out" to the community so I thought it would be a good idea to get in contact.

I am very active on many enthusiast forums on the GPU scene.

Thanks for your time; I look forward to hearing your reply.

Nick Peyton


----------



## fat4l

Anyone wanna beat my scores ?









http://www.3dmark.com/3dm/15528227?

*8 629 Graphics Score.*
GTX 1080 FE - 2202/11016MHz (TDP increased)
i7 4790K @5100MHz
The rest in the sig.


----------



## khemist

https://imageshack.com/i/poLrFODkj

https://imageshack.com/i/plwPeNHxj

https://imageshack.com/i/poyc9P1yj

Got the new cooler installed.


----------



## juniordnz

@khemist

That's what? A founders edition?


----------



## ucode

Quote:


> Originally Posted by *fat4l*
> 
> Anyone wanna beat my scores ?


W7 here









What about something a little different

GPUPI 2.3.4

1B digits ,GPU CUDA 8.0


----------



## Derek1

Quote:


> Originally Posted by *fat4l*
> 
> Anyone wanna beat my scores ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15528227?
> 
> *8 629 Graphics Score.*
> GTX 1080 FE - 2202/11016MHz (TDP increased)
> i7 4790K @5100MHz
> The rest in the sig.


5.1 @ 1.348v? Am I reading that right?


----------



## khemist

Quote:


> Originally Posted by *juniordnz*
> 
> @khemist
> 
> That's what? A founders edition?


It is.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Care to elaborate, mate? I'm not sure what argument you're trying to make.


Saying that the power limit and the TDP are different. You'll get throttling from the power limit, not from the TDP. Semantics, I know, but....they are all together different.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Saying that the power limit and the TDP are different. You'll get throttling from the power limit, not from the TDP. Semantics, I know, but....they are all together different.


Oh, I get it now. That's what I meant, I just used the wrong words hehe (not a native speaker of the language...)


----------



## fat4l

Quote:


> Originally Posted by *Vellinious*
> 
> Saying that the power limit and the TDP are different. You'll get throttling from the power limit, not from the TDP. Semantics, I know, but....they are all together different.


Ah yes. of course








FE is being limited by POWER LIMIT.


----------



## fat4l

Quote:


> Originally Posted by *Derek1*
> 
> 5.1 @ 1.348v? Am I reading that right?


Of course you are







That's my 24/7 setting..








Quote:


> Originally Posted by *ucode*
> 
> W7 here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What about something a little different
> 
> GPUPI 2.3.4
> 
> 1B digits ,GPU CUDA 8.0


Bah







it's asking for some "Error: High Precision Event Timer for time measurement not found!"
what's your scores on this one and ...... is it light bench ? Can you OC higher than usual with this test ?


----------



## Koniakki

Quote:


> Originally Posted by *Derek1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> Anyone wanna beat my scores ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15528227?
> 
> *8 629 Graphics Score.*
> GTX 1080 FE - 2202/11016MHz (TDP increased)
> i7 4790K @5100MHz
> The rest in the sig.
> 
> 
> 
> 
> 
> 5.1 @ 1.348v? Am I reading that right?
Click to expand...

Quote:


> Originally Posted by *fat4l*
> 
> Of course you are
> 
> 
> 
> 
> 
> 
> 
> That's my *24/7* setting..




From SiliconLottery or pure blessing from the HW gods?


----------



## nrpeyton

Quote:


> Originally Posted by *khemist*
> 
> https://imageshack.com/i/poLrFODkj
> 
> https://imageshack.com/i/plwPeNHxj
> 
> https://imageshack.com/i/poyc9P1yj
> 
> Got the new cooler installed.


The inside of your case is immaculate lol I like it.


----------



## Derek1

Quote:


> Originally Posted by *fat4l*
> 
> Of course you are
> 
> 
> 
> 
> 
> 
> 
> That's my 24/7 setting..
> 
> 
> 
> 
> 
> 
> 
> 
> Bah
> 
> 
> 
> 
> 
> 
> 
> it's asking for some "Error: High Precision Event Timer for time measurement not found!"
> what's your scores on this one and ...... is it light bench ? Can you OC higher than usual with this test ?


What are you using to cool it with if I may ask?


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> 
> I just put my little girls to sleep...


**** sake, lol ha


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> **** sake, lol ha


Sorry







but I just could not miss this opportunity haha


----------



## ucode

Quote:


> Originally Posted by *fat4l*
> 
> Of course you are
> 
> 
> 
> 
> 
> 
> 
> That's my 24/7 setting..
> 
> 
> 
> 
> 
> 
> 
> 
> Bah
> 
> 
> 
> 
> 
> 
> 
> it's asking for some "Error: High Precision Event Timer for time measurement not found!"
> what's your scores on this one and ...... is it light bench ? Can you OC higher than usual with this test ?


Quote:


> Originally Posted by *the link*
> 
> *Error: High Precision Event Timer for time measurement not found!*
> 
> GPUPI needs a timer with a very high resolution to ensure that the time measurement for a benchmark run is precise. Therefor you need to have the High Precision Event Timer (HPET) enabled in your BIOS settings and your system settings. To check the status of the latter, open up a command prompt with administration rights and run:
> 
> SHELL:
> 
> bcdedit /enum
> 
> The value of useplatformclock should be "Yes". If it's not, you can fix this by running:
> 
> SHELL:
> 
> bcdedit /set useplatformclock yes
> 
> A reboot might be necessary afterwards.


Some scores can be found here, ignore the Titan / LN2 ones.

It can be a little bursty and you may find GPU can clock higher. Start of with lower clocks to get a feel for it.


----------



## nrpeyton

Its only 9 degrees C here outside; I'm thinking of unplugging the PC and taking it outside for a minute on an extension, to see if I can get to 2200mhz. Highest max now is 2189.

Am I mad or sad or should I just not bother?

I will get a feel for what a water-block will do for my card maybe.


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Sorry
> 
> 
> 
> 
> 
> 
> 
> but I just could not miss this opportunity haha


lol whats your max clock then mate you installed it yet?


----------



## OZrevhead

Quote:


> Originally Posted by *ucode*
> 
> T4 VBIOS cross flashed to my FE. It's okay for squeezing the last drop out of benches but nothing personally that would be really missed. YMMV
> 
> Something else to watch out for is the video clock. Some cards with higher base and boost clocks may have a lower video clock. The original 1.2V strix VBIOS video clock was limited to something like 1708.5MHz while the T4 video clock goes a fair bit higher. If you want to see the effect of video clock try forcing a P-State such as P0 or P2. and compare with normal clocks with same GPU and Memory clocks. IIRC elmor said the T4 gained something like 400 points over the previous strix 1.2V VBIOS with same GPU and Mem clocks in firestrike.


So how much more core clock did you get with t4 on your FE?


----------



## T3MP3R3D

Can someone tell me what is this on my card? mold or rust or what?


----------



## nrpeyton

Quote:


> Originally Posted by *OZrevhead*
> 
> So how much more core clock did you get with t4 on your FE?


also interested to know this


----------



## Vellinious

Quote:


> Originally Posted by *taskforce809*
> 
> Can someone tell me what is this on my card? mold or rust or what?


Hard to tell exactly what it is from the pictures, but....it sure looks like the nickel plating is corroding.

What kinds of fittings, blocks, rads, etc are you running in your loop? And...what coolant is that?


----------



## T3MP3R3D

Quote:


> Originally Posted by *Vellinious*
> 
> Hard to tell exactly what it is from the pictures, but....it sure looks like the nickel plating is corroding.
> 
> What kinds of fittings, blocks, rads, etc are you running in your loop? And...what coolant is that?


Fittings: Bitspower and 2 45 degrees EK
Radiators: 1 aquacomputer 360, 1 EK 360 and 1 Alphacool 280
Tubing: 12mm
Coolant: Primochill uv brite green

So, This means what?? Drain everything, replace the cards or what?? Am kind of worry now


----------



## x-apoc

oxidation


----------



## galeonki

Quote:


> Originally Posted by *fat4l*
> 
> Anyone wanna beat my scores ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15528227?
> 
> *8 629 Graphics Score.*
> GTX 1080 FE - 2202/11016MHz (TDP increased)
> i7 4790K @5100MHz
> The rest in the sig.


Yep, catch

http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/spy/P/2005/1085/8200?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K


----------



## Vellinious

Quote:


> Originally Posted by *taskforce809*
> 
> Fittings: Bitspower and 2 45 degrees EK
> Radiators: 1 aquacomputer 360, 1 EK 360 and 1 Alphacool 280
> Tubing: 12mm
> Coolant: Primochill uv brite green
> 
> So, This means what?? Drain everything, replace the cards or what?? Am kind of worry now


That's the MSI Seahawk GPU, yes? I'd email their customer service with those pics.....it looks to me like the blocks were poorly plated. They'll also probably want to know what components are in your loop.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> lol whats your max clock then mate you installed it yet?


Yeah installed it, kinda a horror story too: I now have some lightbleed on the "E" from the EVGA logo. So I went to the EVGA chat and talked to the agent. he told me to swap the coolers from the to rma card to the new one, so I started (wish I had not done that). all the screws no problem, then when I took off the backplate it was a bit gluey(no idea what) and then the cooler removing the led and fan connectors were a pain! especially the fan one it was secured by the hand of go I was tinkering and trying to get it loose and after my nails almost broke and I cut my hand it was finally loose (connector's plastic is a bit damaged now). Then the 2nd card.. same problem only that that fan cable was probably glued or soldered by god this time I spend at least 20min trying to get it loose used a screwdriver, pliers,... (cut my hand again plus I was shaking from the nerves) and I gave up and re assembled while having problems here and there. I was really afraid I murdered both cards... and I am pretty sure there is blood on them too... I tested them, fans still spin and leds work, both cards are ok (I hope) .

Now the good part, this 1080 is kinda good (other than the lightbleed then) I got to 2150MHz (No idea how I did that tbh, and pretty sure I could go higher too just no idea how) Just tinkered with Precision a bit but I actually have no idea how to dial in an overclock. When I set GPU clock offset to 150card just goes to 2000 max (mostly even around 1980) and when I set it to 175 It freezes.. like what? When I set 150 offset and set manual curve on 150 offset I get around 2150 but then drops to 2121 or something (again no idea why) Fan speed is at 100% with 130% power target and max voltage thing

So if anyone has some tips/tricks or other help to overclock this beast please let me know


----------



## Lays

Same thing happened to my Seahawk, cept ALL the plating came off inside the block. I'd rma but it doesn't really effect anything, already rinsed loop out so eh..


----------



## Vellinious

Quote:


> Originally Posted by *Lays*
> 
> Same thing happened to my Seahawk, cept ALL the plating came off inside the block. I'd rma but it doesn't really effect anything, already rinsed loop out so eh..


Didn't EK make those blocks? Makes me wonder if they went back to skimping on nickel plating.....they had issues with that a few years back.


----------



## fat4l

Quote:


> Originally Posted by *Koniakki*
> 
> 
> 
> From SiliconLottery or pure blessing from the HW gods?


SL of course








Quote:


> Originally Posted by *ucode*
> 
> Some scores can be found here, ignore the Titan / LN2 ones.
> 
> It can be a little bursty and you may find GPU can clock higher. Start of with lower clocks to get a feel for it.


Ok thanks. Will look into it if I have time








Quote:


> Originally Posted by *Derek1*
> 
> What are you using to cool it with if I may ask?


Custom water cooling. EK supremacy evo nickel, liquid metal paste, naked die(no ihs), 2x ek D5 lots of rads...etc








Quote:


> Originally Posted by *galeonki*
> 
> Yep, catch
> 
> http://www.3dmark.com/search#/?url=/proxycon/ajax/search/cpugpu/spy/P/2005/1085/8200?minScore=0&gpuName=NVIDIA_GeForce_GTX_1080&mode=advanced&cpuName=Intel_Core_i7-6700K


Dat core clock...not possible here


----------



## outofmyheadyo

Got my hands on msi gaming x 1080 and this thing coil whines like it's his job, its unbearable. Does msi offer rma for unusable cards like that? I mean it wasnt cheap and I cant use it without going crazy. If it's a waste of time to rma, i'll just have to sell it.
My psu is supernova G2 750 so im sure it's not cqused by that, I also tried my comp in different rooms and poweroutlets in my apartment and nothing helped.


----------



## T3MP3R3D

Quote:


> Originally Posted by *Vellinious*
> 
> That's the MSI Seahawk GPU, yes? I'd email their customer service with those pics.....it looks to me like the blocks were poorly plated. They'll also probably want to know what components are in your loop.


Yes MSI Seahawk. Its funny, because I just opened an RMA due to coil whine on both cards. Thanks for the info


----------



## T3MP3R3D

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Got my hands on msi gaming x 1080 and this thing coil whines like it's his job, its unbearable. Does msi offer rma for unusable cards like that? I mean it wasnt cheap and I cant use it without going crazy. If it's a waste of time to rma, i'll just have to sell it.
> My psu is supernova G2 750 so im sure it's not cqused by that, I also tried my comp in different rooms and poweroutlets in my apartment and nothing helped.


Are all MSI like that? This is my first set of MSI and going thru the same **** with my Seahawks, already opened RMA.


----------



## partypoison25

Two msi gaming x here no coil whine at all.


----------



## Dry Bonez

So,coming from a gtx 580 to EVGA SC 1080, my card will NOT,i repeat,NOT go beyond 2000mhz, is this something to be upset at? I feel inferior to everyone when i read all these comments of acheiving 2100+. So as i said, is this something i should be worried about?


----------



## jleslie246

Does anyone know why EK canceled their water block for the 1080 Classified?


----------



## ucode

Quote:


> Originally Posted by *OZrevhead*
> 
> So how much more core clock did you get with t4 on your FE?


From 1.094V to 1.2V saw a little on the wrong side of 100MHz increase. For my card that's about 2.2GHz depending on driver. Would be interesting to see what 1.3V could achieve, just out of curiosity.

Quote:


> Originally Posted by *Archang3l*
> 
> I was really afraid I murdered both cards... and I am pretty sure there is blood on them too... I tested them, fans still spin and leds work, both cards are ok (I hope) .


That's no way to treat the girls, sounds like child welfare need a call.









Quote:


> Originally Posted by *fat4l*
> 
> Ok thanks. Will look into it if I have time


That's okay, I understand. Nothing wrong with being afraid.









Quote:


> Originally Posted by *jleslie246*
> 
> Does anyone know why EK canceled their water block for the 1080 Classified?


Usually comes down to money, profit / loss. Maybe not worth it if it already runs fairly cool as is and demand would be low.


----------



## TWiST2k

Quote:


> Originally Posted by *jleslie246*
> 
> Does anyone know why EK canceled their water block for the 1080 Classified?


IMO, because this round the Classified is a waste of money, and them producing a waterblock for it would not equate to the amount of sales they would receive. I have a 980 Ti Classifed and its a great card, but it was all about the FTW for the 1080s.


----------



## Vellinious

Quote:


> Originally Posted by *TWiST2k*
> 
> IMO, because this round the Classified is a waste of money, and them producing a waterblock for it would not equate to the amount of sales they would receive. I have a 980 Ti Classifed and its a great card, but it was all about the FTW for the 1080s.


The higher power limit is nice, and if the classy voltage tool works to increase voltage past that 1.093v barrier, then there's something else. If there were a pascal bios editor, I'd tend to agree with you, unless you were going sub-ambient, but.....without the ability to dial in a bios, the classy isn't looking like that bad of an option.


----------



## OZrevhead

Im testing T4 bios now then going back to a 980Ti as its fast enough for me and they are half the price here.

Actually my T4 bios is only showing 1.131v, how do I get to 1.25v ?

Cheers.


----------



## fat4l

Quote:


> Originally Posted by *ucode*
> 
> That's okay, I understand. Nothing wrong with being afraid.


Being affraid of what ? Being killed in a war or what? Then I would understand it but ....being affraid of submitting scores ? rly


----------



## Spiriva

Quote:


> Originally Posted by *OZrevhead*
> 
> Im testing T4 bios now then going back to a 980Ti as its fast enough for me and they are half the price here.
> 
> Actually my T4 bios is only showing 1.131v, how do I get to 1.25v ?
> 
> Cheers.


You need to use the graph overclocking way to get the volt up. CTRL +F if you are using msi afterburner.


----------



## TWiST2k

Quote:


> Originally Posted by *Vellinious*
> 
> The higher power limit is nice, and if the classy voltage tool works to increase voltage past that 1.093v barrier, then there's something else. If there were a pascal bios editor, I'd tend to agree with you, unless you were going sub-ambient, but.....without the ability to dial in a bios, the classy isn't looking like that bad of an option.


I hear ya man, just my take on the matter, but we all have a different outlook.


----------



## Lays

Everybody be talkin about 1.093v and here I am trying to wonder why I can't get past 1.062v









I miss my 980 ti matrix sometimes, lol.


----------



## OZrevhead

Quote:


> Originally Posted by *Spiriva*
> 
> You need to use the graph overclocking way to get the volt up. CTRL +F if you are using msi afterburner.


I got the graph up, tuned it but the gpu clocks dont follow the curve or I get no overclock or voltage increase. I am doing something wrong , any ideas what?










I hit 20k Fire Strike so Im happy anyway (1.113v, 2139MHz)


----------



## juniordnz

Quote:


> Originally Posted by *OZrevhead*
> 
> I got the graph up, tuned it but the gpu clocks dont follow the curve or I get no overclock or voltage increase. I am doing something wrong , any ideas what?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hit 20k Fire Strike so Im happy anyway (1.113v, 2139MHz)


What's your graphic score with that OC? That's the one that tells a story here...

All that voltage for 2139mhz seems a lot, BTW. And I'd also advise you to start with memory OC and after that move to core, that yields more performance than the last 13-26mhz on core.


----------



## Tdbeisn554

Quote:


> Originally Posted by *ucode*
> 
> That's no way to treat the girls, sounds like child welfare need a call.
> 
> 
> 
> 
> 
> 
> 
> 
> .


Well I know I feel really bad...


----------



## ucode

Quote:


> Originally Posted by *fat4l*
> 
> Being affraid of what ? Being killed in a war or what? Then I would understand it but ....being affraid of submitting scores ? rly


lol. Just playing with you.


----------



## T3MP3R3D

Backplate for FTW will fit on the classified??


----------



## Casper123123123

Hi Guys,

Could you please let me know where I can download Asus GTX 1080 VBIOS for Founders Edition?


----------



## nrpeyton

Quote:


> Originally Posted by *ucode*
> 
> From 1.094V to 1.2V saw a little on the wrong side of 100MHz increase. For my card that's about 2.2GHz depending on driver. Would be interesting to see what 1.3V could achieve, just out of curiosity.
> That's no way to treat the girls, sounds like child welfare need a call.
> 
> 
> 
> 
> 
> 
> 
> 
> That's okay, I understand. Nothing wrong with being afraid.
> 
> 
> 
> 
> 
> 
> 
> 
> Usually comes down to money, profit / loss. Maybe not worth it if it already runs fairly cool as is and demand would be low.


How did you get to 1.2v?

I have the Classified 1080 and I can't get past 1.093v despite trying anything I can think of, except physical voltage hardware modification, which I have no idea how to do.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I have a copy of an old one for Maxwell, but I doubt it works with pascal, unless they used the exact same controller.
> 
> Or, you can try this one....
> 
> http://www.overclock.net/t/1411500/official-evga-classified-k-ngp-n-owners-club


Tried both versions, even ran them as administrators and also even checked if it was maybe working but not reporting it on MSI AB but nope.. no change at all.

disappointed :-(


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Yeah installed it, kinda a horror story too: I now have some lightbleed on the "E" from the EVGA logo. So I went to the EVGA chat and talked to the agent. he told me to swap the coolers from the to rma card to the new one, so I started (wish I had not done that). all the screws no problem, then when I took off the backplate it was a bit gluey(no idea what) and then the cooler removing the led and fan connectors were a pain! especially the fan one it was secured by the hand of go I was tinkering and trying to get it loose and after my nails almost broke and I cut my hand it was finally loose (connector's plastic is a bit damaged now). Then the 2nd card.. same problem only that that fan cable was probably glued or soldered by god this time I spend at least 20min trying to get it loose used a screwdriver, pliers,... (cut my hand again plus I was shaking from the nerves) and I gave up and re assembled while having problems here and there. I was really afraid I murdered both cards... and I am pretty sure there is blood on them too... I tested them, fans still spin and leds work, both cards are ok (I hope) .
> 
> Now the good part, this 1080 is kinda good (other than the lightbleed then) I got to 2150MHz (No idea how I did that tbh, and pretty sure I could go higher too just no idea how) Just tinkered with Precision a bit but I actually have no idea how to dial in an overclock. When I set GPU clock offset to 150card just goes to 2000 max (mostly even around 1980) and when I set it to 175 It freezes.. like what? When I set 150 offset and set manual curve on 150 offset I get around 2150 but then drops to 2121 or something (again no idea why) Fan speed is at 100% with 130% power target and max voltage thing
> 
> So if anyone has some tips/tricks or other help to overclock this beast please let me know


What you could try doing.. if you want to find your max stable without too much hassle -- is download MSI Afterburner Beta 14 and then try this:

1. Press Cntrl. F to bring up voltage curve window.
2. Click the "plot" point" for 1.093v (highest we can go)
3. Press "L" (you'll see a yellow line appear on screen after u press L)
4. Plot a point on the graph for 1.093 (say +150)
5. Raise voltage and power and fan to 100% on main window.
6. Click "tick" on main window to confirm everything
7. Test for stability
8. Repeat but try a +165, +175, +180 etc etc until you get to your highest stable.

By doing it this way you are "locking" onto each voltage so it won't jump around. Max clock might still jump a bit but only by one or two "13 MHz steps" which is temp dependant.


----------



## nrpeyton

Quote:


> Originally Posted by *Casper123123123*
> 
> Hi Guys,
> 
> Could you please let me know where I can download Asus GTX 1080 VBIOS for Founders Edition?


https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803

Be careful, this BIOS removes power limit *and temp limit* and enables voltage up to 1.2v. If you're not careful and if you don't monitor temps you could destroy your card. Because the temp limit/tdp limit is completely removed the fan doesn't always come on automatically as card heats up.

Wish I could use it with my Classified but its not compatible :-(


----------



## nrpeyton

Anyone got experience with RMA'ing?

I bought my Classified from Scan.co.uk a few weeks ago.

Distance selling regulations here are 14 days so I made sure I sent off an RMA request the day before yesterday (just incase I change my mind).. that gives me another 10 days to send the card to them (and make my decision). (As long as you notify you are okay).. I've been issued with a code and everything.

*But.. obviously I have opened it* so plastic packaging is going to be ripped etc and the original box; while in perfect condition and not ripped -- there are still slight marks where I've peeled the tape off... etc

In other words its *not exactly* how it was when sold -- because its been opened.

I'm just not happy because everyone else seems to be taking advantage of the STRIX/T4 BIOS... but I can't... and I actually have the *ONE CARD THAT SHOULD* -- more than any other card be able to use it. It is the Classified.. its meant to have overclocking in its blood.... yet everyone else is able to use the T4 for extra voltage *EXCEPT* me.

I don't want to hear anymore about how voltage doesn't scale as well on pascal.... it *does STILL* scale .. just not as much as it used to.. and that trend is always going to get worse.. not better; so no point reading another post about it. I just read a post someone got an extra 100mhz between 1.093 and 1.2. So there we have it --.

*Anyway I have never RMA'd anything before?? can anyone give me any advice on what to expect?*

I'm in the U.K btw.. Scotland to be exact 

The Classified cost me £710 which is equivalent to exactly $870.80 U.S dollars.

Thanks?


----------



## nrpeyton

Quote:


> Originally Posted by *jleslie246*
> 
> Does anyone know why EK canceled their water block for the 1080 Classified?


Not sure -- but *Alphacool are making one*. It will be ready in 5 weeks.

Just got an email from them yesterday confirming this:

Here is a link to a post I made about it on another forum (emails included)

https://www.techpowerup.com/forums/threads/evga-classified-1080-owners-waterblock.226959/#post-3541516


----------



## nrpeyton

Quote:


> Originally Posted by *TWiST2k*
> 
> IMO, because this round the Classified is a waste of money, and them producing a waterblock for it would not equate to the amount of sales they would receive. I have a 980 Ti Classifed and its a great card, but it was all about the FTW for the 1080s.


What would you of had EVGA do -- which they didn't do? U say Classified's a waste of money this round but what would you of had EVGA do differently?

I am interested because I have a Classified 1080 and I am thinking of RMA'ing it and getting something else; but I am struggling to decide.

What did EVGA miss out?


----------



## chiknnwatrmln

Holy septuple post. Use the edit feature.


----------



## nrpeyton

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Holy septuple post. Use the edit feature.


If you look at my posts they are all quotes/replies to different threads discussing different issues.

I've just came on after being off for 24hrs so had a lot to catch up on. I could have waited between posts I suppose. sorry. lol.


----------



## outofmyheadyo

why does it take 5 weeks to manufacture a waterblock ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *nrpeyton*
> 
> If you look at my posts they are all quotes/replies to different threads discussing different issues.
> 
> I've just came on after being off for 24hrs so had a lot to catch up on. I could have waited between posts I suppose. sorry. lol.


Hit multi after each post you want to quote and then hit quote after the last post you want to respond to.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *outofmyheadyo*
> 
> why does it take 5 weeks to manufacture a waterblock ?


Well first you have to CAD it up and do the engineering, make sure all the tolerances are correct, etc etc. Then make a prototype or two (or five..) to make sure it works, then send it out for high volume production, then get in the parts, then assemble and package them. Then they're ready to go.

At least that's my guess. If the guys doing the engineering are good enough they might get it on the first iteration.


----------



## OZrevhead

Junior - I got about 25300 iirc, 1.13 is what the t4 gives me before touching the curve, once I touch it my results get worse.

:lol:


----------



## Derek1

Hopefully there won't be much of a stampede for this.










__ https://twitter.com/i/web/status/789161236520771588
Line starts behind me!









ETA Damn! I thought it meant the Hybrid Kits. But is only the Hydro Copper and Block. Can the Kits be far behind?


----------



## juniordnz

Quote:


> Originally Posted by *OZrevhead*
> 
> Junior - I got about 25300 iirc, 1.13 is what the t4 gives me before touching the curve, once I touch it my results get worse.
> 
> :lol:


That's a lot of voltage for 25300 points, mate. I get 25400 with stock 1.062V. There must be something wrong with your curve, try fiddling with it a little bit more to see if you can find a sweet spot.


----------



## smicha

If I could join the club ?


----------



## juniordnz

Quote:


> Originally Posted by *smicha*
> 
> If I could join the club ?


Holy christ! Wish I could see some minecraft running on that rig!


----------



## OZrevhead

[quote name="juniordnz"

Quote:


> Originally Posted by *juniordnz*
> 
> That's a lot of voltage for 25300 points, mate. I get 25400 with stock 1.062V. There must be something wrong with your curve, try fiddling with it a little bit more to see if you can find a sweet spot.


This is the third bios I have tried, I think the FE bios scored over 25k too. Im not fussed, I down traded for a 980Ti to get some cash back and bios mods a plenty for them too.


----------



## midlan

I have question for owners of EVGA CLASSIFIED. Do you experiencing coil whine issue?

I have my 2nd FTW and it has lots of coil whine, even louder than fans on it. Same as the 1st. So I am wondering about getting CLASSIFIED if it hasn't coil whine.


----------



## nrpeyton

Quote:


> Originally Posted by *midlan*
> 
> I have question for owners of EVGA CLASSIFIED. Do you experiencing coil whine issue?
> 
> I have my 2nd FTW and it has lots of coil whine, even louder than fans on it. Same as the 1st. So I am wondering about getting CLASSIFIED if it hasn't coil whine.


I've had mine for two weeks (1080 classified) and no coil whine -- then today -- suddenly I can hear a weird humming noise but its coming from my speakers... it seems to have the same characteristics as coil whine.

Only does it when the gaphics card is benching/gaming.

prime95 and I hear nothing.

Quote:


> Originally Posted by *OZrevhead*
> 
> This is the third bios I have tried, I think the FE bios scored over 25k too. Im not fussed, I down traded for a 980Ti to get some cash back and bios mods a plenty for them too.


We might see more become available for Pascal after the 1080ti. As the elte guys who code all that stuff will begin to upgrade.


----------



## juniordnz

Quote:


> Originally Posted by *midlan*
> 
> I have question for owners of EVGA CLASSIFIED. Do you experiencing coil whine issue?
> 
> I have my 2nd FTW and it has lots of coil whine, even louder than fans on it. Same as the 1st. So I am wondering about getting CLASSIFIED if it hasn't coil whine.


I think you got a bad one. I'm on my second FTW and I can't hear a thing besides the fans spinning.


----------



## midlan

nrpeyton: when i connected speakers to my competer while gaming, they were recieving the exactly the same tones of coil whine as the card was producing. In the headphones it is audible too, but not as much. But affecting my audio is not main issue. It can be fixed with external audio card.

My build is super silent (passive PSU, noctuas running on 300-600 RPM) so I can hear the coil whine clearly directly from the card.

EDIT: the nois I can hear is exactly same as this card has:


----------



## nrpeyton

Quote:


> Originally Posted by *midlan*
> 
> nrpeyton: when i connected speakers to my competer while gaming, they were recieving the exactly the same tones of coil whine as the card was producing. In the headphones it is audible too, but not as much. But affecting my audio is not main issue. It can be fixed with external audio card.
> 
> My build is super silent (passive PSU, noctuas running on 300-600 RPM) so I can hear the coil whine clearly directly from the card.


hmm the fact I can hear it through the speakers is excruciatingly annoying and worse as I turn volume up. I'm the same. Headphones aren' so bad but its still there.

If you can actually hear it with the audio switched off I can't even bear to imagine how annoying that must be for you :-(

If you can't lean to live with it you will need to RMA.

I would have asked if youve got anything plugged into the 'line in' on your mothrboards sound coz for me - that makes it 10x worse but but not sure that'll help you


----------



## chiknnwatrmln

All cards I've owned (granted, not many - 670 FTW, 7950, 290x, 290, and now 1080) have all had some sort of whine or buzzing noise. The AMD ones were the worst but in my experience every card is gonna make some noise. I can't see myself RMA'ing card after card or purchasing another $700 video card because of a quiet whine.

As for nrpeyton, it sounds like you're getting some feedback or something. As someone else said an external sound card may help.


----------



## Vellinious

Ya know, a lot of times when there's buzzing in the sound it's due to a bad / low quality PSU or a bad / low quality motherboard, and they cause feedback into the audio components..

Coil whine comes from the GPU, and doesn't get transmitted anywhere, as it's a physical trait....the vibration in the coils that resonates at a frequency the human ear can hear. Usually happening when the GPU is pushing higher frame rates, and will likely get worse as frame rates increase.

If the noise is coming from your speakers, I would look at other areas before the GPU. Just a thought.


----------



## Derek1

Just for clarity's sake, 'feedback' is not the proper term in this case. Feedback occurs when audio from the speakers is picked up in a mic feeding back, typically a high pitched shriek but can also occur at lower frequencies.

Electronic 'buzzing' or noise of that sort would more closely be termed as a phase shift problem.


----------



## midlan

Vellinious: thats the theory, in fact I hear same tones from audio and the card. I have high end PSU Seasonic Platinum 520W Fanless and high end motherboard ASUS MAXIMUS VIII GENE so I think it is not the problem.


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> Ya know, a lot of times when there's buzzing in the sound it's due to a bad / low quality PSU or a bad / low quality motherboard, and they cause feedback into the audio components..
> 
> Coil whine comes from the GPU, and doesn't get transmitted anywhere, as it's a physical trait....the vibration in the coils that resonates at a frequency the human ear can hear. Usually happening when the GPU is pushing higher frame rates, and will likely get worse as frame rates increase.
> 
> If the noise is coming from your speakers, I would look at other areas before the GPU. Just a thought.


I think what you are describing is a problem with poorly shielded wiring bleeding across lines or components on the pcb's or improper grounding.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> I think what you are describing is a problem with poorly shielded wiring bleeding across lines or components on the pcb's or improper grounding.


^^This

I apologize that I wasn't able to articulate it very well. I work in the evenings, and browse the forums during my "brain breaks". I'm not always on top of my game.... At least someone was able to understand what I was trying to say. lol


----------



## outofmyheadyo

Did a few benchmarks run with my MSI 1080 GamingX 6700K @ 4,6 + 3200 14-14-14-34 ram,GPU @ +175 (2152) +500 mem

firestrike - 19 971 - Graphics Score 25 621 - Physics Score 14 731 - Combined Score 9 419

timespy - 7867 - Graphics Score 8 355 - CPU Score 5 911

Pretty happy with the performance so far, would be nice to break 20k score on firestrike and 8k on timespy but I`m afraid thats all I can muster with the 6700K.

Firestrike Ultra Stress Test 99%


----------



## juniordnz

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Did a few benchmarks run with my MSI 1080 GamingX 6700K @ 4,6 + 3200 14-14-14-34 ram,GPU @ +175 (2152) +500 mem
> 
> firestrike - 19 971 - Graphics Score 25 621 - Physics Score 14 731 - Combined Score 9 419
> 
> timespy - 7867 - Graphics Score 8 355 - CPU Score 5 911
> 
> Pretty happy with the performance so far, would be nice to break 20k score on firestrike and 8k on timespy but I`m afraid thats all I can muster with the 6700K.
> 
> Firestrike Ultra Stress Test 99%


That's a nice graphics score!









Did you try to max your VRAM first then move to core? If not, just do it out of curiosity... it worked the best for me like that!


----------



## outofmyheadyo

Havent really tried to max it out yet, will do when I have more time, ran the extreme and ultra aswell just for comparison:

firestrike ultra - 5991 - Graphics Score 6 007 - Physics Score 14 714 - Combined Score 3 138

firestrike extreme - 10 967 - Graphics Score 12 007 - Physics Score 14 759 - Combined Score 5 390


----------



## steeludder

Hello guys, hadn't been following this for a while... but did I understand this right that the Asus T4 bios can be flashed onto a FE card? I've got a Palit FE 1080 that I've modded with conductive liquid metal to bypass the power limit... I've got it under water (and hooking up a chiller this weekend) which means voltage is the only real limiter at present.

Please tell me the T4 bios is ok to flash onto my 1080... Last thing I wanna do is brick it!


----------



## TWiST2k

Quote:


> Originally Posted by *steeludder*
> 
> Hello guys, hadn't been following this for a while... but did I understand this right that the Asus T4 bios can be flashed onto a FE card? I've got a Palit FE 1080 that I've modded with conductive liquid metal to bypass the power limit... I've got it under water (and hooking up a chiller this weekend) which means voltage is the only real limiter at present.
> 
> Please tell me the T4 bios is ok to flash onto my 1080... Last thing I wanna do is brick it!


Why not use the Pascal BIOS Editor we just got, its a couple pages back bro, just keep reading and you will find out.


----------



## steeludder

Quote:


> Originally Posted by *TWiST2k*
> 
> Why not use the Pascal BIOS Editor we just got, its a couple pages back bro, just keep reading and you will find out.


Don't play with my sanity man... There is no such thing as a pascal bios editor... argh


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> Why not use the Pascal BIOS Editor we just got, its a couple pages back bro, just keep reading and you will find out.


hahaha epic









on-topic: do you guys think a 5.0ghz 4790K will be able to run 120fps @ 1080P without any bottleneck? I'm bottlenecking like crazy with this 4690K at 4.4 and just bought a binned 4790K to see if it can handles the bottleneck and keep me away from sky/kaby


----------



## steeludder

Quote:


> Originally Posted by *juniordnz*
> 
> hahaha epic


...but I did not fall for it!









I've been reading quite a few pages back on several occasions however. I've seen mentions of the T4 bios in combination with the two letters "FE". So no need to be patronising, I'm just trying to get proper understanding and avoid breaking my card.


----------



## juniordnz

Quote:


> Originally Posted by *steeludder*
> 
> ...but I did not fall for it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been reading quite a few pages back on several occasions however. I've seen mentions of the T4 bios in combination with the two letters "FE". So no need to be patronising, I'm just trying to get proper understanding and avoid breaking my card.


Meant no offense, bro. And I believe twist2k meant none either. It was just a silly joke, have a good laugh and move on.

Btw, T4 is the "best" we have so far. Got my best results with it so you should give it a try if you're not afraid of cross-flashing. Just be aware the you'll need to be able to dissipate the extra heat, otherwise it's a moot point

Happy OCing


----------



## feznz

Quote:


> Originally Posted by *steeludder*
> 
> ...but I did not fall for it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been reading quite a few pages back on several occasions however. I've seen mentions of the T4 bios in combination with the two letters "FE". So no need to be patronising, I'm just trying to get proper understanding and avoid breaking my card.


I've been wondering if a raised power limiter would net any worthwhile gains 2100Mhz seems to be the point where the performance scaling drops off significantly

http://videocardz.com/60923/galax-overclocks-gtx-1080-to-2-2-ghz-on-air-2-5-ghz-with-ln2


----------



## steeludder

Quote:


> Originally Posted by *juniordnz*
> 
> Meant no offense, bro. And I believe twist2k meant none either. It was just a silly joke, have a good laugh and move on.
> 
> Btw, T4 is the "best" we have so far. Got my best results with it so you should give it a try if you're not afraid of cross-flashing. Just be aware the you'll need to be able to dissipate the extra heat, otherwise it's a moot point
> 
> Happy OCing


It's cool man, I know how it feels to discuss issues profusely just to have a noob come in and say "can u phlash mah bios plox?"

But this is a little trickier, the thread is 729 pages long, time is limited and the topic is very much only experience-based. As you said, cross-flashing isn't a safe exercise... and that's a lot why I was asking. Because I've seen people mention "successful" flashes of that T4 bios onto FE cards.

FE cards all have the same design so if it works for one, it should work for all (in theory). I was just trying to make sure I understood this correctly.









EDIT: I have the card under H2O with an EK waterblock. I'm hooking up a chiller this weekend so cooling is not an issue... I'd really like to be able to push voltage!


----------



## steeludder

Quote:


> Originally Posted by *feznz*
> 
> I've been wondering if a raised power limiter would net any worthwhile gains 2100Mhz seems to be the point where the performance scaling drops off significantly
> 
> http://videocardz.com/60923/galax-overclocks-gtx-1080-to-2-2-ghz-on-air-2-5-ghz-with-ln2


The power limit mod I did helped for oc stability, but didn't give more oc room. My Palit FE is topping out around 2164 under water. On the FE cards, temp, voltage and power limits are very close to eachother. If you alleviate one of them, you'll run into the next pronto. That's why you need to get rid of all 3 if you want to have any hope of going further.


----------



## juniordnz

Quote:


> Originally Posted by *steeludder*
> 
> It's cool man, I know how it feels to discuss issues profusely just to have a noob come in and say "can u phlash mah bios plox?"
> 
> But this is a little trickier, the thread is 729 pages long, time is limited and the topic is very much only experience-based. As you said, cross-flashing isn't a safe exercise... and that's a lot why I was asking. Because I've seen people mention "successful" flashes of that T4 bios onto FE cards.
> 
> FE cards all have the same design so if it works for one, it should work for all (in theory). I was just trying to make sure I understood this correctly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I have the card under H2O with an EK waterblock. I'm hooking up a chiller this weekend so cooling is not an issue... I'd really like to be able to push voltage!


I "successfully" flashed T4 to my FTW (If you can call losing one DP + no TDP monitoring successful). Got my best results so far (25.750 firestrike graphics).

Stock BIOS: 25.400 @ 1.062V
T4 BIOS: 25.750 @ 1.093V

Is the 1,35% performance gain worth the extra 0.031V, running a BIOS not meant for the card, no TDP monitoring? IMO, it's not.


----------



## steeludder

Quote:


> Originally Posted by *juniordnz*
> 
> I "successfully" flashed T4 to my FTW (If you can call losing one DP + no TDP monitoring successful). Got my best results so far (25.750 firestrike graphics).
> 
> Stock BIOS: 25.400 @ 1.062V
> T4 BIOS: 25.750 @ 1.093V
> 
> Is the 1,35% performance gain worth the extra 0.031V, running a BIOS not meant for the card, no TDP monitoring? IMO, it's not.


I agree... Thx for sharing.
The FTW and the FE are different PCBs though so I'm not gonna give it a shot just yet without more info.


----------



## juniordnz

Quote:


> Originally Posted by *steeludder*
> 
> I agree... Thx for sharing.
> The FTW and the FE are different PCBs though so I'm not gonna give it a shot just yet without more info.


I would only run the T4 BIOS on a card that has a serious power system. Something like a AMP!Extreme, Classy, HOF, etc. But that's just me...

The truth is that these pascals are so leveled and scale so bad with higher voltage that cross-flashing will hardly be worth it for most of us.


----------



## steeludder

Quote:


> Originally Posted by *juniordnz*
> 
> I would only run the T4 BIOS on a card that has a serious power system. Something like a AMP!Extreme, Classy, HOF, etc. But that's just me...
> 
> The truth is that these pascals are so leveled and scale so bad with higher voltage that cross-flashing will hardly be worth it for most of us.


I wouldn't consider cross-flashing normally, but with the water chiller coming on and the relatively small voltage bump I would actually apply to the GPU, I think it would definitely be worth a try.


----------



## juniordnz

Quote:


> Originally Posted by *steeludder*
> 
> I wouldn't consider cross-flashing normally, but with the water chiller coming on and the relatively small voltage bump I would actually apply to the GPU, I think it would definitely be worth a try.


So please do. And report back, always nice to hear about other people experience in OCing.


----------



## steeludder

I like your style!


----------



## chiknnwatrmln

Definitely post back, if all goes well I'll probably flash the T4 to my FE this weekend. I'm tired of hitting the power limit and throttling from 2189 to ~2100MHz. This chip should score more than 24.5k FS.


----------



## juniordnz

So, what's your insight on this:


__
https://www.reddit.com/r/58hnzb/word_of_warning_the_evga_1080_blackscreenfan_bug/%5B/URL
 how much did you spent (in size) to cover the whole VRAM and VRM modules? Which size of pad should I get? Thanks!


----------



## Coopiklaani

flashed my GTX1080 FE with strix t4 bios, played with the curve a little bit. I'm able to hit around 2265MHz on the core with 1.193v.
Then I decided to run some furmark to push my vrm to the very limit. HWinfo reported just over 400w of a 180w TDP card.







just over 200% TDP, nothing to worry about.


----------



## NeoandGeo

Should most 1080FTW cards be able to reach ~2100Mhz? I seem to have hit a wall at 2,050Mhz (Equals +50 on my core so I have started to OC my memory, have gotten it to +450, but can't seem to break the 24k barrier in 3DMark. I have tried the slave BIOS and basic voltage slider adjustments, but it doesn't seem to make a big difference other than max voltage/TDP when benchmarking. I have a fan curve that keeps me below 60c and using AB 4.3.0 Beta 14.

Haven't played with the advanced voltage tweaking yet as I'm not too sure of how it fully works. Any tips on where to start in on getting my OC closer to 2100?


----------



## Coopiklaani

Quote:


> Originally Posted by *NeoandGeo*
> 
> Should most 1080FTW cards be able to reach ~2100Mhz? I seem to have hit a wall at 2,050Mhz (Equals +50 on my core so I have started to OC my memory, have gotten it to +450, but can't seem to break the 24k barrier in 3DMark. I have tried the slave BIOS and basic voltage slider adjustments, but it doesn't seem to make a big difference other than max voltage/TDP when benchmarking. I have a fan curve that keeps me below 60c and using AB 4.3.0 Beta 14.
> 
> Haven't played with the advanced voltage tweaking yet as I'm not too sure of how it fully works. Any tips on where to start in on getting my OC closer to 2100?


No FTW cards are not binned. They are just as good as a FE card for overclocking, just quieter, cooler. Try t4 bios, see if that extra voltage could help.


----------



## steeludder

Quote:


> Originally Posted by *Coopiklaani*
> 
> flashed my GTX1080 FE with strix t4 bios, played with the curve a little bit. I'm able to hit around 2265MHz on the core with 1.193v.
> Then I decided to run some furmark to push my vrm to the very limit. HWinfo reported just over 400w of a 180w TDP card.
> 
> 
> 
> 
> 
> 
> 
> just over 200% TDP, nothing to worry about.


LOL epic.
What temps & cooling?
What brnad FE?


----------



## x-apoc

This is the pad ARCTIC Thermal Pad (145 x 145 x 1.5 mm)
https://www.amazon.com/ARCTIC-Thermal-Pad-145-1-5/dp/B00UYTU6Z6/ref=sr_1_1?ie=UTF8&qid=1477062229&sr=8-1&keywords=ARCTIC+Thermal+Pad+%28145+x+145+x+1.5+mm%29

I used like 6" slack, got plenty left to cover few more video cards.


----------



## NeoandGeo

Quote:


> Originally Posted by *Coopiklaani*
> 
> No FTW cards are not binned. They are just as good as a FE card for overclocking, just quieter, cooler. Try t4 bios, see if that extra voltage could help.


I will try that, and report back. Thanks!


----------



## juniordnz

Quote:


> Originally Posted by *NeoandGeo*
> 
> Should most 1080FTW cards be able to reach ~2100Mhz? I seem to have hit a wall at 2,050Mhz (Equals +50 on my core so I have started to OC my memory, have gotten it to +450, but can't seem to break the 24k barrier in 3DMark. I have tried the slave BIOS and basic voltage slider adjustments, but it doesn't seem to make a big difference other than max voltage/TDP when benchmarking. I have a fan curve that keeps me below 60c and using AB 4.3.0 Beta 14.
> 
> Haven't played with the advanced voltage tweaking yet as I'm not too sure of how it fully works. Any tips on where to start in on getting my OC closer to 2100?


Both FTW I had could go past 2100mhz. But, of course, there will always be some duds amongst good cards.
Quote:


> Originally Posted by *x-apoc*
> 
> This is the pad ARCTIC Thermal Pad (145 x 145 x 1.5 mm)
> https://www.amazon.com/ARCTIC-Thermal-Pad-145-1-5/dp/B00UYTU6Z6/ref=sr_1_1?ie=UTF8&qid=1477062229&sr=8-1&keywords=ARCTIC+Thermal+Pad+%28145+x+145+x+1.5+mm%29
> 
> I used like 6" slack, got plenty left to cover few more video cards.


Thanks, but what would that be in square area? I'm lookin at some fujipoly 100X15x1,0mm


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Coopiklaani*
> 
> flashed my GTX1080 FE with strix t4 bios, played with the curve a little bit. I'm able to hit around 2265MHz on the core with 1.193v.
> Then I decided to run some furmark to push my vrm to the very limit. HWinfo reported just over 400w of a 180w TDP card.
> 
> 
> 
> 
> 
> 
> 
> just over 200% TDP, nothing to worry about.


Damn son. I wonder how accurate that reading is at that point. Regardless I'm gonna try it, here's hoping the stock power delivery system can handle it!

What were your clocks with stock BIOS at 1.093v?


----------



## juniordnz

What I *really* wonder is if everybody is testing the performance gain on those overclocks or just bragging about numbers. My card could reach 2189mhz, but I could care less a about that since my 2114mhz overclock yields more performance.

Clocks, meh...


----------



## x-apoc

Quote:


> Originally Posted by *juniordnz*
> 
> Both FTW I had could go past 2100mhz. But, of course, there will always be some duds amongst good cards.
> Thanks, but what would that be in square area? I'm lookin at some fujipoly 100X15x1,0mm


Not sure from top of my head. around 180 millimeters area.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> flashed my GTX1080 FE with strix t4 bios, played with the curve a little bit. I'm able to hit around 2265MHz on the core with 1.193v.
> Then I decided to run some furmark to push my vrm to the very limit. HWinfo reported just over 400w of a 180w TDP card.
> 
> 
> 
> 
> 
> 
> 
> just over 200% TDP, nothing to worry about.


How the hell are all these basic Founder's Edition (with 5 phase VRM) beating all other manufacturers consistently. I keep seeing FE owners mentioning their clocks and most are over 2200. While the rest of us with custom cards are lucky to get to 2150.?

Doensn't make sense.

I have the Classified ($850 here) with 14 phase VRM -- and from the posts I've read across different forums other Classified owners seem to be "one of the lucky ones" if they can merely get to 2150. Doesn't make sense....

I wonder how my "chip" would perform if I put it on a FE circuit board. hmm....

*wonders if nvidia keep the best clockers for FE's.


----------



## bloot

I'm the proud owner of a Palit Super GTX 1080 JetStream for a week now, the memory on this thing overclocks pretty well, not so the core though















Cooler is really fantastic, stock profile is ultra silent and never goes up 70ºC


----------



## Krzych04650

Thinking about picking two 1080s, but I am not sure about that. Prices are pretty nice for some models already, even in Poland where prices are usually pathetic and 1080 released at $930. But now I can get something like MSI Armor for 2799 PLN ($700 including VAT, so it basically matches US price, well except we are earning 4 times less in Poland than people in the US). To give you some comparison, GTX 980 stabilized price was around 2400 PLN. For 980 Ti it was around 3200. So 2799 seems sensible, I cannot really see it going significantly lower even after 1080Ti/Vega release. May reach this 2400 point somewhere in future, after 6 months or so, but this is far too much waiting just to save 400 PLN, equivalent to $100.

I am just not sure if investing in 1080 and later SLI is a good way to go, especially considering the costs and a fact that I don't exactly have all of this money right now, I would have to borrow for second card and repay it in few months. If it is good idea depends on Q1 2017 releases I guess, but I am not really fancy of waiting, I got 3440x1440 monitor in February and I am basically waiting since then, playing less demanding games in meantime.

I am thinking about 1080 SLI because after seeing benchmarks for Titan XP, I don't feel that single 1080 Ti will be enough for me, and even if it will for now, it won't last a year just like 980 Ti didn't. 1080 SLI with reasonable scaling would be perfect, with good headroom for future, which I cannot say about 1080 Ti, I mean it will be able to just cut it for now, but it will fall short after a year again.

8 GB of memory should be enough for those 2 years I plan to keep this 1080 SLI for, 3440x1440 requires significantly less than 4K.

DX12/Vulkan both are going to be adapted more and more in games, and devs will use more and more features from them, including multi-GPU support, so I cannot really imagine too many AAA games not supporting SLI in future, and for less demanding games SLI won't be needed. There may not be SLI support at release, but imagine paying 150-200 $/€ for a game and you will have an idea how it is like to buy game on release in Poland, and DRM is pretty intensive at the moment, so not much hope of playing new games at release anyway.

Like I said I have waited for too long, I am just tired of this already, but I don't want to be inpatient and make hasty decision. But realistically looking, for full availability and sensible prices for 1080 Ti/Vega I will have to wait another 4-6 months, I just don't want to do that. Not much hope in AMD bringing anything revolutionary with Vega either, so why wait.

I have moved my PC to the attic above me and I am connected with 3m USB/DP cables, so I don't hear my PC at all even at crazy fan speeds, so getting a bit lower grade model like MSI armor won't be a problem. Also MSI is going wider instead of thicker with cooler so those Armors will do well for SLI, especially in my case with side and bottom intake (Define R5).

I won't be able to afford 1080 Ti SLI, so it is either 1080 SLI or single 1080 Ti.

So to sum things up, price is quite good, SLI support in future should be good, whatever releases in Q1-Q2 2017 it won't be powerful enough for me if we talk about single card, 8 GB of memory will be enough, after using both AMD and Nvidia cards I like Nvidia more, I won't be able to afford 1080 Ti SLI, games and series I like are supporting SLI and vast majority of them are supported by Nvidia. Everything sounds sensible.

For those who somehow got through this wall of text, what do you think?


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> How the hell are all these basic Founder's Edition (with 5 phase VRM) beating all other manufacturers consistently. I keep seeing FE owners mentioning their clocks and most are over 2200. While the rest of us with custom cards are lucky to get to 2150.?
> 
> Doensn't make sense.
> 
> I have the Classified ($850 here) with 14 phase VRM -- and from the posts I've read across different forums other Classified owners seem to be "one of the lucky ones" if they can merely get to 2150. Doesn't make sense....
> 
> I wonder how my "chip" would perform if I put it on a FE circuit board. hmm....
> 
> *wonders if nvidia keep the best clockers for FE's.


Well yeah that is exaclty my point too... Classified has 14 +3 phases design, dual 8 pin 245w TDP, giant custom cooler, ... and a FE has 5+1 phases design, single 8pin 180w TDP and a simple blower style cooler and it beats a lot of high end (overkill) cards like the Classified,... in clocks and sometimes even in temps to...

I actually wouldn't be surprised if nvidia indeed keeps the best chips for the FE's.... I mean it would make sense since almost all custom editions are better in some ways: better cooling, better VRM's, better components, ...


----------



## chiknnwatrmln

Quote:


> Originally Posted by *nrpeyton*
> 
> How the hell are all these basic Founder's Edition (with 5 phase VRM) beating all other manufacturers consistently. I keep seeing FE owners mentioning their clocks and most are over 2200. While the rest of us with custom cards are lucky to get to 2150.?
> 
> Doensn't make sense.
> 
> I have the Classified ($850 here) with 14 phase VRM -- and from the posts I've read across different forums other Classified owners seem to be "one of the lucky ones" if they can merely get to 2150. Doesn't make sense....
> 
> I wonder how my "chip" would perform if I put it on a FE circuit board. hmm....
> 
> *wonders if nvidia keep the best clockers for FE's.


It seems Pascal overclocking is limited more so by the chip than the actual power delivery system or PCB.

My FE can clock up to 2200MHz for benches on air and be stable. Under water I get no improvement in clock speed, but temperatures are much lower.

On the otherhand, real world performance is lower on the FE due to hitting the power limit at those clocks. So while my chip can do 2200MHz, the card can't sustain it.


----------



## Tdbeisn554

Quote:


> Originally Posted by *Krzych04650*
> 
> Thinking about picking two 1080s, but I am not sure about that. Prices are pretty nice for some models already, even in Poland where prices are usually pathetic and 1080 released at $930. But now I can get something like MSI Armor for 2799 PLN ($700 including VAT, so it basically matches US price, well except we are earning 4 times less in Poland than people in the US). To give you some comparison, GTX 980 stabilized price was around 2400 PLN. For 980 Ti it was around 3200. So 2799 seems sensible, I cannot really see it going significantly lower even after 1080Ti/Vega release. May reach this 2400 point somewhere in future, after 6 months or so, but this is far too much waiting just to save 400 PLN, equivalent to $100.
> 
> I am just not sure if investing in 1080 and later SLI is a good way to go, especially considering the costs and a fact that I don't exactly have all of this money right now, I would have to borrow for second card and repay it in few months. If it is good idea depends on Q1 2017 releases I guess, but I am not really fancy of waiting, I got 3440x1440 monitor in February and I am basically waiting since then, playing less demanding games in meantime.
> 
> I am thinking about 1080 SLI because after seeing benchmarks for Titan XP, I don't feel that single 1080 Ti will be enough for me, and even if it will for now, it won't last a year just like 980 Ti didn't. 1080 SLI with reasonable scaling would be perfect, with good headroom for future, which I cannot say about 1080 Ti, I mean it will be able to just cut it for now, but it will fall short after a year again.
> 
> 8 GB of memory should be enough for those 2 years I plan to keep this 1080 SLI for, 3440x1440 requires significantly less than 4K.
> 
> DX12/Vulkan both are going to be adapted more and more in games, and devs will use more and more features from them, including multi-GPU support, so I cannot really imagine too many AAA games not supporting SLI in future, and for less demanding games SLI won't be needed. There may not be SLI support at release, but imagine paying 150-200 $/€ for a game and you will have an idea how it is like to buy game on release in Poland, and DRM is pretty intensive at the moment, so not much hope of playing new games at release anyway.
> 
> Like I said I have waited for too long, I am just tired of this already, but I don't want to be inpatient and make hasty decision. But realistically looking, for full availability and sensible prices for 1080 Ti/Vega I will have to wait another 4-6 months, I just don't want to do that. Not much hope in AMD bringing anything revolutionary with Vega either, so why wait.
> 
> I have moved my PC to the attic above me and I am connected with 3m USB/DP cables, so I don't hear my PC at all even at crazy fan speeds, so getting a bit lower grade model like MSI armor won't be a problem. Also MSI is going wider instead of thicker with cooler so those Armors will do well for SLI, especially in my case with side and bottom intake (Define R5).
> 
> I won't be able to afford 1080 Ti SLI, so it is either 1080 SLI or single 1080 Ti.
> 
> So to sum things up, price is quite good, SLI support in future should be good, whatever releases in Q1-Q2 2017 it won't be powerful enough for me if we talk about single card, 8 GB of memory will be enough, after using both AMD and Nvidia cards I like Nvidia more, I won't be able to afford 1080 Ti SLI, games and series I like are supporting SLI and vast majority of them are supported by Nvidia. Everything sounds sensible.
> 
> For those who somehow got through this wall of text, what do you think?


Well a single card is almost always better in my opinion. Lower system temps, lower power draw, a lot of games don't even have (good) SLI support (so then you pay like 800$ for a card that does nothing...), SLI problems overall.
I would go for the 1080Ti if I were you, unless you really need the extra performance .
1080 SLI has great performance (if it works..) but a 1080Ti will probably have Titan X performance which is really powerful too and you will have a guarantee you will get all that power in all games. With an SLI setup you are sure about the power of 1 1080, the other one depends if the game has SLI support and then you are still not sure about the scaling you will have.
But this is just my opinion. 1 card is most of the times more interesting than an SLI setup (unless you really need the extra performance)


----------



## Krzych04650

Quote:


> Originally Posted by *Archang3l*
> 
> Well a single card is almost always better in my opinion. Lower system temps, lower power draw, a lot of games don't even have (good) SLI support (so then you pay like 800$ for a card that does nothing...), SLI problems overall.
> I would go for the 1080Ti if I were you, unless you really need the extra performance .
> 1080 SLI has great performance (if it works..) but a 1080Ti will probably have Titan X performance which is really powerful too and you will have a guarantee you will get all that power in all games. With an SLI setup you are sure about the power of 1 1080, the other one depends if the game has SLI support and then you are still not sure about the scaling you will have.
> But this is just my opinion. 1 card is most of the times more interesting than an SLI setup (unless you really need the extra performance)


Thats the whole point of this, that I need extra performance, if I didn't then I wouldn't even think about SLI.

Everybody knows that buying a second 1080 is just unjustified enthusiast purchase, but what if you are one?

Also what you say about SLI is mostly true for pairing two mid-range cards and comparing against high-end card that is two times more powerful, so you get the same performance in theory, but in reality high-end card wins because two mid-range cards can match it only with perfect SLI support. Here we are talking about pairing two almost high-end cards against one true high-end, so you are getting way bigger theoretical performance from two cards than one. If you get SLI support then you fly away, and if you don't its not the end of the world, like it would be if I for example considered two 1070s against 1080 Ti. I am still left with powerful card at the end, even if it is only one. 1070 SLI vs 1080 Ti is rather idiotic, but 1080 SLI vs 1080 Ti makes potential gains in performance worth the issues I think.

There is one main question to answer yourself - How many games will be demanding enough to need SLI but not supporting it, given that you won't play day one? I think very few, especially if you avoid broken crap and have some quality requirements from a game like I do. And at the same time there are many great games that will highly benefit from SLI.

This is whole point for me, you don't buy SLI for all games, just for those few that make you do so.

PS

*Anyone has PNY GeForce GTX 1080 XLR8?*


----------



## nrpeyton

Quote:


> Originally Posted by *Krzych04650*
> 
> Thinking about picking two 1080s, but I am not sure about that. Prices are pretty nice for some models already, even in Poland where prices are usually pathetic and 1080 released at $930. But now I can get something like MSI Armor for 2799 PLN ($700 including VAT, so it basically matches US price, well except we are earning 4 times less in Poland than people in the US). To give you some comparison, GTX 980 stabilized price was around 2400 PLN. For 980 Ti it was around 3200. So 2799 seems sensible, I cannot really see it going significantly lower even after 1080Ti/Vega release. May reach this 2400 point somewhere in future, after 6 months or so, but this is far too much waiting just to save 400 PLN, equivalent to $100.
> 
> I am just not sure if investing in 1080 and later SLI is a good way to go, especially considering the costs and a fact that I don't exactly have all of this money right now, I would have to borrow for second card and repay it in few months. If it is good idea depends on Q1 2017 releases I guess, but I am not really fancy of waiting, I got 3440x1440 monitor in February and I am basically waiting since then, playing less demanding games in meantime.
> 
> I am thinking about 1080 SLI because after seeing benchmarks for Titan XP, I don't feel that single 1080 Ti will be enough for me, and even if it will for now, it won't last a year just like 980 Ti didn't. 1080 SLI with reasonable scaling would be perfect, with good headroom for future, which I cannot say about 1080 Ti, I mean it will be able to just cut it for now, but it will fall short after a year again.
> 
> 8 GB of memory should be enough for those 2 years I plan to keep this 1080 SLI for, 3440x1440 requires significantly less than 4K.
> 
> DX12/Vulkan both are going to be adapted more and more in games, and devs will use more and more features from them, including multi-GPU support, so I cannot really imagine too many AAA games not supporting SLI in future, and for less demanding games SLI won't be needed. There may not be SLI support at release, but imagine paying 150-200 $/€ for a game and you will have an idea how it is like to buy game on release in Poland, and DRM is pretty intensive at the moment, so not much hope of playing new games at release anyway.
> 
> /\ typed that fast and mixed it up a bit and can't be bothered editing to fix it but you got the picture lol..
> 
> Like I said I have waited for too long, I am just tired of this already, but I don't want to be inpatient and make hasty decision. But realistically looking, for full availability and sensible prices for 1080 Ti/Vega I will have to wait another 4-6 months, I just don't want to do that. Not much hope in AMD bringing anything revolutionary with Vega either, so why wait.
> 
> I have moved my PC to the attic above me and I am connected with 3m USB/DP cables, so I don't hear my PC at all even at crazy fan speeds, so getting a bit lower grade model like MSI armor won't be a problem. Also MSI is going wider instead of thicker with cooler so those Armors will do well for SLI, especially in my case with side and bottom intake (Define R5).
> 
> I won't be able to afford 1080 Ti SLI, so it is either 1080 SLI or single 1080 Ti.
> 
> So to sum things up, price is quite good, SLI support in future should be good, whatever releases in Q1-Q2 2017 it won't be powerful enough for me if we talk about single card, 8 GB of memory will be enough, after using both AMD and Nvidia cards I like Nvidia more, I won't be able to afford 1080 Ti SLI, games and series I like are supporting SLI and vast majority of them are supported by Nvidia. Everything sounds sensible.
> 
> For those who somehow got through this wall of text, what do you think?


When you say you're an "enthusiast" what does that mean... I've had my card for over two weeks and I've spent the entire time "tinkering" with settings, trying new BIOS's, overclocking, editing my voltage curve and trying to think of new ways to push it even harder, stress testing and benching. And like zero time actually playing any games. Apart from a 5 minute session to compare frame rates to my old setup.

When you say 1080ti won't be enough for you -- why is that exactly? A 1080 will play most games at 4k at 'slightly' more than acceptable frame-rates. And a 1080ti will most definitely do it.
VR is also new -- and they are saying the "hype" around it doesn't even do it justice. When you actually get VR you will be a lot more blown away than you ever thought you would be! A lot more surprised than you thought you would be! If you are truly "enthusiast at playing games" then you would be looking at VR and a 1080 is more than capable of VR. A 1080ti will definitely be capable. OMG VR is even coming out for console and even one 1080 is about 6 times the power of the latest generation console!!

/\ /\ that was all my afterthoughts.

My *INITIAL* thooughts when I read your post were I feel very lucky to live where I do with the opportunities. And reading your posts I am extremely angry at companies like Nvidia. If they can charge £200 for a graphics card how can they charge £700 for a card that in "real terms" is actually just a "slightly faster" version of the exact same technology (when u consider the bigger picture -- think of a car -- which lasts 10 years - so the technology is SO similar)

It probably doesn't actually cost much more to produce than the cheaper version.

They are too greedy.
I'm all for rewarding success and creating incentive for companies and individuals to do well -- i'm rather conservative that way.... but the price differences in technology which is much the same is far far too greedy and there should be more trade-monopoly laws protecting the consumer. AMD really needs to catch up -- because if it can (and hopefully it will) we might see things become a bit fairer for the poor guy.

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> It seems Pascal overclocking is limited more so by the chip than the actual power delivery system or PCB.
> 
> My FE can clock up to 2200MHz for benches on air and be stable. Under water I get no improvement in clock speed, but temperatures are much lower.
> 
> On the otherhand, real world performance is lower on the FE due to hitting the power limit at those clocks. So while my chip can do 2200MHz, the card can't sustain it.


Before I picked my 1080 I spent ages reading reviews and comparisons -- and even though I was warned that a Classified was a waste of time this round -- I still couldn't resist -- I couldn't help myself.. I thought "why not just pay an extra £100 or so and "find out for myself" -- I wanted *the best of the best* because if I was spending *that much money* on a card anyway I thought I may as well go the full mile.. I just didn't listen.. and now I wish I had listened.

But even if I had -- then I'd still always have wondered what it would be like to own a "Classified" -- was there something I was missing out on?? "I've never had a card with a 14-phase VRM before etc etc...

I'm still not sure if I've learnt my lesson... Its my first "proper" informed enthusiast purchase... and I've ended up seriously disappointed. Maybe if I'd owned a 980ti I'd not of made the same mistake.. or maybe I would of made an even bigger one and got an even more expensive card.. lol... time will tell...

I could RMA it and grab an FTW but what if I get one that doesn't clock as high as my Classified. How will I feel then? lol And I've never ever RMA'd anything before so I don't know if I'd be making a mistake. Something could go wrong and I could lose the entire card!


----------



## Derek1

deleted


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Well yeah that is exaclty my point too... Classified has 14 +3 phases design, dual 8 pin 245w TDP, giant custom cooler, ... and a FE has 5+1 phases design, single 8pin 180w TDP and a simple blower style cooler and it beats a lot of high end (overkill) cards like the Classified,... in clocks and sometimes even in temps to...
> 
> I actually wouldn't be surprised if nvidia indeed keeps the best chips for the FE's.... I mean it would make sense since almost all custom editions are better in some ways: better cooling, better VRM's, better components, ...


How did you get on with max clock on your 2nd new card then after you got all the other issues you had out the way?

Quote:


> Originally Posted by *Derek1*
> 
> deleted


Too late I already read it lol -- and it was actually a good post -- interesting.. you shouldn't have deleted it.

I think the reason some of us (like me) why we are annoyed is that we've spent extra money on extra features on our cards when they haven't actually really helped. Yet if we hadn't and we'd still had a really hard time with silicon lottery on a FE (for example) then we'd be wishing we had got the better card -- so either way we lose... if there is really no gain in a Classified this round then they never should have produced it.

I have *the one card* that I should be able to play about with voltage etc to find out for myself (now bear in mind that voltage has gotten SOME people an extra 100mhz (I read a post only a few pages back with an example of this) yet my card seems the *ONLY* one on these forums that is unable to use the T4. So now I'm wishing I'd got an FTW. Now seriously -- I've been paid again since buying my Classified so I'm not even skint anymore -- so I should *not* be wishing I actually owned a cheaper card.

To be honest its not even cheaper anymore -- people who've bought these types of cards in the past -- who have a lot more experience than me -- and who had the sense to actually listen, lol -- have decided not to bother -- and prices have now came down -- you can now get a classified in many online shops at the same price as an FTW.....

Once you pair the FTW with the T4 it beats a Classified -- even if the Classified was under water.

If we had BIOS tweaks on PASCAL things would maybe be different.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> How did you get on with max clock on your 2nd new card then after you got all the other issues you had out the way?
> Too late I already read it lol -- and it was actually a good post -- interesting.. you shouldn't have deleted it.
> 
> I think the reason some of us (like me) why we are annoyed is that we've spent extra money on extra features on our cards when they haven't actually really helped. Yet if we hadn't and we'd still had a really hard time with silicon lottery on a FE (for example) then we'd be wishing we had got the better card -- so either way we lose... if there is really no gain in a Classified this round then they never should have produced it.
> 
> I have *the one card* that I should be able to play about with voltage etc to find out for myself (now bear in mind that voltage has gotten SOME people an extra 100mhz (I read a post only a few pages back with an example of this) yet my card seems the *ONLY* one on these forums that is unable to use the T4. So now I'm wishing I'd got an FTW.


LOL The formatting went to hell! It looked so much better in the editor.

I didn't save the screen shot as I was making multiple runs all afternoon and just writing down notes.

The bottom line being that after finding the sweetspot for my card at default voltages (1.062) I just used the slider to go to 100% or 1.093 V with the slave bios of 130%. My clocks went up sure but performance in Valley stayed about the same, but FS overall score dropped from 17566 to 15206, Graphic score from 24501 to 24125, FPS not really effected.

I figure it is about average for the FTW's. Clock starts out the run at 2139-2150 and settles nicely into 2114 by the end.

THE FE's on average are probably lower, we tend to notice the outliers more significantly for both whether good or bad without understanding the variance. On the whole I am happy with my card, especially considering mine was shipped before Aug 31







, and the fact I am coming from a HD7950 that clocked at 1025 lol.


----------



## Agoniizing

My GTX 1080 FTW I just got is a lemon just like my SC 1080







Out of the box it boosts to 1974MHz, I cant even OC it past 2GHz without crashing. I dont know why I keep getting bad 1080s.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *nrpeyton*
> 
> Before I picked my 1080 I spent ages reading reviews and comparisons -- and even though I was warned that a Classified was a waste of time this round -- I still couldn't resist -- I couldn't help myself.. I thought "why not just pay an extra £100 or so and "find out for myself" -- I wanted *the best of the best* because if I was spending *that much money* on a card anyway I thought I may as well go the full mile.. I just didn't listen.. and now I wish I had listened.
> 
> But even if I had -- then I'd still always have wondered what it would be like to own a "Classified" -- was there something I was missing out on?? "I've never had a card with a 14-phase VRM before etc etc...
> 
> I'm still not sure if I've learnt my lesson... Its my first "proper" informed enthusiast purchase... and I've ended up seriously disappointed. Maybe if I'd owned a 980ti I'd not of made the same mistake.. or maybe I would of made an even bigger one and got an even more expensive card.. lol... time will tell...
> 
> I could RMA it and grab an FTW but what if I get one that doesn't clock as high as my Classified. How will I feel then? lol And I've never ever RMA'd anything before so I don't know if I'd be making a mistake. Something could go wrong and I could lose the entire card!


In my opinion, high end cards are most suited if you plan on staying with air cooling or cold, like LN2. For standard water cooling reference is my first choice. Block availability is the best and the power delivery system is (usually) good enough to push the chip as far as it'll go.

However if and when I end up going back to air cooling, I'll get one of those fancy Classified Gaming FTW Strix's that everybody keeps talking about.


----------



## Vellinious

Quote:


> Originally Posted by *Agoniizing*
> 
> My GTX 1080 FTW I just got is a lemon just like my SC 1080
> 
> 
> 
> 
> 
> 
> 
> Out of the box it boosts to 1974MHz, I cant even OC it past 2GHz without crashing. I dont know why I keep getting bad 1080s.


That's some terrible luck. All 5 (3 FTWs and 2 ACX 3.0) 1080s I've tested all boost to at least 2053 out of the box, and two of them would go well above 2160 with some minor tweaking.


----------



## Agoniizing

Quote:


> Originally Posted by *Vellinious*
> 
> That's some terrible luck. All 5 (3 FTWs and 2 ACX 3.0) 1080s I've tested all boost to at least 2053 out of the box, and two of them would go well above 2160 with some minor tweaking.


It's just bad luck right? At first I was thinking maybe it was my PSU, but I have an EVGA 650 G2. I would have been happy with at least 2000Mhz constant clock. Are there any custom bios out there that I can try?


----------



## Nelly.

SERIOUS ISSUE with EVGA cards - Any of you EVGA owners seen this? >>

__
https://www.reddit.com/r/580b8w/welp_my_evga_1070_ftw_just_killed_itself/


----------



## TWiST2k

Quote:


> Originally Posted by *nrpeyton*
> 
> How did you get on with max clock on your 2nd new card then after you got all the other issues you had out the way?
> Too late I already read it lol -- and it was actually a good post -- interesting.. you shouldn't have deleted it.
> 
> I think the reason some of us (like me) why we are annoyed is that we've spent extra money on extra features on our cards when they haven't actually really helped. Yet if we hadn't and we'd still had a really hard time with silicon lottery on a FE (for example) then we'd be wishing we had got the better card -- so either way we lose... if there is really no gain in a Classified this round then they never should have produced it.
> 
> I have *the one card* that I should be able to play about with voltage etc to find out for myself (now bear in mind that voltage has gotten SOME people an extra 100mhz (I read a post only a few pages back with an example of this) yet my card seems the *ONLY* one on these forums that is unable to use the T4. So now I'm wishing I'd got an FTW. Now seriously -- I've been paid again since buying my Classified so I'm not even skint anymore -- so I should *not* be wishing I actually owned a cheaper card.
> 
> To be honest its not even cheaper anymore -- people who've bought these types of cards in the past -- who have a lot more experience than me -- and who had the sense to actually listen, lol -- have decided not to bother -- and prices have now came down -- you can now get a classified in many online shops at the same price as an FTW.....
> 
> Once you pair the FTW with the T4 it beats a Classified -- even if the Classified was under water.
> 
> If we had BIOS tweaks on PASCAL things would maybe be different.


This guy gets it!!!!


----------



## Koniakki

Okay, I believe I have spent more than enough time troubleshooting this(almost a week)..

Can someone explain or does anyone knows why out of the blue after restarting my PC, my iGPU which was disabled in bios, got enabled although in the bios it shows as still disabled(although a couple iGPU options below which disappear when disabled are now shown)?

And while the HDMI is still connected to my GTX 1080 Gamerock I get NO SIGNAL until windows log screen then everything is fine? iGPU still show enabled in the Device manager and I can disable it without issues. But if HDMI is connected to my GTX 1080 I cannot access the mobo bios at boot.

I plugged my Zotac FE just to make sure and voila, iGPU stays disabled. Plug the Palit and iGPU gets enabled by its own and I get no bios screen until Windows log on appears.

Btw if I plugged the HDMI on the mobo HDMI port I see the bios screen while booting just fine.

Is this the infamous black screen issue guys?

Also after the log on/windows boot, everything is working fine e.g games, benchmarks, settings etc. Weird.


----------



## wardo3640

I just got my system up and running this week and was tweeking the cards to get them bumped up a bit and looks like I went too far. I can't get into windows for more than a few seconds now. How do I turn these things back down?


----------



## Koniakki

Quote:


> Originally Posted by *wardo3640*
> 
> I just got my system up and running this week and was tweeking the cards to get them bumped up a bit and looks like I went too far. I can't get into windows for more than a few seconds now. How do I turn these things back down?


If understand this correctly you mean you selected the apply OC at boot/startup?

If so, start in SAFE MODE and either delete the profiles in the MSI AB or PX or uninstall the overclocking application completely.


----------



## wardo3640

Yes you are correct that button is checked. It had a mild overclock on it already and when I started tweeking it tonight I forgot to uncheck it.


----------



## Koniakki

Quote:


> Originally Posted by *wardo3640*
> 
> Yes you are correct that button is checked. It had a mild overclock on it already and when I started tweeking it tonight I forgot to uncheck it.


Simple problem. Don't worry. Just press F8 while your computer is booting to access the SAFE MODE option and either disable the startup of the overclocking application, uninstall it or just plain delete the install folder in Program Files/x86.

Let us know how it works out.


----------



## Krzych04650

Ok, I decided to get one card for now. I have enough games to play through until I get money for second card, games that I will be able to play on single 1080 (basically 99,5% of games, lets be honest) and I am not able right now on Fury. For Fury on 3440x1440 basically 2011 games are max of what it can run, except for some beautifully running games like Mad Max, there I got just around 64 FPS on average.

Anyway, here is a list of 1080s inside reasonable price point, what would you recommend? https://www.morele.net/komputery/podzespoly-komputerowe/karty-graficzne-12/,3100.00,,,,,,p,0,,8143O1100886/1/#product_list

Asus is not an option. Thick cards are also not an option because I need space if I want to SLI later, so Palit/Gainward is out.

Basically I can get MSI Armor or Zotac with 2 fans. This non-AMP one looks interesting, is well priced at 2946 ZL and has far better build quality than this MSI Armor, also most likely better cooling, because heatsink on Armor looks really tiny. Plus not backplate on Armor, and with my luck the card will brick on first touch







Anyone has this Zotac? Or MSI Armor? I know it is reviewed at TechPowerUp, but some real case scenario info would be nice.


----------



## SweWiking

Krzych04650: i wouldnt bother getting a second card tbh. I had two 1080's but alot of games dont even support sli. Mafia 3 etc so i just sold one of the cards and aim to get the 1080ti and sell my other 1080 when the ti launches.


----------



## Derek1

Quote:


> Originally Posted by *SweWiking*
> 
> Krzych04650: i wouldnt bother getting a second card tbh. I had two 1080's but alot of games dont even support sli. Mafia 3 etc so i just sold one of the cards and aim to get the 1080ti and sell my other 1080 when the ti launches.


That is another option. Wait until the Ti launches, and you will be able to get 2 1080's for the price of one Ti, or less, as the maeketplace becomes flooded with them with people wanting to sell and upgrade, the prices are gonna be dirt cheap.


----------



## Krzych04650

Quote:


> Originally Posted by *SweWiking*
> 
> Krzych04650: i wouldnt bother getting a second card tbh. I had two 1080's but alot of games dont even support sli. Mafia 3 etc so i just sold one of the cards and aim to get the 1080ti and sell my other 1080 when the ti launches.


I am definitely not going to play some crap ports that cannot hold 60 [email protected] on GTX 1070/980 Ti on LOW SETTINGS like Mafia III. I won't pay a broken penny to developer who doesn't respect me as a customer and tries to sell broken crap to me, so don't worry about that. Already released games I intend/need to use SLI for are supporting it, and future releases will support SLI sooner or later, I don't plan to play day one anyway, I never do.

I am aware that second 1080 is not a reasonable purchase by any means, but I know my needs and I am also aware of the risk, if it is going to be a mistake then I will pay for it. The most powerful single GPU wasn't enough for me in previous gen, and looking at Titan XP benchmarks, it won't be right now also, and since I won't be able to afford two 1080 Ti's, 1080 SLI is what is optimal, at least in theory. Choosing 1080 SLI has much bigger risk but potential gains are worth it. Time will tell.


----------



## Krzych04650

Quote:


> Originally Posted by *Derek1*
> 
> That is another option. Wait until the Ti launches, and you will be able to get 2 1080's for the price of one Ti, or less, as the maeketplace becomes flooded with them with people wanting to sell and upgrade, the prices are gonna be dirt cheap.


I am getting one for now, and second one in early 2017, so second 1080 should be cheaper. Or maybe I will change my mind and won't get SLI. I don't know, for now I am getting one 1080 because I am not going to wait another 6 months on a card that won't even let me to play 4 year old games.

Sorry for double post.


----------



## Derek1

Quote:


> Originally Posted by *Krzych04650*
> 
> I am getting one for now, and second one in early 2017, so second 1080 should be cheaper. Or maybe I will change my mind and won't get SLI. I don't know, for now I am getting one 1080 because I am not going to wait another 6 months on a card that won't even let me to play 4 year old games.
> 
> Sorry for double post.


My remark was meant as more of a joke, but is probably close to the truth.

My concern is that the famous announcement is only speculation at this point. People are assuming the announcement is going to be about a 1080Ti. I am not so sure that that is what the announcement is going to be about. The 1050Ti is out, or soon to be. I think that we may see the 1080Ti releeased/announced *before* January.


----------



## Krzych04650

Quote:


> Originally Posted by *Derek1*
> 
> My remark was meant as more of a joke, but is probably close to the truth.
> 
> My concern is that the famous announcement is only speculation at this point. People are assuming the announcement is going to be about a 1080Ti. I am not so sure that that is what the announcement is going to be about. The 1050Ti is out, or soon to be. I think that we may see the 1080Ti releeased/announced *before* January.


I don't think so. Not much reason in releasing 1080 Ti when you can milk people with $700+ for 1080 (well, now there are some reasonably priced models available, finally) and $1200 for melting Titan, because there is absolutely no competition. I think we can quite safely predict that AMD won't be early with Vega. I don't believe in 1080Ti/Vega releases in late Q4/early Q1, Nvidia wil be first but only when Vega is close. Plus availability issues and horrific prices for like 2 months. You won't be able to "effectively" get 1080Ti/Vega until late Q1, and this is optimistic scenario.


----------



## ucode

Gotta build the hype up first to get people to drop their old cards. "The next best thing since sliced bread coming to yah soon".

The old trick seems to be to give enough incentive to want one but not too much to stop you also wanting the next one after that also. And so on...


----------



## Groo21

I agree with @Krzych04650.

I don't see any incentive for nVidia to release an upgraded SKU until AMD has something that can compete with the 1080, or at least until AMD is about to launch something that could possibly compete with the 1080.

In the meantime, they can improve Pascal yield and keep developing whatever pascalNext is.


----------



## firegrass

Hi all.
This is a great thread for information so I thought that I would contribute.
The MSI ARMOR 8GB OC was the only 1080 that was available to me at the time. It was not really what I wanted.
Straight away I was VERY unimpressed by the high temps and only standard boost.
After a bit of research on GPU Boost 3.0 I realised that I would need to get the temperatures down.

At the time there were not many cooling solutions available yet. But I read about the NZXT Kraken G10 for the 980 and heard that it might fit.
It did!
I attached a Corsair H55 with push/pull fans and with a lot of reading online and experimentation this is what I've achieved








I did put on some heatsinks on the memory and added a Corsair AF120 fan in a PCI fan mount which blows straight onto the card.

It's extremely stable and I'm extremely happy

EDIT: 


I was going to try the MSI GAMING Z Bios but I don't think there would be any benefit. If there is please tell me.


----------



## Fediuld

The 1080 Armor OC is very underrated card. Mine is running under full waterblock at 2190.


----------



## Krzych04650

Quote:


> Originally Posted by *Fediuld*
> 
> The 1080 Armor OC is very underrated card. Mine is running under full waterblock at 2190.


Well, thats not MSI Armor anymore if you changed cooling. This is whole point, this card has very shaved off cooling, no backplate and etc, this is why it is rated quite low. There is some certain price point after which such shave offs are not really well received. I thought about getting this card but after some reports I don;t look at it anymore. What you got under watercooling you could get on any other card, depending only on silicon lottery. What makes the card is cooler efficiency and build/pcb quality.

Although this is the one of the cheapest 1080s with something more than reference pcb and blower cooler, so I guess some compromises are justified. Depends where, in Poland where Armor is 2799 and Gaming X is 3349 it makes sense, because price difference is very big.


----------



## firegrass

With the Kraken G10 plus the H55 cooler it was still cheaper than the GAMING X here. I'll probably get another one. But I'll probably get another case to fit them in.


----------



## Krzych04650

I think I am not going to be so saving with getting first one, especially if everything in budget I set to myself has limited cooling, which may be ok for single card, but will fall short with SLI. I got a bit more money than I thought so I can get Zotac AMP Extreme in this price point, it has by far the best cooling potential, should work as amazing as my Fury Nitro (thats one amazing cooler). But this Zotac is also bigger, or thicker I should say, and I plan to SLI later. They will fit, more or less like that:



But I think I will still be better off with something like that than with much smaller and inefficient coolers? I have a lot of space in my care (Define R5) and I can push a lot of air through, including side and bottom intake, and also I have PC in different room than monitor, so I can ramp fans up a lot, so it should be okay, I think? I just need to think forward a bit here not to end with broken setup.

Or will just buy second card with different cooling/brand, much thinner than this one and put this on the top and this Zotac behemoth at the bottom to make some space between them.


----------



## Whitechap3l

Hi guys long time not checking the forum.
I read some guys hit 2200+mhz on water. Which cards and bios did you use?


----------



## wardo3640

Ty @Koniakki I got into safe mode and found the profile cfg files and manually edited them. Next boot was AOK!! Thanks.


----------



## NeoandGeo

The T4 bios helped me finally surpass the 21k barrier in 3DMark, but with the same wall of 2050Mhz and +400 on the memory. Can't complain though, haven't had any problems in games.


----------



## Spiriva

Quote:


> Originally Posted by *Whitechap3l*
> 
> Hi guys long time not checking the forum.
> I read some guys hit 2200+mhz on water. Which cards and bios did you use?


I use Evga FE, stable at 2210mhz using the Evga SC bios, EK waterblock.
I tried the "T4" bios, and i could push the core somemore, around 2250mhz but didnt see any gains at all.


----------



## aerial

What is the best bios for 1080 FE, if I wan't to be able to lower RPM of the fan below stock min 1100?
I don't care about increased performance, just looking for bios that offers me similar clocks, will be as compatibile with fe card as possible, and has high rpm range (below 1100 min and also not too low of a cap for max rpm). From what I see some bioses cap max rpm at 2.5k which is too low for FE.


----------



## Koniakki

Quote:


> Originally Posted by *wardo3640*
> 
> Ty @Koniakki I got into safe mode and found the profile cfg files and manually edited them. Next boot was AOK!! Thanks.


Glad to hear!


----------



## Dragonsyph

Sick of waiting for a 1080 TI, do you guys think a gtx 1080 FTW, at say 2100mhz will handle a 1440x3440 monitor at 60-100 fps? Or a reg 1440p at 100fps +. Or am i gonna need two gtx 1080s, and if so do the 1080s in SLI have any stuttering?


----------



## Coopiklaani

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Damn son. I wonder how accurate that reading is at that point. Regardless I'm gonna try it, here's hoping the stock power delivery system can handle it!
> 
> What were your clocks with stock BIOS at 1.093v?


BTW, I'm using a EKFC waterblock, so VRMs are well taken care of. I could hit around 2150 @ 1.093v with the stock BIOS. Pascal doesn't scale well with voltage.


----------



## Coopiklaani

Quote:


> Originally Posted by *steeludder*
> 
> LOL epic.
> What temps & cooling?
> What brnad FE?


Custom loop, EKFC waterblock. It hits 55C max with 15min of furmark. EVGA FE. But I think all FE cards are equal.


----------



## Lays

Anyone tried "T4" bios on MSI Gaming X?


----------



## jleslie246

Quote:


> Originally Posted by *Dragonsyph*
> 
> Sick of waiting for a 1080 TI, do you guys think a gtx 1080 FTW, at say 2100mhz will handle a 1440x3440 monitor at 60-100 fps? Or a reg 1440p at 100fps +. Or am i gonna need two gtx 1080s, and if so do the 1080s in SLI have any stuttering?


I'm waiting also. From what I have found/seen, we should keep waiting. I thought about SLI 980ti's also. You can get used ones really cheap. But i am trying to just hold out.


----------



## Dragonsyph

Quote:


> Originally Posted by *jleslie246*
> 
> I'm waiting also. From what I have found/seen, we should keep waiting. I thought about SLI 980ti's also. You can get used ones really cheap. But i am trying to just hold out.


Ya i was gonna wait it out but one of my 290's took a ****, 2 gpus this month have died, bad month for me i guess. Stuck with a single 290 and a screen that likes to flicker sometimes. All my stuff is so damn old its all dying. I am a game addict and i need my his res game play and right now im not getting my fix bro. Next two checks all going too a new 1080 or two 1070s. I just dont wanna buy 1080 SLI for 1400 dollars when 1070 SLI for 800 will get me 100+ fps at 1440p. Guess 1080 SLI is more future proof and would last longer.

Tell you one thing though im never buying a damn AMD gpu again, 5970, 7850, 7870, 280x, 290 ALL DEAD FROM black screens and vertical lines on ones where the vram failed. NEVER again.


----------



## jleslie246

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya i was gonna wait it out but one of my 290's took a ****, 2 gpus this month have died, bad month for me i guess. Stuck with a single 290 and a screen that likes to flicker sometimes. All my stuff is so damn old its all dying. I am a game addict and i need my his res game play and right now im not getting my fix bro. Next two checks all going too a new 1080 or two 1070s. I just dont wanna buy 1080 SLI for 1400 dollars when 1070 SLI for 800 will get me 100+ fps at 1440p. Guess 1080 SLI is more future proof and would last longer.
> 
> Tell you one thing though im never buying a damn AMD gpu again, 5970, 7850, 7870, 280x, 290 ALL DEAD FROM black screens and vertical lines on ones where the vram failed. NEVER again.


Watch a few videos comparing SLI 980ti's with a 1080. Its pretty tempting. I can buy two 980ti's for $680.


----------



## Groo21

Quote:


> Originally Posted by *Coopiklaani*
> 
> Custom loop, EKFC waterblock. It hits 55C max with 15min of furmark. EVGA FE. But I think all FE cards are equal.


I don't think all FE cards come with the "Y" power adapter that the EVGA FE does.

Also, the VBIOS' of the other manufacturers appear to have different power limits.


----------



## Agoniizing

I flashed my lemon 1080 FTW with the T4 bios, and so far im stable at 2012mhz. I haven't tried pushing it further, but the extra voltage helped. This card cant even go past 1987mhz on stock volts.


----------



## caenlen

Quote:


> Originally Posted by *Agoniizing*
> 
> I flashed my lemon 1080 FTW with the T4 bios, and so far im stable at 2012mhz. I haven't tried pushing it further, but the extra voltage helped. This card cant even go past 1987mhz on stock volts.


I didn't know there were lemons... My 1080 has never dropped below 2.08ghz, I leave fan profile at 100% when gaming though, the extra sound is not that loud compared to older blower cards, its actually kind of quiet for a 100% blower speed... very impressed with Nvidia FE.


----------



## Spiriva

Quote:


> Originally Posted by *Spiriva*
> 
> I use Evga FE, stable at 2210mhz using the Evga SC bios, EK waterblock.
> I tried the "T4" bios, and i could push the core somemore, around 2250mhz but didnt see any gains at all.


I haved to correct my self, it was the older strix bios i had tried and reached 2010mhz, I just now flashed the "t4" bios and now the card run at 2250mhz at 1.181v (with original bios it runs at 1.031v)


----------



## Agoniizing

Quote:


> Originally Posted by *caenlen*
> 
> I didn't know there were lemons... My 1080 has never dropped below 2.08ghz, I leave fan profile at 100% when gaming though, the extra sound is not that loud compared to older blower cards, its actually kind of quiet for a 100% blower speed... very impressed with Nvidia FE.


Mine is a lemon, with stock volts it can't even go past 1987mhz.


----------



## Tdbeisn554

Seems like I have a bad hardware year... My replacement GTX 1080 Classified has some led bleed on the E from the EVGA logo, kinda bothers me and the coil whine is really annoying. I'm getting a shipping label from EVGA to retun it... I really hope third time is a charm... If I have a card with no (or minimal coil whine) and a clock of 2100 I would be super happy. Every time I play a game the card whines like it is dying and it is kinda loud too. Any of you guys have coil whine on their card??

And since I am still on a 1080p screen I really want to upgrade to a new monitor (GTX 1080 is really overpowered for 1080p screen







)But I am not really sure if I should go 1440 or 4K... I mean 4K is kinda the future, and I will keep the monitor probably for 5-10 years. All media is going to 4K (movies, games, series,...) and only games are 1440. But if I only get 30fps in my games... then yeah it's not really fun too.
I just don't really know what I should look at/ buy. I do want at least an IPS panel with G-sync and 1440p or 4K. I have my eye on the ASUS PG279Q and ASUS PG279AQ. Any tips or ideas would be welcome


----------



## Krzych04650

Quote:


> Originally Posted by *Krzych04650*
> 
> I think I am not going to be so saving with getting first one, especially if everything in budget I set to myself has limited cooling, which may be ok for single card, but will fall short with SLI. I got a bit more money than I thought so I can get Zotac AMP Extreme in this price point, it has by far the best cooling potential, should work as amazing as my Fury Nitro (thats one amazing cooler). But this Zotac is also bigger, or thicker I should say, and I plan to SLI later. They will fit, more or less like that:
> 
> 
> 
> But I think I will still be better off with something like that than with much smaller and inefficient coolers? I have a lot of space in my care (Define R5) and I can push a lot of air through, including side and bottom intake, and also I have PC in different room than monitor, so I can ramp fans up a lot, so it should be okay, I think? I just need to think forward a bit here not to end with broken setup.
> 
> Or will just buy second card with different cooling/brand, much thinner than this one and put this on the top and this Zotac behemoth at the bottom to make some space between them.


Okay I found someone with two of them on youtube, he said that the cards are sandwiched but temps are okay, max 75 C on top card, but the difference between top and bottom one reaches 15-20 C. So I will get watercooled card on the top and Zotac on the bottom. I cannot get two watercooled because their price right now is horrendous.
Quote:


> Originally Posted by *Archang3l*
> 
> Seems like I have a bad hardware year... My replacement GTX 1080 Classified has some led bleed on the E from the EVGA logo, kinda bothers me and the coil whine is really annoying. I'm getting a shipping label from EVGA to retun it... I really hope third time is a charm... If I have a card with no (or minimal coil whine) and a clock of 2100 I would be super happy. Every time I play a game the card whines like it is dying and it is kinda loud too. Any of you guys have coil whine on their card??
> 
> And since I am still on a 1080p screen I really want to upgrade to a new monitor (GTX 1080 is really overpowered for 1080p screen
> 
> 
> 
> 
> 
> 
> 
> )But I am not really sure if I should go 1440 or 4K... I mean 4K is kinda the future, and I will keep the monitor probably for 5-10 years. All media is going to 4K (movies, games, series,...) and only games are 1440. But if I only get 30fps in my games... then yeah it's not really fun too.
> I just don't really know what I should look at/ buy. I do want at least an IPS panel with G-sync and 1440p or 4K. I have my eye on the ASUS PG279Q and ASUS PG279AQ. Any tips or ideas would be welcome


From what I can tell you, every card out of 6 or 7 I had whined as fcuk (3 different power supplies, and it is a myth that PSU helps). Even if one (MSI 980 TI) wasn't whining at the beginning or only a little, it started to whine as crazy after like 8 months. You cannot get away from coil whine. I'd suggest doing what I did, move you PC away from you, preferably to completely different room and pass some display and usb cable through the wall/ceiling/floor. This way you are relieved from any kind of noise issues once and for all.

As for 1440p vs 4K, don't ruin your performance for no reason. As for which model/brand to get, I'd rather suggest looking at serious manufacturers or your bad hardware year will continue to another year, if only one.


----------



## juniordnz

Guys, would you say that Arctic's thermal pad (6w/mK) is better then the stock ones the comes on the FTW? I'm pretty worried about those VRAM/VRM temps and the growing reports of broken card due to overheating in those areas. Especially because I'm in a hot country with no AC in my room (last week we got 42ºC here).

I believe they must use some cheap, low quality pads like the ones EK ships with their waterblocks. Arctic's are the only one avaiable here in Brazil, otherwise I would have to import some 14w/mK Fujipoly and that would cost me a lot.

Thoughts?


----------



## Tdbeisn554

Quote:


> Originally Posted by *Krzych04650*
> 
> From what I can tell you, every card out of 6 or 7 I had whined as fcuk (3 different power supplies, and it is a myth that PSU helps). Even if one (MSI 980 TI) wasn't whining at the beginning or only a little, it started to whine as crazy after like 8 months. You cannot get away from coil whine. I'd suggest doing what I did, move you PC away from you, preferably to completely different room and pass some display and usb cable through the wall/ceiling/floor. This way you are relieved from any kind of noise issues once and for all.
> 
> As for 1440p vs 4K, don't ruin your performance for no reason. As for which model/brand to get, I'd rather suggest looking at serious manufacturers or your bad hardware year will continue to another year, if only one.


What do you mean with serious manufacturers then?

I have thought about placing my pc in another room and working with a usb C dock or something but i would need a cable that is at least 15m then...


----------



## SauronTheGreat

guys i need some help in these two 1080s which one is more powerful out of the box ?

i) Gigabyte 1080 Xtreme gaming

http://www.gigabyte.com/products/product-page.aspx?pid=5920&dl=#sp

ii ) Zotac 1080 AMP! Extreme

https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-extreme#spec


----------



## Iceman2733

Quote:


> Originally Posted by *SauronTheGreat*
> 
> guys i need some help in these two 1080s which one is more powerful out of the box ?
> 
> i) Gigabyte 1080 Xtreme gaming
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5920&dl=#sp
> 
> ii ) Zotac 1080 AMP! Extreme
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-extreme#spec


I would go Zotac it seems like the gigabyte are prone to coil whine a lot. Even the 980ti were known to have a lot of coil whine, I have read lots of good stuff about the Zotac only complaint seems to be the color scheme

Sent from my Pixel XL using Tapatalk


----------



## SauronTheGreat

Quote:


> Originally Posted by *Iceman2733*
> 
> I would go Zotac it seems like the gigabyte are prone to coil whine a lot. Even the 980ti were known to have a lot of coil whine, I have read lots of good stuff about the Zotac only complaint seems to be the color scheme
> 
> Sent from my Pixel XL using Tapatalk


yes indeed i own the amp extreme there are very less colour options, its just i bought zotac 1080 amp extreme and now the gigabyte 1080 xtreme gaming is in my local market ... i guess i should keep the zotac amp extreme, and the gigabyte supplier over here is selling the 1080 xtreme very expensive also ...


----------



## galeonki

Quote:


> Originally Posted by *Lays*
> 
> Anyone tried "T4" bios on MSI Gaming X?


Yep, best results and stable work on t4 bios. You should try it. I have the same card. Remember - t4 bios gives You 2 display port only and different fans speed so after flashing change the fan curve in Afterburner a little.


----------



## Krzych04650

Quote:


> Originally Posted by *Archang3l*
> 
> What do you mean with serious manufacturers then?
> 
> I have thought about placing my pc in another room and working with a usb C dock or something but i would need a cable that is at least 15m then...


15m? I meant different room, not different house









As for manufacturers, if I say serious then I basically mean no Asus or Acer or other joke manufacturers for fanboys. But problem is, if you necessarily need [email protected] then only quality option on the market is Dell S2716DG, which is TN.

Do what you think is good for you, but I am just saying that if you already think that you have very bad hardware year then buying Acer or Asus monitor is not the best idea, because another 3+ exchanges are awaiting you, if you are lucky.


----------



## Tdbeisn554

Quote:


> Originally Posted by *Krzych04650*
> 
> 15m? I meant different room, not different house
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for manufacturers, if I say serious then I basically mean no Asus or Acer or other joke manufacturers for fanboys. But problem is, if you necessarily need [email protected] then only quality option on the market is Dell S2716DG, which is TN.
> 
> Do what you think is good for you, but I am just saying that if you already think that you have very bad hardware year then buying Acer or Asus monitor is not the best idea, because another 3+ exchanges are awaiting you, if you are lucky.


I do not really think placing my pc in the bathroom is a good idea...







and the other room is against the other wall from where my desk is so...

As for manufacturers... I had a lot of problems with corsair, AX860i PSU (really expensive and highest end serie...) And Corsair is a serious name in the business. The RMA service was super good not complaining about that.
and now with my EVGA GTX 1080, and EVGA is one of the better brands for Nvidia based cards so...

I really do want IPS, I have a TN panel now and I really miss the better colors en angles of it.
And why do you think it will ruin your performance? Ok sure FPS will take a hit but I do not play competitive and last years more and more RPG's and other singleplay games for the story and world. In my opinion a nice IPS with a sharper visuals would really help my immersion and the looks of the games in comparison with a 1ms TN panel with 144fps.

But why do you hate Acer and ASUS?


----------



## Krzych04650

Quote:


> Originally Posted by *Archang3l*
> 
> I really do want IPS, I have a TN panel now and I really miss the better colors en angles of it.
> And why do you think it will ruin your performance? Ok sure FPS will take a hit but I do not play competitive and last years more and more RPG's and other singleplay games for the story and world. In my opinion a nice IPS with a sharper visuals would really help my immersion and the looks of the games in comparison with a 1ms TN panel with 144fps.
> 
> But why do you hate Acer and ASUS?


I understand, I play the same way, for story, world and visuals, so IPS is my way to go, I am just saying that 1440p IPS G-sync is almost not doable as nobody serious makes it.

I don't hate anyone or anything, just judging by quality and customer support. I would love it if they were good manufacturers, this would be the best for everyone, the more good products the better, but well, they are not, so...


----------



## Tdbeisn554

Quote:


> Originally Posted by *Krzych04650*
> 
> I understand, I play the same way, for story, world and visuals, so IPS is my way to go, I am just saying that 1440p IPS G-sync is almost not doable as nobody serious makes it.
> 
> I don't hate anyone or anything, just judging by quality and customer support. I would love it if they were good manufacturers, this would be the best for everyone, the more good products the better, but well, they are not, so...


Oh yeah now I get your point








Well I now have a new laptop with an IPS display and I actually prefer my laptop screen over my desktop 27" 1ms screen for pictures and games (only visual part then )
I had an LG IPS screen some years back but I got a 27" 144hz for free and 27" was superior to 21" so..
But now I really want back an IPS panel, with at least 1440p or 4K resolution (sharper visuals and more place on the screen, which is a big plus too since I am studying IT now)


----------



## Krzych04650

Quote:


> Originally Posted by *Archang3l*
> 
> Oh yeah now I get your point
> 
> 
> 
> 
> 
> 
> 
> 
> Well I now have a new laptop with an IPS display and I actually prefer my laptop screen over my desktop 27" 1ms screen for pictures and games (only visual part then )
> I had an LG IPS screen some years back but I got a 27" 144hz for free and 27" was superior to 21" so..
> But now I really want back an IPS panel, with at least 1440p or 4K resolution (sharper visuals and more place on the screen, which is a big plus too since I am studying IT now)


How about 3440x1440 IPS monitor? Thats what I bought after all and I am very happy that things turned that way. Although that is not making quality concerns any smaller, most of ultrawides are as faulty as those Asus/Acer IPS gaming screens. But there are some quality ones, even for less than $700 if you don't want curve. If you do ten around $800.


----------



## ralphi59

Grab an Acer xb321hk
It s a dream with 1 1080gtx


----------



## Derpinheimer

Quote:


> Originally Posted by *Agoniizing*
> 
> I flashed my lemon 1080 FTW with the T4 bios, and so far im stable at 2012mhz. I haven't tried pushing it further, but the extra voltage helped. This card cant even go past 1987mhz on stock volts.


Mine boosts to 2025 on it's own, does yours not?
I'm not asking that in a douchey way intentionally, i just mean.. Doesn't it crash if you don't lower the core since it boosts too high by default?


----------



## Derek1

Quote:


> Originally Posted by *SauronTheGreat*
> 
> guys i need some help in these two 1080s which one is more powerful out of the box ?
> 
> i) Gigabyte 1080 Xtreme gaming
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5920&dl=#sp
> 
> ii ) Zotac 1080 AMP! Extreme
> 
> https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-amp-extreme#spec


I think you basiclly have to cross your fingers and hope that you get a good chip regardless of maker. Or even versions within a makers product line.

If you have been following this thread you must have seen how much the Lotery comes into effect here.

Box ratings are not a good indicator of what you might receive.

My FTW boosted to 1987 out of the box, no OCing, though it was only rated to do 1860.

Best advice, flip a coin. At least it will tell you what you want.









ETA And that advertised rating may be a marketing ploy. anyway. As in my case I am a lot happier with the 1987 than I would have been with 1860, which might be just some arbitrary figure they use as a "at least" number. Who knows? They are such weasels in marketing. lol

ETA2 Also, higher clocks do not guarantee significantly better FPS.


----------



## Krzych04650

Yay my Zotac order just go postponed by 4 days to the next week... I don't get those shops. Why are they listing things they don't have? If they list something then they should have at least one on hand ready to send. Especially huge shops like one I ordered from should. Huge stocks and fast deliveries are the only reasons why they are attractive. And now they not only doesn't have things but also lie about delivery time, wasting your time. I would cancel this order and get money back, but they have by far the best price, even better than in Germany, and this is even before taking more expensive delivery and currency exchange costs into account, so I don't really have much choice.


----------



## caenlen

Quote:


> Originally Posted by *Archang3l*
> 
> Seems like I have a bad hardware year... My replacement GTX 1080 Classified has some led bleed on the E from the EVGA logo, kinda bothers me and the coil whine is really annoying. I'm getting a shipping label from EVGA to retun it... I really hope third time is a charm... If I have a card with no (or minimal coil whine) and a clock of 2100 I would be super happy. Every time I play a game the card whines like it is dying and it is kinda loud too. Any of you guys have coil whine on their card??
> 
> And since I am still on a 1080p screen I really want to upgrade to a new monitor (GTX 1080 is really overpowered for 1080p screen
> 
> 
> 
> 
> 
> 
> 
> )But I am not really sure if I should go 1440 or 4K... I mean 4K is kinda the future, and I will keep the monitor probably for 5-10 years. All media is going to 4K (movies, games, series,...) and only games are 1440. But if I only get 30fps in my games... then yeah it's not really fun too.
> I just don't really know what I should look at/ buy. I do want at least an IPS panel with G-sync and 1440p or 4K. I have my eye on the ASUS PG279Q and ASUS PG279AQ. Any tips or ideas would be welcome


fyi I would argue the gtx 1080 is not powered for 1080p 240hz, that new Asus 240hz 1080p monitor i am considering buying... 240hz is heaven... i can tell a difference from 144hz, no motion blur :3


----------



## Tdbeisn554

Quote:


> Originally Posted by *caenlen*
> 
> fyi I would argue the gtx 1080 is not powered for 1080p 240hz, that new Asus 240hz 1080p monitor i am considering buying... 240hz is heaven... i can tell a difference from 144hz, no motion blur :3


Haha yeah sure for 240Hz it will be a good card








But what are you playing?
I want a monitor with really nice colors, and a higher resolution to get extra sharp visuals with great colors etc. I'm playing Witcher 3, Skyrim, rpg's in general. MP games like BF and Titanfall,.. here and there, not my everyday games


----------



## caenlen

Quote:


> Originally Posted by *Archang3l*
> 
> Haha yeah sure for 240Hz it will be a good card
> 
> 
> 
> 
> 
> 
> 
> 
> But what are you playing?
> I want a monitor with really nice colors, and a higher resolution to get extra sharp visuals with great colors etc. I'm playing Witcher 3, Skyrim, rpg's in general. MP games like BF and Titanfall,.. here and there, not my everyday games


I play black ops 2 multi with some friends, its a lot of fun at high refresh rates. we don't play any other cod though


----------



## Tdbeisn554

Quote:


> Originally Posted by *caenlen*
> 
> I play black ops 2 multi with some friends, its a lot of fun at high refresh rates. we don't play any other cod though


Not saying high refresh rates are dumb







my TN is 144Hz too, but I would prefer an IPS with better colors over 144Hz.. But if I can have a 1440 or 4K screen WITH 120+ Hz that would be amazing


----------



## Iceman2733

Quote:


> Originally Posted by *Archang3l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *caenlen*
> 
> I play black ops 2 multi with some friends, its a lot of fun at high refresh rates. we don't play any other cod though
> 
> 
> 
> Not saying high refresh rates are dumb
> 
> 
> 
> 
> 
> 
> 
> my TN is 144Hz too, but I would prefer an IPS with better colors over 144Hz.. But if I can have a 1440 or 4K screen WITH 120+ Hz that would be amazing
Click to expand...

all you need is a little IPS 144hz and 1440









Sent from my Pixel XL using Tapatalk


----------



## Tdbeisn554

Quote:


> Originally Posted by *Iceman2733*
> 
> all you need is a little IPS 144hz and 1440
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my Pixel XL using Tapatalk


Haha what is a liitle IPS? a 2" IPS panel?


----------



## Derek1

Here is the link to the EVGA request page for the replacement thermal pads for anyone using a card with the ACX 3.0 cooler.

http://www.evga.com/thermalmod/


----------



## KedarWolf

Which do you think would be the best option for SLI on air?

1080 Zotac Amp Extreme or a Zotac FE?

Edit: Or wait for one 1080 Ti. I won't have the cash until the spring anyhow.


----------



## ucode

Quote:


> Originally Posted by *Derek1*
> 
> Here is the link to the EVGA request page for the replacement thermal pads for anyone using a card with the ACX 3.0 cooler.
> 
> http://www.evga.com/thermalmod/


What's next! If General Motors forgot to install cylinder head gaskets or used the wrong ones would they just send out free gaskets to their customers (which aren't really free because they should have been there in the first place) and leave their customers to install them.

IMHO if people want to install the pads themselves fine, but at least offer to do the work for those that don't.


----------



## Dragonsyph

Was gonna get 1080 FTW, but don't think i wanna after finding this stuff out, and reading the toms hardwar review showing the FTW reaching 108C on the VRMs makes me think the card wont last long.

What brand of card would you guys suggest?

Was looking at the GIGABYTE XTREME Gaming and it has a 4 year warranty which sounds good.


----------



## Iceman2733

Well let me give a little info to other users I have 2ea Evga 1080 FTW cards, they run amazing so far (knock on wood). I noticed that one of my cards would idle in the 50c range which was odd the other card was in the high 30-low 40ish. Well today my EK blocks come in and get to tearing the bone out. One of my cards was MISSING a memory thermal pad on one of the memory modules!!! I didn't have the card long and didn't notice any huge issues but this couldn't have been that healthy for the card glad I didn't try to OC it on air first. Needless to say I am quite dissapointed in EVGA for the money these cards cost there shouldn't be issues like this. On the plate you can see that one might have been stuck there but it never touched the memory module upon installation the module had NO thermal pad residue and was super clean compared to the rest! The way I removed it there was NO way it came off when I took it apart I noticed it as I was separating this plate from the board. I am not telling people to avoid EVGA but check your cards even if it means tearing a new card down. I will be calling to complain to EVGA in the morning if anything to let them know they have an issue. I was looking to upgrade to the X99 platform and my original plan was to use there Classified board now I am second guessing that.


----------



## turtletrax

Quote:


> Originally Posted by *Iceman2733*
> 
> Well let me give a little info to other users I have 2ea Evga 1080 FTW cards, they run amazing so far (knock on wood). I noticed that one of my cards would idle in the 50c range which was odd the other card was in the high 30-low 40ish. Well today my EK blocks come in and get to tearing the bone out. One of my cards was MISSING a memory thermal pad on one of the memory modules!!! I didn't have the card long and didn't notice any huge issues but this couldn't have been that healthy for the card glad I didn't try to OC it on air first. Needless to say I am quite dissapointed in EVGA for the money these cards cost there shouldn't be issues like this. On the plate you can see that one might have been stuck there but it never touched the memory module upon installation the module had NO thermal pad residue and was super clean compared to the rest! The way I removed it there was NO way it came off when I took it apart I noticed it as I was separating this plate from the board. I am not telling people to avoid EVGA but check your cards even if it means tearing a new card down. I will be calling to complain to EVGA in the morning if anything to let them know they have an issue. I was looking to upgrade to the X99 platform and my original plan was to use there Classified board now I am second guessing that.


This is why I think it should be a right to remove your stock cooler. My Asus FE cards had the sticker on the gpu screws and apparently they do consider it warranty voiding to remove. It used needle nose pliers to get them off without too much fuss, but it kinda pissed me off. Who wants to go through an rma that is completely avoidable by giving everything a quick check and scraping off the crap TIM these things come with??


----------



## Krzych04650

Quote:


> Originally Posted by *KedarWolf*
> 
> Which do you think would be the best option for SLI on air?
> 
> 1080 Zotac Amp Extreme or a Zotac FE?
> 
> Edit: Or wait for one 1080 Ti. I won't have the cash until the spring anyhow.


Depends on how much space you have between PCI-E slots. If you have 3 slots between them then there will be like 2mm space between top and bottom Zotac Extreme, so not too good (they fcuked this up in my opinion, this card is already very long and very wide, this by itself gives enormous heatsink, why would you expand something like this to the bottom also and make it 2,5 slot instead of 2? Nonsense) If you have 4 slots then you can get them. They are the best option on air because they are just the coolest while keeping good noise levels, and you have huge cooling potential left if you want to speed up fans. So for 3 slots, Zotac Extreme at the bottom and on the top either some hybrid or some thin but good card, like Gaming X. However it will be much hotter than Zotac especially if it is at the top. For 4 slots get 2 Extremes, cooling them should be very easy provided enough space between them and good case.

Here you have picture of what it would look like with 3 slot space, you would need to fit second card to this red PCI-E slot, not really good idea:



There is one guy on youtube, Known_Only_Once, he had two Extremes in SLI, so you can look on this channel and see how it looks. He said that temp difference between top and bottom card was 15-20 C, but top, hottest one was only 75 C max. So theoretically it will work.


----------



## TWiST2k

I have been looking at some FujiPoly pads for my 1080 FTW and was wondering what the thickness should be? I see the FujiPoly as 1mm or 1.5mm on Amazon.


----------



## x-apoc

Quote:


> Originally Posted by *TWiST2k*
> 
> I have been looking at some FujiPoly pads for my 1080 FTW and was wondering what the thickness should be? I see the FujiPoly as 1mm or 1.5mm on Amazon.


I'm using 1.5, currently idle temp is 38-40c @ 0 fan spin on ftw, This is with 75f / 23.8c room temp.


----------



## rakesh27

Guys,

I brought the Zotax Amp Edition 1080, i relise now i should have waited for the TI edition to come out, oh well.

Anyways, best thing i did was convert it to water, i brought the Kraken G10 and Corsair H75 i think cooler, stuck it on the 1080 and i think its the best mod you can do, its very easy to do.

My temps are super low and the framerates and overclock are high...

Problem with these high end cards are temps, they get hot very quickly, good thing about water is temps and noise are very low.

Try it, well worth doing.


----------



## Casper123123123

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> I brought the Zotax Amp Edition 1080, i relise now i should have waited for the TI edition to come out, oh well.
> 
> Anyways, best thing i did was convert it to water, i brought the Kraken G10 and Corsair H75 i think cooler, stuck it on the 1080 and i think its the best mod you can do, its very easy to do.
> 
> My temps are super low and the framerates and overclock are high...
> 
> Problem with these high end cards are temps, they get hot very quickly, good thing about water is temps and noise are very low.
> 
> Try it, well worth doing.


Could you please provide temps from old cooler and new liquid one?


----------



## rakesh27

Ok.

here we go

air idle 45c
water idle 36c

air load 75c-85c
water load 45c-55c

Big difference, now you can really overclock the card, with no temp throttling. Trust me, all you do is take off old cooler, stick on the Kracken G10 + VRM fan included, make sure you get the vga fan cable to normal 3-4 pin cable so the vrm's get cooled.

On the pump side, get some small memory heatsinks so they stay cool and your done.

You will never have to worry about temps again, as you know your card will perfom 100% in what ever you do.

Also i did push.pull on my rad and connected them togeather then onto 1 fan header on the mobo, the pump goes to another 1 fan/pump header on the mobo, and the fan included with the Kracken G10 connects to the video card fan header with the cable you buy so this can be done.

For my Powercolor 295x2 and EVGA 980TI SC i connected the pump + 3 fans all to the video card fan header and it all worked fine, not one problem, this time i wasnt sure with the 1080 if this card could cope with all that.

Try it, google it, and see the results... very easy mod to do.

What you do is google your card pcb or gpu board so if you can see if the vrms need small heatsinks attached when you do the conversion, for me i was luck as my card had its own heatsink attached to my VRM so i just left it on and carried on with the conversion.

If you have a plate on the gpu side that covers the memory and vrm or just vrm leave it on there and attach everything else and you should be fine as the fan will keep it cool anyways.

Good luck


----------



## Haunebu

16 Phase GTX 1080


----------



## KedarWolf

Quote:


> Originally Posted by *Haunebu*
> 
> 
> 
> 16 Phase GTX 1080


Yeah, just saw that in Google news. It's a two slot full water block you add to a custom loop with backplate (very heavy according to the review). 16 phase two 8 pin power connectors.









Edit: http://techreport.com/news/30862/zotac-and-thermaltake-join-forces-for-a-liquid-cooled-gtx-1080


----------



## Koniakki

Quote:


> Originally Posted by *Haunebu*
> 
> 
> 
> 16 Phase GTX 1080


Old news.









It was posted a while back. It's the elusive ZOTAC GTX 1080 PGF.


----------



## ssgtnubb

Quote:


> Originally Posted by *Dragonsyph*
> 
> Was gonna get 1080 FTW, but don't think i wanna after finding this stuff out, and reading the toms hardwar review showing the FTW reaching 108C on the VRMs makes me think the card wont last long.
> 
> What brand of card would you guys suggest?
> 
> Was looking at the GIGABYTE XTREME Gaming
> and it has a 4 year warranty which sounds good.


I'm running a Gigabyte 1080 Xtreme and can attest at how amazing it is as a card. No coil whine and temps are very good with the cooler setup. It's rather mesmerizing to look at horizontally in my S8S.


----------



## MACH1NE

Are the thermal pads missing on all manufactured acx 1080s or did they fix this up after a certain batch


----------



## Lays

Quote:


> Originally Posted by *Haunebu*
> 
> 
> 
> 16 Phase GTX 1080


Not like it will matter, as we've seen extra voltage barely helps at all, and almost all cards clock identically.


----------



## KedarWolf

Quote:


> Originally Posted by *Lays*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Haunebu*
> 
> 
> 
> 16 Phase GTX 1080
> 
> 
> 
> Not like it will matter, as we've seen extra voltage barely helps at all, and almost all cards clock identically.
Click to expand...

If they raise the voltage limits and power limits on this card with the 16 phases I'm sure you'd be able to get more consistent and better results, as long as they aren't friggin' Micron.


----------



## Dragonsyph

Quote:


> Originally Posted by *ssgtnubb*
> 
> I'm running a Gigabyte 1080 Xtreme and can attest at how amazing it is as a card. No coil whine and temps are very good with the cooler setup. It's rather mesmerizing to look at horizontally in my S8S.


Ya it does look pretty sick, and from the reviews i looked at its one of the coldest running cards and really good VRM temps also.


----------



## Thetbrett

Quote:


> Originally Posted by *KedarWolf*
> 
> Which do you think would be the best option for SLI on air?
> 
> 1080 Zotac Amp Extreme or a Zotac FE?
> 
> Edit: Or wait for one 1080 Ti. I won't have the cash until the spring anyhow.


get the AMP version. smae as extreme just different cooler. save yourself some money


----------



## KickAssCop

AMP version runs about 5-7 C hotter than AMP extreme. I would just buy AMP extreme if you are not going SLI.
If you are going SLI on air then I suggest ASUS STRIX. My cards hover around 75 C top and about 65 C bottom at full load. Usually in games it is even lower around 70 and 60 respectively.


----------



## KedarWolf

Quote:


> Originally Posted by *KickAssCop*
> 
> AMP version runs about 5-7 C hotter than AMP extreme. I would just buy AMP extreme if you are not going SLI.
> If you are going SLI on air then I suggest ASUS STRIX. My cards hover around 75 C top and about 65 C bottom at full load. Usually in games it is even lower around 70 and 60 respectively.


ASUS X99-A II motherboard as an extra PCI-E slot between the two cards so I may do okay with two Extremes.


----------



## ROKUGAN

Quote:


> Originally Posted by *KickAssCop*
> 
> AMP version runs about 5-7 C hotter than AMP extreme. I would just buy AMP extreme if you are not going SLI.
> If you are going SLI on air then I suggest ASUS STRIX. My cards hover around 75 C top and about 65 C bottom at full load. Usually in games it is even lower around 70 and 60 respectively.


I had both versions and in my case the AMP version ran 15-20 C hotter than the AMP Extreme @ 4K. The AMP cooler is not capable of keeping the card under 80C in 4K, while the Extreme will stay in the mid 60's. Also, the AMP will have slightly lower frequencies (around 50 Mhz = 1 FPS @ 4K, nothing serious). I did a more detailed comparison a while ago. I would def go with the Extreme, cooler is really awesome albeit huge. Nothing wrong with the AMP though, it reached 2063 Mhz stable but wasn´t completely happy seeing temps at 84C.


----------



## Krzych04650

Quote:


> Originally Posted by *Thetbrett*
> 
> get the AMP version. smae as extreme just different cooler. save yourself some money


Point is that this "just different cooler" determines if he will get thermal throttling or not, assuming that he wants good noise levels. Don't spread misinformation, you are saying like there is no major difference between them, while in fact AMP Extreme is capable of efficient and quiet cooling while AMP is not. And we are talking about single card, and he asked about SLI on air, two AMPs wouldn't last 15 minutes of load.


----------



## Vellinious

Messed around a little bit with the rig after I got the waterblocks on last night. I'm pretty impressed. I'll add some clock to the CPU this weekend and give it a real go at it.

http://www.3dmark.com/spy/639346


----------



## kikibgd

hey guys im about to pull trigger for 1080, now it was suggested g1 or strix, i really dont want anything with MSI (never ever again)

so which one has less problems, *note i hate coil whine....


----------



## juniordnz

Quote:


> Originally Posted by *kikibgd*
> 
> hey guys im about to pull trigger for 1080, now it was suggested g1 or strix, i really dont want anything with MSI (never ever again)
> 
> so which one has less problems, *note i hate coil whine....


I'd go for the strix if you don't care about the awful customer support


----------



## kikibgd

im about to buy their famous 279q swift, i guess im going ballz deep









but still waiting for peoples opinions


----------



## ValSidalv21

Quote:


> Originally Posted by *kikibgd*
> 
> hey guys im about to pull trigger for 1080, now it was suggested g1 or strix, i really dont want anything with MSI (never ever again)
> 
> so which one has less problems, *note i hate coil whine....


If I was to buy one now, I'd go for the Gigabyte Xtreme Gaming. Both the G1 and the Strix look pretty average compared to it.


----------



## keikei

Hey Guys, what is the minimum OC on these cards on non-reference air? What sort of frames should i expect on BF1 @4k? Thanks. MIne's supposed to be delivered sometime today.


----------



## ROKUGAN

Quote:


> Originally Posted by *keikei*
> 
> Hey Guys, what is the minimum OC on these cards on non-reference air? What sort of frames should i expect on BF1 @4k? Thanks. MIne's supposed to be delivered sometime today.


Any 1080 (non-OC) should reach 60 FPS on BF1 @ 4K, lots of benchmarks/reviews on the web:

http://www.guru3d.com/articles_pages/battlefield_1_pc_graphics_benchmark_review,7.html


----------



## Reefer

Deionized water tries
Quote:


> Originally Posted by *KedarWolf*
> 
> Which do you think would be the best option for SLI on air?
> 
> 1080 Zotac Amp Extreme or a Zotac FE?
> 
> Edit: Or wait for one 1080 Ti. I won't have the cash until the spring anyhow.


I have the Amp edition. and with a custom fan profile making the fans go 70% @ 70°C, my temps in games are between 65 and 72 Celcius...

Still did not find out why other cards run so high in temp..
Mine is stable clocked at 2073 powerboost. and mem goes 5610 stable.

Btw i am using the Extreme bios on the card. For temps it is no difference


----------



## KedarWolf

So,

For SLI with an extra slot between the cards, what 1080s do you think would be best for overclocking and low temps?


----------



## arrow0309

My new "small" big case Enthoo Mini XL & 1080 Strix @Bits setup:


----------



## Krzych04650

Quote:


> Originally Posted by *KedarWolf*
> 
> So,
> 
> For SLI with an extra slot between the cards, what 1080s do you think would be best for overclocking and low temps?


If you have enough space between them and you want air then you need to look at some serious coolers like Zotac AMP Extreme or Gigabyte Extreme Gaming. You have to buy a card with stupid name basically









But SLI is where hybrid/watercooled cards are starting to make sense, so if you have money to spend and you are not bothered by water pump noise then AIO watercooled cards like EVGA Hybrid or Gigabyte Waterforce will give you the best results because they are basically running independent of each other.


----------



## greg1184

Just got this gem for my first gpu under water project.


----------



## Koniakki

Quote:


> Originally Posted by *KedarWolf*
> 
> So,
> 
> For SLI with an extra slot between the cards, what 1080s do you think would be best for overclocking and low temps?


The ones mentioned by other members like GB XG and ZOTAC AE are great too but lets not forget other good contenders.

My GameRock playing heavily graphics modded [email protected] with 2xMSAA/Ultra.
Yeah, 90% fan speed but it's not that loud at all tbh.

Max temp as shown: 48'C.


----------



## bloot

My GTX 1080 Super JetStream never goes above 65ºC and is usually staying at 63-64 with fans at 50% playing the Witcher 3 (vsync off), it's super silent also. I think Palit did a really great job with their coolers on the GTX 1080.

25ºC ambient btw.

Greetings.


----------



## Krzych04650

Quote:


> Originally Posted by *Koniakki*
> 
> The ones mentioned by other members like GB XG and ZOTAC AE are great too but lets not forget other good contenders.
> 
> My GameRock playing heavily graphics modded [email protected] with 2xMSAA/Ultra.
> Yeah, 90% fan speed but it's not that loud at all tbh.
> 
> Max temp as shown: 48'C.


Quote:


> Originally Posted by *bloot*
> 
> My GTX 1080 Super JetStream never goes above 65ºC and is usually staying at 63-64 with fans at 50% playing the Witcher 3 (vsync off), it's super silent also. I think Palit did a really great job with their coolers on the GTX 1080.
> 
> 25ºC ambient btw.
> 
> Greetings.


I agree that Palit should be mentioned. They did a great job. Another 2,5 slot coolers. This is not very optimal for anything more than single GPU config, I would like to heave an option like Sapphire Fury cooler, something like 2,1 slot, only exceeding 2-slot by like 2 mm, very efficient and quiet triple fan cooler, you can fit two of them easily in 3 slot SLI, which you cannot say about 2,5 slot coolers.


----------



## Derpinheimer

Quote:


> Originally Posted by *Koniakki*
> 
> The ones mentioned by other members like GB XG and ZOTAC AE are great too but lets not forget other good contenders.
> 
> My GameRock playing heavily graphics modded [email protected] with 2xMSAA/Ultra.
> Yeah, 90% fan speed but it's not that loud at all tbh.
> 
> Max temp as shown: 48'C.


I'm sure the cooler is good, but.. Your framerate appears capped at 50? And gpu usage never exceeded 74%


----------



## Koniakki

Quote:


> Originally Posted by *Derpinheimer*
> 
> I'm sure the cooler is good, but.. Your framerate appears capped at 50? And gpu usage never exceeded 74%


Yeah, I was a bit bored and I was messing(again) with the Hz(50 vs 59 vs 60Hz) on my Sammy 40D5005 TV while re-adjusting(again) the PQ settings..


----------



## greg1184

Very nice block. Easy to install, my first time doing this. Phanteks provides very nice documentation.


----------



## ROKUGAN

Quote:


> Originally Posted by *Reefer*
> 
> Deionized water tries
> I have the Amp edition. and with a custom fan profile making the fans go 70% @ 70°C, my temps in games are between 65 and 72 Celcius...
> 
> Still did not find out why other cards run so high in temp..
> Mine is stable clocked at 2073 powerboost. and mem goes 5610 stable.
> 
> Btw i am using the Extreme bios on the card. For temps it is no difference


Either you're very lucky on your unit or you´re playing at low resolution and not stressing much your GPU, as temp problems in the AMP have been extensively commented:


__
https://www.reddit.com/r/4sgwg8/zotac_1080_amp_edition_owners_temperature_and/

I even changed the thermal paste in my AMP and lowered the temps by 3C on average, but they were still edging around 80C on most demading games @ 4K (fans @ 100%) so I finally bought the Extreme.

Nothing wrong with the AMP, but just want to make clear that the temps you are reporting are much lower than average users with that card.


----------



## juniordnz

Can anyone tell me the thickness of the pads used on the FTW? I plan on putting pads on the frontplate over the mosfets and VRAM modules and on the backplate also where the mosfets and VRAM modules are located.

Of course it needs to make contact with the heatplate in the front and backplate in the back, 1,5mm thick should be enough?


----------



## Menthol

Quote:


> Originally Posted by *juniordnz*
> 
> I'd go for the strix if you don't care about the awful customer support


^^^ This is the correct answer IMO, FTW if you want piece of mind with EVGA support, I's stay away from 2.5/3 slot cards, Pascal doesn't need that much cooler and it blocks the next slot, I had ASUS 680 cards with 2.5 slot coolers, never again

The FTW's cooler is quieter than the cooler on the Strix, at least on the cards I have, Strix looks nice on an ASUS board with matching RGB lighting if that's your thing but both are quiet and clock about the same, thee Strix is a little longer than the FTW if your casee has limitations


----------



## Menthol

Quote:


> Originally Posted by *KedarWolf*
> 
> So,
> 
> For SLI with an extra slot between the cards, what 1080s do you think would be best for overclocking and low temps?


Make sure to get the correct HB sli bridge, it's a little confusing how the slot spacing is advertised
I have a FE, FTW and a Strix myself, all are good cards, obviously the FEE is louder than the others, LED lighting is nicee on both FTW and Strix but Strix has thee ROG logo on back plate that really looks nice, Strix boosts higher at stock but FTW is real close and all boost about the same if you push them and can keep them cool enough, I used a universal block when I benched the FE and FTW, haven't gotten around to benching the Strix


----------



## Menthol

Quote:


> Originally Posted by *juniordnz*
> 
> Can anyone tell me the thickness of the pads used on the FTW? I plan on putting pads on the frontplate over the mosfets and VRAM modules and on the backplate also where the mosfets and VRAM modules are located.
> 
> Of course it needs to make contact with the heatplate in the front and backplate in the back, 1,5mm thick should be enough?


Have you placed your request for the free pads from EVGA,


----------



## Reefer

Quote:


> Originally Posted by *ROKUGAN*
> 
> Either you're very lucky on your unit or you´re playing at
> 
> Nothing wrong with the AMP, but just want to make clear that the temps you are reporting are much lower than average users with that card.


Hi,

Yes, i have seen all those post before, so weird/lucky.. That my card does not reach those tempraturen.

I play my games @1440p all maxed out. Case is the antec 900 v2, i did change the paste aswell but that did not help alot.

Guess i am lucky then. Always game oc'd around 2072Mhz. 1.093V mem @5605.


----------



## juniordnz

Quote:


> Originally Posted by *Menthol*
> 
> Have you placed your request for the free pads from EVGA,


Yes, did it yesterday. But I live in Brazil, and I'm not sure how long will it take to get here or if they are going to ship here at all. It would take me 30-45 days to get some from ebay...

Guess I'll get in contact with EVGA's rep here in Brazil to see if they have some info on that matter...


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Yes, did it yesterday. But I live in Brazil, and I'm not sure how long will it take to get here or if they are going to ship here at all. It would take me 30-45 days to get some from ebay...
> 
> Guess I'll get in contact with EVGA's rep here in Brazil to see if they have some info on that matter...


Junior, did you also order the FTW hybrid kit?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Junior, did you also order the FTW hybrid kit?


No, I didn't. I have a modded Kraken G10 + H100i V2 on the hold here. Will wait for the pads so I can do everything just once. Also need to wait for some Kryonaut and Screw Set bought from ebay. Hopefully I get to do it all before 2017









EVGA's hybrid kit is not avaiable here. Plus, I think that small thin rad wouldn't be able to handle the summer here...


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> No, I didn't. I have a modded Kraken G10 + H100i V2 on the hold here. Will wait for the pads so I can do everything just once. Also need to wait for some Kryonaut and Screw Set bought from ebay. Hopefully I get to do it all before 2017
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA's hybrid kit is not avaiable here. Plus, I think that small thin rad wouldn't be able to handle the summer here...


I was going to do the same thing (G10), but cancelled my order for it the minute I saw the release of the FTW kit 2 days ago. I will be waiting for the pads also and like you get it all done at the same time. Going for AS 5 here myself. Did you get a shim or are you going to cut the tabs off?

It is not available here yet either and probably won't be until like you say 2017 and I am not prepared to wait. I ordered direct from EVGA store and with the exchange rate and duty and taxes etc am looking at $300 C for it. That will teach me for not looking closer and buying the FTW hybrid the first time.


----------



## FireDragon

I attempted to sign up for the 1080 owner's club, but the signin form rejects my gpu-z id which is 4qkrs. Further, the form only has "Founders" and "Reference" editions as an option. I have the GigaByte watercooled version.


----------



## Krzych04650

And my Zotac Extreme order got postponed again, first from 27 to 31 and now to 04.11. So I told them to fcuk off and give me money back and I ordered Palit, or Gainward in the Us, Jetstream. Should be here tomorrow. 3004 ZL compared to 3330 for Zotac. Will see how good those Palits/Gainwards really are. I can always return if I don't like it. Thought about MSI Gaming X but it would cost 3450 so...

Is this Palit Jetstream custom or reference PCB?


----------



## sirleeofroy

Quote:


> Originally Posted by *Krzych04650*
> 
> And my Zotac Extreme order got postponed again, first from 27 to 31 and now to 04.11. So I told them to fcuk off and give me money back and I ordered Palit, or Gainward in the Us, Jetstream. Should be here tomorrow. 3004 ZL compared to 3330 for Zotac. Will see how good those Palits/Gainwards really are. I can always return if I don't like it. Thought about MSI Gaming X but it would cost 3450 so...
> 
> Is this Palit Jetstream custom or reference PCB?


I have the Gainward GLH which is the same as the Palit Super Jetsream, they all use the same cooler which has to be one of the best air coolers out there. It's a custom board BTW.

Mine boosts to 2063MHz out of the box (stock settings, no tweaks).


----------



## boredgunner

Quote:


> Originally Posted by *sirleeofroy*
> 
> Mine boosts to 2063MHz out of the box (stock settings, no tweaks).


That's insane. That's my max constant overclock on my Armor 8G OC due to its crappy cooler.


----------



## Bishop07764

Quote:


> Originally Posted by *FireDragon*
> 
> I attempted to sign up for the 1080 owner's club, but the signin form rejects my gpu-z id which is 4qkrs. Further, the form only has "Founders" and "Reference" editions as an option. I have the GigaByte watercooled version.


The forms a bit outdated. I have a MSI Gaming X EK. I finally tried again and got mine to validate. You post the link to your gpuz validation in the last spot on the form. It should automatically take it then.


----------



## EDGERRIES

If you
Quote:


> Originally Posted by *KedarWolf*
> 
> So,
> 
> For SLI with an extra slot between the cards, what 1080s do you think would be best for overclocking and low temps?


I have been running Sli configs since the 8800gtx's. My rule of thumb is if you've got one slot between cards or they are sandwhiched, always go for the reference or blower type of card (barring watercooling).

The blowers manage to exhaust all the air out the case instead of blowing air around the case like the non blower after market coolers do. I have always received better temps in Sli using blower cards close together then Evga acx coolers, Asus strix coolers and msi dragon coolers iv'e used to try better my Sli experience.

Temps have always been higher than my blower fan cards I have used. I am currently running 2x1080s FE in sli with one spacing and the blowers are doing a brilliant job at keeping the card under throttling temps. When I've used non blower cards the top Card always seems to get to high 80-90c's plus while benching and gaming.

The best air cooling solution for me in SLI are blower coolers and then obviously the best experience for Sli with regards to temps and performance is watercooling









Just thought i'd add my experience over the years with Sli configs.


----------



## Bishop07764

Well made the jump to 1440p 144hz gsync 27" in from my old dinosaur hanns g 28" 1080p 60hz monitor. Wow, you guys aren't kidding. The difference is amazing even on the desktop.







I did go with a "inferior" Dell TN panel but I just didn't want to play the IPS lottery. I got it on sale at Best Buy brand new for less than 500 bones US too. I can't believe that I'm asking this, but how can I limit my fps to 144 at 1440p? Do you guys use rivatuner? I want to keep it in the gsync range. I was shooting up to 200fps at times in Doom even with everything set on Nightmare. At least a good problem to have.


----------



## mndx

Quote:


> Originally Posted by *sirleeofroy*
> 
> I have the Gainward GLH which is the same as the Palit Super Jetsream, they all use the same cooler which has to be one of the best air coolers out there. It's a custom board BTW.
> 
> Mine boosts to 2063MHz out of the box (stock settings, no tweaks).


I got the GLH too, but i think the comparable Palit would be the Gamerock Series. JetStream is the cheaper one.


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> I was going to do the same thing (G10), but cancelled my order for it the minute I saw the release of the FTW kit 2 days ago. I will be waiting for the pads also and like you get it all done at the same time. Going for AS 5 here myself. Did you get a shim or are you going to cut the tabs off?
> 
> It is not available here yet either and probably won't be until like you say 2017 and I am not prepared to wait. I ordered direct from EVGA store and with the exchange rate and duty and taxes etc am looking at $300 C for it. That will teach me for not looking closer and buying the FTW hybrid the first time.


I'm going to use a 1,5mm shim, not going to void warranty for that. I'm just no sure if I use Kryonaut or CLU. It freaks me out the idea of having liquid metal near all those SMDs. And having a shim does complicate things a little. I'll have to make a super thin TIM layer on both sides to get a good heat transfer. Maybe I'll lap the shim to a mirror finish to get better heat transfer, idk...

EVGA don't even ship to Brazil lol so here the FTW Hybrid kit was definetely a no go. And corsair's 5 year national warranty was very important too...I just like the peace of mind...

My advice to everyone: don't ever, EVER , EVEEEEER, buy a G10 thinking you'll adapt it easily to 5gen asetek's coolers. The amount of work it took to make it fit on the H100i V2 is something I don't wish on my enemies..


----------



## sirleeofroy

Quote:


> Originally Posted by *mndx*
> 
> I got the GLH too, but i think the comparable Palit would be the Gamerock Series. JetStream is the cheaper one.


My bad, you're absolutely right. The Super Jetstream is equivalent to the Gainward Golden Sample.

The GLH is indeed the same card as the Palit Gamerock Edition....

On that note, any idea if the "G-Panel" works with the Gainward cards? Technically they should as the cards are the same, I just kinda want that panel......


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> I'm going to use a 1,5mm shim, not going to void warranty for that. I'm just no sure if I use Kryonaut or CLU. It freaks me out the idea of having liquid metal near all those SMDs. And having a shim does complicate things a little. I'll have to make a super thin TIM layer on both sides to get a good heat transfer. Maybe I'll lap the shim to a mirror finish to get better heat transfer, idk...
> 
> EVGA don't even ship to Brazil lol so here the FTW Hybrid kit was definetely a no go. And corsair's 5 year national warranty was very important too...I just like the peace of mind...
> 
> My advice to everyone: don't ever, EVER , EVEEEEER, buy a G10 thinking you'll adapt it easily to 5gen asetek's coolers. The amount of work it took to make it fit on the H100i V2 is something I don't wish on my enemies..


Maybe use the CLU on the shim to copper cold plate on the pump and then the Grizzly on the gpu to shim?


----------



## bloot

Quote:


> Originally Posted by *mndx*
> 
> I got the GLH too, but i think the comparable Palit would be the Gamerock Series. JetStream is the cheaper one.


PCB design, cooler, power phases and everything else is the same on all of the three models. Gamerock is even cheaper than the JetStream, at least here . They only change the shroud and color scheme on the cooler and the core clock and memory speeds, the rest is identical.

Greetings.


----------



## Krzych04650

Thanks for answers about Palit.

I ordered it today after I got message about Zotac order being postponed and it is already on the way, will be delivered tomorrow. I shouldn't order this Zotac in the first place, especially considering 10% price difference.

Reviews are very good for Palit so there is no reason to pay more I guess. Those cards run super cool with stupidly now fan speeds, so I should be able to get really low temps.

Will see, I will report tomorrow about the card.


----------



## FireDragon

Thanks. That worked. I was posting the validation CODE and not the url. The form is very unclear on that.


----------



## OZrevhead

Quote:


> Originally Posted by *juniordnz*
> 
> I'm going to use a 1,5mm shim, not going to void warranty for that. I'm just no sure if I use Kryonaut or CLU. It freaks me out the idea of having liquid metal near all those SMDs. And having a shim does complicate things a little. I'll have to make a super thin TIM layer on both sides to get a good heat transfer. Maybe I'll lap the shim to a mirror finish to get better heat transfer, idk...
> 
> EVGA don't even ship to Brazil lol so here the FTW Hybrid kit was definetely a no go. And corsair's 5 year national warranty was very important too...I just like the peace of mind...
> 
> My advice to everyone: don't ever, EVER , EVEEEEER, buy a G10 thinking you'll adapt it easily to 5gen asetek's coolers. The amount of work it took to make it fit on the H100i V2 is something I don't wish on my enemies..


Shims reduce the effectiveness too

Use a freight forwarder, there are plenty (I use shipito). That's how we get evga store stuff sent to Australia too.


----------



## juniordnz

Quote:


> Originally Posted by *OZrevhead*
> 
> Shims reduce the effectiveness too
> 
> Use a freight forwarder, there are plenty (I use shipito). That's how we get evga store stuff sent to Australia too.


It would still cost too much considering the difference in currency. Also, shipping costs are insane! It may be worth it there, but here is definitely not. Also, I'm not a huge fan of Evga's hybrid kit and their tiny radiators. Plus we get a great customer support from corsair here in Brazil. 5 years warranty on watercolors and they replace the faulty one within 3-5 days. That made me go for two H100i V2 on my right, one for cpu and the other for gpu.


----------



## kx11

i have a pair of 1080 Strix gpus + SLi HB 3slots spacing

if anyone is interested i'm willing to sell them


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> Can anyone tell me the thickness of the pads used on the FTW? I plan on putting pads on the frontplate over the mosfets and VRAM modules and on the backplate also where the mosfets and VRAM modules are located.
> 
> Of course it needs to make contact with the heatplate in the front and backplate in the back, 1,5mm thick should be enough?


I am trying to find this out as well man haha! You got me looking at those FujiPoly pads, I had never heard of them until you mentioned it.


----------



## DStealth

New driver closing 8.5 GPU in Time spy..
http://www.3dmark.com/3dm/15694338

Interesting finding my card is stable @1v 2114 pleased with it so far cannot exceed 50* while playing with custom fan profile ...wonderful for stock cooler Palit Jetsream


----------



## Krzych04650

Okay, got Palit Jetstream like hour ago.

Card is quite well built, looks nice on the top and bottom but quite raw from the side as you see it for the most part, because there is just raw heatsink there, but this not an issue for me, I want performance.

Card is beefy, very thick.

Cards boosts to 1860 out of the box.

As for cooling, I have now Valley running for like 20 minutes and it settled at 64 C with 30% fan speed which is 750 RPM which is stupidly low, but I have my PC in very well ventilated case and in different room where there is 12.5 C ambient temp. But the card is surely dead quiet even with like 25 C ambient temp.

As for coil whine, every single GPU in the world whines regardless of anything, but in this case whining is quite gentle. All cards start to whine even after connecting them, they don't even need load, and under load whine usually gets horrible, but this time whining is more or less the same for idle and load, you are not getting anything better than this in times when some stupid coil whine rules the industry.

Another amazing thing is power draw. Its just crazy how power efficient this card is. Stock 980 Ti was drawing about 350W from the wall, stock Fury was drawing 360W. GTX 1080 draws... 240W.

Final OC results need time so I will post about it later.

So far so good. Only thing I don't like is the size, I prefer longer cards instead of thicker ones.]

EDIT: 47 C on 50% fan speed which is 1250 RPM. What a cooler...

EDIT 2: Card's boost is actually 1911, it goes there only during very heavy load, Valley or even Fire Strike is not pushing it there, only combined test from Fire Strike does.


----------



## keikei

Anyone play MGSV? I'm getting reboots everytime i play the game. I've read the game doesnt play nicely with overlays, but it doesnt seem to be the case as i've closed steam and msi frame counters before playing.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> I am trying to find this out as well man haha! You got me looking at those FujiPoly pads, I had never heard of them until you mentioned it.


Problem is I think will have to buy different thickness pads for each part of the card. Thicker ones for the backplate, there's something like a 2-3mm gap there between the PCB and the backplate. Also, memory modules are "higher" than mosfets, probably 1,5mm for mosfets 0,5-1mm for VRAM modules.

Never handled those pads myself, do they compress well? I mean, if we put a thick one and compress it with the heatplate, will it compress and fill the gap?

What a pain this is...thanks EVGA


----------



## Krzych04650

Played a bit with overclocking on Palit Jetstream and it looks like anything around or under 2101 MHz is stable, assuming small overvoltage, and everything above will eventually crash.

Memory got up to 5500 MHz easily, +500 overclock, no need to try more.

So I ended up with +200/+500 overclock. Clock is fluctuating between 2050 and 2101 depending on application and load, nice improvement over 1860-1911 of stock. But I don't really understand those clocks to be honest, its like they are dependent on card's bias more than on what I tell them to do and I don't have full control over them.

Performance improvement from overclocking is 9-11%, which is not a lot, but still noticeable and important improvement. My previous card, Sapphire Fury Nitro, was making 5% improvement in huge pain and needed +60mV for that and everyone who had AMD card knows what it means. Basically +25% to already high power draw and +10 C to temp. Here temperature went up by 7 C (from 43 to 50) and power draw still didn't reach 290W from the wall with overcloked i5 4690K 4,5 Ghz 1,35V.

Also I am very positively surprised by performance. I expected less. Card is just around 60 [email protected] in The Witcher 3 Ultra without Hairworks in as demanding area as Crockback Bog, which is one of the most most demanding areas in the game, I was getting 35 FPS on Fury there when I was testing on the same settings, now 56 FPS on stock and 61 after OC.

So I would have to reconsider my SLI plans, depending on what comes out in January and how (actual availability, price, performance and etc)


----------



## bloot

I knew you wouldn't be disappointed congrats and enjoy @Krzych04650!


----------



## Krzych04650

Quote:


> Originally Posted by *bloot*
> 
> I knew you wouldn't be disappointed congrats and enjoy @Krzych04650!


Thanks. Definitely recommended card, I cannot see what those 20% more expensive models could have to justify the price difference.


----------



## ejohnson

1080 ordered!


----------



## Fediuld

Would the Strixx BIOS work on an Armor OC?
Since it seems is working on the Palit one.


----------



## Derek1

Must Read!!
Was posted today.
http://www.gamersnexus.net/news-pc/2661-evga-mosfet-failure-possible-from-thermal-runaway-scenario


----------



## keikei

Nvidia Drivers WHQL 375.70


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Must Read!!
> Was posted today.
> http://www.gamersnexus.net/news-pc/2661-evga-mosfet-failure-possible-from-thermal-runaway-scenario


Each day I get more convinced that this thermal pad mod is a must. It'll just take too long to get those pads here in brazil and I'm not really sure what quality EVGA will be sending us. After all this mess don't even try to push those 1w/mK chep pads on me...

I'm almost buying some Arctic 6w/mK myself. Only thing is that I don't know the thickness necessary. I can only find up to 1,5mm thick pads and I believe the gap between PCB and backplate is bigger than that.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Each day I get more convinced that this thermal pad mod is a must. It'll just take too long to get those pads here in brazil and I'm not really sure what quality EVGA will be sending us. After all this mess don't even try to push those 1w/mK chep pads on me...
> 
> I'm almost buying some Arctic 6w/mK myself. Only thing is that I don't know the thickness necessary. I can only find up to 1,5mm thick pads and I believe the gap between PCB and backplate is bigger than that.


Didn't someone find that they needed 3mm there?

Not sure what to tell you Junior. The whole ordeal has be a PITA!
And only now EVGA is offering to replace the cards for those who do not want to attempt the thermal pad mod on their own.
Too late for me as I have received my Hybrid kit today at a cost of $275C after currency conversion shipping and duty/taxes.
Total price to date for my FTW Gaming $1300C.


----------



## GAEVULK

very very happy with my 1080


----------



## Derek1

Nice and Clean!


----------



## x-apoc

Great looking system.

I would use this coolant instead if it was my pc. http://www.frozencpu.com/products/17691/ex-liq-272/Mayhems_Aurora_Coolant_Concentrate_-_250mL_-_Nebula_Blue_.html


----------



## ucode

This is what I was talking about earlier, can displayed clocks be trusted.
Quote:


> Overclocking on both our GTX 1050 and GTX 1050 Ti cards was more problematic than on previous products. It looks as though NVIDIA has capped memory overclocking to 2002 MHz. GPU overclocking is limited too, to a maximum boost clock of 1911 MHz, but the cap doesn't seem to be implemented correctly *as clocks beyond that will provide more GPU performance with the displayed number staying at 1911 MHz*.


----------



## feznz

Quote:


> Originally Posted by *x-apoc*
> 
> Great looking system.
> 
> I would use this coolant instead if it was my pc. http://www.frozencpu.com/products/17691/ex-liq-272/Mayhems_Aurora_Coolant_Concentrate_-_250mL_-_Nebula_Blue_.html


problem being it is sold as a show case fluid lasting from a few days to maybe a month or so before the Aurora drops out and from what I have heard it is an obsolete nightmare to clean out afterwards to the stage of disassembling blocks and throwing out the rads.


----------



## x-apoc

Real shame if thats the case. It looks amazing.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Each day I get more convinced that this thermal pad mod is a must. It'll just take too long to get those pads here in brazil and I'm not really sure what quality EVGA will be sending us. After all this mess don't even try to push those 1w/mK chep pads on me...
> 
> I'm almost buying some Arctic 6w/mK myself. Only thing is that I don't know the thickness necessary. I can only find up to 1,5mm thick pads and I believe the gap between PCB and backplate is bigger than that.


Hey Junior, in the Hybrid Kit for the FTW they include a strip of thermal pad that is supposed to be attached to the back of the PCB and back plate. It is 2mm thick.
I don't know if that is going to be the right thickness the way things have been going with EVGA recently, but its a start anyway.


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Hey Junior, in the Hybrid Kit for the FTW they include a strip of thermal pad that is supposed to be attached to the back of the PCB and back plate. It is 2mm thick.
> I don't know if that is going to be the right thickness the way things have been going with EVGA recently, but its a start anyway.


That's really for the backplate side of the pcb? If it is, it's going to be a pain finding 2mm thick thermal pads. I've seen only 0,5 1,0 and 1,5mm thick ones.

Does it come with pads for the mosfets also?

Thanks for the info









Already set my card back to 215W limit instead of 280W it was before. Shouldn't be too much of an issue because all I'm playing right now doesn't stress the card that much. But I'll certainly do that mod when I get the H100i installed.


----------



## ssgwright

any great bios come out recently?


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> That's really for the backplate side of the pcb? If it is, it's going to be a pain finding 2mm thick thermal pads. I've seen only 0,5 1,0 and 1,5mm thick ones.
> 
> Does it come with pads for the mosfets also?
> 
> Thanks for the info
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Already set my card back to 215W limit instead of 280W it was before. Shouldn't be too much of an issue because all I'm playing right now doesn't stress the card that much. But I'll certainly do that mod when I get the H100i installed.


Here ya go, take your pick. http://www.frozencpu.com/cat/l2/g8/c487/list/p1/Thermal_Interface-Thermal_Pads_Tape.html

Yes it says to be attached inbetween the pcb and backplate. The kit has a heat sink with fan that extends to just over the mosfets and along the underside of the sink is a strip of pad to cover them. I will be replacing that with the fujipoly.

The fujipoly come in 2.0, 2.5 and 3.0. Expensive I know but I am in so deep now what is another $30.


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Here ya go, take your pick. http://www.frozencpu.com/cat/l2/g8/c487/list/p1/Thermal_Interface-Thermal_Pads_Tape.html
> 
> Yes it says to be attached inbetween the pcb and backplate. The kit has a heat sink with fan that extends to just over the mosfets and along the underside of the sink is a strip of pad to cover them. I will be replacing that with the fujipoly.
> 
> The fujipoly come in 2.0, 2.5 and 3.0. Expensive I know but I am in so deep now what is another $30.


So it should be something like 2.0mm for everything on the backplate and something like 1,5mm for mosfets on the frontplate? VRAM modules are a tad bit higher so I guess 1,0mm? God, I'm going crazy with this...I'm the same boat, buddy. I'm in so deep with this card that I MUST do everything right. It'll be just a great relief when I get pads + watercooling settled...


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> So it should be something like 2.0mm for everything on the backplate and something like 1,5mm for mosfets on the frontplate? VRAM modules are a tad bit higher so I guess 1,0mm? God, I'm going crazy with this...I'm the same boat, buddy. I'm in so deep with this card that I MUST do everything right. It'll be just a great relief when I get pads + watercooling settled...


Yes, 2mm for the backplate/pcb I think. The strip on the bottom of the heatsink that is to cover the mosfets is only 1mm. The pads on the underside of the copper plate that go over the VRAM chips are 1mm.
Check out this video tear down of the FTW Hybrid.
http://www.gamersnexus.net/guides/2581-evga-gtx-1080-ftw-hybrid-tear-down

There is no frontplate as you will see so I am unsure what thickness you would need there.
better slightly too thick than under so don't be afraid to overestimate.

ETA The R22 modules/phases look to be higher than the mosfets.


----------



## SIDWULF

What board partner should I get and why?


----------



## boredgunner

Quote:


> Originally Posted by *SIDWULF*
> 
> What board partner should I get and why?


Will you be water cooling or no? If not, Zotac AMP Extreme, Gigabyte XTREME Gaming, or MSI GAMING/GAMING X. These all have great coolers. All GTX 1080s overclock essentially the same due to BIOS limitations.


----------



## SIDWULF

Quote:


> Originally Posted by *boredgunner*
> 
> Will you be water cooling or no? If not, Zotac AMP Extreme, Gigabyte XTREME Gaming, or MSI GAMING/GAMING X. These all have great coolers. All GTX 1080s overclock essentially the same due to BIOS limitations.


No water cooling. I heard with the GTX 1080's that the clock frequency spikes alot which causes microstutter. Have you noticed this? Seems to happen with custom cooler solutions also.

http://videocardz.com/60838/msi-geforce-gtx-1080-gaming-x-is-much-better-than-founders-edition

http://www.hardocp.com/article/2016/06/27/asus_rog_gtx_1080_strix_gaming_video_card_review/4#.WBVXVeCp7qB

The MSI card does not appear to suffer from these frequency spikes at all like the ASUS Strix and Founders Edition do above.


----------



## boredgunner

Quote:


> Originally Posted by *SIDWULF*
> 
> No water cooling. I heard with the GTX 1080's that the clock frequency spikes alot which causes microstutter. Have you noticed this? Seems to happen with custom cooler solutions also.
> 
> http://videocardz.com/60838/msi-geforce-gtx-1080-gaming-x-is-much-better-than-founders-edition
> 
> http://www.hardocp.com/article/2016/06/27/asus_rog_gtx_1080_strix_gaming_video_card_review/4#.WBVXVeCp7qB
> 
> The MSI card does not appear to suffer from these frequency spikes at all like the ASUS Strix and Founders Edition do above.


I experience no such stutters, although that HardOCP review suggests it is related to the power limit which makes sense. I've always maxed mine out.


----------



## SIDWULF

Quote:


> Originally Posted by *boredgunner*
> 
> I experience no such stutters, although that HardOCP review suggests it is related to the power limit which makes sense. I've always maxed mine out.


Why would asus ship the cards like that? They are clearly not keeping a stable frequency at default settings and gpu boost is hitting a limit continuously causing the spikes and microstutter. I will go with MSI cause it seems like they acctually tested their cards!

Even worse for the founders edition its clock stability is a mess!

Why would anyone buy a these cards knowing this?


----------



## Benjiw

Quote:


> Originally Posted by *SIDWULF*
> 
> Why would asus ship the cards like that? They are clearly not keeping a stable frequency at default settings and gpu boost is hitting a limit continuously causing the spikes and microstutter. I will go with MSI cause it seems like they acctually tested their cards!
> 
> Even worse for the founders edition its clock stability is a mess!
> 
> Why would anyone buy a these cards knowing this?


ASUS strix cards have been power limited since 9xx so it wouldn't surprise me if the new 10x0 cards are the same.


----------



## Vellinious

Pushing the GPUs a bit. Love these things.

1080 FTWs @ 2164 / 5499

http://www.3dmark.com/3dm/15733297


----------



## kx11

you're lucky those FTWs didn't burn on you


----------



## Vellinious

Quote:


> Originally Posted by *kx11*
> 
> you're lucky those FTWs didn't burn on you


They're under water....the VRMs stay plenty cool. lol


----------



## kx11

that's good , my friend bought 2 FTWs and their temps are never hotter than 20c however they freakin' shutdown every 5 minutes on him









bad VRMs and broken sensors on both cards , talk about bad luck


----------



## Vellinious

The temps shown in GPUz and the overclocking programs are of the core. If you want to measure how hot the VRMs run, need to use an infrared thermometer.

I'd wager, that if he replaced the thermal pads, they'd be fine. I ran both of my FTWs at 2100+ on air, and had no issues whatsoever. But....I had replaced the thermal pads.


----------



## MACH1NE

Hey guys purchased the evga 1080sc I was wondering if we can flash a modded bios on these which disables the speed boost and keeps a steady overclock. IIRC this was possible on my evga 980ti classy


----------



## Vellinious

No


----------



## ralphi59

Hi
Use a fix point in the afterburner curve


----------



## Benjiw

Quote:


> Originally Posted by *Vellinious*
> 
> Pushing the GPUs a bit. Love these things.
> 
> 1080 FTWs @ 2164 / 5499
> 
> http://www.3dmark.com/3dm/15733297


Great score, I got my i7 6700k the other day for my 980tis and I got 9k in time spy but I've not flashed them or overclocked the i7.


----------



## keikei

Quote:


> Originally Posted by *boredgunner*
> 
> Will you be water cooling or no? If not, Zotac AMP Extreme, Gigabyte XTREME Gaming, or MSI GAMING/GAMING X. These all have great coolers. All GTX 1080s overclock essentially the same due to BIOS limitations.


Why the zotac amp extreme?


----------



## kikibgd

Tomorrow I go pick up Gigabyte XTREME Gaming hope I get good one.
Going to be some lottery day need to pick up Asus 279q swift also


----------



## keikei

Quote:


> Originally Posted by *kikibgd*
> 
> Tomorrow I go pick up Gigabyte XTREME Gaming hope I get good one.
> Going to be some lottery day need to pick up Asus 279q swift also


Let us know how it goes. I plan on returning my evga. I'm not just gonna wait for it to die on me. The gigabyte is definitely on my list. The msi version looks super cool, but wont fit my case.


----------



## Spiriva

Palit have a new bios out for the "Palit 1080 GameRock Premium"

*New:*
Board power limit
*Target: 230.0 W
Limit: 276.0 W
Adj. Range: -61%, +20%*
Thermal limits
Rated: 83.0C
Max: 92.0C

*Old:*
Board power limit
*Target: 200.0 W
Limit: 240.0 W
Adj. Range: -55%, +20%*
Thermal limits
Rated: 83.0C
Max: 92.0C

http://www.palit.com/palit/vgapro.php?id=2614&lang=en&pn=NEB1080H15P2-1040G&tab=do
(added 2016 10 28)


----------



## bloot

Quote:


> Originally Posted by *Spiriva*
> 
> Palit have a new bios out for the "Palit 1080 GameRock Premium"
> 
> *New:*
> Board power limit
> *Target: 230.0 W
> Limit: 276.0 W
> Adj. Range: -61%, +20%*
> Thermal limits
> Rated: 83.0C
> Max: 92.0C
> 
> *Old:*
> Board power limit
> *Target: 200.0 W
> Limit: 240.0 W
> Adj. Range: -55%, +20%*
> Thermal limits
> Rated: 83.0C
> Max: 92.0C
> 
> http://www.palit.com/palit/vgapro.php?id=2614&lang=en&pn=NEB1080H15P2-1040G&tab=do
> (added 2016 10 28)


It's also available for the Super JetStream http://www.palit.com/palit/vgapro.php?id=2619&lang=en&pn=NEB1080S15P2-1040J&tab=do

I flashed it and it's the same they released on september 21 (86.04.3B.00.66), don't know why did they change the date, the zip name is VGA_BIOS_Upgrade_1024F which may indicate the date as it did in the previous one (VGA_BIOS_Upgrade_0921).

Trying to flash it using the included exe it says there's no need to update, so no doubt it's the same, at least the Super JetStream one.


----------



## Krzych04650

Quote:


> Originally Posted by *Spiriva*
> 
> Palit have a new bios out for the "Palit 1080 GameRock Premium"
> 
> *New:*
> Board power limit
> *Target: 230.0 W
> Limit: 276.0 W
> Adj. Range: -61%, +20%*
> Thermal limits
> Rated: 83.0C
> Max: 92.0C
> 
> *Old:*
> Board power limit
> *Target: 200.0 W
> Limit: 240.0 W
> Adj. Range: -55%, +20%*
> Thermal limits
> Rated: 83.0C
> Max: 92.0C
> 
> http://www.palit.com/palit/vgapro.php?id=2614&lang=en&pn=NEB1080H15P2-1040G&tab=do
> (added 2016 10 28)


I would like to see this for JetStream too. The card is hitting max power target quite often under huge load, like Time Spy for example, and throttles a bit. Thats the problem only if your card can go over 2050 MHz, but still.


----------



## arrow0309

Quote:


> Originally Posted by *GAEVULK*
> 
> 
> 
> 
> 
> very very happy with my 1080


+Rep!
Nice rig and wc setup you got there Gaetano!















The temps are OK








Btw, did you see my new system as well?









http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/7410#post_25609792


----------



## JoeDirt

Is it ugly? Yes. Does it work and was it free? Yes. Asus Strix 1080 mated with a H105. Junk fans will soon be replaced. Lowered temps by 15c. Could be better but I'm not going to screw with it anymore.


----------



## Dragonsyph

Quote:


> Originally Posted by *JoeDirt*
> 
> Is it ugly? Yes. Does it work and was it free? Yes. Asus Strix 1080 mated with a H105. Junk fans will soon be replaced. Lowered temps by 15c. Could be better but I'm not going to screw with it anymore.


15c is still good, for better temps using that cooler you are gonna have to get rid of the concave cooler head. Lap it to flat or something.


----------



## JoeDirt

Quote:


> Originally Posted by *Dragonsyph*
> 
> 15c is still good, for better temps using that cooler you are gonna have to get rid of the concave cooler head. Lap it to flat or something.


That's what I was thinking too. But to much work and I just don't care that much anymore.


----------



## Koniakki

Quote:


> Originally Posted by *Spiriva*
> 
> Palit have a new bios out for the "Palit 1080 GameRock Premium"
> 
> *New:*
> Board power limit
> *Target: 230.0 W
> Limit: 276.0 W
> Adj. Range: -61%, +20%*
> Thermal limits
> Rated: 83.0C
> Max: 92.0C
> 
> *Old:*
> Board power limit
> *Target: 200.0 W
> Limit: 240.0 W
> Adj. Range: -55%, +20%*
> Thermal limits
> Rated: 83.0C
> Max: 92.0C
> 
> http://www.palit.com/palit/vgapro.php?id=2614&lang=en&pn=NEB1080H15P2-1040G&tab=do
> (added 2016 10 28)


Just flashed it on my GTX 1080 GamerRock(non Premium). On to testing now.

The report below for my card confirms the 230/276w PL.









==============NVSMI LOG==============

Timestamp : Sun Oct 30 22:13:10 2016
Driver Version : 373.06

Attached GPUs : 1
GPU 0000:01:00.0
Power Readings
Power Management : Supported
Power Draw : 8.17 W
Power Limit : 230.00 W
Default Power Limit : 230.00 W
Enforced Power Limit : 230.00 W
Min Power Limit : 90.00 W
Max Power Limit : 276.00 W


----------



## Vellinious

Got both GPUs to run 2202 through Timespy...the score dropped off a tad, though, so I'll back the clocks back off until I get the T4 bios flashed and can feed the clocks a little more voltage.


----------



## ucode

Nice


----------



## Dragonsyph

Decided to get the EVGA GeForce GTX 1080 FTW HYBRID GAMING, its just gonna take 2 more weeks of paychecks to save up for it rofl. God i can't wait. Hope i get a good overclocker, like 2165 or so.


----------



## SIDWULF

I'm trying to choose a board partner but I have noticed something peculiar going on between the founders edition and some custom solutions.

http://videocardz.com/60838/msi-geforce-gtx-1080-gaming-x-is-much-better-than-founders-edition

"those spikes cause micro-stuttering, which negatively affects gaming experience"

The MSI 1080 Gaming X seems to be the only card that has stable frequency across the board.

https://nl.hardware.info/reviews/6765/17/msi-geforce-gtx-1080-gaming-x-review-de-eerste-custom-gtx-1080-duurtest

http://www.hardocp.com/article/2016/09/26/msi_geforce_gtx_1080_gaming_x_8g_video_card_review/5#.WBQtohmp7qA

I would like to buy the ASUS Strix (it seems like the most popular) but it also seems to have clock stability issues just like the Founders Edition:

https://nl.hardware.info/reviews/6784/17/asus-geforce-gtx-1080-strix-review-kleurrijk-alternatief-duurtest

http://www.hardocp.com/article/2016/06/27/asus_rog_gtx_1080_strix_gaming_video_card_review/4#.WBQtkuCp7qC


----------



## Dragonsyph

Quote:


> Originally Posted by *SIDWULF*
> 
> I'm trying to choose a board partner but I have noticed something peculiar going on between the founders edition and some custom solutions.
> 
> http://videocardz.com/60838/msi-geforce-gtx-1080-gaming-x-is-much-better-than-founders-edition
> 
> "those spikes cause micro-stuttering, which negatively affects gaming experience"
> 
> The MSI 1080 Gaming X seems to be the only card that has stable frequency across the board.
> 
> https://nl.hardware.info/reviews/6765/17/msi-geforce-gtx-1080-gaming-x-review-de-eerste-custom-gtx-1080-duurtest
> 
> http://www.hardocp.com/article/2016/09/26/msi_geforce_gtx_1080_gaming_x_8g_video_card_review/5#.WBQtohmp7qA
> 
> I would like to buy the ASUS Strix (it seems like the most popular) but it also seems to have clock stability issues just like the Founders Edition:
> 
> https://nl.hardware.info/reviews/6784/17/asus-geforce-gtx-1080-strix-review-kleurrijk-alternatief-duurtest
> 
> http://www.hardocp.com/article/2016/06/27/asus_rog_gtx_1080_strix_gaming_video_card_review/4#.WBQtkuCp7qC
> 
> This is frustrating


Most cards do that because of how GPU boost 3.0 works. They will down clock anytime the card thinks it can get away with it.


----------



## SIDWULF

Quote:


> Originally Posted by *Dragonsyph*
> 
> Most cards do that because of how GPU boost 3.0 works. They will down clock anytime the card thinks it can get away with it.


Some custom cards have stable frequency on the same tests, so I don't think it's necessarily the way gpu boost is supposed to work, with constant frequency spikes which would cause micro stutter and frame latency issues. Something is wrong with the cards mentioned.


----------



## juniordnz

Stable frequency is achieved by low Temps and a good power delivery system. Choose a card that have both and you're good to go.

If you're staying on air I'd suggest amp!extreme


----------



## Vellinious

Quote:


> Originally Posted by *SIDWULF*
> 
> Some custom cards have stable frequency on the same tests, so I don't think it's necessarily the way gpu boost is supposed to work, with constant frequency spikes which would cause micro stutter and frame latency issues. Something is wrong with the cards mentioned.


That all comes down to temps, and when / if it's hitting the thermal clock down points. Some coolers are a tad better than others. Some silicon is a tad better. On air cooling with pascal, your best bet (not counting the silicon lottery, which will be a huge player in overclockability), is to find the GPU with the best cooling solution.

The hotter the GPU runs, the less efficient it is, and the more voltage it'll need for a specific clock. When it hits those thermal points, it's either going to up the voltage or drop the clock a step...depending of course on how the voltage / frequency curve is set.

All in all, if you're wanting a card that's not going to have the clocks jump around, the pascal GPUs aren't for you, unless you're willing to put the time in and create a custom voltage / frequency curve to give you your desired results...which you'll have to do on pretty much any of them.

G'luck with your selections.


----------



## SIDWULF

Quote:


> Originally Posted by *Vellinious*
> 
> That all comes down to temps, and when / if it's hitting the thermal clock down points. Some coolers are a tad better than others. Some silicon is a tad better. On air cooling with pascal, your best bet (not counting the silicon lottery, which will be a huge player in overclockability), is to find the GPU with the best cooling solution.
> 
> The hotter the GPU runs, the less efficient it is, and the more voltage it'll need for a specific clock. When it hits those thermal points, it's either going to up the voltage or drop the clock a step...depending of course on how the voltage / frequency curve is set.
> 
> All in all, if you're wanting a card that's not going to have the clocks jump around, the pascal GPUs aren't for you, unless you're willing to put the time in and create a custom voltage / frequency curve to give you your desired results...which you'll have to do on pretty much any of them.
> 
> G'luck with your selections.


I know how it works...There is clearly something wrong with the founders edition and asus strix cards here...their clock frequency is all over the place compared to MSI


----------



## Vellinious

Quote:


> Originally Posted by *SIDWULF*
> 
> I know how it works...There is clearly something wrong with the founders edition and asus strix cards here...their clock frequency is all over the place compared to MSI


Lottery....they need more voltage, so the curve bounces the clocks around more. /shrug

I don't think there's anything "wrong with them". Just working the way boost 3.0 was intended to work.


----------



## SIDWULF

Quote:


> Originally Posted by *Vellinious*
> 
> Lottery....they need more voltage, so the curve bounces the clocks around more. /shrug
> 
> I don't think there's anything "wrong with them". Just working the way boost 3.0 was intended to work.


There's clearly a pattern between reviews and the asus custom cards and founder edition frequency is all over the place. So it cant be the silicon lottery. Probabaly the choice of components on the boards themselves.


----------



## Vellinious

Quote:


> Originally Posted by *SIDWULF*
> 
> There's clearly a pattern between reviews and the asus custom cards and founder edition frequency is all over the place. So it cant be the silicon lottery. Probabaly the choice of components on the boards themselves.


The FE's have a terrible cooler....thus the temps are higher.

lol, you clearly have it figured out., so.....why did you even ask?

Good evening. /wink


----------



## SIDWULF

Quote:


> Originally Posted by *Vellinious*
> 
> The FE's have a terrible cooler....thus the temps are higher.
> 
> lol, you clearly have it figured out., so.....why did you even ask?
> 
> Good evening. /wink


What about the asus strix? Why does it have a similar problem as the Founders Edition and the MSI doesn't?


----------



## chiknnwatrmln

The FE clocks bounce around because of temperature and power limit, not silicon lottery


----------



## Dragonsyph

Some cards bios have higher power limits like 110% or 120% or 130%. And all FE cards get hot fast and hit the thermal limit set in the software at which point down volt or down clock to reduce heat.

Thats why im going with the EVGA GeForce GTX 1080 FTW HYBRID GAMING, has duel bios with one at 130% power and has core temps around 41C. The few reviews of them out seen 2165 OC.


----------



## Jim86

My FE is clocked at 2126 and running at 49 degrees full load usually pretty stable no crashes yet will try 2151 soon.


----------



## wardo3640

Quote:


> Originally Posted by *Vellinious*
> 
> Pushing the GPUs a bit. Love these things.
> 
> 1080 FTWs @ 2164 / 5499
> 
> http://www.3dmark.com/3dm/15733297


Very nice numbers.

These are mine.


The most I can manage with my pair of Seahawk X's is 2050 and 5500 and a time spy gfx score of 15,500. Thinking of repasting and putting some better pads on it.

Any recommendations on a bios for me?


----------



## Vellinious

Quote:


> Originally Posted by *wardo3640*
> 
> Very nice numbers.
> 
> These are mine.
> 
> 
> The most I can manage with my pair of Seahawk X's is 2050 and 5500 and a time spy gfx score of 15,500. Thinking of repasting and putting some better pads on it.
> 
> Any recommendations on a bios for me?


Is there a trick to getting the numbers from a pic of some AIO units? Or....


----------



## ROKUGAN

Quote:


> Originally Posted by *keikei*
> 
> Why the zotac amp extreme?


Temps. This review pretty much answers it (despite his unit was not a good OC)


----------



## 66racer

Hello,

Im curious, my 1080ftw doesnt down clock during idle (with two 1080p monitors) so it stays at 17xx mhz. I remember this being an issue with dual monitors a while ago and think even 4k monitors but I disconnected the second monitor and it still does it; even after a reboot. It just stays at 17xx mhz but does boost like normal to a peak of 1967ish mhz. Im not overly concerned but since it is an ITX chassis I would like it to idle if there is anything I can do about it. (edit: just thought of something, on single monitor the back of the gpu still has the HDMI cable for my second screen connected, I may try to disconnect that as well when Im back home)

Anything I can try? Windows 10 pro and Geforce experience is installed.

Also, cant wait to get the thermal pads from EVGA to see if she will OC a little better, my peak seems to be around 2050mhz so I just keep it stock. Good thing is my ITX chassis blows fresh air directly onto the GPU or I may be a little more concerned about the VRM issue.

Thanks!


----------



## juniordnz

Quote:


> Originally Posted by *66racer*
> 
> Hello,
> 
> Im curious, my 1080ftw doesnt down clock during idle (with two 1080p monitors) so it stays at 17xx mhz. I remember this being an issue with dual monitors a while ago and think even 4k monitors but I disconnected the second monitor and it still does it; even after a reboot. It just stays at 17xx mhz but does boost like normal to a peak of 1967ish mhz. Im not overly concerned but since it is an ITX chassis I would like it to idle if there is anything I can do about it. (edit: just thought of something, on single monitor the back of the gpu still has the HDMI cable for my second screen connected, I may try to disconnect that as well when Im back home)
> 
> Anything I can try? Windows 10 pro and Geforce experience is installed.
> 
> Also, cant wait to get the thermal pads from EVGA to see if she will OC a little better, my peak seems to be around 2050mhz so I just keep it stock. Good thing is my ITX chassis blows fresh air directly onto the GPU or I may be a little more concerned about the VRM issue.
> 
> Thanks!


That's weird. Mine downclocks like it should when there's no 3d app running.

Do you have chrome running? Did you disable hardware acceleration on chrome? Had to do that with my 970 because it would constantly boost to 3d clocks when browsing on chrome.

Take a good look at the apps that are running when you're Idling, theres something telling the cards to boost.


----------



## x-apoc

Quote:


> Originally Posted by *66racer*
> 
> Hello,
> 
> Im curious, my 1080ftw doesnt down clock during idle (with two 1080p monitors) so it stays at 17xx mhz. I remember this being an issue with dual monitors a while ago and think even 4k monitors but I disconnected the second monitor and it still does it; even after a reboot. It just stays at 17xx mhz but does boost like normal to a peak of 1967ish mhz. Im not overly concerned but since it is an ITX chassis I would like it to idle if there is anything I can do about it. (edit: just thought of something, on single monitor the back of the gpu still has the HDMI cable for my second screen connected, I may try to disconnect that as well when Im back home)
> 
> Anything I can try? Windows 10 pro and Geforce experience is installed.
> 
> Also, cant wait to get the thermal pads from EVGA to see if she will OC a little better, my peak seems to be around 2050mhz so I just keep it stock. Good thing is my ITX chassis blows fresh air directly onto the GPU or I may be a little more concerned about the VRM issue.
> 
> Thanks!


Confirming my also downclocks.

Do you happen to have Kboost enabled? , see if that does it once its off.


----------



## 66racer

Quote:


> Originally Posted by *juniordnz*
> 
> That's weird. Mine downclocks like it should when there's no 3d app running.
> 
> Do you have chrome running? Did you disable hardware acceleration on chrome? Had to do that with my 970 because it would constantly boost to 3d clocks when browsing on chrome.
> 
> Take a good look at the apps that are running when you're Idling, theres something telling the cards to boost.


Ah of course, forgot about chrome.....think I had hardware acceleration on because an older device so I bet it carried that over to the desktop too. I will check that out!

thanks


----------



## kikibgd

bought GTX 1080 Xtreme Gaming its good card got no problems with it no coil whine no bent parts, its bit more hot in my case.. i guess i need new case...

goes over 2050 alone


----------



## Dragonsyph

Quote:


> Originally Posted by *kikibgd*
> 
> bought GTX 1080 Xtreme Gaming its good card got no problems with it no coil whine no bent parts, its bit more hot in my case.. i guess i need new case...
> 
> goes over 2050 alone


Nice and grats, 8))). That was one of the cards i was looking at, seemed like a pretty good built 1080. What are the temps you are getting? Think most reviews got around 65c.


----------



## Thetbrett

Quote:


> Originally Posted by *Krzych04650*
> 
> Point is that this "just different cooler" determines if he will get thermal throttling or not, assuming that he wants good noise levels. Don't spread misinformation, you are saying like there is no major difference between them, while in fact AMP Extreme is capable of efficient and quiet cooling while AMP is not. And we are talking about single card, and he asked about SLI on air, two AMPs wouldn't last 15 minutes of load.


read the reviews, not much difference in temps. certainly not worth the extra 200 dollars. DId my research


----------



## x-apoc

After messing around with thermal pads, I was able to bring down 1080 evga temp to 50-60c at 80% fan speed at 99% gpu load / vram usage was 7.4gb. Its not looking pretty, but seeing results I will fix that next lol. Also on less gpu demanding games temp was high 40s low 50s. For the die I use GC-Extreme tp.


----------



## kikibgd

Quote:


> Originally Posted by *Dragonsyph*
> 
> Nice and grats, 8))). That was one of the cards i was looking at, seemed like a pretty good built 1080. What are the temps you are getting? Think most reviews got around 65c.


71c while 29c in the room (aircondition) but this case is very very bad for air cooling

im thinking to do some modding to front panel and put 2x140mm fans


----------



## 66racer

Quote:


> Originally Posted by *x-apoc*
> 
> After messing around with thermal pads, I was able to bring down 1080 evga temp to 50-60c at 80% fan speed at 99% gpu load / vram usage was 7.4gb. Its not looking pretty, but seeing results I will fix that next lol. Also on less gpu demanding games temp was high 40s low 50s. For the die I use GC-Extreme tp.


Im looking forward to doing the thermal pad install as soon as mine get here.


----------



## NYU87

Just picked up a MSI GTX 1080 Gaming X. Ran Firestrike with a quick overclock, offset +160/450.

Graphics score of 25,134. Not bad.


----------



## Dragonsyph

Quote:


> Originally Posted by *NYU87*
> 
> Just picked up a MSI GTX 1080 Gaming X. Ran Firestrike with a quick overclock, offset +160/450.
> 
> Graphics score of 25,134. Not bad.


With +160 to the core what does it boost to?


----------



## juniordnz

For God's sake, please, PLEASE stop posting offsets. Post the actual core/mem clock.

Each card has different base/bost/mem clock. Your offset tells us nothing.


----------



## Dragonsyph

Quote:


> Originally Posted by *juniordnz*
> 
> For God's sake, please, PLEASE stop posting offsets. Post the actual core/mem clock.
> 
> Each card has different base/bost/mem clock. Your offset tells us nothing.


----------



## Koniakki

Quote:


> Originally Posted by *juniordnz*
> 
> For God's sake, please, PLEASE stop posting offsets. Post the actual core/mem clock.
> 
> Each card has different base/bost/mem clock. Your offset tells us nothing.


+1


----------



## NYU87

Quote:


> Originally Posted by *Dragonsyph*
> 
> With +160 to the core what does it boost to?


Quote:


> Originally Posted by *juniordnz*
> 
> For God's sake, please, PLEASE stop posting offsets. Post the actual core/mem clock.
> 
> Each card has different base/bost/mem clock. Your offset tells us nothing.


Haha my bad.

Core was around 2117MHz and memory was 5500MHz.


----------



## wirefox

HAHA 1/6/04 --- guess I haven't logged into 3mark in a long time...



Zotac articstorm

24,955

2088 / 5528 (where I run for bf1 - could go higher on both but don't)


----------



## x-apoc

Don't have screenies, but run Fire Strike last night on my 1080 ftw, got ~ 23500 gpu score and combined was ~17600, default clocks, fan speed 90% never broke 60c temp, at 80% fan 61c was highest, 100% I don't remember. Clock boosted to 1987mhz, once temp went up it dropped to 1964mhz.

Haven't figured out max overclock yet, but I've been seeing artifact when I tried to exceed +100 mhz core clock with only +300mhz on MEM.


----------



## Bishop07764

Quote:


> Originally Posted by *66racer*
> 
> Hello,
> 
> Im curious, my 1080ftw doesnt down clock during idle (with two 1080p monitors) so it stays at 17xx mhz. I remember this being an issue with dual monitors a while ago and think even 4k monitors but I disconnected the second monitor and it still does it; even after a reboot. It just stays at 17xx mhz but does boost like normal to a peak of 1967ish mhz. Im not overly concerned but since it is an ITX chassis I would like it to idle if there is anything I can do about it. (edit: just thought of something, on single monitor the back of the gpu still has the HDMI cable for my second screen connected, I may try to disconnect that as well when Im back home)
> 
> Anything I can try? Windows 10 pro and Geforce experience is installed.
> 
> Also, cant wait to get the thermal pads from EVGA to see if she will OC a little better, my peak seems to be around 2050mhz so I just keep it stock. Good thing is my ITX chassis blows fresh air directly onto the GPU or I may be a little more concerned about the VRM issue.
> 
> Thanks!


I have a MSI gaming X EK, and I have the same issue. Mine downclocks for a while after I install a new driver. Then it goes right back to idling at +1700. It does it just sitting on the desktop with nothing happening. Not a huge deal, but it is annoying. It still does it after I upgraded my monitor to a 1440p g sync too. If you find fix, please let me know.


----------



## NYU87

Playing around with MSI Afterburner, got an overclock of 2139-2126MHz on core and 1.09GHz on memory.

GPU score: 25,195

http://www.3dmark.com/fs/10653587


----------



## steeludder

[email protected] (1.46V!)
1080 @ 2215/5556 (power modded, no volt mod, i.e. severely volt-starved)

Time Spy Result: *8519*
http://www.3dmark.com/spy/668684

CPU/GPU/VRM on chilled water (~13C)

I'm targeting top spot for my CPU+GPU combo, but I struggle to extract more out of this box. I need 60 more points. Gonna experiment some more with ram timings, currently running my Gskill CL15 3000 kit at 3200(15-15-15-35 1T).

Any suggestion welcome!


----------



## Lays

Quote:


> Originally Posted by *steeludder*
> 
> [email protected] (1.46V!)
> 1080 @ 2215/5556 (power modded, no volt mod, i.e. severely volt-starved)
> 
> Time Spy Result: *8519*
> http://www.3dmark.com/spy/668684
> 
> CPU/GPU/VRM on chilled water (~13C)
> 
> I'm targeting top spot for my CPU+GPU combo, but I struggle to extract more out of this box. I need 60 more points. Gonna experiment some more with ram timings, currently running my Gskill CL15 3000 kit at 3200(15-15-15-35 1T).
> 
> Any suggestion welcome!


Run it on a fresh win 10 or win 8 install that's been stripped if you really want the bragging rights of having the "fastest" for your hardware category.

That 1080 is probably not volt starved, more volts will probably only slightly help a few mhz. You've gotta be subzero on these cards to really see voltage do anything at all for clock speeds.


----------



## steeludder

Quote:


> Originally Posted by *Lays*
> 
> Run it on a fresh win 10 or win 8 install that's been stripped if you really want the bragging rights of having the "fastest" for your hardware category.
> 
> That 1080 is probably not volt starved, more volts will probably only slightly help a few mhz. You've gotta be subzero on these cards to really see voltage do anything at all for clock speeds.


Well, the voltage limit in Afterburner is a solid line at "1" throughout the entire benchmark run so I would venture a guess that a few mV more wouldn't hurt. Especially since the GPU core temp shows a max of 22C under full load. Might not do wonders, but 50MHz on the core would probably give me the 60 points I need for absolute bragging rights!









But then again I'm not gonna do a hard volt mod on this card so it's kinda moot. I could attempt a cross-flash (Strix T4 bios looks potent) but I'm wary of the consequences. I just don't wanna brick it. Meh.


----------



## Spiriva

Is the t4 bios the best bios out there yet for the 1080´s ? or is there another one to try ?


----------



## WoWScoty

Quote:


> Originally Posted by *Koniakki*
> 
> Just flashed it on my GTX 1080 GamerRock(non Premium). On to testing now.
> 
> The report below for my card confirms the 230/276w PL.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ==============NVSMI LOG==============
> 
> Timestamp : Sun Oct 30 22:13:10 2016
> Driver Version : 373.06
> 
> Attached GPUs : 1
> GPU 0000:01:00.0
> Power Readings
> Power Management : Supported
> Power Draw : 8.17 W
> Power Limit : 230.00 W
> Default Power Limit : 230.00 W
> Enforced Power Limit : 230.00 W
> Min Power Limit : 90.00 W
> Max Power Limit : 276.00 W


Hello,

I also have the GameRock(not Premium) edition card and I have already flashed the old bios upgrade. However when I try to flash using the Palit tool it says "No need to update your VGA BIOS!\n\n(or not support for your VGA card)" I have tried extracting the .rom file from the Dest_BIN.exe, but it is password protected. Can you share the new (10.28.2016) BIOS ?


----------



## Koniakki

Quote:


> Originally Posted by *WoWScoty*
> 
> Hello,
> 
> I also have the GameRock(not Premium) edition card and I have already flashed the old bios upgrade. However when I try to flash using the Palit tool it says "No need to update your VGA BIOS!\n\n(or not support for your VGA card)" I have tried extracting the .rom file from the Dest_BIN.exe, but it is password protected. Can you share the new (10.28.2016) BIOS ?


I should clarify that this is the "updated" bios for the GameRock Premium from TP, datng 21/9 which some say its the same as the one released on 28/10. This is not the newest 28/10 bios.

And to be honest I have my doubts if its the same, just for the reason that over Palit's website download options for the GameRock Premium there was already that 21/9 released bios until it was replaced with the newest one on 28/10.

So it doesnt seem logical for Palit to re-upload the same bios with just different upload dates as the only difference.

Or its indeed the same and that's the reason why it says no need to update since we already "have" the updated bios or simply we can't fool Palit from thinking this is indeed a GameRock Premium and not another model.

So as WoWScoty requested, it would be great if someone who flashed the latest 28/10 GameRock Premium bios can share it please.


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Seems like I have a bad hardware year... My replacement GTX 1080 Classified has some led bleed on the E from the EVGA logo, kinda bothers me and the coil whine is really annoying. I'm getting a shipping label from EVGA to retun it... I really hope third time is a charm... If I have a card with no (or minimal coil whine) and a clock of 2100 I would be super happy. Every time I play a game the card whines like it is dying and it is kinda loud too. Any of you guys have coil whine on their card??
> 
> And since I am still on a 1080p screen I really want to upgrade to a new monitor (GTX 1080 is really overpowered for 1080p screen
> 
> 
> 
> 
> 
> 
> 
> )But I am not really sure if I should go 1440 or 4K... I mean 4K is kinda the future, and I will keep the monitor probably for 5-10 years. All media is going to 4K (movies, games, series,...) and only games are 1440. But if I only get 30fps in my games... then yeah it's not really fun too.
> I just don't really know what I should look at/ buy. I do want at least an IPS panel with G-sync and 1440p or 4K. I have my eye on the ASUS PG279Q and ASUS PG279AQ. Any tips or ideas would be welcome


I have the ASUS PG278Q; it was the first "ever" monitor to feature G-SYNC. I've never had any problems with it.

Only annoying thing is if you like to leave your PC on overnight (always running) like I do; it switches it's self on (the monitor).. and if you use your surround sound system as your alarm you get the "duh duh" windows sound. I check to see why its switching its self on and all I can see is a little message on the bottom right corner saying "g-sync monitor detected". To be fair they probably *all* do it.

I also have a 4k TV. But I paid over 1000 for it last year (when 4k TV's first came out). The detail is phenominal. Amazing.. and its so big and the detail is "still" there at this enourmous size it almost makes it look 3D (until the novely wares off).... *but* you have to "get over" the fact its not g-sync anymore, in fact its not even a gaming monitor... and playing on the TV you *really notice it*. So I'm constantly left with a choice between the faster 1440 or the bigger/more detailed 4k TV. At times I wish I just had a normal 1080 TV and a 4k gaming monitor. But I suppose when more film content begins to be released in 4k that opinion may change.

Also 1080p on a 1080 card is ridiculous. Completely wasted. (But you did say that)


----------



## bloot

Quote:


> Originally Posted by *Koniakki*
> 
> I should clarify that this is the "updated" bios for the GameRock Premium from TP, datng 21/9 which some say its the same as the one released on 28/10. This is not the newest 28/10 bios.
> 
> And to be honest I have my doubts if its the same, just for the reason that over Palit's website download options for the GameRock Premium there was already that 21/9 released bios until it was replaced with the newest one on 28/10.
> 
> So it doesnt seem logical for Palit to re-upload the same bios with just different upload dates as the only difference.
> 
> Or its indeed the same and that's the reason why it says no need to update since we already "have" the updated bios or simply we can't fool Palit from thinking this is indeed a GameRock Premium and not another model.
> 
> So as WoWScoty requested, it would be great if someone who flashed the latest 28/10 GameRock Premium bios can share it please.


It's the same, only change is nvflash version https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=https%3A%2F%2Fforums.overclockers.ru%2Fviewtopic.php%3Fp%3D14223255%23p14223255&edit-text=


----------



## Koniakki

Quote:


> Originally Posted by *bloot*
> 
> It's the same, only change is nvflash version https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=https%3A%2F%2Fforums.overclockers.ru%2Fviewtopic.php%3Fp%3D14223255%23p14223255&edit-text=


Updated Nvflash. Now it makes sense.


----------



## 0gata

Quote:


> Originally Posted by *tin0*
> 
> As promised I'm sharing the MSI GeForce GTX 1080 GAMING Z BIOS. This BIOS has higher clocks and higher TDP limits (stock boost to 1911MHz, depending on your card will result in well over 2000MHz default effective core clock). The .rar file attached contains a batch file which you need to run in order to flash (also see included flash guide). Running the batch file you will be asked whether you want to flash the GAMING Z BIOS with OC mode or GAMING mode enabled by default.
> 
> 
> 
> *Flashing is at your own risk, I am in no way responsible for possible damage to your card(s). To avoid strange behaviour and/or lower scores, I suggest to use this BIOS only on MSI GAMING PCB based graphics cards for now (MSI GAMING Z, GAMING X, GAMING, ARMOR, Sea Hawk EK).
> 
> When I get home later, I will try it on my MSI GTX 1080 ARMOR 8G OC. Let me know how it works out for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX_1080_GAMING_Z_8G_602-V336-09S_vbios.zip 2987k .zip file


well i had Gaming X at stock OC bios etc by default it reached 2025 mhz stabilizing in 3dmark heaven at 1987mhz

didn't did much of testing then updated to bios which u posted

here are 2 screenshots ;p

http://prnt.sc/d1vqrt

http://prnt.sc/d1wy1q

now i have gaming Z card which is on stock OC at 2075 and stabilizes at 2063in games and 2050 in benchmarks (i aded +12mhz so its now 1088)

but in benchmarks with Fan profile ON it never goes above 61-3C (so its 2050mhz)

and in my games it even don't breaks 45c but i didn't played allot so







(btw my resolution being just 1200p and i use Vsinc (60FPScap)









anyway 10x for that bios its just 1 click in msi gaming app to gent unnecessary fast card now


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> I have the ASUS PG278Q; it was the first "ever" monitor to feature G-SYNC. I've never had any problems with it.
> 
> Only annoying thing is if you like to leave your PC on overnight (always running) like I do; it switches it's self on (the monitor).. and if you use your surround sound system as your alarm you get the "duh duh" windows sound. I check to see why its switching its self on and all I can see is a little message on the bottom right corner saying "g-sync monitor detected". To be fair they probably *all* do it.
> 
> I also have a 4k TV. But I paid over 1000 for it last year (when 4k TV's first came out). The detail is phenominal. Amazing.. and its so big and the detail is "still" there at this enourmous size it almost makes it look 3D (until the novely wares off).... *but* you have to "get over" the fact its not g-sync anymore, in fact its not even a gaming monitor... and playing on the TV you *really notice it*. So I'm constantly left with a choice between the faster 1440 or the bigger/more detailed 4k TV. At times I wish I just had a normal 1080 TV and a 4k gaming monitor. But I suppose when more film content begins to be released in 4k that opinion may change.
> 
> Also 1080p on a 1080 card is ridiculous. Completely wasted. (But you did say that)


Yeah I play everything at 100+ FPS (unless the game is locked or is called Mafia 3...) everything on max etc, but if I would have an IPS panel with better colors (my new laptop has an IPS panel and I kinda like it more in terms of colors etc than my desktop screen now) and higher res. That would be amazing, 4K looks awesome I'm sure, but 4K is a bit too much for a 1080 now, and new/upcoming games will not run at 60 at all then. On lower resolutions or older games max refresh will be 60 then so I am kinda looking at a 144hz 1440P ips screen


----------



## nrpeyton

.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Got both GPUs to run 2202 through Timespy...the score dropped off a tad, though, so I'll back the clocks back off until I get the T4 bios flashed and can feed the clocks a little more voltage.


How did u get ahold of *two* cards that *both* overclock to 2200?

Chances of that must be 1 in 1000


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> How did u get ahold of *two* cards that *both* overclock to 2200?
> 
> Chances of that must be 1 in 1000


I went through 5 to find 2 that would.


----------



## Dragonsyph

Iv read through quite a few posts and im just wondering if the version of windows gives any performance gains with a gtx 1080? Like windows 7 vs 8.1 vs 10.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Iv read through quite a few posts and im just wondering if the version of windows gives any performance gains with a gtx 1080? Like windows 7 vs 8.1 vs 10.


Would depend on the benchmark / game you're playing and which version of direct x they're using. If you're playing older games that are running DX9, your best bet would be 7. If you're playing newer games, or plan on playing titles in DX12, you'll want to use Windows 10. DX11 games run pretty well on all 3, really....


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> Would depend on the benchmark / game you're playing and which version of direct x they're using. If you're playing older games that are running DX9, your best bet would be 7. If you're playing newer games, or plan on playing titles in DX12, you'll want to use Windows 10. DX11 games run pretty well on all 3, really....


Thanks for the information.


----------



## Vellinious

Decided to put the boots to my rig a little bit tonight. Warmed up a bit too fast though. I need to start with lower ambient temps.

I'd just love to break 48k graphics score and 33k overall.



http://www.3dmark.com/fs/10662253


----------



## DStealth

Quote:


> Originally Posted by *steeludder*
> 
> [email protected] (1.46V!)
> 1080 @ 2215/5556 (power modded, no volt mod, i.e. severely volt-starved)
> 
> Time Spy Result: *8519*
> http://www.3dmark.com/spy/668684
> 
> CPU/GPU/VRM on chilled water (~13C)
> 
> I'm targeting top spot for my CPU+GPU combo, but I struggle to extract more out of this box. I need 60 more points. Gonna experiment some more with ram timings, currently running my Gskill CL15 3000 kit at 3200(15-15-15-35 1T).
> 
> Any suggestion welcome!


Good score... Try T4 BIOS if you're not with it ..anyway 13C are nice








Here's my best run so far with the stock cooler
@2164 GPU 8483
http://www.3dmark.com/spy/658129
There was a guy with air cooled card exceeding 8700 points ...so we're not the luckiest especially yours with 13C chilled one ...

Edit: Just realized you're not talking about the GPU score but the total one...your 8240 GPU is terrible for the temperatures and the clocks 2200+ on the core...


----------



## 66racer

The other day I mentioned my EVGA 1080 FTW was not entering the Idle state (2 monitors but does it with only one too....both 1080p)....anyways installed yesterdays hotfix driver and now it idles fine. Man did this through two drivers too so wonder what they fixed, the notes were vague.


----------



## steeludder

Quote:


> Originally Posted by *DStealth*
> 
> Good score... Try T4 BIOS if you're not with it ..anyway 13C are nice
> 
> 
> 
> 
> 
> 
> 
> 
> Here's my best run so far with the stock cooler
> @2164 GPU 8483
> http://www.3dmark.com/spy/658129
> There was a guy with air cooled card exceeding 8700 points ...so we're not the luckiest especially yours with 13C chilled one ...
> 
> Edit: Just realized you're not talking about the GPU score but the total one...your 8240 GPU is terrible for the temperatures and the clocks 2200+ on the core...


Well, I have no idea why that is, but that gives me hope that I can get a higher score. Any idea why my gpu score would be lower despite higher clocks? I had another few runs last night, water at 8C (!): 8547 (graphics score @ 8251): http://www.3dmark.com/spy/673723
Any suggestion welcome!

EDIT: also, bumping memory speed above ~11175 only lowers the score, despite not artifacting at all.


----------



## DStealth

Driver settings, background services, power limit...many possible reasons...your score seem right for 2100/11k with 2200 on the core it should exceed 8.5k GPU for sure..even 8.6...Keep pushing m8








How you're overclocking the core via curve or with direct offset? While the actual speed may defer from the one seen in monitoring programs with the curve


----------



## steeludder

Quote:


> Originally Posted by *DStealth*
> 
> Driver settings, background services, power limit...many possible reasons...your score seem right for 2100/11k with 2200 on the core it should exceed 8.5k GPU for sure..even 8.6...Keep pushing m8
> 
> 
> 
> 
> 
> 
> 
> 
> How you're overclocking the core via curve or with direct offset? While the actual speed may defer from the one seen in monitoring programs with the curve


Looking at the 6 scores that are ahead of mine in the FM database (for 5960x + 1080 combo), their graphics score are all comparable to mine, so I'm curious to see what could be holding it back.

I o/c using the curve in Afterburner, but realistically I only bump up the 1.093V dot to the highest frequency that the gpu will take to pass the benchmark. I could just as well use offset (since I power-modded the card with liquid metal and thermal throttling is out of the question at 19C load







)

I'm closing most open services in the lower right taskbar, not sure if I can/should close anything else. Driver-wise, I use the suggested profile settings for 3DMark.


----------



## fat4l

Ok so here is how my GTX 1080 ended up in my new X71 case.
2189/11000MHz_1.094v 24/7 OC! Temps 34-36C under load lol!

The build has been finished some time ago so there ma be some dust ....

Cooling:
EK Supremacy EVO Full Nickel
EK XE 360 rad push pull(bottom)
EK SE 360 rad push(top)
EK D5 X-top dual pump in series
Mo-Ra 3 420 with 20cm fans, push pull
Thermal Grizzly Conductonaut(Liquid Metal) paste
SP 120 Quiet fans, AF140 quiet fans, Bitfenix Spectre Pro 230/200mm fans.

(The small LCD display is there temporary to monitor temps of Ram; the red color on one Ram module has chipped so I need to repaint it







; also the red color is "orange" on the pics while in reality is deeeeeep nice red!







)











Spoiler: Warning: MORE PICS HERE! :)





















See more pics above!

See the back of the case, theres tubes routed out for Mora 3. Also theres a switch that I use to power up all my fans. 5V and 12V switch. The pc is dead silent in windows as all the fans are @5V and pums at 1300rpm.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I went through 5 to find 2 that would.


Where u buying from? In the U.K the website return policy pages all say they check returns to see if they are genuinely faulty and if not; are returned to you at your own expense with delivery cost.

If you don't want it back they *may* still take it but at a 10% restocking fee.


----------



## DStealth

Quote:


> Originally Posted by *steeludder*
> 
> I'm closing most open services in the lower right taskbar, not sure if I can/should close anything else. Driver-wise, I use the suggested profile settings for 3DMark.


Ok, try offset and NV control panel options set to maximum performance, you can use this tutorial if you're not common with them in "nVIDIA Tweaks (Provided by justandoldman ) nVIDIA Tweaks (Click to hide)" section
Quote:


> Originally Posted by *fat4l*
> 
> Ok so here is how my GTX 1080 ended up in my new X71 case.
> 2189/11000MHz_*1.94v* 24/7 OC! T


Does it burn ?


----------



## steeludder

Quote:


> Originally Posted by *DStealth*
> 
> Ok, try offset and NV control panel options set to maximum performance, you can use this tutorial if you're not common with them in "nVIDIA Tweaks (Provided by justandoldman ) nVIDIA Tweaks (Click to hide)" section
> Does it burn ?


Gentleman!
I will take a good look at those... Thanks!









EDIT: the tweaks are for unigine valley but I suppose they apply for 3dMark just as well...


----------



## nrpeyton

Quote:


> Originally Posted by *66racer*
> 
> The other day I mentioned my EVGA 1080 FTW was not entering the Idle state (2 monitors but does it with only one too....both 1080p)....anyways installed yesterdays hotfix driver and now it idles fine. Man did this through two drivers too so wonder what they fixed, the notes were vague.


That's good then mate glad u got it fixed. Especially after the money u must of spent on not 1 but 2, 1080's


----------



## PasK1234Xw

Anyone have EVGA reference model 1080 hybrid? If so mind sharing the vBIOS?

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6188-KR


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Where u buying from? In the U.K the website return policy pages all say they check returns to see if they are genuinely faulty and if not; are returned to you at your own expense with delivery cost.
> 
> If you don't want it back they *may* still take it but at a 10% restocking fee.


I sold them, I didn't return them.

I'd get them in, test them on air running 2151 @ 1.032v (pretty sure that's it), and if they'd run it, I kept it. If not, they got sold.


----------



## ssgtnubb

Looking to buy another 1080 xtreme if anyone knows of anyone selling. http://www.overclock.net/t/1615220/wtb-gigabyte-gtx-1080-xtreme/0_50


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I sold them, I didn't return them.
> 
> I'd get them in, test them on air running 2151 @ 1.032v (pretty sure that's it), and if they'd run it, I kept it. If not, they got sold.


Lol
Ebay charges 10% too. Plus paypal fees. Maybe even postage insurance?

How much money did u lose all together?


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> Ok so here is how my GTX 1080 ended up in my new X71 case.
> 2189/11000MHz_1.94v 24/7 OC! Temps 34-36C under load lol!


Am I reading this correctly? Your post makes it look like your running your 1080 at 1.94v ??

Also are you running conductanaught on your gpu too or just CPU? And how would u rate it? How is that going for you?


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> Am I reading this correctly? Your post makes it look like your running your 1080 at 1.94v ??
> 
> Also are you running conductanaught on your gpu too or just CPU? And how would u rate it? How is that going for you?


1.094v









Yeah both, cpu and gpu. I love the paste. CL pro/ultra would do the same job tho.


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> 1.094v
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah both, cpu and gpu. I love the paste. CL pro/ultra would do the same job tho.


Lol. Of course. I should have realised you meant 1.094. My bad.

I'm thinking of grabbing conductanaught. Thermal conductivity rated at 70 odd compared to most being 30-40 and pastes at about 12.


----------



## juniordnz

Quote:


> Originally Posted by *fat4l*
> 
> 1.094v
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah both, cpu and gpu. I love the paste. CL pro/ultra would do the same job tho.


What did you use to cover the SMDs around the GPU die?

I'll have to use a copper shim on my card, and I'm afraind the conductonaut won't work well with the convex cold plate on the H100i. Will probably have to use conductonaut on the die-shim and kryonaut on shim-block


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> What did you use to cover the SMDs around the GPU die?
> 
> I'll have to use a copper shim on my card, and I'm afraind the conductonaut won't work well with the convex cold plate on the H100i. Will probably have to use conductonaut on the die-shim and kryonaut on shim-block


Wouldn't that be like mixing them together? Or am I just not understanding what u posted?

Ur meaning using conductanaught on chip and kryonaut on block?


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> Wouldn't that be like mixing them together? Or am I just not understanding what u posted?
> 
> Ur meaning using conductanaught on chip and kryonaut on block?


I'll be using a copper shim between the die and block. I won't be mixing them. It'll probably have to be liquid metal between the die and the copper shim, and kryonaut between the copper shim and the water block.

This is a copper shim:


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol
> Ebay charges 10% too. Plus paypal fees. Maybe even postage insurance?
> 
> How much money did u lose all together?


Lost about $10 per card. They all sold in facebook groups. Same place I sold my Titan X pascal and a couple of 1080 SCs I got in the step up from EVGA for a couple of 980ti Classys.


----------



## Kylar182

GP104.zip 149k .zip file
 Need someone to do Max Voltage (unmodded PCB) and 150% Power and 95c temp target. Please and thank you.


----------



## juniordnz

Quote:


> Originally Posted by *Kylar182*
> 
> GP104.zip 149k .zip file
> Need someone to do Max Voltage (unmodded PCB) and 150% Power and 95c temp target. Please and thank you.


we all do...


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> we all do...


Hahaha, no kidding


----------



## fat4l

Quote:


> Originally Posted by *juniordnz*
> 
> What did you use to cover the SMDs around the GPU die?
> 
> I'll have to use a copper shim on my card, and I'm afraind the conductonaut won't work well with the convex cold plate on the H100i. Will probably have to use conductonaut on the die-shim and kryonaut on shim-block


I just used some normal paste around the die. You can also se "liquid tape" or some clear nail laquer..


----------



## max883

I own a Evga gtx 1080 SC acx 3.0 i removed the backplate. Used thermal Grizly Kryonut as paste and updatet the bios. Now my card never goes over 65.c at 50% fann speed. This is how it was ment to be inn the first plase.


----------



## TWiST2k

Quote:


> Originally Posted by *Kylar182*
> 
> GP104.zip 149k .zip file
> Need someone to do Max Voltage (unmodded PCB) and 150% Power and 95c temp target. Please and thank you.


Are you joking, or just another of the TLDR generation...


----------



## nexxusty

Quote:


> Originally Posted by *Kylar182*
> 
> GP104.zip 149k .zip file
> Need someone to do Max Voltage (unmodded PCB) and 150% Power and 95c temp target. Please and thank you.


The eff are you talking about?

First of all... not possible. Second of all....

Who do you think you are?

If it was possible, do it yourself. We aren't your slaves.
Quote:


> Originally Posted by *wirefox*
> 
> HAHA 1/6/04 --- guess I haven't logged into 3mark in a long time...
> 
> 
> 
> Zotac articstorm
> 
> 24,955
> 
> 2088 / 5528 (where I run for bf1 - could go higher on both but don't)


No way... you had a Ti 4600?

Man... I've never even seen one. Everyone had Ti 4200's.

What did it do OC wise? Had a Ti 4200 that did 300/600. I think the only difference was clock speeds on the GeForce 4's.

Blast from the past man....


----------



## Koniakki

He did ask politely and said please and thank you. Give the guy a break guys!









He's obviously either kidding or plainly didn't know.


----------



## nexxusty

Quote:


> Originally Posted by *Koniakki*
> 
> He did ask politely and said please and thank you. Give the guy a break guys!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He's either kidding or plainly didn't know.


I'd rather not give people like that a break.

This isn't supposed to be a place where you ask others to mod a BIOS for you.

If tools are available, it's not hard. I take exception to people not willing to put in the time I and many others have because they feel their time is too important.

My view on this will never change.


----------



## Kylar182

Quote:


> Originally Posted by *Koniakki*
> 
> He did ask politely and said please and thank you. Give the guy a break guys!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He's obviously either kidding or plainly didn't know.


You're correct, had no idea there was no Pascal bios tweaker yet.
Quote:


> Originally Posted by *nexxusty*
> 
> The eff are you talking about?
> 
> First of all... not possible. Second of all....
> 
> Who do you think you are?
> 
> If it was possible, do it yourself. We aren't your slaves.


This is some serious salt, on the Titan X forum people are nice and actually help mod if asked. What a Douche


----------



## jleslie246

Quote:


> Originally Posted by *Kylar182*
> 
> GP104.zip 149k .zip file
> Need someone to do Max Voltage (unmodded PCB) and 150% Power and 95c temp target. Please and thank you.


I'm on it.


----------



## Kylar182

Quote:


> Originally Posted by *jleslie246*
> 
> I'm on it.


At least you're funny vs Salty. So is this bios perma-locked? I'm Maxwell so I don't know much about this, was for a friend. I did notice that there's no KBoost option in Precision X so I'm assuming it's locked. Think they got tired of doing Warranty returns on us?


----------



## bloot

Quote:


> Originally Posted by *Kylar182*
> 
> At least you're funny vs Salty. So is this bios perma-locked? I'm Maxwell so I don't know much about this, was for a friend. I did notice that there's no KBoost option in Precision X so I'm assuming it's locked. Think they got tired of doing Warranty returns on us?


Pascal bioses are encrypted and can't be modded for now.


----------



## OccamRazor

Quote:


> Originally Posted by *nexxusty*
> 
> I'd rather not give people like that a break.
> 
> This isn't supposed to be a place where you ask others to mod a BIOS for you.
> 
> If tools are available, it's not hard. I take exception to people not willing to put in the time I and many others have because they feel their time is too important.
> 
> My view on this will never change.


Some people are not as easy learning as others and need a break, its up to us to help, this is why these forums are made for as well but unfortunately there are people from time to time that come here just to "grab and go"...

Quote:


> Originally Posted by *Koniakki*
> 
> He did ask politely and said please and thank you. Give the guy a break guys!
> 
> 
> 
> 
> 
> 
> 
> 
> He's obviously either kidding or plainly didn't know.


Wazzup Kon?









Quote:


> Originally Posted by *jleslie246*
> 
> I'm on it.


Why not?









Cheers all

Occamrazor


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> ....
> 
> Wazzup Kon?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ....


Occam buddy! Long time!









I see you been away too.. a little.









Nothing much besides life and playing with hardware and gpu's I can't really afford!









What are you up to? All is good I hope?


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> Occam buddy! Long time!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I see you been away too.. a little.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing much besides life and playing with hardware and gpu's I can't really afford!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What are you up? All is good I hope?


Yeah, just a little...








All is good Bro, hope you are too!
Have been working sluggishly on this bios over time, found some interesting things and found a way to increase power target but the way the new boost is applied and the voltage controlled is really like a puzzle and render it useless... (but no fear, there are "others" far ahead of me in this...







)

Cheers

Occamrazor


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> Yeah, just a little...
> 
> 
> 
> 
> 
> 
> 
> 
> All is good Bro, hope you are too!
> Have been working sluggishly on this bios over time, found some interesting things and found a way to increase power target but the way the new boost is applied and the voltage controlled is really like a puzzle and render it useless... (but no fear, there are "others" far ahead of me in this...
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Cheers
> 
> Occamrazor


With you and the others here I have no fears at all.









Already feeling a lot better for my GTX 1080 ownership just by having you here buddy!









Cheers!


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> With you and the others here I have no fears at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Already feeling a lot better for my GTX 1080 ownership just by having you here buddy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!


Thanks!


----------



## TK421

So is every component marked R005 a shunt resistor on nvidia pascal cards?

I have one near the VRM bank 1 (core) and bank 2 (pll/mem)


----------



## juniordnz

Just playing the waiting game on some fujipoly thermal pads, copper shim and conductonaut now...


----------



## OccamRazor

Quote:


> Originally Posted by *TK421*
> 
> So is every component marked R005 a shunt resistor on nvidia pascal cards?
> 
> I have one near the VRM bank 1 (core) and bank 2 (pll/mem)


By lowering the resistance, (The exact resistance will differ from card to card on both resistors (R005=0.005 Ohm, there are 2 so it will be a resistance of +-10 mOhm)) it will increase the power limit to the double just by dropping to half the resistance, but a normal multimeter can't measure resistances of under ~0.5 Ohm because of the internal resistance of the multimeter, you have to find one that can...











Have fun!









Cheers

Occamrazor

Edit: there are 3 resistors, (2) 5MO and (1) R002, so its 12 mOhm!


----------



## steeludder

Quote:


> Originally Posted by *OccamRazor*
> 
> Edit: there are 3 resistors, (2) 5MO and (1) R002, so its 12 mOhm!


I only had to bridge the upper one on my Palit 1080FE, fyi. Power limit no longer an issue!









Voltage however...









Anyone figured out a pencil mod yet to increase voltage to the chip (while we're waiting for the bios editor/mod)? The hardcore volt-mod is kinda out of the question for a low-end modder like me


----------



## wirefox

I bought the Zotac Artic Storm with their block. it's amazing as the block covers off on vram both sides.

I was thinking maybe getting a 1080gtx ti but am having such good success with this card I may not.

with my current rig, playing bf1 with all post processing and lighting on high and AA off and everything else on ultra.. it's butter smooth.

My 3930 is gettting tasked at 70%-90%+ usuage on various cores. HT on most all cores are running but oddly sometimes 10-12 are not or only a little.

Question here for the peanut gallery is will my current rig and cpu bottleneck with an second 1080gtx?

feels like it may with the current cpu load (mainly asking with playing bf1)

wire


----------



## OccamRazor

Quote:


> Originally Posted by *steeludder*
> 
> I only had to bridge the upper one on my Palit 1080FE, fyi. Power limit no longer an issue!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Voltage however...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone figured out a pencil mod yet to increase voltage to the chip (while we're waiting for the bios editor/mod)? The hardcore volt-mod is kinda out of the question for a low-end modder like me


Its the same principle, you apply graphite layers to the resistor lowering the resistance!









OR:

You just need:

2 × 1 KΩ multi-turn VRs (trim pots) for memory and GPU vmod.
1 × 1 MΩ multi-turn VR (trim pots) for 1.8V vmod.
1 × 100 KΩ multi-turn VR (trim pot) for PLL mod.










Cheers

Occamrazor


----------



## steeludder

Quote:


> Originally Posted by *OccamRazor*
> 
> Its the same principle, you apply graphite layers to the resistor lowering the resistance!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


I would think so too... Would you care pointing out which resistor that would be on a 1080 FE PCB? It's just funny I haven't seen anyone mentioning it online anywhere yet...


----------



## TK421

Quote:


> Originally Posted by *OccamRazor*
> 
> By lowering the resistance, (The exact resistance will differ from card to card on both resistors (R005=0.005 Ohm, there are 2 so it will be a resistance of +-10 mOhm)) it will increase the power limit to the double just by dropping to half the resistance, but a normal multimeter can't measure resistances of under ~0.5 Ohm because of the internal resistance of the multimeter, you have to find one that can...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have fun!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor
> 
> Edit: there are 3 resistors, (2) 5MO and (1) R002, so its 12 mOhm!


I have a not reference pcb, used for 1080 and 1070

I only spot two R005 resistors.

Any other marking I should look for other than R005, 5MO, R002?


----------



## OccamRazor

Quote:


> Originally Posted by *TK421*
> 
> I have a not reference pcb, used for 1080 and 1070
> 
> I only spot two R005 resistors.
> 
> Any other marking I should look for other than R005, 5MO, R002?


No need, have a look:



Cheers

Occamrazor


----------



## OccamRazor

Quote:


> Originally Posted by *steeludder*
> 
> *I only had to bridge the upper one* on my Palit 1080FE, fyi. Power limit no longer an issue!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Voltage however...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone figured out a pencil mod yet to increase voltage to the chip (while we're waiting for the bios editor/mod)? The hardcore volt-mod is kinda out of the question for a low-end modder like me


And if you short all 3 resistors, the card will lock at 135mhz...









Cheers

Occamrazor


----------



## TK421

Quote:


> Originally Posted by *OccamRazor*
> 
> No need, have a look:
> 
> 
> 
> Cheers
> 
> Occamrazor


Ah it's not my 1080 amp, this one comes with ridiculously high limit on the power so shorting is not required

It's for a laptop pcb, so I'm asking if there should be a rule of thumb when looking for resistors to short


----------



## OccamRazor

Quote:


> Originally Posted by *TK421*
> 
> Ah it's not my 1080 amp, this one comes with ridiculously high limit on the power so shorting is not required
> 
> It's for a laptop pcb, so I'm asking if there should be a rule of thumb when looking for resistors to short


If its a 1080 there is only ~10/12 mOhm unlike their predecessors high GPU resistance, so, it will be 2/3 resistors 5 mOhm and 2 mOhm!

Cheers

Occamrazor


----------



## OccamRazor

Quote:


> Originally Posted by *steeludder*
> 
> I would think so too... Would you care pointing out which resistor that would be on a 1080 FE PCB? It's just funny I haven't seen anyone mentioning it online anywhere yet...


Its more than that unfortunately and it requires soldering a trimpot to the voltage controller and removing a resistor, so, bye bye warranty...








Unless you have someone that has a eletronic workshop to solder back the resistor and clean up the PCB if something goes awry with the card...

Cheers

Occamrazor


----------



## OccamRazor

Quote:


> Originally Posted by *steeludder*
> 
> *I only had to bridge the upper one* on my Palit 1080FE, fyi. Power limit no longer an issue!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Voltage however...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone figured out a pencil mod yet to increase voltage to the chip (while we're waiting for the bios editor/mod)? The hardcore volt-mod is kinda out of the question for a low-end modder like me


Quote:


> Originally Posted by *OccamRazor*
> 
> *And if you short all 3 resistors, the card will lock at 135mhz*...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Quote:


> Originally Posted by *TK421*
> 
> Ah it's not my 1080 amp, this one comes with ridiculously high limit on the power so shorting is not required
> 
> It's for a laptop pcb, so I'm asking if there should be a rule of thumb when looking for *resistors to short*


DO NOT SHORT RESISTORS! Just lower their resistance, otherwisethe card will lock!









Cheers

Occamrazor


----------



## steeludder

Quote:


> Originally Posted by *OccamRazor*
> 
> Its more than that unfortunately and it requires soldering a trimpot to the voltage controller and removing a resistor, so, bye bye warranty...
> 
> 
> 
> 
> 
> 
> 
> 
> Unless you have someone that has a eletronic workshop to solder back the resistor and clean up the PCB if something goes awry with the card...
> 
> Cheers
> 
> Occamrazor


Ah yes, I've seen the full-blown mod. Unfortunately nothing for me.
I'll just have to be (im)patiently waiting for a software-based solution...


----------



## TK421

Quote:


> Originally Posted by *OccamRazor*
> 
> If its a 1080 there is only ~10/12 mOhm unlike their predecessors high GPU resistance, so, it will be 2/3 resistors 5 mOhm and 2 mOhm!
> 
> Cheers
> 
> Occamrazor


So R005 and R002?

How about 1070?

I'll look for more.


----------



## OccamRazor

Quote:


> Originally Posted by *TK421*
> 
> So R005 and R002?
> 
> How about 1070?
> 
> I'll look for more.


If you mean the card GTX 1070, yes its the same resistors, same card, same chip, same PCB, only lesser or different VRM sections as its the cut down 1070 GP104-200

Cheers

Occamrazor


----------



## OccamRazor

Quote:


> Originally Posted by *steeludder*
> 
> Ah yes, I've seen the full-blown mod. Unfortunately nothing for me.
> I'll just have to be (im)patiently waiting for a software-based solution...


Dont hold your breath! If all of you are waiting for big OC with bios unlock, its better to wake up and smell the coffee!









We all have to accept the fact that Pascal IS just a Maxwell refresh with a die shrink and with it a huge jump in frequencies BUT it is the end on the line on this architecture as it is (stretched to almost the mhz limit already) easily understood as the modded 1080 under liquid nitrogen with 1,4V reaches 2500mhz and with a good die, a few mhz over that...
It also hides the fact that the 980Ti under liquid nitrogen is still faster due to more shaders and better CPC performance...
And there is also the temperature to frequency dependency, where voltage increase does not render much under air/water...
Not fun anymore to OC as it was with Kepler/Maxwell!
Lets just hope that Volta will bring back some excitement!









Cheers

Occamrazor


----------



## TK421

Ah yes, to partially short the resistor. Dunno what the proper term is.


----------



## jleslie246

Quote:


> Originally Posted by *OccamRazor*
> 
> Dont hold your breath! If all of you are waiting for big OC with bios unlock, its better to wake up and smell the coffee!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We all have to accept the fact that Pascal IS just a Maxwell refresh with a die shrink and with it a huge jump in frequencies BUT it is the end on the line on this architecture as it is (stretched to almost the mhz limit already) easily understood as the modded 1080 under liquid nitrogen with 1,4V reaches 2500mhz and with a good die, a few mhz over that...
> It also hides the fact that the 980Ti under liquid nitrogen is still faster due to more shaders and better CPC performance...
> And there is also the temperature to frequency dependency, where voltage increase does not render much under air/water...
> Not fun anymore to OC as it was with Kepler/Maxwell!
> Lets just hope that Volta will bring back some excitement!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Will the 1080ti be different, being 102 chip?


----------



## steeludder

Quote:


> Originally Posted by *OccamRazor*
> 
> Dont hold your breath! If all of you are waiting for big OC with bios unlock, its better to wake up and smell the coffee!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We all have to accept the fact that Pascal IS just a Maxwell refresh with a die shrink and with it a huge jump in frequencies BUT it is the end on the line on this architecture as it is (stretched to almost the mhz limit already) easily understood as the modded 1080 under liquid nitrogen with 1,4V reaches 2500mhz and with a good die, a few mhz over that...
> It also hides the fact that the 980Ti under liquid nitrogen is still faster due to more shaders and better CPC performance...
> And there is also the temperature to frequency dependency, where voltage increase does not render much under air/water...
> Not fun anymore to OC as it was with Kepler/Maxwell!
> Lets just hope that Volta will bring back some excitement!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


I know.
I've got a water chiller so I can push my gpu a bit more than usual. Pretty sure the gpu would do another 25-50Mhz on top of the 2216 I can reach now when benching if I could raise the V to, say, 1.2...


----------



## Vellinious

Quote:


> Originally Posted by *steeludder*
> 
> I know.
> I've got a water chiller so I can push my gpu a bit more than usual. Pretty sure the gpu would do another 25-50Mhz on top of the 2216 I can reach now when benching if I could raise the V to, say, 1.2...


What kind of scores are you seeing at those clocks, though? Speaking in terms of.....Firestrike or Timespy?


----------



## steeludder

Quote:


> Originally Posted by *Vellinious*
> 
> What kind of scores are you seeing at those clocks, though? Speaking in terms of.....Firestrike or Timespy?


Look a couple of pages back... Not high enough for my liking, I need to sweep a bit, driver & tweak-wise. Sitting around 8250 gfx score in Time Spy iirc.


----------



## Vellinious

Quote:


> Originally Posted by *steeludder*
> 
> Look a couple of pages back... Not high enough for my liking, I need to sweep a bit, driver & tweak-wise. Sitting around 8250 gfx score in Time Spy iirc.


May want to try lowering the clock a tad. Higher core clocks don't always mean higher graphics scores. Just a thought....as both of mine will run 2227 pretty easy, but, the scores start to suffer once past 2193. /shrug


----------



## steeludder

Quote:


> Originally Posted by *Vellinious*
> 
> May want to try lowering the clock a tad. Higher core clocks don't always mean higher graphics scores. Just a thought....as both of mine will run 2227 pretty easy, but, the scores start to suffer once past 2193. /shrug


Is that on stock bios or Asus T4?
I'm gonna tinker a bit when I have time. Sweetspot on VRAM seems to be around 11100 (+550 offset). After that, scores start to drop.


----------



## Webster200x

Quote:


> Originally Posted by *jleslie246*
> 
> I'm on it.


Lol that was a good one but to be back on the topic when will we get that bloody bios editor....


----------



## Vellinious

Quote:


> Originally Posted by *steeludder*
> 
> Is that on stock bios or Asus T4?
> I'm gonna tinker a bit when I have time. Sweetspot on VRAM seems to be around 11100 (+550 offset). After that, scores start to drop.


Stock. I haven't tried the T4 bios yet. Not sure if I will.....they run pretty hard the way it is.

Neither of my cards, when I was testing, liked anything over 11000. The score started to drop just a tiny bit with anything above that. Not really impressed with the ability to overclock the memory this gen, but...eh.


----------



## juniordnz

Quote:


> Originally Posted by *steeludder*
> 
> Is that on stock bios or Asus T4?
> I'm gonna tinker a bit when I have time. Sweetspot on VRAM seems to be around 11100 (+550 offset). After that, scores start to drop.


Quote:


> Originally Posted by *Vellinious*
> 
> Stock. I haven't tried the T4 bios yet. Not sure if I will.....they run pretty hard the way it is.
> 
> Neither of my cards, when I was testing, liked anything over 11000. The score started to drop just a tiny bit with anything above that. Not really impressed with the ability to overclock the memory this gen, but...eh.


So, this must vary from card to card then...mine scales well up until +575, after that the performance goes downhill (even though they could d +625 without a single artifact).


----------



## OccamRazor

Quote:


> Originally Posted by *jleslie246*
> 
> Will the 1080ti be different, being 102 chip?


More processing power, Stream Processors, Tesselation Units, Texture Units, ROPs... More powerfull but the same low OC effectiveness!









Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *steeludder*
> 
> I know.
> I've got a water chiller so I can push my gpu a bit more than usual. Pretty sure the gpu would do another 25-50Mhz on top of the 2216 I can reach now when benching if I could raise the V to, say, 1.2...


steeludder; I am thinking of grabbing a water chiller too; did you do a 'before' and 'after' with your water chiller. I.E. your max clock at ambient then your max clock with the chiller temp set to lowest?
Quote:


> Originally Posted by *nexxusty*
> 
> I'd rather not give people like that a break.
> 
> This isn't supposed to be a place where you ask others to mod a BIOS for you.
> 
> If tools are available, it's not hard. I take exception to people not willing to put in the time I and many others have because they feel their time is too important.
> 
> My view on this will never change.


If I knew how to mod a PASCAL BIOS I would actually probably have happily done it for him (no joke intended). Or -- at the very least pointed him to a tutorial (had one existed).

And I'd have enjoyed helping him. So I would have to jump to his defense too.

Chill out. lol.

Some people have computing degrees in programming; others just play games and enjoy what is available to them and help where they can. And occasionally ask for things too.

Someone whose done it before or who works in the industry may take a few weeks or months to develop the tweaker, but someone without the credentials could take years to develop the skills and still never get there. So how can you assume he thought his time was more precious?

Don't know what it was about his post that made you think it was directed at "you alone". ? He's got nearly 200 posts on the forum so can't say he's not contributed in the past or that he wouldn't of became part of the group here; although I'm pretty sure that now -- he will never be. As you've probably scared him off with your unfriendly post/complaint about someone you don't even know.?
Quote:


> Originally Posted by *juniordnz*
> 
> I'll be using a copper shim between the die and block. I won't be mixing them. It'll probably have to be liquid metal between the die and the copper shim, and kryonaut between the copper shim and the water block.
> 
> This is a copper shim:


Ahh I see; thanks for clearing that up for me; makes perfect sense now. Except wouldn't you get slightly better temps without the Shim?

Quote:


> Originally Posted by *Vellinious*
> 
> Lost about $10 per card. They all sold in facebook groups. Same place I sold my Titan X pascal and a couple of 1080 SCs I got in the step up from EVGA for a couple of 980ti Classys.


I see -- glad it worked out for you then surprised you were able to sell them privately at pretty much the same price you paid, but hmm at least you've shown it can be done. Personally I don't know a single person offline (not on the internet or in these groups) who is into computing like I am. So I am stuck with the Silicon Lottery and if I end up with a "total" lemon; ebay.


----------



## Koniakki

We all have our good days and bad days. I think that much can be said about, well, pretty much everyone.

We're all here to pass time, help, learn, improve and many just to make their cards "sweat" their worth out..


----------



## bloot

My card's GDDR5X can get to 11880 in Fire Strike and 12000 in Heaven. Max stable in everything else is +925 (11850) without losing performance at all. Core is not as good though, only 2088-2100 max.


----------



## Koniakki

Quote:


> Originally Posted by *bloot*
> 
> My card's GDDR5X can get to 11880 in Fire Strike and 12000 in Heaven. Max stable in everything else is +925 (11850) without losing performance at all. Core is not as good though, only 2088-2100 max.


Holy memory OC! Did you blast it with any Gamma rays or something?









Nice!


----------



## bloot

Quote:


> Originally Posted by *Koniakki*
> 
> Holy memory OC! Did you blast it with any Gamma rays or something?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice!


I was quite disappointed with the low oc on the core but the memory oc compensated it a bit









http://www.3dmark.com/fs/10530712


----------



## nrpeyton

Quote:


> Originally Posted by *bloot*
> 
> I was quite disappointed with the low oc on the core but the memory oc compensated it a bit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/10530712


2088-2100 is nothing to be ashamed of. Its still well above average. Considering the stock boost for a 1080 is 1733. And your card has a factory O/C at 1847. And your reaching up to 2100. Thats argueably a 253-367mhz O/C-- and we don't even have voltage tweaks yet 

I have a EVGA 1080 CLASSIFIED (the most over-powered; beast of a 1080 money can buy) and I can barely go 50-60mhz faster than you.

At least you can pass a 3dmark test with 2100MHZ on the validation page, thousands probably can't


----------



## JoeDirt

Quote:


> Originally Posted by *Dragonsyph*
> 
> 15c is still good, for better temps using that cooler you are gonna have to get rid of the concave cooler head. Lap it to flat or something.


Got my SP120's in yesterday. Temps max at 40c now. That's another 10c drop. Was not expecting that at all.


----------



## bloot

Quote:


> Originally Posted by *nrpeyton*
> 
> 2088-2100 is nothing to be ashamed of. Its still well above average. Considering the stock boost for a 1080 is 1733. And your card has a factory O/C at 1847. And your reaching up to 2100. Thats argueably a 253-367mhz O/C-- and we don't even have voltage tweaks yet
> 
> I have a EVGA 1080 CLASSIFIED (the most over-powered; beast of a 1080 money can buy) and I can barely go 50-60mhz faster than you.
> 
> At least you can pass a 3dmark test with 2100MHZ on the validation page, thousands probably can't


Not that much, stock boost is 1987-2000 so I can only squeeze 101-113MHz, but I'm quite happy with the card, It's cool and quiet and performs pretty well, I only push it that far for benchmarking.

Greetings!


----------



## nrpeyton

15 degrees C is about what we can expect going water then - maybe another 10c going chilled water... so 25c -30c max for the enthusiast with a good budget who won't go more extreme (I.E phase change/chill box).

That still translates to up to about 50mhz MAX more stability for said person(s).

Edit: Sorry should of said if you are FE going water could result in much better temp gains; i.e. 40c 

Starting to wish I had went FE; I could of tweaked with it more lol 

Quote:


> Originally Posted by *bloot*
> 
> Not that much, stock boost is 1987-2000 so I can only squeeze 101-113MHz, but I'm quite happy with the card, It's cool and quiet and performs pretty well, I only push it that far for benchmarking.
> 
> Greetings!


My stock boost reaches exactly the same as yours mate. I.E. with no manual O/C I get to 1987-2001. Card is rated 1860 factory though (pretty much same as your 1847).

You could try MSI Afterburner Beta 14; cntrl F (voltage curve window). PRESS L after selecting voltage to "lock" onto that voltage. Increase to 100% voltage in main window. Plot a higher clock point for 1.093v in curve window. Then hit apply in main window and see if that gets you any higher. I can only get a +149 if I O/C maually but if I use method above I can get 29mhz more.


----------



## JoeDirt

New personal best gfx score: 25931
Says the the 375.76 driver is not valid. Said that with the last set of drivers too. Oh well, still counts for me.


----------



## bloot

Quote:


> Originally Posted by *nrpeyton*
> 
> 15 degrees C is about what we can expect going water then - maybe another 10c going chilled water... so 25c -30c max for the enthusiast with a good budget who won't go more extreme (I.E phase change/chill box).
> 
> That still translates to up to about 50mhz MAX more stability for said person(s).
> 
> Edit: Sorry should of said if you are FE going water could result in much better temp gains; i.e. 40c
> 
> Starting to wish I had went FE; I could of tweaked with it more lol
> My stock boost reaches exactly the same as yours mate. I.E. with no manual O/C I get to 1987-2001. Card is rated 1860 factory though (pretty much same as your 1847).
> 
> You could try MSI Afterburner Beta 14; cntrl F (voltage curve window). PRESS L after selecting voltage to "lock" onto that voltage. Increase to 100% voltage in main window. Plot a higher clock point for 1.093v in curve window. Then hit apply in main window and see if that gets you any higher. I can only get a +149 if I O/C maually but if I use method above I can get 29mhz more.


Thanks, will try it!


----------



## khemist

https://imageshack.com/i/pnepCYuSj

New FE cooler on the way! lol, thats the fourth cooler i've had for it... the heatkiller block should do the job.




https://imageshack.com/i/poJMgOSIj


----------



## chiknnwatrmln

Power throttling is really killing me. My card is stable at 2189 MHz but throttles under heavy use and only scores 24.8k on FS.


----------



## Dragonsyph

Quote:


> Originally Posted by *JoeDirt*
> 
> Got my SP120's in yesterday. Temps max at 40c now. That's another 10c drop. Was not expecting that at all.


Very nice bro. Iv heard the same thing from videos of the EVGA FTW hybrid, that replacing the fan droped the temps from like 52c to 42c or something around there. But i think the video i seen they replaced the fan with a Vardar ek fan i think.


----------



## TWiST2k

Quote:


> Originally Posted by *Kylar182*
> 
> At least you're funny vs Salty. So is this bios perma-locked? I'm Maxwell so I don't know much about this, was for a friend. I did notice that there's no KBoost option in Precision X so I'm assuming it's locked. Think they got tired of doing Warranty returns on us?


IMO, you know what the issue is Kylar? It is the fact that you came to this thread that was well over 750 pages, did zero reading or research and then ask for something that does not even exist. If this was the Nvidia GTX 1000 cards custom bios (upon request ) thread, then I can understand MAYBE coming in a bit blind, but even then, it would be nice to read what the requirements are for assistance. This has nothing to do with friendly helpful people, I personally LOVE to help people that have an interest in learning. But when someone just wants things handed to them and has no interest in learning anything about it, well they can piss off. And you coming in here asking for something that nobody has the means to accomplish at this time, is really just the pinnacle of that.

We live in an age of infinite knowledge at our fingertips, literally just a few words and clicks away with the advent of the modern search engine and there are more lazy and uneducated people now then before we even had the internet, it is atrocious. You can drill down with Google and get results in any time period or age you want, even just pick a site you want to search with Google and you will get better results then from there own built in search engine most of the time.

It really does not matter whatever career, hobbies and interests you will have in life, taking the time to do a bit of homework on something you have decided to become involved with, will only benefit you.
Quote:


> Originally Posted by *juniordnz*
> 
> Just playing the waiting game on some fujipoly thermal pads, copper shim and conductonaut now...


Dude, how awesome! You are gonna have to give us the low down on how you slap all that together and how it pans out! I was getting mentally ready to take the plunge with a 360 Predator, but held off cause of all the leaking and it has been a let down ever since lol.

I received my thermal pad set from EVGA today for my FTW, they really got it shipped out quick, I don't understand why everybody is so up in arms about it. I have had zero problems with mine, but just like the peace of mind of the cooler the better. I also have a Corsair Carbide Air 540 case with all Noctua 3k fans as I do not expect seriously powerful electronic devices to be able to run silent and still be at an optimal thermal level, I try to keep my expectations realistic.

Here is a pic of the package and contents from EVGA;


----------



## ucode

Quote:


> Originally Posted by *TWiST2k*
> 
> Here is a pic of the package and contents from EVGA;


Can't see too clearly there. Did they also send the proper tools for disassembly/reassembly of the card? You know, right sized screwdriver / nut spinner or whatever ?


----------



## TWiST2k

Quote:


> Originally Posted by *ucode*
> 
> Can't see too clearly there. Did they also send the proper tools for disassembly/reassembly of the card? You know, right sized screwdriver / nut spinner or whatever ?


No, no screwdrivers or tools included. If you install computer components in your PC I would assume you would have the most basic of screwdrivers.

You can click on the picture for a larger version, it is a bit blurry, but its just a bag with the pads, the envelope it came in and the little tube of thermal paste.


----------



## steeludder

Quote:


> Originally Posted by *nrpeyton*
> 
> steeludder; I am thinking of grabbing a water chiller too; did you do a 'before' and 'after' with your water chiller. I.E. your max clock at ambient then your max clock with the chiller temp set to lowest?


Quite exactly 52MHz.
2164 max benching freq on normal water, 2216 on chiller, with water between 8 and 10C.
That's at 1.093V, and power limit removed, stock FE bios.
Note that it was a cool and dry evening, dew point was 7C. Don't wanna go beyond that. Tbh that's also about as low as the chiller would go I reckon, considering the heat a 5960x, VRM, 2 D5 pumps and a 1080 dump into the water at full load.


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> Dude, how awesome! You are gonna have to give us the low down on how you slap all that together and how it pans out! I was getting mentally ready to take the plunge with a 360 Predator, but held off cause of all the leaking and it has been a let down ever since lol.
> 
> I received my thermal pad set from EVGA today for my FTW, they really got it shipped out quick, I don't understand why everybody is so up in arms about it. I have had zero problems with mine, but just like the peace of mind of the cooler the better. I also have a Corsair Carbide Air 540 case with all Noctua 3k fans as I do not expect seriously powerful electronic devices to be able to run silent and still be at an optimal thermal level, I try to keep my expectations realistic.


I'll try to take pics of every step when I do it. It'll be a pain, mainly because the Kraken isn't made for 5th gen asetek's coolers, so I'll have to go "mcgyver" on it to make it work, fingers crossed...It'll still take some time though, international shipping takes a life to get here...

Really, did you read negative feedback on the Predator? I was inlove with it since it was launched, but had to give it up because a predator360 + block for the FTW would cost me a little more than two times what I paid for two H100i V2 + Kraken G10 + thermal pastes + thermal pads to get both CPU and GPU under water. Also, corsair have great CS and RMA here in brazil, so it's five years of warranty that does work.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Just playing the waiting game on some fujipoly thermal pads, copper shim and conductonaut now...


I with ya brother.
Playing the waiting game here too.
Waiting on the Fuji, Grizzly and sinks.


----------



## nrpeyton

Quote:


> Originally Posted by *steeludder*
> 
> Quite exactly 52MHz.
> 2164 max benching freq on normal water, 2216 on chiller, with water between 8 and 10C.
> That's at 1.093V, and power limit removed, stock FE bios.
> Note that it was a cool and dry evening, dew point was 7C. Don't wanna go beyond that. Tbh that's also about as low as the chiller would go I reckon, considering the heat a 5960x, VRM, 2 D5 pumps and a 1080 dump into the water at full load.


I see thanks for info. Very useful information to me. 50mhz is about what I anticipated too so its great to hear that this is indeed what someone with a water chiller actually got.

*The LN2 guys say you get 100mhz of stability for every 50c degrees on PASCAL* -- so it would appear the same is true whether you are *subzero or not.*
50 degrees C is 50 degrees C whether you are at -100 or + 50!
I suppose silicon doesn't care! The idea of a "zero freezing point" is a human thing (as water freezes at this point and we think about winter).

Very very useful mate thanks for replying; this is very useful information to a lot of people who may be thinking of taking the plunge into more risky cooling ventures (minus extreme).

Its not a lot I know' but the fact you still got "something" and *its even* a nice "rounded" figure like 50 is great. Encouraging for me because I want one lol.
What is the rating of your chiller if you don't mind me asking? Is it a "Hailea" one by any chance?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> I with ya brother.
> Playing the waiting game here too.
> Waiting on the Fuji, Grizzly and sinks.


Wow, those are some very nice fans over there, mate. And with a H110i? Killer!

I can imagine your anxiety seeing everything just sitting there lol mine are shoved deep down the closet









keep us updated


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Wow, those are some very nice fans over there, mate. And with a H110i? Killer!
> 
> I can imagine your anxiety seeing everything just sitting there lol mine are shoved deep down the closet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> keep us updated


Thnx

Not enough closet space here to do the out of sight out of mind thing.








Will post some pics when I do the Hybrid kit sink mod I am gonna do, probably over in the Overheating thread though.
Ya I went with the pretty lights. lol


----------



## Derek1

Quote:


> Originally Posted by *TWiST2k*
> 
> I received my thermal pad set from EVGA today for my FTW, they really got it shipped out quick, I don't understand why everybody is so up in arms about it. I have had zero problems with mine, but just like the peace of mind of the cooler the better. I also have a Corsair Carbide Air 540 case with all Noctua 3k fans as I do not expect seriously powerful electronic devices to be able to run silent and still be at an optimal thermal level, I try to keep my expectations realistic.
> 
> Here is a pic of the package and contents from EVGA;


I guess you got your before they decided to add the VRAM pads as well?
Did you check for gaps there?


----------



## Vellinious

Quote:


> Originally Posted by *steeludder*
> 
> Quite exactly 52MHz.
> 2164 max benching freq on normal water, 2216 on chiller, with water between 8 and 10C.
> That's at 1.093V, and power limit removed, stock FE bios.
> Note that it was a cool and dry evening, dew point was 7C. Don't wanna go beyond that. Tbh that's also about as low as the chiller would go I reckon, considering the heat a 5960x, VRM, 2 D5 pumps and a 1080 dump into the water at full load.


Dual loop time? If you're really into benchmarking it might be worth it....if you're just playin around, I wouldn't bother.


----------



## steeludder

Quote:


> Originally Posted by *nrpeyton*
> 
> I see thanks for info. Very useful information to me. 50mhz is about what I anticipated too so its great to hear that this is indeed what someone with a water chiller actually got.
> 
> *The LN2 guys say you get 100mhz of stability for every 50c degrees on PASCAL* -- so it would appear the same is true whether you are *subzero or not.*
> 50 degrees C is 50 degrees C whether you are at -100 or + 50!
> I suppose silicon doesn't care! The idea of a "zero freezing point" is a human thing (as water freezes at this point and we think about winter).
> 
> Very very useful mate thanks for replying; this is very useful information to a lot of people who may be thinking of taking the plunge into more risky cooling ventures (minus extreme).
> 
> Its not a lot I know' but the fact you still got "something" and *its even* a nice "rounded" figure like 50 is great. Encouraging for me because I want one lol.
> What is the rating of your chiller if you don't mind me asking? Is it a "Hailea" one by any chance?


It is.
Hailea HC-500A (1/2hp). You can read my whole build log/thread here if you're interested: https://community.futuremark.com/forum/showthread.php?185362-Project-quot-Noob-Chill-On-A-Bender-quot-Build-Log


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> I guess you got your before they decided to add the VRAM pads as well?
> Did you check for gaps there?


are they advising on putting thermal pads on the VRAM-Backplate also?

I couldn't wait for EVGA and ordered some 6w/mK myself (I doubt those EVGA are sending are even close to that) and I'm planning on putting on everything on the backplate, mosfets, VRAM, everything that gets hot. IMO, is much better having the heat dissipated to a aluminum plate through thermal pads then leaving the job for the air (poor thermal conductivity)


----------



## JoeDirt

New personal best. Just broke the 26k GFX mark and got into the 19400's for total score.


----------



## juniordnz

Quote:


> Originally Posted by *JoeDirt*
> 
> New personal best. Just broke the 26k GFX mark and got into the 19400's for total score.


Missed your last posts here...what clocks got you there? Any hard mod on the card?

That's a very nice score


----------



## Vellinious

Quote:


> Originally Posted by *JoeDirt*
> 
> New personal best. Just broke the 26k GFX mark and got into the 19400's for total score.


Nice run.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> are they advising on putting thermal pads on the VRAM-Backplate also?
> 
> I couldn't wait for EVGA and ordered some 6w/mK myself (I doubt those EVGA are sending are even close to that) and I'm planning on putting on everything on the backplate, mosfets, VRAM, everything that gets hot. IMO, is much better having the heat dissipated to a aluminum plate through thermal pads then leaving the job for the air (poor thermal conductivity)


The VRAM pads that EVGA are now including are 1.5mm to replace the stock ones @ 1.0mm which in some cases do not reach the cooling plate as has been reported by some people. These are not for the backplate area of the VRAM but the area that makes direct contact with the modules.


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> The VRAM pads that EVGA are now including are 1.5mm to replace the stock ones @ 1.0mm which in some cases do not reach the cooling plate as has been reported by some people. These are not for the backplate area of the VRAM but the area that makes direct contact with the modules.


Nice to hear that. So I was spot on when I bought those 1,5mm 11w/mK fujipoly for the heatplate









Mine do make contact, but since I'm dismembering the whole thing why not replace them for something better? I just don't want to worry about temps anymore after all that...


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Nice to hear that. So I was spot on when I bought those 1,5mm 11w/mK fujipoly for the heatplate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mine do make contact, but since I'm dismembering the whole thing why not replace them for something better? I just don't want to worry about temps anymore after all that...


Yep I thought you had mentioned you were going 1.5mm and it turns out to be a good move.
Have spent the morning going through the forums over at EVGA and need a break and lie down.
The stupid is strong over there.


----------



## Krzych04650

Quote:


> Originally Posted by *JoeDirt*
> 
> New personal best. Just broke the 26k GFX mark and got into the 19400's for total score.


Fire Strike is crashing for me for some reason. I have tested my GPU overclock in many games and synthetic benchmarks and it works perfectly but Fire Strike is crashing even after decreasing overclock. And it crashes in exact the same place all the time, during second graphics test. Anyone else had this issue?


----------



## juniordnz

Quote:


> Originally Posted by *Krzych04650*
> 
> Fire Strike is crashing for me for some reason. I have tested my GPU overclock in many games and synthetic benchmarks and it works perfectly but Fire Strike is crashing even after decreasing overclock. And it crashes in exact the same place all the time, during second graphics test. Anyone else had this issue?


Yeah, right on the scene where you can see the the guy in black/white robe with the big robot thing behind him, right? lol

Happened here too, but I just considered it not stable and keep tweaking the OC until it didn't happened anymore.

BTW, what clocks are you running to get above 26k?


----------



## DStealth

Anyone tried http://www.futuremark.com/benchmarks/vrmark
The Orange room looks low for our cards keeping in mind 980ti are pushing stable over 10k's...
11305


----------



## JoeDirt

Quote:


> Originally Posted by *juniordnz*
> 
> Missed your last posts here...what clocks got you there? Any hard mod on the card?
> 
> That's a very nice score


Only physical mod is I put a H105 on it. Using the Strix T4 BIOS. Clocks are at 2164 MHz and memory at 5544 MHz. On air it wouldn't go past 2088.
Quote:


> Originally Posted by *Krzych04650*
> 
> Fire Strike is crashing for me for some reason. I have tested my GPU overclock in many games and synthetic benchmarks and it works perfectly but Fire Strike is crashing even after decreasing overclock. And it crashes in exact the same place all the time, during second graphics test. Anyone else had this issue?


I have had the same issues. But if I back the clocks off just a bit it will get past it. It's why I use FS as my first stability check. If I can't get it to hold for one pass on FS, then I don't even continue.


----------



## juniordnz

Quote:


> Originally Posted by *JoeDirt*
> 
> Only physical mod is I put a H105 on it. Using the Strix T4 BIOS. Clocks are at 2164 MHz and memory at 5544 MHz. On air it wouldn't go past 2088.


Nice! What paste did you use with the H105?

I can get my card to 2177 with t4, but Temps were an issue. Hopefully with the H100i that won't happen anymore.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Nice! What paste did you use with the H105?
> 
> I can get my card to 2177 with t4, but Temps were an issue. Hopefully with the H100i that won't happen anymore.


The "extra stability to 2179 on T4. Is that the 'extra voltage' or the 'removed power limit' which is making that possible for you?


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> The "extra" up to 2179 on T4. Is that the extra voltage or the 'removed power limit' which is making that possible for you?


I believe it's the "no power limit". I've never gone past 1.093V. But that vBIOS is better clock for clock also, don't know why. Maybe the extra power keeps clocks steadier, idk...


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> I believe it's the "no power limit". I've never gone past 1.093V. But that vBIOS is better clock for clock also, don't know why. Maybe the extra power keeps clocks steadier, idk...


Interesting....

I've never flashed to a bios that wasn't meant for my card... I'm sure there's some kind of override or something, yes?


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Interesting....
> 
> I've never flashed to a bios that wasn't meant for my card... I'm sure there's some kind of override or something, yes?


Well, it's not perfect, since I lost one of the DP connectors. Had to swich to another one to get video.

Also, not being able to monitor the power limit freaks me out a bit. Wouldn't like my card handling more than the 280W it's supposed to...

HWINFO64 is reliable on this? I have voltage and power draw readings there...


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Well, it's not perfect, since I lost one of the DP connectors. Had to swich to another one to get video.
> 
> Also, not being able to monitor the power limit freaks me out a bit. Wouldn't like my card handling more than the 280W it's supposed to...
> 
> HWINFO64 is reliable on this? I have voltage and power draw readings there...


Should be...but I'd check GPUz sensors tab to make sure they're matching up.


----------



## Spiriva

Quote:


> Originally Posted by *juniordnz*
> 
> I believe it's the "no power limit". I've never gone past 1.093V. But that vBIOS is better clock for clock also, don't know why. Maybe the extra power keeps clocks steadier, idk...


I think your right too. On stock evga bios i can hit 2150ish mhz stable the cards never go over 1.031v tho. With the t4 i sat it too 1.100v and both cards play at 2240ish mhz now. Altho going futher with the volt doesnt do anything but make the card run abit hotter, no more mhz gain.


----------



## Vellinious

Does anyone remember which page the T4 bios and flashing instruction was posted on?


----------



## Spiriva

Quote:


> Originally Posted by *Vellinious*
> 
> Does anyone remember which page the T4 bios and flashing instruction was posted on?


nvflash --index=0 --save 1080org.rom
nvflash --index=0 --protectoff
nvflash --index=0 -6 strix1080xoc_t4.rom

--index=0 is because i have two cards, next card would be --index=1 and do the same thing again.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Does anyone remember which page the T4 bios and flashing instruction was posted on?


Just make sure you don't use the rightmost DP. That's the one that gets deactivated.


----------



## Derek1

Quote:


> Originally Posted by *Spiriva*
> 
> I think your right too. On stock evga bios i can hit 2150ish mhz stable the cards never go over 1.031v tho. With the t4 i sat it too 1.100v and both cards play at 2240ish mhz now. Altho going futher with the volt doesnt do anything but make the card run abit hotter, no more mhz gain.


And you tried upping the volts slider on Prec X to 100% and on slave bios +130% TDP?
Cos I get 1.08v all the time on GPU Z at 2151mhz.


----------



## JoeDirt

Quote:


> Originally Posted by *juniordnz*
> 
> Nice! What paste did you use with the H105?
> 
> I can get my card to 2177 with t4, but Temps were an issue. Hopefully with the H100i that won't happen anymore.


PK3


----------



## Spiriva

Quote:


> Originally Posted by *Derek1*
> 
> And you tried upping the volts slider on Prec X to 100% and on slave bios +130% TDP?
> Cos I get 1.08v all the time on GPU Z at 2151mhz.


Yes, no matter what it wouldnt go over 1.031v with the Evga Bios.


----------



## nrpeyton

Thinking of performing the power mod on my card.

I've watched the videos for FE power mod. But my Classified's PCB is a bit different (14+3 power phase).

Look at those phases; beautiful lol. *also a headache* haha


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Thinking of performing the power mod on my card.
> 
> I've watched the videos for FE power mod. But my Classified's PCB is a bit different (14+3 power phase).
> 
> Look at those phases; beautiful lol. *also a headache* haha


I'm not hitting the power limit on my FTWs....the Classy has a higher power limit than that even. Are you hitting the power limit perf cap? Check the GPUz sensors tab.....if you're not hitting the power limit perf cap, doing the power limit mod won't do anything for you.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I'm not hitting the power limit on my FTWs....the Classy has a higher power limit than that even. Are you hitting the power limit perf cap? Check the GPUz sensors tab.....if you're not hitting the power limit perf cap, doing the power limit mod won't do anything for you.


Right I thought the same to begin with; however if I explain the next bit to you -- it may make more sense...

Okay I can flash the T4 to my Classified. Card boots; I simply lose the normal display port (like everyone else), I can game, browse, blah blah so on so forth...

However; FPS actually "drops" by about 10-20fps in all games/apps and my card refuses to draw more than 100W (HWINFO64). I can over-volt it, clock it really high, blah blah blah but nothing makes any difference.. It will even run at 2200+ (my normal max is 2179) *but* still won't exceed around 80-100 watts. The reduced power consumption doesn't even seem to affect clocks (checked in 3 different programs) but my FPS is lower. Almost like its running at same clock but the throughput on each clock tick is less.

So my theory is: Power mod as a work-around. ?

I've also verified the reduced power consumption using a watt meter plugged into wall socket.


----------



## Vellinious

That's really odd....what GPU do you have?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *nrpeyton*
> 
> I see thanks for info. Very useful information to me. 50mhz is about what I anticipated too so its great to hear that this is indeed what someone with a water chiller actually got.
> 
> *The LN2 guys say you get 100mhz of stability for every 50c degrees on PASCAL* -- so it would appear the same is true whether you are *subzero or not.*
> 50 degrees C is 50 degrees C whether you are at -100 or + 50!
> I suppose silicon doesn't care! The idea of a "zero freezing point" is a human thing (as water freezes at this point and we think about winter).
> 
> Very very useful mate thanks for replying; this is very useful information to a lot of people who may be thinking of taking the plunge into more risky cooling ventures (minus extreme).
> 
> Its not a lot I know' but the fact you still got "something" and *its even* a nice "rounded" figure like 50 is great. Encouraging for me because I want one lol.
> What is the rating of your chiller if you don't mind me asking? Is it a "Hailea" one by any chance?


Not to doubt you or the other guys (who surely have more experience than I) but when I went from the FE cooler to watercooling (max temp 85c -> 45c, so 40c drop) I didn't gain any stability in regards to core clock. It was the same before and after.

However it's much, much quieter


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> That's really odd....what GPU do you have?


EVGA 1080 Classified.


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Not to doubt you or the other guys (who surely have more experience than I) but when I went from the FE cooler to watercooling (max temp 85c -> 45c, so 40c drop) I didn't gain any stability in regards to core clock. It was the same before and after.
> 
> However it's much, much quieter


I went from 2153 in SLI on air to 2189 in SLI on water, and running a lot more stable.

Quote:


> Originally Posted by *nrpeyton*
> 
> EVGA 1080 Classified.


Yeah...brain fart. We were just talking about that.

The way it sounds, you're almost better off with the stock bios. I'm not sure the power mod would do anything....the problem sounds like it's in the bios, as the hardware works fine with the stock bios. I dunno...just my two pennies.

Reached the 48k graphics score I had been searching for....now I gotta get the CPU tuned in again for the lowered ambient temps and should be able to push above 33k.

Sure wish I knew what the guy at the top of the charts was doing to go above 49k graphics score....that's insane.

http://www.3dmark.com/fs/10694125


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I went from 2153 in SLI on air to 2189 in SLI on water, and running a lot more stable.
> Yeah...brain fart. We were just talking about that.
> 
> The way it sounds, you're almost better off with the stock bios. I'm not sure the power mod would do anything....the problem sounds like it's in the bios, as the hardware works fine with the stock bios. I dunno...just my two pennies.


A few guys on here have said they got over 2200mhz with a little extra voltage the T4 brings. Because the T4 is the only option for voltage just now; power moding as a work-around for this is all I can think of.. or as you say.. give up and make do with what i have.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> A few guys on here have said they got over 2200mhz with a little extra voltage the T4 brings. Because the T4 is the only option for voltage just now; power moding as a work-around for this is all I can think of.. or as you say.. give up and make do with what i have.


All it took for 2200+ on a single card run for mine was to create a really steep curve, like someone earlier in the thread had suggested, keep the temps on the core below 35c peak, and they'd run fine there. On Firestrike, that is. In TimeSpy they'll do 2202 in SLI. My best score came from those testing runs.

Ran a timespy quick...glad I did. Bumped my high score up a tad more. I might try to lower the ambient another 5c tomorrow and see what happens.

http://www.3dmark.com/3dm/15887715



http://imgur.com/all


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> All it took for 2200+ on a single card run for mine was to create a really steep curve, like someone earlier in the thread had suggested, keep the temps on the core below 35c peak, and they'd run fine there. On Firestrike, that is. In TimeSpy they'll do 2202 in SLI. My best score came from those testing runs.
> 
> Ran a timespy quick...glad I did. Bumped my high score up a tad more. I might try to lower the ambient another 5c tomorrow and see what happens.
> 
> http://www.3dmark.com/3dm/15887715
> 
> 
> 
> http://imgur.com/all


Nice score; I'm still waiting on block coming out for my Classified. Alphacool are doing one but still 3 weeks away; then I'll be able to see if I can get her to 2200 on water. Think I'm just going to hold off until then -- before I do anymore research on power mods for my card.

Already tried every other possible thing I can think of; sometimes I can finish a timespy lap at 2179, other times she won't make it... lol


----------



## NIK1

I was thinking of getting the MSI GeForce® GTX 1080 SEA HAWK EK X and using my GTX 980 with a Ek waterblock for another build.How much better is the MSI1080 SEA HAWK EK X compared to my GTX 980.Just wondering if blowing the extra coin will be worth it.


----------



## nrpeyton

A single 1080 is about the same as two 980's in SLI at 4k (non SLI 1080 is about 40% more FPS)


----------



## Derek1

Didn't someone unlock the Gaming Z bios as well? They could flash that Seahawk and ramp it up.


----------



## Koala Bear

Gigabyte GTX 1080 Extreme gaming. GPU clock hit 2,012mhz out of the box


----------



## ucode

Check your video clock. Seems at least sometimes the bigger the clock base and boost is at default the larger the difference between video clock and gpu clock which can have a negative effect on performance.


----------



## TK421

Quick question on HDMI signal.

Suppose a card has unknown HDMI/mDP version, how do you check for version of HDMI/mDP?

I know pascal desktop has HDMI 2.0 and DP 1.3 but I'm trying to figure this out on a laptop.


----------



## Koniakki

Quote:


> Originally Posted by *TK421*
> 
> Quick question on HDMI signal.
> 
> Suppose a card has unknown HDMI/mDP version, how do you check for version of HDMI/mDP?
> 
> I know pascal desktop has HDMI 2.0 and DP 1.3 but I'm trying to figure this out on a laptop.


Besides checking with the manufacturer specs/support/faq page or emailing/calling them, I don't think it's possible to probe this information directly.

But I'm curious and would like to see what others has to say.

And by checking, you can do it the old fashion way if you have plugin it in and see what it supports.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Didn't someone unlock the Gaming Z bios as well? They could flash that Seahawk and ramp it up.


I've had a look for that BIOS but can't seem to pin down a link to it anywhere; even emailed a guy on another forum who brought it up on a thread somewhere last month but heard nothing back.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> I've had a look for that BIOS but can't seem to pin down a link to it anywhere; even emailed a guy on another forum who brought it up on a thread somewhere last month but heard nothing back.


Ya I was redirected to a thread here that had a post by someone who had zip link to it, or a link to a post that had a zip for it.
The guy was saying that only MSi users should use it. Does that sound familiar?
But damned if I can remember what thread I saw that in. It was only 3 days ago too.

ETA Check this http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/570
Not sure if this was the one I was initially talking about. Go to the Tech Powerup link and have a look. There is a disclaimer there.
I will continue to look and see if I can find the post I was referring to.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Ya I was redirected to a thread here that had a post by someone who had zip link to it, or a link to a post that had a zip for it.
> The guy was saying that only MSi users should use it. Does that sound familiar?
> But damned if I can remember what thread I saw that in. It was only 3 days ago too.
> 
> ETA Check this http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/570
> Not sure if this was the one I was initially talking about. Go to the Tech Powerup link and have a look. There is a disclaimer there.
> I will continue to look and see if I can find the post I was referring to.


Excellent; thanks I'll have a look.

==========================

*FYI EVGA Owners*

Precision X has just been updated to version 6.08. Read trough the change-log under version history and apparently they have improved the "resuming" after crash with O/C scanner. Can't wait to try it.. i'm away to download it now 

Would be nice to see some feedback on how people get on with new 1.08 -- i'll be back shortly with news of my own experience with it


----------



## TK421

Quote:


> Originally Posted by *TK421*
> 
> Quick question on HDMI signal.
> 
> Suppose a card has unknown HDMI/mDP version, how do you check for version of HDMI/mDP?
> 
> I know pascal desktop has HDMI 2.0 and DP 1.3 but I'm trying to figure this out on a laptop.


Quote:


> Originally Posted by *Koniakki*
> 
> Besides checking with the manufacturer specs/support/faq page or emailing/calling them, I don't think it's possible to probe this information directly.
> 
> But I'm curious and would like to see what others has to say.
> 
> And by checking, you can do it the old fashion way if you have plugin it in and see what it supports.


How do you check for DP 1.3 then?

For HDMI it's either compressed chroma 4K60 or 4:4:4 4K30/4K24 on TVs right? This seems to be the dead giveaway of the HDMI version.


----------



## FattysGoneWild

Quote:


> Originally Posted by *nrpeyton*
> 
> Excellent; thanks I'll have a look.
> 
> ==========================
> 
> *FYI EVGA Owners*
> 
> Precision X has just been updated to version 6.08. Read trough the change-log under version history and apparently they have improved the "resuming" after crash with O/C scanner. Can't wait to try it.. i'm away to download it now
> 
> Would be nice to see some feedback on how people get on with new 1.08 -- i'll be back shortly with news of my own experience with it


Don't use it. Its broke. That software is a complete POS. I was getting this as well in games. http://forums.evga.com/Precision-X-OC-Causing-CPU-Spikes-m2576064.aspx Since switching to Afterburner. No issues at all now. Their software offering is just not reliable or stable at all.


----------



## nrpeyton

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Don't use it. Its broke. That software is a complete POS. I was getting this as well in games. http://forums.evga.com/Precision-X-OC-Causing-CPU-Spikes-m2576064.aspx Since switching to Afterburner. No issues at all now. Their software offering is just not reliable or stable at all.


I hear what your saying and I would normally use MSI AB too; and will probably go back to it soon.

I like the automatic O/C scanner on Precision X though (which has apparently been fixed in this new version) so will let you guys know how I get on with it. And if there have been any improvements to the problems like FattysGoneWild had. 

Will be interesting just to see what an automatic voltage curve looks like. Especially across different cards/different people. Would be good to get some screenshots up too.

We could even copy it across to AB (AB beta 14 has voltage curve now); useful for those who don't have the time to perfect their curve manually.

/\ that is all dependant of this new verison actually working of course 

I'll brb with more on my actual experience with this tonight


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> I hear what your saying and I would normally use MSI AB too; and will probably go back to it soon.
> 
> I like the automatic O/C scanner on Precision X though (which has apparently been fixed in this new version) so will let you guys know how I get on with it. And if there have been any improvements to the problems like FattysGoneWild had.
> 
> Will be interesting just to see what an automatic voltage curve looks like. Especially across different cards/different people. Would be good to get some screenshots up too.
> 
> We could even copy it across to AB (AB beta 14 has voltage curve now); useful for those who don't have the time to perfect their curve manually.
> 
> /\ that is all dependant of this new verison actually working of course
> 
> I'll brb with more on my actual experience with this tonight


Here, check this out, this is the post I was referring to initially.








http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/4760


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Here, check this out, this is the post I was referring to initially.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/4760


Downloaded it; unfortunately its still voltage locked.. that BIOS is just to enable MSI users of said card to set their card up in factory "gaming/overclock" mode by default; to save time having to open up overclocking programme every time you reboot.

Seems the T4 is still the only option for voltage  Thanks for trying though.


----------



## Vellinious

Adding voltage, without first controlling, and keeping the temps low, is a waste of time. Just sayin.


----------



## arrow0309

Quote:


> Originally Posted by *Koala Bear*
> 
> 
> Gigabyte GTX 1080 Extreme gaming. GPU clock hit 2,012mhz out of the box


+Rep
Nice rig and vga!








What is the max gpu temp of the 1080 Xtreme? And with extra oc?


----------



## Koala Bear

Peak temp yesterday was 52c when I hit 2012mhz. I have not tried to overclock yet.


----------



## Cozmo85

Looking at both the MSI Aero and the EVGA not reference blower model to watercool. Any preference for either? They both appear to be reference boards. MSI is ~$10 cheaper


----------



## KickAssCop

Get the cheaper one. Overclocking is a lottery anyways.


----------



## steeludder

Quote:


> Originally Posted by *Vellinious*
> 
> May want to try lowering the clock a tad. Higher core clocks don't always mean higher graphics scores. Just a thought....as both of mine will run 2227 pretty easy, but, the scores start to suffer once past 2193. /shrug


Tried a few more things over the weekend but no luck, unfortunately.

- cannot get core clocks to stick at 2228. Even with water temps at 8C. Time Spy will crash after 2-3 seconds;
- 2216 gets the highest score. 2202 MHz score is lower;
- +550 offset in mem speed seems to be the sweetspot too. Higher and lower frequencies will give a lower score;
- Using core offset instead of curve doesn't work at all... I think that results in only using the stock curve and raising it across the board... which is stupid, since any given voltage point will be too low for the raised frequencies - unless I'm doing it wrong; once your temp and power restrictions are lifted, only the 1.093V voltage point is relevant;
- Tried the nvidia profile inspector app... Set the 3DMark Time Spy profile, clicked "apply settings"... but I don't think anything actually changed - no difference in score at all;

I think flashing the T4 bios might be the last option remaining...


----------



## nexxusty

Quote:


> Originally Posted by *TWiST2k*
> 
> *IMO, you know what the issue is Kylar? It is the fact that you came to this thread that was well over 750 pages, did zero reading or research and then ask for something that does not even exist. If this was the Nvidia GTX 1000 cards custom bios (upon request ) thread, then I can understand MAYBE coming in a bit blind, but even then, it would be nice to read what the requirements are for assistance. This has nothing to do with friendly helpful people, I personally LOVE to help people that have an interest in learning. But when someone just wants things handed to them and has no interest in learning anything about it, well they can piss off.* And you coming in here asking for something that nobody has the means to accomplish at this time, is really just the pinnacle of that.
> 
> We live in an age of infinite knowledge at our fingertips, literally just a few words and clicks away with the advent of the modern search engine and there are more lazy and uneducated people now then before we even had the internet, it is atrocious. You can drill down with Google and get results in any time period or age you want, even just pick a site you want to search with Google and you will get better results then from there own built in search engine most of the time.
> 
> It really does not matter whatever career, hobbies and interests you will have in life, taking the time to do a bit of homework on something you have decided to become involved with, will only benefit you.
> Dude, how awesome! You are gonna have to give us the low down on how you slap all that together and how it pans out! I was getting mentally ready to take the plunge with a 360 Predator, but held off cause of all the leaking and it has been a let down ever since lol.
> 
> I received my thermal pad set from EVGA today for my FTW, they really got it shipped out quick, I don't understand why everybody is so up in arms about it. I have had zero problems with mine, but just like the peace of mind of the cooler the better. I also have a Corsair Carbide Air 540 case with all Noctua 3k fans as I do not expect seriously powerful electronic devices to be able to run silent and still be at an optimal thermal level, I try to keep my expectations realistic.
> 
> Here is a pic of the package and contents from EVGA;


Very well said.

That would be the exact basis for my response.

Personally... it doesn't bother me if some of you don't like what was typed.

Like Kon said as well, we all have good and bad days. Hehe.


----------



## Krzych04650

Question about SLI. Or rather SLI connectors. Shouldn't their placement be unified for all board partner's cards? From what I see on pictures of for example Founders Edition, Palit Jetstream and MSI Gaming X they have different pcb width and SLI connectors are always on the edge, so there is no way to connect different cards with non-flexible bridge? I thought that SLI connectors placement has to be the same for all cards.


----------



## JoeDirt

New personal best: GFX Score - 26209 / Total Score - 19628

http://www.3dmark.com/3dm/15921222?


----------



## Koniakki

Quote:


> Originally Posted by *nrpeyton*
> 
> .........
> 
> We could even copy it across to AB *(AB beta 14* has voltage curve now); useful for those who don't have the time to perfect their curve manually.
> 
> ...)


MSI Afterburner 4.3.0 Final/Stable was already released a week and a half ago.

Just stating it in case some users missed it.









Quote:


> Originally Posted by *TK421*
> 
> How do you check for DP 1.3 then?
> 
> For HDMI it's either compressed chroma 4K60 or 4:4:4 4K30/4K24 on TVs right? This seems to be the dead giveaway of the HDMI version.


I don't do DP. Sry.









I had come across this subject before that's why I said afaik there isn't a way to probe the info of the controller used directly .

It would be great to hear other members input on this.


----------



## Derek1

Quote:


> Originally Posted by *JoeDirt*
> 
> New personal best: GFX Score - 26209 / Total Score - 19628
> 
> http://www.3dmark.com/3dm/15921222?


Good one: thumb:

Just out of curiosity, what is your Mem Clock? I know it says 1398 there so does that mean you have +398?


----------



## JoeDirt

Quote:


> Originally Posted by *Derek1*
> 
> Good one: thumb:
> 
> Just out of curiosity, what is your Mem Clock? I know it says 1398 there so does that mean you have +398?


Just broke my GFX score again with a 26382


Memory is at +585 with a custom curve for core clock that hits 2214 @ 1.175v

Went to Afterburner to give me better voltage curve control.


----------



## OccamRazor

Quote:


> Originally Posted by *JoeDirt*
> 
> Just broke my GFX score again with a 26382
> 
> 
> Memory is at +585 with a custom curve for core clock that hits 2214 @ 1.175v
> 
> Went to Afterburner to give me better voltage curve control.


What card do you have Joe?

Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *JoeDirt*
> 
> Just broke my GFX score again with a 26382
> 
> 
> Memory is at +585 with a custom curve for core clock that hits 2214 @ 1.175v
> 
> Went to Afterburner to give me better voltage curve control.


What is your process behind setting your curve?

I tested mine for stability individually at each voltage point (and was stable at each voltage point for multiple runs) -- but as soon as I entered it all in together, and unlocked voltage it would crash everytime!

My curve is below:

800mv : +175 - 1771mhz . . *Ultra FS & Skydiver stable
812mv : +175 - 1797mhz . . *99.3 Ultra & Skydiver stable
825mv : +180 - 1835mhz *97.9 Ultra & Skydiver stable
831mv : +175 - 1847mhz *98.5 Ultra & Skydiver stable
843mv : +180 - 1873mhz . . *98.7 Ultra & Skydiver 98.7
850mv : +175 - 1885mhz . . *98.3 Ultra FS stable
862mv : +180 - 1911mhz . . *98.4 Ultra FS stable
875mv : +165 - 1923mhz . . *99.4 Ultra FS stable
881mv : +180 - 1949mhz . . *98.9 Ultra FS stable
893mv : +160 - 1949mhz . . *98.2 Ultra......
900mv : +160 - 1961mhz . . *99.7 Ultra....
912mv : +165 - 1999mhz . . *99.7 Ultra..
925mv : +155 - 1999mhz . . *99.7 Ultra..
931mv : +150 - 1999mhz . . *99.7 Ultra..
943mv : +150 - 2012mhz . . *98.7 Ultra..
950mv : +155 - 2037mhz . . *98.8 Ultra..
962mv : +150 - 2050mhz . . *98.8 Ultra..
975mv : +155 - 2075mhz . . *98.8 Ultra..
981mv : +160 - 2088mhz . . *98.3 Ultra..
993mv : +160 - 2100mhz . . *99.6 Ultra..
1000mv: +160 - 2100mhz . . *99.0 Ultra..
1012mv: +160 - 2113mhz . . *98.8 Ultra..
1025mv: +160 - 2138mhz . . *98.6 Ultra..
1031mv: +165 - 2151mhz . . *98.3 Ultra..
1043mv: +160 - 2151mhz . . *99.2 Ultra..
1050mv: +160 - 2151mhz . . *98.5 Ultra FS & Skydiver *97.8
1062mv: +160 - 2176mhz . . *97.8 Ultra FS & Skydiver *98.0
1075mv: +160 - 2176mhz . . *97.5 Skydiver& Ultra stable.
1081mv: +160 - 2176mhz . . *97.1 Skydiver & Ultra stable
1093mv: +160 - 2176mhz

the 97/98/99 number is my framerate stability score for that run.

As I say; at each individual point its 100% stable on multiple runs; but as soon as I enter the entire curve it crashes within seconds of run. Very strange. I have a EVGA CLASSIFIED 1080. Was using Afterburner Beta 14. Might try the new 'AB final stable 4.3.0' and see if that works.

Anyone else had that problem?

Nick


----------



## Vellinious

Quote:


> Originally Posted by *JoeDirt*
> 
> Just broke my GFX score again with a 26382
> 
> 
> Memory is at +585 with a custom curve for core clock that hits 2214 @ 1.175v
> 
> Went to Afterburner to give me better voltage curve control.


How cool were you running it? I'm guessing pretty chilly to keep that clock from dropping frames at those voltages / frequencies. Or....you got a golden sample.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> What is your process behind setting your curve?
> 
> I tested mine for stability individually at each voltage point (and was stable at each voltage point for multiple runs) -- but as soon as I entered it all in together, and unlocked voltage it would crash everytime!
> 
> My curve is below:
> 
> 800mv : +175 - 1771mhz . . *Ultra FS & Skydiver stable
> 812mv : +175 - 1797mhz . . *99.3 Ultra & Skydiver stable
> 825mv : +180 - 1835mhz *97.9 Ultra & Skydiver stable
> 831mv : +175 - 1847mhz *98.5 Ultra & Skydiver stable
> 843mv : +180 - 1873mhz . . *98.7 Ultra & Skydiver 98.7
> 850mv : +175 - 1885mhz . . *98.3 Ultra FS stable
> 862mv : +180 - 1911mhz . . *98.4 Ultra FS stable
> 875mv : +165 - 1923mhz . . *99.4 Ultra FS stable
> 881mv : +180 - 1949mhz . . *98.9 Ultra FS stable
> 893mv : +160 - 1949mhz . . *98.2 Ultra......
> 900mv : +160 - 1961mhz . . *99.7 Ultra....
> 912mv : +165 - 1999mhz . . *99.7 Ultra..
> 925mv : +155 - 1999mhz . . *99.7 Ultra..
> 931mv : +150 - 1999mhz . . *99.7 Ultra..
> 943mv : +150 - 2012mhz . . *98.7 Ultra..
> 950mv : +155 - 2037mhz . . *98.8 Ultra..
> 962mv : +150 - 2050mhz . . *98.8 Ultra..
> 975mv : +155 - 2075mhz . . *98.8 Ultra..
> 981mv : +160 - 2088mhz . . *98.3 Ultra..
> 993mv : +160 - 2100mhz . . *99.6 Ultra..
> 1000mv: +160 - 2100mhz . . *99.0 Ultra..
> 1012mv: +160 - 2113mhz . . *98.8 Ultra..
> 1025mv: +160 - 2138mhz . . *98.6 Ultra..
> 1031mv: +165 - 2151mhz . . *98.3 Ultra..
> 1043mv: +160 - 2151mhz . . *99.2 Ultra..
> 1050mv: +160 - 2151mhz . . *98.5 Ultra FS & Skydiver *97.8
> 1062mv: +160 - 2176mhz . . *97.8 Ultra FS & Skydiver *98.0
> 1075mv: +160 - 2176mhz . . *97.5 Skydiver& Ultra stable.
> 1081mv: +160 - 2176mhz . . *97.1 Skydiver & Ultra stable
> 1093mv: +160 - 2176mhz
> 
> the 97/98/99 number is my framerate stability score for that run.
> 
> As I say; at each individual point its 100% stable on multiple runs; but as soon as I enter the entire curve it crashes within seconds of run. Very strange. I have a EVGA CLASSIFIED 1080. Was using Afterburner Beta 14. Might try the new 'AB final stable 4.3.0' and see if that works.
> 
> Anyone else had that problem?
> 
> Nick


Unfortunately, it doesn't work like that. The way your curve is set plays a big role in performance. Otherwise, we could just test voltage point 1.093 for the max clock and leave it there. All the points before you hit your max OC will impact performance. How, exactly? No one knows. It's a trial and error game...

That's why I've given up the curve method and just stick with my max offset OC (+89mhz / 2114mhz @ 1.062V)


----------



## steeludder

Quote:


> Originally Posted by *juniordnz*
> 
> Unfortunately, it doesn't work like that. The way your curve is set plays a big role in performance. Otherwise, we could just test voltage point 1.093 for the max clock and leave it there. All the points before you hit your max OC will impact performance. How, exactly? No one knows. It's a trial and error game...
> 
> That's why I've given up the curve method and just stick with my max offset OC (+89mhz / 2114mhz @ 1.062V)


Errr...
The curve is pretty straightforward actually.
It will always try to get to the highest voltage point within the given parameters. These are:
1) is the voltage within the limit?
2) is the power within the limit?
3) is the temp within the limit?

If yes is the answer to all three questions, the frequency given for voltage point 1.093 will be used. If any of the thresholds above is reached, the card will go to the next voltage point below. That's why if you remove thermal and power constraints, voltage will be your only limit and the 1.093 point is the only one you'll need to tweak.


----------



## juniordnz

Quote:


> Originally Posted by *steeludder*
> 
> Errr...
> The curve is pretty straightforward actually.
> It will always try to get to the highest voltage point within the given parameters. These are:
> 1) is the voltage within the limit?
> 2) is the power within the limit?
> 3) is the temp within the limit?
> 
> If yes is the answer to all three questions, the frequency given for voltage point 1.093 will be used. If any of the thresholds above is reached, the card will go to the next voltage point below. That's why if you remove thermal and power constraints, voltage will be your only limit and the 1.093 point is the only one you'll need to tweak.


Read my post again.


----------



## Derek1

Quote:


> Originally Posted by *steeludder*
> 
> Errr...
> The curve is pretty straightforward actually.
> It will always try to get to the highest voltage point within the given parameters. These are:
> 1) is the voltage within the limit?
> 2) is the power within the limit?
> 3) is the temp within the limit?
> 
> If yes is the answer to all three questions, the frequency given for voltage point 1.093 will be used. If any of the thresholds above is reached, the card will go to the next voltage point below. That's why if you remove thermal and power constraints, voltage will be your only limit and the 1.093 point is the only one you'll need to tweak.


About 100 pages back galeonki and synthetic killer were playing with the voltage curves in AB and posting pics. Those are the only 2 guys I think I have seen break 2300. In one of the pics of the curve in AB it looks as though it is a fairly normal curve up until about 1.07v @ say 2151 but then whomever it was suddenly jacked 1.093 up to 2300.
Is that what you mean?


----------



## JoeDirt

Quote:


> Originally Posted by *OccamRazor*
> 
> What card do you have Joe?
> 
> Cheers
> 
> Occamrazor


Strix Gaming with the F4 BIOS
Quote:


> Originally Posted by *nrpeyton*
> 
> What is your process behind setting your curve?
> 
> I tested mine for stability individually at each voltage point (and was stable at each voltage point for multiple runs) -- but as soon as I entered it all in together, and unlocked voltage it would crash everytime!
> 
> My curve is below:
> 
> 800mv : +175 - 1771mhz . . *Ultra FS & Skydiver stable
> 812mv : +175 - 1797mhz . . *99.3 Ultra & Skydiver stable
> 825mv : +180 - 1835mhz *97.9 Ultra & Skydiver stable
> 831mv : +175 - 1847mhz *98.5 Ultra & Skydiver stable
> 843mv : +180 - 1873mhz . . *98.7 Ultra & Skydiver 98.7
> 850mv : +175 - 1885mhz . . *98.3 Ultra FS stable
> 862mv : +180 - 1911mhz . . *98.4 Ultra FS stable
> 875mv : +165 - 1923mhz . . *99.4 Ultra FS stable
> 881mv : +180 - 1949mhz . . *98.9 Ultra FS stable
> 893mv : +160 - 1949mhz . . *98.2 Ultra......
> 900mv : +160 - 1961mhz . . *99.7 Ultra....
> 912mv : +165 - 1999mhz . . *99.7 Ultra..
> 925mv : +155 - 1999mhz . . *99.7 Ultra..
> 931mv : +150 - 1999mhz . . *99.7 Ultra..
> 943mv : +150 - 2012mhz . . *98.7 Ultra..
> 950mv : +155 - 2037mhz . . *98.8 Ultra..
> 962mv : +150 - 2050mhz . . *98.8 Ultra..
> 975mv : +155 - 2075mhz . . *98.8 Ultra..
> 981mv : +160 - 2088mhz . . *98.3 Ultra..
> 993mv : +160 - 2100mhz . . *99.6 Ultra..
> 1000mv: +160 - 2100mhz . . *99.0 Ultra..
> 1012mv: +160 - 2113mhz . . *98.8 Ultra..
> 1025mv: +160 - 2138mhz . . *98.6 Ultra..
> 1031mv: +165 - 2151mhz . . *98.3 Ultra..
> 1043mv: +160 - 2151mhz . . *99.2 Ultra..
> 1050mv: +160 - 2151mhz . . *98.5 Ultra FS & Skydiver *97.8
> 1062mv: +160 - 2176mhz . . *97.8 Ultra FS & Skydiver *98.0
> 1075mv: +160 - 2176mhz . . *97.5 Skydiver& Ultra stable.
> 1081mv: +160 - 2176mhz . . *97.1 Skydiver & Ultra stable
> 1093mv: +160 - 2176mhz
> 
> the 97/98/99 number is my framerate stability score for that run.
> 
> As I say; at each individual point its 100% stable on multiple runs; but as soon as I enter the entire curve it crashes within seconds of run. Very strange. I have a EVGA CLASSIFIED 1080. Was using Afterburner Beta 14. Might try the new 'AB final stable 4.3.0' and see if that works.
> 
> Anyone else had that problem?
> 
> Nick


One thing I noticed with this card is that it likes to run lean on voltage. Adding more would yield higher clock speeds but lower frame rates. It's like it's entering the tuning world of cars. I started with a stock curve and added my known best of +245 to see what it changed the curve to and what voltage it was calling for. Ran a pass in FS to confirm and see if stable. If it was I would keep that voltage point as is and keep rising the clock speed until it failed. I would then go to the next voltage point to see if it ran stable there. I just kept repeating that process. But then memory also comes into play. Something I noticed is there is something between core clock and memory clock that makes a big difference. Not so much a memory cut off point but a ratio that the memory and core need to be at with one another. If you are able to figure it out you can keep raising you memory and core by that ratio together and all will be well. If it falls out of that magic ratio then your frame rates will suffer. The magic point for me was memory had to be ruffly 2 1/2 times that of the core clock for stability and speed. Hope this helps in some way. I'm not to good and explaining what I'm thinking.

Here is my curve:


Quote:


> Originally Posted by *Vellinious*
> 
> How cool were you running it? I'm guessing pretty chilly to keep that clock from dropping frames at those voltages / frequencies. Or....you got a golden sample.


Not cold at all. I put my h105 on it and it always runs cool. Idle (depending on house temps 21-23c) around 26-29c and under load maxes at 42c. Anything above 45c and the card throttles. These cards should all come water cooled.


----------



## steeludder

Quote:


> Originally Posted by *Derek1*
> 
> About 100 pages back galeonki and synthetic killer were playing with the voltage curves in AB and posting pics. Those are the only 2 guys I think I have seen break 2300. In one of the pics of the curve in AB it looks as though it is a fairly normal curve up until about 1.07v @ say 2151 but then whomever it was suddenly jacked 1.093 up to 2300.
> Is that what you mean?


Yes. Because they lifted all the restrictions they could lift that would cause the card to throttle (temp & power). So only the max voltage is limiting him (1.093V).
Quote:


> Originally Posted by *juniordnz*
> Read my post again.


Yeah, we're essentially saying the same thing. I was trying to add a bit more clarity.


----------



## Vellinious

Quote:


> Originally Posted by *JoeDirt*
> 
> Strix Gaming with the F4 BIOS
> One thing I noticed with this card is that it likes to run lean on voltage. Adding more would yield higher clock speeds but lower frame rates. It's like it's entering the tuning world of cars. I started with a stock curve and added my known best of +245 to see what it changed the curve to and what voltage it was calling for. Ran a pass in FS to confirm and see if stable. If it was I would keep that voltage point as is and keep rising the clock speed until it failed. I would then go to the next voltage point to see if it ran stable there. I just kept repeating that process. But then memory also comes into play. Something I noticed is there is something between core clock and memory clock that makes a big difference. Not so much a memory cut off point but a ratio that the memory and core need to be at with one another. If you are able to figure it out you can keep raising you memory and core by that ratio together and all will be well. If it falls out of that magic ratio then your frame rates will suffer. The magic point for me was memory had to be ruffly 2 1/2 times that of the core clock for stability and speed. Hope this helps in some way. I'm not to good and explaining what I'm thinking.
> 
> Here is my curve:
> 
> Not cold at all. I put my h105 on it and it always runs cool. Idle (depending on house temps 21-23c) around 26-29c and under load maxes at 42c. Anything above 45c and the card throttles. These cards should all come water cooled.


Ah, you must be using the T4 bios


----------



## steeludder

Quote:


> Originally Posted by *JoeDirt*
> 
> Not so much a memory cut off point but a ratio that the memory and core need to be at with one another. If you are able to figure it out you can keep raising you memory and core by that ratio together and all will be well. If it falls out of that magic ratio then your frame rates will suffer. The magic point for me was memory had to be ruffly 2 1/2 times that of the core clock for stability and speed. Hope this helps in some way. I'm not to good and explaining what I'm thinking.


That's very interesting.
Just like back in the days of GPU alt-coin mining.
I wonder if I could push my card further if I tried to maintain that ratio...


----------



## Derek1

Quote:


> Originally Posted by *JoeDirt*
> 
> Strix Gaming with the F4 BIOS
> One thing I noticed with this card is that it likes to run lean on voltage. Adding more would yield higher clock speeds but lower frame rates. It's like it's entering the tuning world of cars. I started with a stock curve and added my known best of +245 to see what it changed the curve to and what voltage it was calling for. Ran a pass in FS to confirm and see if stable. If it was I would keep that voltage point as is and keep rising the clock speed until it failed. I would then go to the next voltage point to see if it ran stable there. I just kept repeating that process. But then memory also comes into play. Something I noticed is there is something between core clock and memory clock that makes a big difference. *Not so much a memory cut off point but a ratio that the memory and core need to be at with one another. If you are able to figure it out you can keep raising you memory and core by that ratio together and all will be well. If it falls out of that magic ratio then your frame rates will suffer. The magic point for me was memory had to be ruffly 2 1/2 times that of the core clock for stability and speed*. Hope this helps in some way. I'm not to good and explaining what I'm thinking.
> 
> Here is my curve:
> 
> Not cold at all. I put my h105 on it and it always runs cool. Idle (depending on house temps 21-23c) around 26-29c and under load maxes at 42c. Anything above 45c and the card throttles. These cards should all come water cooled.


I was thinking along these lines when I remarked to synthetic after he posted this.
http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/6880#post_25566236
(Look for my post quoting him on this page)
Trying to nail down the variability between performnce FPS/graphics score and clocks. What exactly is the relationship?
So ya, your 2.5 ratio is about the same as what works best for me give or take a few points.


----------



## juniordnz

I can't wait to put my 1080 under water so I can play with that T4 BIOS...

BTW, I found out that HWINFO64 is precise ass hell on power readings from the GPU. So there's no problem that GPU-Z can't read power limit % when using t4 bios...


----------



## JoeDirt

Quote:


> Originally Posted by *Vellinious*
> 
> Ah, you must be using the T4 bios


ops yeah I meant T4 sorry about that


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> I can't wait to put my 1080 under water so I can play with that T4 BIOS...
> 
> BTW, I found out that HWINFO64 is precise ass hell on power readings from the GPU. So there's no problem that GPU-Z can't read power limit % when using t4 bios...


Did you flash to Master or Slave Junior?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Did you flash to Master or Slave Junior?


Can't remember. Slave I guess.


----------



## Vellinious

I need to do that on mine. The single card runs aren't up there where I think they should be, but....the SLI runs are killin it.


----------



## nexxusty

Think I might play with the T4 BIOS on my FE....

Limited to 1.050v with the stock BIOS. Any voltage set in the curve past 1.050v is set to 1.050v... I've hardware removed my power limit and have the card never going past 39c.... Hoping I can get more out of it.


----------



## Darkboomhoney

i need the waterblock for my classified to finish my build .... alphacool says they get one in 3-4 weeks ...
I hope custom Bios comes out to put the voltage on max 1,25 volt now my clocks : 2152 / 500+


----------



## steeludder

Quote:


> Originally Posted by *steeludder*
> 
> That's very interesting.
> Just like back in the days of GPU alt-coin mining.
> I wonder if I could push my card further if I tried to maintain that ratio...


Meh, didn't really work.
I managed to get a run through at 2228 with +570mem but the score was lower. Any combination with higher numbers would just crash.








T4 bios time it is, I suppose...


----------



## nrpeyton

Quote:


> Originally Posted by *JoeDirt*
> 
> Strix Gaming with the F4 BIOS
> One thing I noticed with this card is that it likes to run lean on voltage. Adding more would yield higher clock speeds but lower frame rates. It's like it's entering the tuning world of cars. I started with a stock curve and added my known best of +245 to see what it changed the curve to and what voltage it was calling for. Ran a pass in FS to confirm and see if stable. If it was I would keep that voltage point as is and keep rising the clock speed until it failed. I would then go to the next voltage point to see if it ran stable there. I just kept repeating that process. But then memory also comes into play. Something I noticed is there is something between core clock and memory clock that makes a big difference. Not so much a memory cut off point but a ratio that the memory and core need to be at with one another. If you are able to figure it out you can keep raising you memory and core by that ratio together and all will be well. If it falls out of that magic ratio then your frame rates will suffer. The magic point for me was memory had to be ruffly 2 1/2 times that of the core clock for stability and speed. Hope this helps in some way. I'm not to good and explaining what I'm thinking.


Hmm thanks for info. Something about my card (maybe because its not MSI; and we are talking MSI AB) meant that I had to reset to default every time between locking a new voltage. So it wouldn't of been practical for me to "do the curve as I go" as I think you were explaining you did it like that. (Hitting default button erases everything). -- When I press L on a new voltage point, and hit apply. Voltage just doesn't change. Unless I hit the 'default' button first.
Maybe the STRIX cards don't have this issue. This could even explain why when I put it all together -- it just crashed; maybe AB was unable to use the information I was entering due to incompatability; or maybe even something in the settings I need to change around voltage control. Going to have another look tonight now; and see if I can fix that; then maybe I can actually use the curve I spent 3 days building lol. Or -- even start again; but just building it up gradually point, to point, instead of only ever testing each point individually with no other clocks set at other voltages.
Quote:


> Originally Posted by *Darkboomhoney*
> 
> 
> 
> i need the waterblock for my classified to finish my build .... alphacool says they get one in 3-4 weeks ...
> I hope custom Bios comes out to put the voltage on max 1,25 volt now my clocks : 2152 / 500+


My thoughts exactly mate. (Another 1080 Classified Owner here)

And I am also waiting on the Alphacool block 

My max clock for my Classified is pretty much exactly the same as yours. Doesn't crash on 2152 but anything higher is a *hit or a miss* - sometimes it will do it - other times it won't.

Can you get a firestrike run completed at 2179? even if it takes you a couple of attempts?


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Unfortunately, it doesn't work like that. The way your curve is set plays a big role in performance. Otherwise, we could just test voltage point 1.093 for the max clock and leave it there. All the points before you hit your max OC will impact performance. How, exactly? No one knows. It's a trial and error game...
> 
> That's why I've given up the curve method and just stick with my max offset OC (+89mhz / 2114mhz @ 1.062V)


There are indeed certain mysteries with GPU Boost 3.0 that we would love if this information was made public. AMD really have some work to do if we are ever to see Nvidia's attitude change. Imagine if cars were only available from one domestic vehicle manufacturer because only 1 or 2 existed. The attitude of that manufacturer I'm sure would be very secretive and probably horribly opposed to any modifications/improvements to its designs. look at what they are charging for a something the size of a coin; which can be easily replicated and where natural resources are in no short supply.

Quote:


> Originally Posted by *steeludder*
> 
> Errr...
> The curve is pretty straightforward actually.
> It will always try to get to the highest voltage point within the given parameters. These are:
> 1) is the voltage within the limit?
> 2) is the power within the limit?
> 3) is the temp within the limit?
> 
> If yes is the answer to all three questions, the frequency given for voltage point 1.093 will be used. If any of the thresholds above is reached, the card will go to the next voltage point below. That's why if you remove thermal and power constraints, voltage will be your only limit and the 1.093 point is the only one you'll need to tweak.


Informative 

But how does GPU Boost 3.0 then decide which "clock" to give said voltage.?

How does one card boost to 2000 automatically out of the box, while another card with the same advertised factory boost clock only reach 1860. How does the card know how good it is?


----------



## Darkboomhoney

Quote:


> Originally Posted by *nrpeyton*
> 
> Hmm thanks for info. Something about my card (maybe because its not MSI; and we are talking MSI AB) meant that I had to reset to default every time between locking a new voltage. So it wouldn't of been practical for me to "do the curve as I go" as I think you were explaining you did it like that. (Hitting default button erases everything). -- When I press L on a new voltage point, and hit apply. Voltage just doesn't change. Unless I hit the 'default' button first.
> Maybe the STRIX cards don't have this issue. This could even explain why when I put it all together -- it just crashed; maybe AB was unable to use the information I was entering due to incompatability; or maybe even something in the settings I need to change around voltage control. Going to have another look tonight now; and see if I can fix that; then maybe I can actually use the curve I spent 3 days building lol. Or -- even start again; but just building it up gradually point, to point, instead of only ever testing each point individually with no other clocks set at other voltages.
> My thoughts exactly mate. (Another 1080 Classified Owner here)
> 
> And I am also waiting on the Alphacool block
> 
> My max clock for my Classified is pretty much exactly the same as yours. Doesn't crash on 2152 but anything higher is a *hit or a miss* - sometimes it will do it - other times it won't.
> 
> Can you get a firestrike run completed at 2179? even if it takes you a couple of attempts?


Yes i can run firestrike at 2176,5 clock but is not Gamestable.... only 2152.... i need more voltage and the waterblock to find the max settings








But with lower cpu clock and lower gpu clock i get higher score ... http://www.3dmark.com/fs/9813830


----------



## nrpeyton

Quote:


> Originally Posted by *Darkboomhoney*
> 
> Yes i can run firestrike at 2176,5 clock but is not Gamestable.... only 2152.... i need more voltage and the waterblock to find the max settings


I will be ordering the alphacool block the day it becomes available on their website.

Our cards seem to clock the same; will be interesting to see how you get on after you have your block; as I'm sure you'll be interested to see if mine does anything for me too.

Will keep you posted 

Have you tried the STRIX/T4 BIOS with your Classified?


----------



## Darkboomhoney

Quote:


> Originally Posted by *nrpeyton*
> 
> I will be ordering the alphacool block the day it becomes available on their website.
> 
> Our cards seem to clock the same; will be interesting to see how you get on after you have your block; as I'm sure you'll be interested to see if mine does anything for me too.
> 
> Will keep you posted
> 
> Have you tried the STRIX/T4 BIOS with your Classified?


can you post the link of the waterblock i cant find it on alphacool ...








oh sorry i understand you waiting too order one ... xD
yes i tried the t4 bios, high clock but bad performance ... i can hit the clock with the t4 bios to 23...+
i posted a update if i have the block and finished my build.


----------



## nrpeyton

Quote:


> Originally Posted by *Darkboomhoney*
> 
> can you post the link of the waterblock i cant find it on alphacool ...
> 
> 
> 
> 
> 
> 
> 
> 
> oh sorry i understand you waiting too order one ... xD
> yes i tried the t4 bios, high clock but bad performance ... i can hit the clock with the t4 bios to 23...+
> i posted a update if i have the block and finished my build.


Here is the link to my original post about it including emails between AlphaCool support and myself:

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/7190

/\ all the information I have is on that link; post #7193 on this thread it is.

I was the same; I could get 2300+ clocks with T4 also; but performance was LESS. Did you measure your 'power draw' while you were trying it? On HWINFO64 I noticed my card would never draw more than about 110W. (In scenarios that used less than 100w anyway -- say 80w, my card would only draw maybe 30-40w).

I was thinking of doing a physical, hard power mod as a workaround for this. If it works then we would have a solution for voltage on Classified 1080. (By combining power mod and T4).


----------



## wardo3640

So after a week of tinkering this is where I am at...




http://www.3dmark.com/spy/696484

It puts me at 112 on the list of all the 1080's out there even those in 4xsli and only about 9 people in 2xsli above me but.....

I WANT MORE!!! lol









any tips or pointers from you pros out there on how I can squeeze out a little more performance from my set up?


----------



## dVeLoPe

im having a bit of an issue here got a good clocker but evga is willing to cross ship me an ''updated'' card

aka with new bios thermal pads and new paste etc but i was told the card would be ''open'' and not ''new''

although the 5yr warranty still applies that kind of bugs me what would you guys do?


----------



## Vellinious

Quote:


> Originally Posted by *dVeLoPe*
> 
> im having a bit of an issue here got a good clocker but evga is willing to cross ship me an ''updated'' card
> 
> aka with new bios thermal pads and new paste etc but i was told the card would be ''open'' and not ''new''
> 
> although the 5yr warranty still applies that kind of bugs me what would you guys do?


Replace them myself...no way I'd let a good card go.


----------



## Derek1

Quote:


> Originally Posted by *dVeLoPe*
> 
> im having a bit of an issue here got a good clocker but evga is willing to cross ship me an ''updated'' card
> 
> aka with new bios thermal pads and new paste etc but i was told the card would be ''open'' and not ''new''
> 
> although the 5yr warranty still applies that kind of bugs me what would you guys do?


Keep it.
Buy the Hybrid kit and put it under water.


----------



## nrpeyton

Quote:


> Originally Posted by *dVeLoPe*
> 
> im having a bit of an issue here got a good clocker but evga is willing to cross ship me an ''updated'' card
> 
> aka with new bios thermal pads and new paste etc but i was told the card would be ''open'' and not ''new''
> 
> although the 5yr warranty still applies that kind of bugs me what would you guys do?


I read that EVGA said that if a customer messes up while trying to install the pads themselves; EVGA will still honour their warranty agreement.

So if you've got a good clocker; I agree - don't let it go.

What are you getting to then max O/C?


----------



## Derek1

This is from the EVGA warranty page. So there is a consideration here to made. But it would include more wait times if you needed to keep sending cards back until you got a equal or better card.

"Products sent in for RMA will be repaired and returned or replaced with a thoroughly tested recertified product of equal or greater performance."

I would still keep what you got though. Do the fix or as I said above get the Hybrid kit and soak the mofo. lol


----------



## Manac0r

Picked up an EVGA hybrid 1080 for my ITX build, using Asus Rock X99E and a Manta case - so not the smallest but compared to a full tower it's a step in the right direction









Anyway card is game stable at 2050hz on Core and 5500 Mem and 120% power target - temps never go above 50 and pretty damn quiet too.

All in all very impressed, and coming from a Titan X (Maxwell) the gains are noticeable,

Now I just need to resist pushing this card like I did the Tx which died on me - R.IP 5/11/16 - served with distinction


----------



## nrpeyton

what the **** -- how the **** -- i don't know what I did.. or how i am doing it

but i've been messing around with the curve in the new MSI AB build 4.3.0 -- and may I say; *MASSIVE* improvements up from the last edition (beta 14).

Anyway as we speak the impossible is *still* happening right before my eyes.. im on about 5 minutes now and *still not crashed* at 2202MHZ on 1080 EVGA Classified.

See my image below:

'how_the_****.jpg'


6 minutes now....?!?!

*Edit:* 10 mins? My detailed curve, I posted in a previous message with all the volts and clocks seems much more stable in this new MSI AB build... I just made a few adjustments then went back along it fixing the way the offsets jump around when it "auto balances" everytime you click apply. A bit repetative to overcome it but eventually it does what u want...

Edit 2:
Going to load up 'The Witcher 3' and see if its actually game-stable too. 

Hey does this mean I can ebay my card as a "silicon lottery special" *2200mhz'er*+ now???? lol


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> what the **** -- how the **** -- i don't know what I did.. or how i am doing it
> 
> but i've been messing around with the curve in the new MSI AB build 4.3.0 -- and may I say; *MASSIVE* improvements up from the last edition (beta 14).
> 
> Anyway as we speak the impossible is *still* happening right before my eyes.. im on about 5 minutes now and *still not crashed* at 2202MHZ on 1080 EVGA Classified.
> 
> See my image below:
> 
> 'how_the_****.jpg'
> 
> 
> 6 minutes now....?!?!


Terrific!

What are voltages like? Can't read the small print on the pic.
Power Draw?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> what the **** -- how the **** -- i don't know what I did.. or how i am doing it
> 
> but i've been messing around with the curve in the new MSI AB build 4.3.0 -- and may I say; *MASSIVE* improvements up from the last edition (beta 14).
> 
> Anyway as we speak the impossible is *still* happening right before my eyes.. im on about 5 minutes now and *still not crashed* at 2202MHZ on 1080 EVGA Classified.
> 
> See my image below:
> 
> 'how_the_****.jpg'
> 
> 
> 6 minutes now....?!?!
> 
> *Edit:* 10 mins? My detailed curve, I posted in a previous message with all the volts and clocks seems much more stable in this new MSI AB build... I just made a few adjustments then went back along it fixing the way the offsets jump around when it "auto balances" everytime you click apply. A bit repetative to overcome it but eventually it does what u want...
> 
> Edit 2:
> Going to load up 'The Witcher 3' and see if its actually game-stable too.
> 
> Hey does this mean I can ebay my card as a "silicon lottery special" *2200mhz'er*+ now???? lol


Did it actually help, though. That's the question.


----------



## Krzych04650

The only difference between MSI Gaming and MSI Gaming X are clocks?


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Terrific!
> 
> What are voltages like? Can't read the small print on the pic.
> Power Draw?


Quote:


> Originally Posted by *Vellinious*
> 
> Did it actually help, though. That's the question.


Quote:


> Originally Posted by *Krzych04650*
> 
> The only difference between MSI Gaming and MSI Gaming X are clocks?


Stock: http://www.3dmark.com/3dm/15931805 (better than *75%* of all results)

Curve (with *2202MHZ*): http://www.3dmark.com/3dm/15931931 (better than *83%* of all results)

/\ curve above with memory O/C too at +500: http://www.3dmark.com/3dm/15932558 (first time I've bet the 4k gaming PC score)
.
Okay my AMD FX-8350 brings down my overall score; but graphics score on 2nd one definitely better. I actually have a validation on 3dmark with 2202mhz. Just never thought this was possible.. I mean I've done it before by cheating (turning fan off so core drops so down REALLY fast) but for this run it stayed at 2202 the entire run (not just for 0.01 seconds lol..

Edit: Right click image on my screenshot and select 'view in new tab' then i think you can see it in full, original size << then you'll be able to see power draw and voltages in hwinfo64. Total system draw peaked at 525w, Card was normal at an upper most peak of 265w.

*Edit 2:*
Also perfectly stable in "The Witcher 3" although core dropped to 2189 after 5 minutes due to temp going above 56c. But even at 2189 it never dropped below it; and was stable.. no crashes.

*Edit 3:*
Before this; I've never really been truly stable unless running at 2152 or lower (traditional core-clock slider adjustment in AB)

*Edit 4:*
CORE O/C = Extra 3FPS (77 to 80 FPS)
This curve also seems most partial to a +500mhz memory O/C. (Got me from 80 to 86 FPS)
Even going to +499 or + 501 = small FPS drop.

So my total O/C combined core and memory takes me from 77FPS to 86FPS  10% 

Just a shame the core doesn't do as much with PASCAL as it did with MAXWELL; something to do with shader count I think.?!

10% is what I got with my 980 STRIX; but my max overclock on that was 1409 (or 1389 stable). So now I'm happy! Got my 10% and got my 2200 for re-selling card in future.... hey I will include the MSI AB file with curve if neeed be  lol


----------



## greg1184

I just put my evga 1080 ACX 3.0 under water. What options do I have to increase the voltage. I am at +150/+500 with EVGA Precision.


----------



## dseg

Quote:


> Originally Posted by *greg1184*
> 
> I just put my evga 1080 ACX 3.0 under water. What options do I have to increase the voltage. I am at +150/+500 with EVGA Precision.


What kind of block did you get?
Does only EK make blocks for the ACXs?


----------



## greg1184

Quote:


> Originally Posted by *dseg*
> 
> What kind of block did you get?
> Does only EK make blocks for the ACXs?










My first water block and I Love it. Phanteks gave great directions to install it and it was pretty easy.


----------



## 0gata

http://www.3dmark.com/spy/697994
hmm not bad.. just pushed randomly sliders ...gonna try tomorow to oc ..
its a gaming X with a Z bios


----------



## steeludder

Quote:


> Originally Posted by *nrpeyton*
> Informative
> 
> But how does GPU Boost 3.0 then decide which "clock" to give said voltage.?
> 
> How does one card boost to 2000 automatically out of the box, while another card with the same advertised factory boost clock only reach 1860. How does the card know how good it is?


I'm gonna venture a guess that all cards of the same model will boost similarly according to a set curve set by the manufacturer. At no point does the card know how good it is. It just boosts along the curve as per original manufacturer setting.
The gpu will always boost to the highest possible voltage point until it reaches one of the throttling thresholds.


----------



## DADDYDC650

Just got me an EVGA GTX 1080 ACX 3.0 from B&H for $599 with free Gears 4 game. Should hold me over until the 1080 Ti is released.









Also, noticed I have until Feb 1 to return the card. Got 3 options to upgrade to the 1080 Ti then. Step-up, return the card or sell it if all else fails.


----------



## KickAssCop

Prices on these cards are declining. Just saw the Zotac 1080 AMP for about 572 on Amazon. Not bad at all.


----------



## MrTOOSHORT

Nice looking card greg1184, stock back plate looks sweet too!


----------



## Derek1

Quote:


> Originally Posted by *DADDYDC650*
> 
> Just got me an EVGA GTX 1080 ACX 3.0 from B&H for $599 with free Gears 4 game. Should hold me over until the 1080 Ti is released.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, noticed I have until Feb 1 to return the card. Got 3 options to upgrade to the 1080 Ti then. Step-up, return the card or sell it if all else fails.


You might need to run over to the EVGA forums and find out if you need to install the new vbios and order a set of thermal pads.
The ACX 3.0 coolers are having some issues.


----------



## Vellinious

Hit a personal best graphics score this morning....wish I had went ahead and bumped the CPU up at the same time. Oh well....still pretty good. Used the same curve I had created for SLI. Might try to bump the core up a little more tonight, see what happens.

http://www.3dmark.com/spy/699181


----------



## nrpeyton

Quote:


> Originally Posted by *steeludder*
> 
> I'm gonna venture a guess that all cards of the same model will boost similarly according to a set curve set by the manufacturer. At no point does the card know how good it is. It just boosts along the curve as per original manufacturer setting.
> The gpu will always boost to the highest possible voltage point until it reaches one of the throttling thresholds.


Interesting but I wonder how the "boost" is actually calculated. My card boosts to 2000 out of the box (and sits between 1987mhz and 2025mhz depending on temperature). Also if I simply do a traditional +100 on the core clock slider my "boost" value changes from stock 1860 to 1960. All makes sense so far....

However if I then set a manual curve by plotting a +100 offset on every voltage from 800mv (lowest) up to 1.093v (not highest on graph, but highest it will go) then hit apply. My boost on GPU-Z only changes to 1898.

I wonder the significance of this "final" or "advertised" boost speed actually is. Because I could add a +300 on the last voltage point on the curve (higher than the 1.093v limit) that the card will never ever get to.. But by doing so still gives me a higher "advertised" boost on GPU-Z.

It must mean or do "something" because I can get to a semi-stable 2202mhz using curve but *never* with traditional offset method.


----------



## 0gata

Quote:


> Originally Posted by *nrpeyton*
> 
> what the **** -- how the **** -- i don't know what I did.. or how i am doing it
> 
> but i've been messing around with the curve in the new MSI AB build 4.3.0 -- and may I say; *MASSIVE* improvements up from the last edition (beta 14).
> 
> Anyway as we speak the impossible is *still* happening right before my eyes.. im on about 5 minutes now and *still not crashed* at 2202MHZ on 1080 EVGA Classified.
> 
> See my image below:
> 
> 'how_the_****.jpg'
> 
> 
> 6 minutes now....?!?!
> 
> *Edit:* 10 mins? My detailed curve, I posted in a previous message with all the volts and clocks seems much more stable in this new MSI AB build... I just made a few adjustments then went back along it fixing the way the offsets jump around when it "auto balances" everytime you click apply. A bit repetative to overcome it but eventually it does what u want...
> 
> Edit 2:
> Going to load up 'The Witcher 3' and see if its actually game-stable too.
> 
> Hey does this mean I can ebay my card as a "silicon lottery special" *2200mhz'er*+ now???? lol


are u using MSI 4.3 FINAL ? or u using 14 beta ? (didnt understand)

and btw that curve is a big mess always changes and it changing Clock speeds and temperature intervals ..before it changed(drop) clock frequency on every 10c .. now it down clocks on every 5c


----------



## Koniakki

Quote:


> Originally Posted by *0gata*
> 
> are u using MSI 4.3 FINAL ? or u using 14 beta ? (didnt understand)
> 
> and btw that curve is a big mess always changes and it changing Clock speeds and temperature intervals ..before it changed(drop) clock frequency on every 10c .. now it down clocks on every 5c


He was using the previous version of MSI AB but the higher clocks in his post you quoted, are achieved using the latest version 4.3.0 Final release.

In short, he's using the MSI 4.3.0 Final.


----------



## DADDYDC650

Quote:


> Originally Posted by *Derek1*
> 
> You might need to run over to the EVGA forums and find out if you need to install the new vbios and order a set of thermal pads.
> The ACX 3.0 coolers are having some issues.


I'm fully aware. Hoping it is a fixed version or I'll have EVGA send me a new one. I have until Feb 1 to return it regardless.


----------



## Derek1

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm fully aware. Hoping it is a fixed version or I'll have EVGA send me a new one. I have until Feb 1 to return it regardless.


To the place you bought it? That is generous, most places only offer 14 day and if you are lucky 30. And still have to pay a restock fee.
EVGA warranty is 30 days for a new card and after that you get a refurb.


----------



## DADDYDC650

Quote:


> Originally Posted by *Derek1*
> 
> To the place you bought it? That is generous, most places only offer 14 day and if you are lucky 30. And still have to pay a restock fee.
> EVGA warranty is 30 days for a new card and after that you get a refurb.


Most places give you until then because of their holiday return policy.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Interesting but I wonder how the "boost" is actually calculated. My card boosts to 2000 out of the box (and sits between 1987mhz and 2025mhz depending on temperature). Also if I simply do a traditional +100 on the core clock slider my "boost" value changes from stock 1860 to 1960. All makes sense so far....
> 
> However if I then set a manual curve by plotting a +100 offset on every voltage from 800mv (lowest) up to 1.093v (not highest on graph, but highest it will go) then hit apply. My boost on GPU-Z only changes to 1898.
> 
> I wonder the significance of this "final" or "advertised" boost speed actually is. Because I could add a +300 on the last voltage point on the curve (higher than the 1.093v limit) that the card will never ever get to.. But by doing so still gives me a higher "advertised" boost on GPU-Z.
> 
> It must mean or do "something" because I can get to a semi-stable 2202mhz using curve but *never* with traditional offset method.


The offset method has a tendency to line up the highest clock with something less than 1.093v. Set an offset, and then look at the frequency voltage curve it creates for that clock...you'll see the difference right away.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> You might need to run over to the EVGA forums and find out if you need to install the new vbios and order a set of thermal pads.
> The ACX 3.0 coolers are having some issues.


New BIOS is just a more agressive fan curve; only useful if you leave fan on auto while gaming (which to me would defeat the entire purpose of even "being" on an O/C thread at all) lol; since leaving your fan on auto = automatic performance decrease.

Best thing to do is just set a manual fan curve; more fun this way anyway 

What would have been good to see in the BIOS; is the fans actually using their entire 12v capacity. (or at least giving u the option)

I have a 1080 Classified; my fans max out at 2800RPM but are capable of around 3600RPM. They must only be running at around 9v at max.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> New BIOS is just a more agressive fan curve; only useful if you leave fan on auto while gaming (which to me would defeat the entire purpose of even "being" on an O/C thread at all) lol; since leaving your fan on auto = automatic performance decrease.
> 
> Best thing to do is just set a manual fan curve; more fun this way anyway
> 
> What would have been good to see in the BIOS; is the fans actually using their entire 12v capacity. (or at least giving u the option)
> 
> I have a 1080 Classified; my fans max out at 2800RPM but are capable of around 3600RPM. They must only be running at around 9v at max.


Oh ya I know all that, but from reading that forum over there for the last 2 or 3 weeks you would think that everyone there has never known how to control a fan. lol
A lot of whiny babies out there who need to have everything spoon fed to them and too spoiled and lazy to do any reading/research on their own.
Part of the culture of entitlement and inability to think critically, preferring hyper-emotionality and narcissism to have their demands met.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> New BIOS is just a more agressive fan curve; only useful if you leave fan on auto while gaming (which to me would defeat the entire purpose of even "being" on an O/C thread at all) lol; since leaving your fan on auto = automatic performance decrease.
> 
> Best thing to do is just set a manual fan curve; more fun this way anyway
> 
> What would have been good to see in the BIOS; is the fans actually using their entire 12v capacity. (or at least giving u the option)
> 
> I have a 1080 Classified; my fans max out at 2800RPM but are capable of around 3600RPM. They must only be running at around 9v at max.


Actually, the new BIOS unlocked more RPM on the fan curve. The original BIOS would come up to ~2700RPM, the new one goes up to ~3200RPM. Don't know about the fan curve they set on the vBIOS because I use my own (100% when gaming, because I use headphones and can't it anyway). Those extra 500RPM are very welcome IMO.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Oh ya I know all that, but from reading that forum over there for the last 2 or 3 weeks you would think that everyone there has never known how to control a fan. lol
> A lot of whiny babies out there who need to have everything spoon fed to them and too spoiled and lazy to do any reading/research on their own.
> Part of the culture of entitlement and inability to think critically, preferring hyper-emotionality and narcissism to have their demands met.


lol 

Quote:


> Originally Posted by *juniordnz*
> 
> Actually, the new BIOS unlocked more RPM on the fan curve. The original BIOS would come up to ~2700RPM, the new one goes up to ~3200RPM. Don't know about the fan curve they set on the vBIOS because I use my own (100% when gaming, because I use headphones and can't it anyway). Those extra 500RPM are very welcome IMO.


hmm really, okay excellent.

perfect example of a good reason to visit these forums... rep +1 lol

i'm away to download it then; hope the same applies to Classified


----------



## Vellinious

Quote:


> Originally Posted by *JoeDirt*
> 
> Just broke my GFX score again with a 26382
> 
> 
> Memory is at +585 with a custom curve for core clock that hits 2214 @ 1.175v
> 
> Went to Afterburner to give me better voltage curve control.


Man....gave it a go, but, I came up WAY short. You've got a killer GPU, man. Mine won't touch 2200+ at 1.075v.

25178 was the best I could muster for graphics score.

http://www.3dmark.com/fs/10722645


----------



## THEROTHERHAMKID

What's the evga precision auto over clock like? Does it work ? Any good ?


----------



## juniordnz

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What's the evga precision auto over clock like? Does it work ? Any good ?


Most likely it doesn't. You can't test isolated points of the curve for stability and then put them together. That would be pretty simple and straightforward but, unfortunately, it doesn't work like that.


----------



## nrpeyton

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> What's the evga precision auto over clock like? Does it work ? Any good ?


Its rubbish. It doesn't resume where it left off after a crash.

For example if it is scanning 1.063v at +50mhz, +60mhz, +70mhz, +80mhz, +90mhz and +100mhz then crashes at +110mhz it won't save the scan as a +100mhz offset then move onto the next voltage.
Instead, unfortunately; it will start right back at the beginning for 1.063v at +10mhz. It will repeat this a few times until eventually giving up. When it gives up it will set the lowest possible + offset for that voltage before moving on to the next one.

Okay it's still better than before; as before it wouldn't even properly save it's progress -at all- and you'd often have to start from scratch everytime driver crashed. Meaning you'd never ever finish a full run and never be able to use it to generate your curve.

Now at least, u get to finish; but due to its tendency to "give up and aim for lowest" instead of simply saving the + offset from the "last good scan" and moving on, it's still inadequate.

If anyone has better luck with it than me. Please share.

If it actually worked it would be the best thing that's happened for GPU overclocking since forever.

Get your finger out EVGA. Pathetic. Definitely a missed opportunity. MSI AB puts u to shame.

*Edit*
Maybe it works better if you own a card that artifacts before driver crashes when overclocking core.
On my card I NEVER see artifacts before crashes in all environments I've tested.
On my GTX 980 it was the opposite.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Its rubbish. It doesn't resume where it left off after a crash.
> 
> For example if it is scanning 1.063v at +50mhz, +60mhz, +70mhz, +80mhz, +90mhz and +100mhz then crashes at +110mhz it won't save the scan as a +100mhz offset then move onto the next voltage.
> Instead, unfortunately; it will start right back at the beginning for 1.063v at +10mhz. It will repeat this a few times until eventually giving up. When it gives up it will set the lowest possible + offset for that voltage before moving on to the next one.
> 
> Okay it's still better than before; as before it wouldn't even properly save it's progress -at all- and you'd often have to start from scratch everytime driver crashed. Meaning you'd never ever finish a full run and never be able to use it to generate your curve.
> 
> Now at least, u get to finish; but due to its tendency to "give up and aim for lowest" instead of simply saving the + offset from the "last good scan" and moving on, it's still inadequate.
> 
> If anyone has better luck with it than me. Please share.
> 
> If it actually worked it would be the best thing that's happened for GPU overclocking since forever.
> 
> Get your finger out EVGA. Pathetic. Definitely a missed opportunity. MSI AB puts u to shame.
> 
> *Edit*
> Maybe it works better if you own a card that artifacts before driver crashes when overclocking core.
> On my card I NEVER see artifacts before crashes in all environments I've tested.
> On my GTX 980 it was the opposite.


Those clocks / voltages are going to vary greatly in how stable they are, by the temps your GPU is running at. I would imagine by the time it works it's way through all those settings, it's probably getting up there a ways, and the higher clocks aren't going to be stable anyway.

Temps are EVERYTHING with Pascal


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Temps are EVERYTHING with Pascal


Amen!

btw, what's the temp point where your cards gets it's first clock down? I've noticed a big difference from Armor to FTW on the first clock down. My armor would throttle as soon as temp hits 39ºC, while the FTW can hold it's max clock until 47-48ºC with 1.062V, a little bit more when at 1.093V.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Those clocks / voltages are going to vary greatly in how stable they are, by the temps your GPU is running at. I would imagine by the time it works it's way through all those settings, it's probably getting up there a ways, and the higher clocks aren't going to be stable anyway.
> 
> Temps are EVERYTHING with Pascal


I'm hearing you man; really I am -- and I *can not wait* for Alphacool to finish this mysterious new full block they are making for my Classified 

I know you've mentioned this to me a few times now and you're absolutely right I really need to get her under water before I can really have any futher moaning rights lol ;-)

The other alternative I have is to swap it for an FTW -- but I think considering I can run stable at 2152mhz and get a non/semi-stable 3dmark test validation at 2202 with a bit of curve work; I would go as far to say I have a decent clocker. Not exceptional -- but definitely "good". If I made the swap I'd be risking a lot....

I actually spoke to EVGA about possibly doing this swap yesterday; they've asked me to write back with reasons about why I'm not satisified with the card and said that they may be able to talk to their RMA department to make an exception. (As long as I was happy to lose the difference between the higher priced Classified and FTW).

I was thinking about writing back and not only mentioning the water-block problem (as officially there is still nothing for Classified yet and I believe they are unaware of Alpha-Cools plans) but also mention about the ridiculous situation we are in where Classified owners are actually more volt limited than owners of any other card in their line-up due to the STRIX/T4 Classified incompatibility. However I'm worried that if I mention this they will cancel my warranty or something lol....

I seem to be making small, reserved and careful steps towards a future swap every week (due to my dissatisfaction) but I still find myself unable to just MAKE the damn decision on this one. I'm not usually so indecisive.

My emails to EVGA so far have been caefully written, polite and with an "air of" a guy just looking for a bit guidance. I'm trying to be careful what I actually ask for and trying to let them suggest it for me. But my disatisafaction at the situation is definitely real -- don't get me wrong. I have the bigger, more expensive card; but due to STRIX/T4 incompatability and EK waterblock support being dropped I feel the overclocking featureset I'm getting is smaller than an owners of less expensive cards. This is back-to-front considering the EVGA Classified's were always meant to of had "overclocking DNA in their blood".

Nick Peyton


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> what's the temp point where your cards gets it's first clock down?


Depends how you have it overclocked.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Amen!
> 
> btw, what's the temp point where your cards gets it's first clock down? I've noticed a big difference from Armor to FTW on the first clock down. My armor would throttle as soon as temp hits 39ºC, while the FTW can hold it's max clock until 47-48ºC with 1.062V, a little bit more when at 1.093V.


35c on mine. 36c or above, and I start seeing worse results. 39c at the voltage / frequency curve I'm pushing, means a driver crash.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> 35c on mine. 36c or above, and I start seeing worse results. 39c at the voltage / frequency curve I'm pushing, means a driver crash.


So, contrary to what some said to me in this and other threads, temperature can change the max overclock a card can achieve.

Nice to hear that...


----------



## ncck

Quote:


> Originally Posted by *greg1184*
> 
> My first water block and I Love it. Phanteks gave great directions to install it and it was pretty easy.


What case?


----------



## Koniakki

Quote:


> Originally Posted by *ncck*
> 
> What case?


He has an *Inwin 902* listed in his specs if he's still using that one.


----------



## greg1184

Quote:


> Originally Posted by *Koniakki*
> 
> He has an *Inwin 902* listed in his specs if he's still using that one.


Oops a type o. It's actually the inwin 909. Really enjoyed working with the case.


----------



## klepp0906

so modded/custom bios' never became a thing w/ the 1080's eh? nvidia have em locked down too tight?


----------



## nrpeyton

Quote:


> Originally Posted by *klepp0906*
> 
> so modded/custom bios' never became a thing w/ the 1080's eh? nvidia have em locked down too tight?


Yes you are correct. And unfortunately this has seriously devalued cards such as the Classified 1080; which now suffers from a "lesser" overclocking feature-set than some of their cheaper cards. Reason for this is due to a modified BIOS being leaked which is compatible with cards with less phases on VRM's but not compatible with models with extreme amounts of phases; such as the Classifieds 14 phase system. This has left us with a "back-to-front" problem in which owners of cards with smaller feature-sets actually have *more* voltage control than the bigger cards.

I wish they would "accidently ;-) leak a BIOS that allowed voltage up to 1.25 for Classified/HOF owners.
It would make sense for sales of these cards. And I can't see it hurting RMA performance since cards with weaker VRM's arlready have the STRIX/T4. I'm only asking for the same voltage control everyone else has.

Also I am looking forward to alphacool finishing their block for my card to get temps down lol


----------



## klepp0906

thank you for the reply. figured atleast somethign with modified memory timings wouldve been made available but i guess with limited voltage control you'd be really limited to that end as well.


----------



## nrpeyton

*-- I have a theory --*

What if I reached out to EVGA and said I'm participating in an LN2 competition for world-record-breaking. Remind them that I am a 'EVGA CLASSIFIED 1080 Owner' (as they'll be able to check this from my Account).

And ask them to release the same BIOS that they supplied to Kingpin for his card for LN2. (Or at least something that has the equivalent voltage control as the T4 up to 1.25v).

And I can't see it hurting RMA performance since cards with weaker VRM's already have the STRIX/T4. I'm only asking for the same voltage control everyone else has.

ASUS did this with their LN2 guys (that is how the T4/STRIX BIOS exists - remember guys the T4 *IS* an ASUS signed BIOS). One of these LN2 guys was just kind enough to release it to the public.

I could even tell EVGA that I do not plan to release this to the public, but if anyone asks me specifically I will simply direct them to EVGA where that person can also ask EVGA if they would be willing to release the BIOS.

I understand I would be surrendering my warranty by receiving the BIOS.

Do you think there's any chance I'd get anywhere with such a request? Or am I just guilty of wishful thinking?

What if they decided to be total ****s; would they cancel my warranty anyway and not supply the BIOS either? That would be the worst possible outcome I guess.

Remember guys before you reply; my Classified does have "overclocking DNA in its blood" -- it only EXISTS *BECAUSE* of overclocking. And the second BIOS on my card is also physically labeled "LN2". 

Nick


----------



## fat4l

It would be great to see..... FE bios allowing up to 1.2v and ......200% TDP


----------



## Fediuld

Had asked previously but received no reply.
Has anyone tried the T4 Bios on cards other than Asus ones? Saw someone having used it on a Palit with good result. So anyone had luck with MSI cards?


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> It would be great to see..... FE bios allowing up to 1.2v and ......200% TDP


FE cards are compatible with the T4 BIOS. And can therefor have 1.2v and 200% TDP if they wish. For a Classified owner no such option exists due to incompatibility.
Quote:


> Originally Posted by *Fediuld*
> 
> Had asked previously but received no reply.
> Has anyone tried the T4 Bios on cards other than Asus ones? Saw someone having used it on a Palit with good result. So anyone had luck with MSI cards?


How did you word your email?


----------



## davepk

My Zotac 1080 AMP Extreme runs fine on it.

With the stock bios Tho my card could oc to and stabilized at 2012 or so it never saw 1.093V

With T4 my card stabilizes at 2062 @ 1.113V

Its not much of an improvement really (silicon lottery and all) but does demonstrate the benefit.

See this thread for further info.


----------



## dayveegravy

Just be honest with evga. Tell them everything you've said here. Its a perfectly reasonable complaint. What you have to do is weigh the risk of getting another gpu (in terms of silicon lottery) versus the probability that a better bios will be released in the future. I would say its probable that a better bios will be released because if they didnt it would ruin the classified brand. Its just How long can you put up with the dissatisfaction?


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> FE cards are compatible with the T4 BIOS. And can therefor have 1.2v and 200% TDP if they wish. For a Classified owner no such option exists due to incompatibility.
> How did you word your email?


When I flashed T4 it decreased my score/perforamnce while with higher MHz so ...i dont think its that "compatible"


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> When I flashed T4 it decreased my score/perforamnce while with higher MHz so ...i dont think its that "compatible"


Did it draw less power? (reading in HWINFO64) or measured with a watt meter?

-click to enlarge-


Quote:


> Originally Posted by *dayveegravy*
> 
> Just be honest with evga. Tell them everything you've said here. Its a perfectly reasonable complaint. What you have to do is weigh the risk of getting another gpu (in terms of silicon lottery) versus the probability that a better bios will be released in the future. I would say its probable that a better bios will be released because if they didnt it would ruin the classified brand. Its just How long can you put up with the dissatisfaction?


that is a very good point; definitely something for me to think over 

Quote:


> Originally Posted by *davepk*
> 
> My Zotac 1080 AMP Extreme runs fine on it.
> 
> With the stock bios Tho my card could oc to and stabilized at 2012 or so it never saw 1.093V
> 
> With T4 my card stabilizes at 2062 @ 1.113V
> 
> Its not much of an improvement really (silicon lottery and all) but does demonstrate the benefit.
> 
> See this thread for further info.


interesting


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *-- I have a theory --*
> 
> What if I reached out to EVGA and said I'm participating in an LN2 competition for world-record-breaking. Remind them that I am a 'EVGA CLASSIFIED 1080 Owner' (as they'll be able to check this from my Account).
> 
> And ask them to release the same BIOS that they supplied to Kingpin for his card for LN2. (Or at least something that has the equivalent voltage control as the T4 up to 1.25v).
> 
> And I can't see it hurting RMA performance since cards with weaker VRM's already have the STRIX/T4. I'm only asking for the same voltage control everyone else has.
> 
> ASUS did this with their LN2 guys (that is how the T4/STRIX BIOS exists - remember guys the T4 *IS* an ASUS signed BIOS). One of these LN2 guys was just kind enough to release it to the public.
> 
> I could even tell EVGA that I do not plan to release this to the public, but if anyone asks me specifically I will simply direct them to EVGA where that person can also ask EVGA if they would be willing to release the BIOS.
> 
> I understand I would be surrendering my warranty by receiving the BIOS.
> 
> Do you think there's any chance I'd get anywhere with such a request? Or am I just guilty of wishful thinking?
> 
> What if they decided to be total ****s; would they cancel my warranty anyway and not supply the BIOS either? That would be the worst possible outcome I guess.
> 
> Remember guys before you reply; my Classified does have "overclocking DNA in its blood" -- it only EXISTS *BECAUSE* of overclocking. And the second BIOS on my card is also physically labeled "LN2".
> 
> Nick


Or just go to the KPE forums and ask Kingpin himself?

Already tried that.....result: no bios

Best of luck with EVGA lol


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Or just go to the KPE forums and ask Kingpin himself?
> 
> Already tried that.....result: no bios
> 
> Best of luck with EVGA lol


what did he say?

I don't like him anymore

lol


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> what did he say?
> 
> I don't like him anymore
> 
> lol


http://forum.kingpincooling.com/showthread.php?t=3908


----------



## hubes

Hey Guys,

Anyone had any issues with their 1080 FE where the display (after being idle for some time) says "no display input" and the ONLY way to get it back is to restart the PC? I've been working with Zotac and they seem to think it's a driver issue. I've done clean installs of all the drivers and I just wanted to see if anyone else has been seeing this. I don't have any power management settings set and the PC never sleeps or hibernates.

Any assistance would be greatly appreciated!


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> http://forum.kingpincooling.com/showthread.php?t=3908


perfect mate; I've just added my own 2 cents to the thread as well lol.

I will copy and paste for everyone here; but before I do. Kingpin basically said "no" and simply stated something like "this has been covered before, see post ###.

~The point I am going to make is that *we as overclockers* want to experience what "you" experienced when you found this out. We want to experience the "trial and error" for ourselves. Not have someone *else* tell us what our cards can/can't do.

Here is my reply: (or you could just read the whole thread on the link Vellinious posted).

=======================================================
Hi,

On PASCAL: It is not just about how much of a "gain" in FPS you get.

Its about the level of tweaking you get.

Or another way to put it; the "overclocking feature-set". Regardless of how much 3dmark points or FPS you gain.

The whole point in overclocking is you get to "play around" and see for yourself what works and what doesn't.

When did we ever *need* to overclock? We didn't ---

You experience the trial and error but more importantly you experience this yourself

There is something about the feeling of "being in control" of your hardware. Going a little beyond specification and "experimenting" to an overclocker can be a thrilling -- whether the end gains are 100FPS or 5 FPS.

Many of us just want to find out for ourselves; not just read "oh we've tried it this is covered in post ###". We want to experience the same as you experienced when you "found this out".

ASUS has released the T4/STRIX BIOS to their LN2 sponsors and one of these guys was kind enough to release this to the public. The voltage increase is only to 1.25v and has not harmed RMA targets.

95% of card models out there (from FE right up to EVGA FTW and even zotac cards) are all able to take advantage of this BIOS. However 1080 Classified cards are NOT compatible with T4 BIOS.

This means we now have a back-to-front situation where people with the Classified actually have LESS voltage control than owners of any other card in the EVGA line-up. (100% due to the fact that the ASUS T4 BIOS is compatible with all EVGA 1080 Cards EXCEPT the Classified). That situation is completely back-to-front.

Kingpin; please. Is there any way you could please release the BIOS that you used for your own LN2 benching on your EVGA Classified.

Thank you 

The release of this BIOS would seriously add value to a seriously devalued Classified 1080 Card. Mostly due to the back-to-front situation which I have described.

Many Classified owners are thinking of exchanging their cards for FTW's. In the 'GTX 1080 Owners Club' forum on overclock.net there are countless posts by FTW owners who have tried the T4 ASUS BIOS on their cards and who were able to break their own 'personal bests' on 3DMARK as a direct consequence. Maybe not by much. But enough to experience the "thrill".
Thanks again 
=======================================================


----------



## dayveegravy

Quote:


> Originally Posted by *nrpeyton*
> 
> *-- I have a theory --*
> 
> What if I reached out to EVGA and said I'm participating in an LN2 competition for world-record-breaking. Remind them that I am a 'EVGA CLASSIFIED 1080 Owner' (as they'll be able to check this from my Account).
> 
> And ask them to release the same BIOS that they supplied to Kingpin for his card for LN2. (Or at least something that has the equivalent voltage control as the T4 up to 1.25v).
> 
> And I can't see it hurting RMA performance since cards with weaker VRM's already have the STRIX/T4. I'm only asking for the same voltage control everyone else has.
> 
> ASUS did this with their LN2 guys (that is how the T4/STRIX BIOS exists - remember guys the T4 *IS* an ASUS signed BIOS). One of these LN2 guys was just kind enough to release it to the public.
> 
> I could even tell EVGA that I do not plan to release this to the public, but if anyone asks me specifically I will simply direct them to EVGA where that person can also ask EVGA if they would be willing to release the BIOS.
> 
> I understand I would be surrendering my warranty by receiving the BIOS.
> 
> Do you think there's any chance I'd get anywhere with such a request? Or am I just guilty of wishful thinking?
> 
> What if they decided to be total ****s; would they cancel my warranty anyway and not supply the BIOS either? That would be the worst possible outcome I guess.
> 
> Remember guys before you reply; my Classified does have "overclocking DNA in its blood" -- it only EXISTS *BECAUSE* of overclocking. And the second BIOS on my card is also physically labeled "LN2".
> 
> Nick


I have seen a few instances where evga has offered to replace cards because the owners wernt satisfied with the vram overclock they were getting. There was nothing wrong with the ram (i had a similar ram oc) and their core clocks were amazing but they weren't happy and so evga replaced them. They have a good reputation for a reason. Unless they've changed I think they will let you swap it out.

But if it was me id wait for a bios until January. If by then one still hadnt been released then id get them to upgrade you to a 1080ti if you can afford the difference.


----------



## Dragonsyph

Quote:


> Originally Posted by *dayveegravy*
> 
> I have seen a few instances where evga has offered to replace cards because the owners wernt satisfied with the vram overclock they were getting. There was nothing wrong with the ram (i had a similar ram oc) and their core clocks were amazing but they weren't happy and so evga replaced them. They have a good reputation for a reason. Unless they've changed I think they will let you swap it out.
> 
> But if it was me id wait for a bios until January. If by then one still hadnt been released then id get them to upgrade you to a 1080ti if you can afford the difference.


Do card with the rare Samsung ram OC better then most of the cards now with the cheaper ram?


----------



## dayveegravy

Quote:


> Originally Posted by *Dragonsyph*
> 
> Do card with the rare Samsung ram OC better then most of the cards now with the cheaper ram?


Samsung ram has been a superior overclocker for quite a while now but its hard to find . I havnt seen much data on pascal. I do see anecdotal evidence of inferior clocks with micron ram.


----------



## juniordnz

There's no Samsung GDDR5X


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> (Or at least something that has the equivalent voltage control as the T4 up to 1.25v).


AFAIK T4 was only 1.2V max, the higher voltages that were used with this VBIOS were hardware ones. If you are truly desperate then the hardware mod is a way to reach higher voltage.

The T4 was a treat and perhaps someone has put the kibosh on that happening again, look what happened with the special unlocked GTX 1070 VBIOS.

Another option is to request an enthusiast key from EVGA, you'll need nvflash to generate the request file. Once that is in place you maybe able to "tweak" your VBIOS. If you try to flash a modded unsigned VBIOS with nvflash Version 5.328 or higher it will even tell you that. A flashed modified VBIOS without signing will result in a disabled card.


----------



## dayveegravy

Yeah i assume he was asking about pascal generally.


----------



## nrpeyton

Quote:


> Originally Posted by *ucode*
> 
> AFAIK T4 was only 1.2V max, the higher voltages that were used with this VBIOS were hardware ones. If you are truly desperate then the hardware mod is a way to reach higher voltage.
> 
> The T4 was a treat and perhaps someone has put the kibosh on that happening again, look what happened with the special unlocked GTX 1070 VBIOS.
> 
> Another option is to request an enthusiast key from EVGA, you'll need nvflash to generate the request file. Once that is in place you maybe able to "tweak" your VBIOS. If you try to flash a modded unsigned VBIOS with nvflash Version 5.328 or higher it will even tell you that. A flashed modified VBIOS without signing will result in a disabled card.


Thank you for detailed reply. rep+1.

I would be happy with voltage up to 1.2v. The Classified 1080 even has an "evbot" hook up plug. This plug is for veterans who bought the evbot (back when it was available) to have control over voltage. Unfortunately no longer available; is the EVBOT; not even on Ebay..

No other card in the EVGA lineup has this pins for this connection. Yet all these other cards have *more* voltage control than the 1080 Classified (via the T4 BIOS) which is incompatible with Classified. (But compatible with the rest of their lineup).

Yes; I understand that to get beyond 1.2v I would indeed need to hard-mod. However I do not have the expertise or the 'freedom from current commitments' to undertake an engineering course that would teach me how to safely perform a hard volt mod. I could probably use the forums and read some material and fumble my way through a power mod, but for the more difficult voltage mods I am not sure this is within my grasp.

I am unfamiliar with 'what happened with unlocked 1070 BIOS'. I am about to do a google search now -- this seems like an interesting read. thank you 

Finally; "an enthusiast" key; I never knew such a thing existed.? Would they also supply the tweaking programme? I don't think I'd mind surrendering my warranty but not if it was only for a *shot in the dark*. It would need to be a real, definitive chance at the said tweakability 

Anyway have a nice night, thanks for information  definitely some items to ponder 

br

Edit: Forgot to mention; there is a version of nvflash released (a few months ago now) that enables modified BIOS to be flashed (in fact this version *wont* flash official, signed BIOS since it only actually works with modified unsigned ones -- the problem is the actual "bios tweaking" the .exe to modify/tweak for pascal does not yet exist  ... in other words we have a green light on one side of the equation; whether the other side will happen; well.. "the jury is still out" on that one


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> Edit: Forgot to mention; there is a version of nvflash released (a few months ago now) that enables modified BIOS to be flashed


Doesn't work for Pascal.

To request a key "nvflash --licreq=LicenseRequest.bin USER_FW_MOD"

This will generate a small file tied to your VBIOS.


----------



## ROKUGAN

Quote:


> Originally Posted by *davepk*
> 
> My Zotac 1080 AMP Extreme runs fine on it.
> 
> With the stock bios Tho my card could oc to and stabilized at 2012 or so it never saw 1.093V
> 
> With T4 my card stabilizes at 2062 @ 1.113V
> 
> Its not much of an improvement really (silicon lottery and all) but does demonstrate the benefit.
> 
> See this thread for further info.


Hi, would you be so kind to upload your modded BIOS so I can try it? I got an Zotac 1080 Amp Extreme as well, mine goes up to 2126 on stock BIOS so I would like to try out if there´s any improvement with the one you´re mentioning


----------



## Dragonsyph

Just bought EVGA GeForce GTX 1080 FTW HYBRID GAMING, hope it gets here fast.


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> Just bought EVGA GeForce GTX 1080 FTW HYBRID GAMING, hope it gets here fast.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*


Im very excited, just have to wait tell next week for it to arrive because tomorrows a holiday.


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> Im very excited, just have to wait tell next week for it to arrive because tomorrows a holiday.


Ya I know what that is like. I am waiting on parts to be delievred before I can start my conversion using the Hybrid kit. Stuff is supposed to be arriving this week, hopefully,
The wait is driving me nuts! lol


----------



## juniordnz

Post your "out of the box" max clock for us when you get it









One of my FTWs got 2012mhz and the other one 2025mhz out of the box


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Ya I know what that is like. I am waiting on parts to be delievred before I can start my conversion using the Hybrid kit. Stuff is supposed to be arriving this week, hopefully,
> The wait is driving me nuts! lol


Ya i even picked 2 day shipping so it would arrive on my day off, but with tomorrow and the week end thats like 5 day shipping lol.

Quote:


> Originally Posted by *juniordnz*
> 
> Post your "out of the box" max clock for us when you get it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One of my FTWs got 2012mhz and the other one 2025mhz out of the box


For sure


----------



## Krzych04650

So after realizing that SLI connectors placement is not unified and depends on pcb width I returned Palit Jetstream, because it is too thick for 3-slot SLI config so I cannot get second one later and connect with solid bridge, and got MSI Gaming non-X, because I am not a monkey to pay for out of the box clocks, and even I was then I probably still could use Afterburner without much issue









Cards share many similarities but are also very different in few things.

From similarities:
-fan % speeds vs fan RPM are exactly the same, I mean exactly exactly the same, almost like they are using the same fans
-both cards can do max 2101 MHz, crashing at 2114 MHz, assuming temps below 50C. Realistic expectation for normal usage when you need to care about noise levels, with ~70 temps, is 2050 MHz
-FireStrike still crashes while everything else in the world is stable, so I stopped to care about this benchmark, maybe it is bugged for me or simply cards are not stable on 100 FPS+ but are with 30-60 FPS, happens
-noise is similar

From differences:
-Palit is downclocking as temp increases and saves stability this way, MSI does noting until 70C and this way crashes after hitting around 60 C if overclock is high
-Palit it terribly power limited, hitting 100% power limit at ~1850 MHz and throttles from 2101 MHz to even 2038 due to hitting 120% power limit. On MSI you don't even need to touch power target, max I saw was 103% with 2101 MHz. So thats huge difference if you are looking for OC capable card, even if you get golden sample with Palit you cannot do much more than 2050 MHz stable due to power limit
-thanks to much better power limit, MSI scores about 8200 graphics score in Time Spy vs 8050-8100 of Palit, not a huge difference if you consider how often Palit throttles
-temperatures are significantly lower on Palit, it does 33 C over ambient vs MSI 40 C, assuming the same noise level
-voltage seem to be way lower on MSI, sitting around 1.012-1.025 vs Palit's 1.050-1.075, assuming temps below 50C


----------



## Koniakki

Quote:


> Originally Posted by *hubes*
> 
> Hey Guys,
> 
> Anyone had any issues with their 1080 FE where the display (after being idle for some time) says "no display input" and the ONLY way to get it back is to restart the PC? I've been working with Zotac and they seem to think it's a driver issue. I've done clean installs of all the drivers and I just wanted to see if anyone else has been seeing this. I don't have any power management settings set and the PC never sleeps or hibernates.
> 
> Any assistance would be greatly appreciated!


Could be a motherboard bios setting/issue.

I had a similar problem but it was with a 980Ti and after the pc was in sleep mode which I couldn't wake it until I reset it.

It was sometime ago and I honestly don't remember the exact details.

But it don't hurt to do a check on some power related or power saving options in the mobo bios.


----------



## hubes

Quote:


> Originally Posted by *Koniakki*
> 
> Could be a motherboard bios setting/issue.
> 
> I had a similar problem but it was with a 980Ti and after the pc was in sleep mode which I couldn't wake it until I reset it.
> 
> It was sometime ago and I honestly don't remember the exact details.
> 
> But it don't hurt to do a check on some power related or power saving options in the mobo bios.


Thanks for the reply! I have checked all the power settings in the BIOS and in Windows and nothing is set to any kind of "low power" or "sleep" type mode. I'm currently working with Zotac support. Their engineering department may want me to send the card in.... time to feel the joy of RMA....


----------



## Koniakki

Quote:


> Originally Posted by *hubes*
> 
> Thanks for the reply! I have checked all the power settings in the BIOS and in Windows and nothing is set to any kind of "low power" or "sleep" type mode. I'm currently working with Zotac support. Their engineering department may want me to send the card in.... time to feel the joy of RMA....


Np.







There're many things and variables to take into consideration so it's kinda hard to pinpoint. E.g does the card still functioning properly?

Because if understand this correctly it's just the display turning off at its specific time set(usually 10min) and you cannot resume its signal back. It has happened before if I'm not mistaken to lose signal from the monitor/gpu.

So if a game/benchmark(keep vsync off since no need to stress it anw) is running(and you have external speakers or headphones) or you have the fan manually set to like 60-70%, does the game stop running or the fan stops spinning? Curious to know this actually if you wish to check.


----------



## hubes

Quote:


> Originally Posted by *Koniakki*
> 
> Np.
> 
> 
> 
> 
> 
> 
> 
> There're many things and variables to take into consideration so it's kinda hard to pinpoint. E.g does the card still functioning properly?
> 
> Because if understand this correctly it's just the display turning off at its specific time set(usually 10min) and you cannot resume its signal back. It has happened before if I'm not mistaken to lose signal from the monitor/gpu.
> 
> So if a game/benchmark(keep vsync off since no need to stress it anw) is running(and you have external speakers or headphones) or you have the fan manually set to like 60-70%, does the game stop running or the fan stops spinning? Curious to know this actually if you wish to check.


So to the best of my knowledge the GPU and the rest of the PC still continue to function normally. The GPU is in a custom liquid cooling loop so there's no fan to tell if its working or not. However, I usually have a program that is running when this happens that I can check the status of on my phone and the program still appears to be running even when this happens. It's an emulator that's running a bot. To be clear the issue happens whether the bot is running or not. But I can say for certain that the issue has happened and the bot has run without issues. The radiator/case fans all stay running at the same speed, the loop pump is still running, and the program that is running still performs the way it should. The display just doesn't come back.

I have updated the BIOS on my motherboard as well with no luck...


----------



## pfinch

Hey guys,

is there any possibility to maintain a constant core clock?
I want to have 2075 at all 3d programs. If i set the curve for 'normal' 3d applications (valley, heaven, WoW, skyrim) it will start at 2100 and drops /stays at 2075 core.
But at Rise of the Tomb Raider and TimeSpy the core drops after some time to 2062.
The only option is to increase all voltage points by one core-step ... but at 2110+ i will get crashes faster then the core drops to 2088/2075

help


----------



## hubes

What are your temps when you're running these applications? Sometimes certain games or programs are harder on GPU's and cause it to warm up more which will definitely decrease your clock speeds. Are you running your GPU on air or liquid?


----------



## pfinch

Quote:


> Originally Posted by *hubes*
> 
> What are your temps when you're running these applications? Sometimes certain games or programs are harder on GPU's and cause it to warm up more which will definitely decrease your clock speeds. Are you running your GPU on air or liquid?


Zotac AMP Extreme on Air (Air540 tower), temps are indeed 3c warmer but maxes at 67c.


----------



## TWiST2k

Quote:


> Originally Posted by *Derek1*
> 
> To the place you bought it? That is generous, most places only offer 14 day and if you are lucky 30. And still have to pay a restock fee.
> EVGA warranty is 30 days for a new card and after that you get a refurb.


Is Amazon not the same in Canada? I return stuff all the time and have 0 issues, they even pay the return shipping and I have never had anyone say anything about a restock fee there. Now outside of Amazon, yes everybody else sucks, but that is why I do most of my shopping with Amazon these days. But even they only have a 30 day return policy, nothing about 3 months lol. I know Christmas time they give you longer, but that is only one time of year.
Quote:


> Originally Posted by *Derek1*
> 
> Oh ya I know all that, but from reading that forum over there for the last 2 or 3 weeks you would think that everyone there has never known how to control a fan. lol
> A lot of whiny babies out there who need to have everything spoon fed to them and too spoiled and lazy to do any reading/research on their own.
> Part of the culture of entitlement and inability to think critically, preferring hyper-emotionality and narcissism to have their demands met.


Bro, I am seeing this more and more everywhere I go, it is the TLDR generation and they make me nauseous just thinking about them.
Quote:


> Originally Posted by *juniordnz*
> 
> Actually, the new BIOS unlocked more RPM on the fan curve. The original BIOS would come up to ~2700RPM, the new one goes up to ~3200RPM. Don't know about the fan curve they set on the vBIOS because I use my own (100% when gaming, because I use headphones and can't it anyway). Those extra 500RPM are very welcome IMO.


That was my favorite part of the T4 BIOS, being able to set the fans higher haha. Since the fan speeds are the only thing that has changed, it would be interesting to see about comparing the BIOS files in hex and seeing what areas differed.
Quote:


> Originally Posted by *Derek1*
> 
> I guess you got your before they decided to add the VRAM pads as well?
> Did you check for gaps there?


Guess that is what I get for requesting so soon lol. I honestly have not been following the whole thing as closely as I would like and I have not checked my card at all yet for gap. I really just want to get some FujiPoly ones if I am going to do it and do it right, I will have to dig back thru those threads and gather all of the pad sizes and thicknesses so I can try to figure out what all I would need to get to replace them all properly. I have not had any issues with my FTW, but I have great airflow in my case and am not afraid to have them crank up when needed.

I know I said this before, but I was so ready to jump on board the Predator 360 train, but I am still following the thread and as much as they seem like a really stand up company, I really don't want to deal with having to RMA or fix it myself if I have an issue, I would prefer it to just work haha. I am hoping in the next few months they will figure there issues out again and I can take the plunge. Getting the 360 with the FTW block with some QDC cables would be awesome!


----------



## hubes

From personal experience when my 1080 was on air it was right around those temps that I started to see some core clock fluctuations.

Here are a couple of things you can try in order of most recommended to not recommended.

1. You can increase your fan curve so the fans spin faster upon hitting the 60 degrees Celsius mark. (this would be the first thing I try)
2. If you haven't already in your overclocking utility, increase the power limit and temp limit, link them together prioritizing power limit. You should be find doing this as you're not getting anywhere near max temps. (I always do this when overclocking)
3. You can try to force constant voltage to your GPU. A lot of overclocking utilities have options to have your GPU run at max core clocks all the time. I do not recommend this for 24/7 use but it might be useful to test.

If I had to guess you will probably start seeing the decrease in core clocks on those two applications once the temps pass 65 Celsius.


----------



## Darkboomhoney

very Good news for Classified Member but my evbot is broken ...... -.-
http://forum.kingpincooling.com/showpost.php?p=31714&postcount=1


----------



## Derek1

Quote:


> Originally Posted by *TWiST2k*
> 
> Is Amazon not the same in Canada? I return stuff all the time and have 0 issues, they even pay the return shipping and I have never had anyone say anything about a restock fee there. Now outside of Amazon, yes everybody else sucks, but that is why I do most of my shopping with Amazon these days. But even they only have a 30 day return policy, nothing about 3 months lol. I know Christmas time they give you longer, but that is only one time of year.
> Bro, I am seeing this more and more everywhere I go, it is the TLDR generation and they make me nauseous just thinking about them.
> That was my favorite part of the T4 BIOS, being able to set the fans higher haha. Since the fan speeds are the only thing that has changed, it would be interesting to see about comparing the BIOS files in hex and seeing what areas differed.
> Guess that is what I get for requesting so soon lol. I honestly have not been following the whole thing as closely as I would like and I have not checked my card at all yet for gap. I really just want to get some FujiPoly ones if I am going to do it and do it right, I will have to dig back thru those threads and gather all of the pad sizes and thicknesses so I can try to figure out what all I would need to get to replace them all properly. I have not had any issues with my FTW, but I have great airflow in my case and am not afraid to have them crank up when needed.
> 
> I know I said this before, but I was so ready to jump on board the Predator 360 train, but I am still following the thread and as much as they seem like a really stand up company, I really don't want to deal with having to RMA or fix it myself if I have an issue, I would prefer it to just work haha. I am hoping in the next few months they will figure there issues out again and I can take the plunge. Getting the 360 with the FTW block with some QDC cables would be awesome!


From what I gather you will need 1.5 for the VRAM, 1.0 for the mosfets/VRM (maybe 1.5) and 2.0 for the pcb/backplate. Hasn't been mentioned over there from what I can remember but there are 4 pieces on the back of the pcb surrounding the gpu chip, and if they are undersized (1.5 or 1.0) then they will need to be replaced as well to 2.0 so they contact the backplate. Have a look for marks on the backplate on yours when you crack it open.
Just double check though in case.


----------



## nrpeyton

Quote:


> Originally Posted by *pfinch*
> 
> Hey guys,
> 
> is there any possibility to maintain a constant core clock?
> I want to have 2075 at all 3d programs. If i set the curve for 'normal' 3d applications (valley, heaven, WoW, skyrim) it will start at 2100 and drops /stays at 2075 core.
> But at Rise of the Tomb Raider and TimeSpy the core drops after some time to 2062.
> The only option is to increase all voltage points by one core-step ... but at 2110+ i will get crashes faster then the core drops to 2088/2075
> 
> help


There is no way to "lock" the frequency on Pascal. The best you can do is lock the voltage and +offset. Even with a locked voltage and offset (of say 1075mv and +75mhz) the frequency will "start" at say 2050mhz then as temps go up it will throttle slightly (in jumps of 13mhz). You *could* compensate by turning fan to 100% and adding more +offset (+75mhz to +88mhz) but you would have to keep compensating all the time as temps go up and down.

With no power restrictions and no offset (temp limited only) on my card it will jump around from between 1987 to 2025.

If you are truly desperate you could put your card under water and replace the thermal compound with liquid metal. Conductonaut (by thermal grizzly) transfers heat 7x as fast as traditional thermal pastes. This way you would see only around a 5c temperature difference between load and idle  Assuming you have adequate radiator space.

Quote:


> Originally Posted by *TWiST2k*
> 
> Is Amazon not the same in Canada? I return stuff all the time and have 0 issues, they even pay the return shipping and I have never had anyone say anything about a restock fee there. Now outside of Amazon, yes everybody else sucks, but that is why I do most of my shopping with Amazon these days. But even they only have a 30 day return policy, nothing about 3 months lol. I know Christmas time they give you longer, but that is only one time of year.
> Bro, I am seeing this more and more everywhere I go, it is the TLDR generation and they make me nauseous just thinking about them.
> That was my favorite part of the T4 BIOS, being able to set the fans higher haha. Since the fan speeds are the only thing that has changed, it would be interesting to see about comparing the BIOS files in hex and seeing what areas differed.
> Guess that is what I get for requesting so soon lol. I honestly have not been following the whole thing as closely as I would like and I have not checked my card at all yet for gap. I really just want to get some FujiPoly ones if I am going to do it and do it right, I will have to dig back thru those threads and gather all of the pad sizes and thicknesses so I can try to figure out what all I would need to get to replace them all properly. I have not had any issues with my FTW, but I have great airflow in my case and am not afraid to have them crank up when needed.
> 
> I know I said this before, but I was so ready to jump on board the Predator 360 train, but I am still following the thread and as much as they seem like a really stand up company, I really don't want to deal with having to RMA or fix it myself if I have an issue, I would prefer it to just work haha. I am hoping in the next few months they will figure there issues out again and I can take the plunge. Getting the 360 with the FTW block with some QDC cables would be awesome!


*All ACX 3) coolers have the thermal pad problem. Not just *some* cards. The problem is that the thermal pads were not included at all. Not in any of these cards. You would be better to just use the free pads EVGA are supplying as they are guaranteed to fit. (Unless the pads you would be getting yourself are much much better performers).


----------



## Vellinious

Quote:


> Originally Posted by *Darkboomhoney*
> 
> very Good news for Classified Member but my evbot is broken ...... -.-
> http://forum.kingpincooling.com/showpost.php?p=31714&postcount=1


I had heard at one time, that there was a way to use a Raspberry Pi in place of the EVBot. No idea how well that actually works, though.


----------



## sirleeofroy

Quote:


> Originally Posted by *Vellinious*
> 
> I had heard at one time, that there was a way to use a Raspberry Pi in place of the EVBot. No idea how well that actually works, though.


This might help - http://hwbot.org/newsflash/2933_evbot_no_need_use_raspberry_pi_to_control_epower_instead_(guide_with_pictures)/


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I had heard at one time, that there was a way to use a Raspberry Pi in place of the EVBot. No idea how well that actually works, though.


I read that guide mate, I think the gist of it wasn't to control the EVBOT with the Raspberry, but in fact was a 'less-than-elegant' way of overclocking a Raspberry PI instead of a graphics card (for all those overclocking addicts out there) In other words if you can't fiddle around with your graphics cards because of nvidias tightening rules around what EVGA is/isn't allowed to do; here's something else you can overclock instead.. lol

no thank you.. haha

BTW mate I think we've done it together, *Kingpin just replied to me directly* on the thread you started http://forum.kingpincooling.com/showthread.php?t=3908 and he said and i quote. "I'll have a look okay" -- then he went onto say about how voltage doesn't do much for pascal over 1.25v; but thats fine.. coz we are limited to 1.093v and I only want to get to 1.25 anyway 

Its funny how he's read that post of mine yesterday, and the day after (today) there is suddenly new evbot support for 1080 Classified today  lol

My password isn't working on kingpin cooling anymore don't know why; i'm trying to get in there now to reply again but got to wait 15 minutes to recover my password after 3x attempts argh.

I can hardly contain myself lol....

*Regarding evbot*
I've looked on ebay everyday for 2weeks now; nothing. :-(

*1080 CLASSIFIED OWNERS - WATERBLOCK SUPPORT UPDATE*

Hi guys,

Heard back from Alphacool today.

Here's the communication between us:

**MY EMAIL** << ALPHACOOL REPLY AT BOTTOM \/ \/ \/
Dear Alphacool,

I emailed you nearly three weeks ago (just less than 3 weeks) regarding EVGA 1080 Classified full waterblock.

You confirmed you were definitely making this and I have been spreading the word on a few English/American forums including an 800 page thread on overclock.net which is titled "the GTX 1080 owners club".

This particular thread has had nearly half a million views ('479, 143' to be exact).

I also posted on techpowerup.com (on this site the post has had 500 views).

Also if you do a google search for "GTX 1080 WATERBLOCK" my post on techpowerup.com is 8th on the list. I have also been including some limited parts of email communication in my posts.

Yesterday evening a few people have started to ask if there are any updates? I wonder if Alpha-cool could please supply me with any "teasers" or information or even any "pictures of work in progress" or pictures of prototypes.

This would be great for advertising too and getting interest up.

Also I do have one major question; will the block cool the VRM's properly to low temperatures and will the actual GPU DIE block it's self perform as well as an EK universal block - if you can answer YES to these two questions I believe we are definitely onto a winning combination between GTX 1080 CLASSIFIED and Alpha-cool waterblock 

Thanks again,

Hope everything is going well and I welcome any updates you can provide.

Nick Peyton

**ALPHACOOL REPLY**
Hi Nick,

First of all I would like to thank you for all the work that you are doing.

I am sending you a draw of the block for the classified which we are using for the manual.

Probably the VRM´s could not be cooled so effective like a full cover block but as you know we work with hybrid blocks and the rams are cooled more passive than active which is for the cooling purpose totally enough ( still better than the original cooling).

I don't want to tell too much but the GPU DIE block is probably the best cooling one on the market (to be honest there is nothing which can outperform it) with an integrated pump inside.

This is all what I can tell you at the moment.

Have a nice day.

~teaser drawing 1080_classified~


----------



## OccamRazor

Quote:


> Originally Posted by *pfinch*
> 
> Hey guys,
> 
> is there any possibility to maintain a constant core clock?
> I want to have 2075 at all 3d programs. If i set the curve for 'normal' 3d applications (valley, heaven, WoW, skyrim) it will start at 2100 and drops /stays at 2075 core.
> But at Rise of the Tomb Raider and TimeSpy the core drops after some time to 2062.
> The only option is to increase all voltage points by one core-step ... but at 2110+ i will get crashes faster then the core drops to 2088/2075
> 
> help


Have you tried the Elmor T4 strix bios? Its working fine on my msi seahawk, [email protected]@TDP 240W, clocks never go down, playing [email protected]@65/70fps, will keep testing higher volts and clocks!



Cheers

Occamrazor


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I read that guide mate, I think the gist of it wasn't to control the EVBOT with the Raspberry, but in fact was a 'less-than-elegant' way of overclocking a Raspberry PI instead of a graphics card (for all those overclocking addicts out there) In other words if you can't fiddle around with your graphics cards because of nvidias tightening rules around what EVGA is/isn't allowed to do; here's something else you can overclock instead.. lol
> 
> no thank you.. haha
> 
> BTW mate I think we've done it together, *Kingpin just replied to me directly* on the thread you started http://forum.kingpincooling.com/showthread.php?t=3908 and he said and i quote. "I'll have a look okay" -- then he went onto say about how voltage doesn't do much for pascal over 1.25v; but thats fine.. coz we are limited to 1.093v and I only want to get to 1.25 anyway
> 
> Its funny how he's read that post of mine yesterday, and the day after (today) there is suddenly new evbot support for 1080 Classified today  lol
> 
> My password isn't working on kingpin cooling anymore don't know why; i'm trying to get in there now to reply again but got to wait 15 minutes to recover my password after 3x attempts argh.
> 
> I can hardly contain myself lol....
> 
> *Regarding evbot*
> I've looked on ebay everyday for 2weeks now; nothing. :-(
> 
> *1080 CLASSIFIED OWNERS - WATERBLOCK SUPPORT UPDATE*
> 
> Hi guys,
> 
> Heard back from Alphacool today.
> 
> Here's the communication between us:
> 
> **MY EMAIL** << ALPHACOOL REPLY AT BOTTOM \/ \/ \/
> Dear Alphacool,
> 
> I emailed you nearly three weeks ago (just less than 3 weeks) regarding EVGA 1080 Classified full waterblock.
> 
> You confirmed you were definitely making this and I have been spreading the word on a few English/American forums including an 800 page thread on overclock.net which is titled "the GTX 1080 owners club".
> 
> This particular thread has had nearly half a million views ('479, 143' to be exact).
> 
> I also posted on techpowerup.com (on this site the post has had 500 views).
> 
> Also if you do a google search for "GTX 1080 WATERBLOCK" my post on techpowerup.com is 8th on the list. I have also been including some limited parts of email communication in my posts.
> 
> Yesterday evening a few people have started to ask if there are any updates? I wonder if Alpha-cool could please supply me with any "teasers" or information or even any "pictures of work in progress" or pictures of prototypes.
> 
> This would be great for advertising too and getting interest up.
> 
> Also I do have one major question; will the block cool the VRM's properly to low temperatures and will the actual GPU DIE block it's self perform as well as an EK universal block - if you can answer YES to these two questions I believe we are definitely onto a winning combination between GTX 1080 CLASSIFIED and Alpha-cool waterblock 
> 
> Thanks again,
> 
> Hope everything is going well and I welcome any updates you can provide.
> 
> Nick Peyton
> 
> **ALPHACOOL REPLY**
> Hi Nick,
> 
> First of all I would like to thank you for all the work that you are doing.
> 
> I am sending you a draw of the block for the classified which we are using for the manual.
> 
> Probably the VRM´s could not be cooled so effective like a full cover block but as you know we work with hybrid blocks and the rams are cooled more passive than active which is for the cooling purpose totally enough ( still better than the original cooling).
> 
> I don't want to tell too much but the GPU DIE block is probably the best cooling one on the market (to be honest there is nothing which can outperform it) with an integrated pump inside.
> 
> This is all what I can tell you at the moment.
> 
> Have a nice day.
> 
> ~teaser drawing 1080_classified~


Not a fan of the GPX blocks...better than nothing, but....they're no better than an AIO cooling solution, which still only does half the job.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> There is no way to "lock" the frequency on Pascal. The best you can do is lock the voltage and +offset. Even with a locked voltage and offset (of say 1075mv and +75mhz) the frequency will "start" at say 2050mhz then as temps go up it will throttle slightly (in jumps of 13mhz). You *could* compensate by turning fan to 100% and adding more +offset (+75mhz to +88mhz) but you would have to keep compensating all the time as temps go up and down.
> 
> With no power restrictions and no offset (temp limited only) on my card it will jump around from between 1987 to 2025.
> 
> If you are truly desperate you could put your card under water and replace the thermal compound with liquid metal. Conductonaut (by thermal grizzly) transfers heat 7x as fast as traditional thermal pastes. This way you would see only around a 5c temperature difference between load and idle  Assuming you have adequate radiator space.
> *All ACX 3) coolers have the thermal pad problem. Not just *some* cards. The problem is that the thermal pads were not included at all. Not in any of these cards. You would be better to just use the free pads EVGA are supplying as they are guaranteed to fit. (Unless the pads you would be getting yourself are much much better performers).


Sorry but I need to make a correction here.
Some cards have a gap problem on the VRAM. Not all.
Jayz2 opened his FTW and wanted to inspect and his card did not have a gap there.

Also, jst a personal preference here butI would be too scared to use CLU or Conductonaut on the GPU. Fine for those with more mettle.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Not a fan of the GPX blocks...better than nothing, but....they're no better than an AIO cooling solution, which still only does half the job.


When I get this new block it will be the first time I've ever put a GPU under water.

That picture looks like it covers the whole card; how do you mean exactly mate? Do they mean that the water actually only flows over the GPU Die and the VRM is only cooled by the plate?
Memory I'm not fussed about.

Will that affect my actual GPU reported temperature?


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> When I get this new block it will be the first time I've ever put a GPU under water.
> 
> That picture looks like it covers the whole card; how do you mean exactly mate? Do they mean that the water actually only flows over the GPU Die and the VRM is only cooled by the plate?
> Memory I'm not fussed about.
> 
> Will that affect my actual GPU reported temperature?


It is difficult to tell.
But if you look at the area near the power connector there appears to be a separate piece there (the block) which seems to cover the whole card. The part that has GPN written on the side looks like it may merely be a shroud covering that would cover the other side of the card/block as it does not seem to be attached to the shroud on the back of the card.
Ask him for a pic of the card turned over. lol


----------



## Derek1

Quote:


> Originally Posted by *TWiST2k*
> 
> Is Amazon not the same in Canada? I return stuff all the time and have 0 issues, they even pay the return shipping and I have never had anyone say anything about a restock fee there. Now outside of Amazon, yes everybody else sucks, but that is why I do most of my shopping with Amazon these days. But even they only have a 30 day return policy, nothing about 3 months lol. I know Christmas time they give you longer, but that is only one time of year.
> Bro, I am seeing this more and more everywhere I go, it is the TLDR generation and they make me nauseous just thinking about them.
> That was my favorite part of the T4 BIOS, being able to set the fans higher haha. Since the fan speeds are the only thing that has changed, it would be interesting to see about comparing the BIOS files in hex and seeing what areas differed.
> Guess that is what I get for requesting so soon lol. I honestly have not been following the whole thing as closely as I would like and I have not checked my card at all yet for gap. I really just want to get some FujiPoly ones if I am going to do it and do it right, I will have to dig back thru those threads and gather all of the pad sizes and thicknesses so I can try to figure out what all I would need to get to replace them all properly. I have not had any issues with my FTW, but I have great airflow in my case and am not afraid to have them crank up when needed.
> 
> I know I said this before, but I was so ready to jump on board the Predator 360 train, but I am still following the thread and as much as they seem like a really stand up company, I really don't want to deal with having to RMA or fix it myself if I have an issue, I would prefer it to just work haha. I am hoping in the next few months they will figure there issues out again and I can take the plunge. Getting the 360 with the FTW block with some QDC cables would be awesome!


Yes Amazon is the same here for Christmas delivery/return policy ( I went and checked) you got till Jan 30th to return if you buy between Nov 1 and Dec 25.
I never use Amazon though as they don't accept PayPal.

As far as the crap going on over at EVGA, of the funnier threads was about someone complaining about the new vbios making the fans so loud that he was distracted by them playing BF1. Someone asked if everyone games with the sound off and I was gonna suggest turning the sound up to drown out the fans if they were confusing them with the sound of a strafing run. lol More likely the person just sucked at it and was looking for an excuse to getting his head shot off. Like it can't be that I suck at it right? It must be the fans distracting me. smh lol


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Yes Amazon is the same here for Christmas delivery/return policy ( I went and checked) you got till Jan 30th to return if you buy between Nov 1 and Dec 25.
> I never use Amazon though as they don't accept PayPal.
> 
> As far as the crap going on over at EVGA, of the funnier threads was about someone complaining about the new vbios making the fans so loud that he was distracted by them playing BF1. Someone asked if everyone games with the sound off and I was gonna suggest turning the sound up to drown out the fans if they were confusing them with the sound of a strafing run. lol More likely the person just sucked at it and was looking for an excuse to getting his head shot off. Like it can't be that I suck at it right? It must be the fans distracting me. smh lol


The sun was in my eyes bro, or else i would have rekt you good.


----------



## Derek1

Part 1 of 3 arrived today.



Just the Fuji and Gelid to come.
God it is taking forever.


----------



## pfinch

Quote:


> Originally Posted by *OccamRazor*
> 
> Have you tried the Elmor T4 strix bios? Its working fine on my msi seahawk, [email protected]@TDP 240W, clocks never go down, playing [email protected]@65/70fps, will keep testing higher volts and clocks!
> 
> 
> 
> Cheers
> 
> Occamrazor


Where do i get that BIOS? And is it safe to use with a Zotac AMP Extreme 1080?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> When I get this new block it will be the first time I've ever put a GPU under water.
> 
> That picture looks like it covers the whole card; how do you mean exactly mate? Do they mean that the water actually only flows over the GPU Die and the VRM is only cooled by the plate?
> Memory I'm not fussed about.
> 
> Will that affect my actual GPU reported temperature?


The temp sensor for NVIDIA is only the core temp. So yes, it'll show the core running cooler.

What I mean, is that if you look at a universal block and an AIO, the only thing that's actually water cooled is the core. Same with the Alphacool GPX blocks. They're essentially a universal GPU block with a "full coverage adapter". A true full coverage block actively cools the VRM and memory. Electronics run better when they're not cooking. lol, and memory is no exception.

With as much as you keep talking about adding voltage and the like, I'd be investing in an infrared thermometer to make sure your VRM temps are staying within reason.

Full coverage block:



GPX block:


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> With as much as you keep talking about adding voltage and the like, I'd be investing in an infrared thermometer to make sure your VRM temps are staying within reason.


I have 4 temperature probes on "wires" connected to my fan controller that I use for monitoring VRM/Northbridge motherboard temps and even ambient.

That should work yeah?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I have 4 temperature probes on "wires" connected to my fan controller that I use for monitoring VRM/Northbridge motherboard temps and even ambient.
> 
> That should work yeah?


That would probably work. Place one between the backplate and the pcb near the VRM, and that should get you pretty close, anyway


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> That would probably work. Place one between the backplate and the pcb near the VRM, and that should get you pretty close, anyway


Okay thanks; I've spent some time reading some more reviews. It seems the alphacool block performs better than an EK universal block. The downfall to this in the past *used to be* at expense of overly restrictive block; however they have now fixed this *without* sacrificing performance by including a small integrated pump inside the new block design  at least they are listening 

I can't wait to get this block!! the VRM/memory cooling will still be better than stock and its the only water solution that doesn't require you to cool the VRM yourself 

Also due to the classifieds 14 phase VRM (spreading the heat over more space) I really think this solution should be sufficient. The only downside I can see is if you were running furmark the heat from the VRM could conduct along the card giving a temperature penalty on the chip. But considering the VRM solution is still slightly better than stock and the waterblock performs slightly better than EK I can't see this being too much of a problem. Even for voltages up to 1.25v.

Classified owners have had a hard struggle; we've had to ask for everything and fight to get it -- and we've still not won yet... but we are getting there and I really think every classified owner needs to repay this by giving alphacool a shot. I think they will be pleasantly surprised.

*=========================*

*Anyone selling an EVBOT???*


----------



## OccamRazor

Quote:


> Originally Posted by *pfinch*
> 
> Where do i get that BIOS? And is it safe to use with a Zotac AMP Extreme 1080?


Go here: http://forum.hwbot.org/showthread.php?p=455871#post455871

It's a bit hit or miss I'm afraid, some have excellent results while others not...

Cheers

Occamrazor


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> I read that guide mate, I think the gist of it wasn't to control the EVBOT with the Raspberry, but in fact was a 'less-than-elegant' way of overclocking a Raspberry PI instead of a graphics card (for all those overclocking addicts out there)


No you didn't get the gist of it, overclocking a Raspberry Pi was just an excuse for an overclocker to already have one.

http://xdevs.com/guide/epower_mod/


----------



## VSG

I guess I can finally join now that I have a card that hasn't died after testing all these GPU blocks:



Got the least expensive EVGA reference PCB card, the non SC w/ACX 3.0. It's going to be watercooled anyway.


----------



## OccamRazor

Quote:


> Originally Posted by *geggeg*
> 
> I guess I can finally join now that I have a card that hasn't died after testing all these GPU blocks:
> 
> 
> 
> Got the least expensive EVGA reference PCB card, the non SC w/ACX 3.0. It's going to be watercooled anyway.


Then you have to try the T4 bios, its working very well in my end!









Cheers

Occamrazor


----------



## pfinch

Quote:


> Originally Posted by *OccamRazor*
> 
> Go here: http://forum.hwbot.org/showthread.php?p=455871#post455871
> 
> It's a bit hit or miss I'm afraid, some have excellent results while others not...
> 
> Cheers
> 
> Occamrazor


Thank you!
I will try it today on my Zotac Extreme AMP 1080... hope it will work on that card


----------



## juniordnz

Anyone got any information from EVGA about the quality of the thermal pads they are shipping? It would be really nice to know what's the thermal conductivity of those pads, because if it's low quality ones it's better to buy and apply your own...


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Anyone got any information from EVGA about the quality of the thermal pads they are shipping? It would be really nice to know what's the thermal conductivity of those pads, because if it's low quality ones it's better to buy and apply your own...


Sorry Junior, no definite word on that from EVGA yet.
All I know for certain is they are by Shin Etsu. People have been asking in the forum over at EVGA for a while now but no official statement. Though a mod or Tech Support guy did say that if you were to use your own to use 11w/mk. I didn't find any of those on the Shin Etsu site when I looked but I may have missed them. As you know I ordered from Fuji and am getting the 17's. I am pretty sure that EVGA aren't using those ones.


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Sorry Junior, no definite word on that from EVGA yet.
> All I know for certain is they are by Shin Etsu. People have been asking in the forum over at EVGA for a while now but no official statement. Though a mod or Tech Support guy did say that if you were to use your own to use 11w/mk. I didn't find any of those on the Shin Etsu site when I looked but I may have missed them. As you know I ordered from Fuji and am getting the 17's. I am pretty sure that EVGA aren't using those ones.


That sucks...

I got 11's for the heatplate, which I believe is already more than the stock ones that EVGA uses and for the backplate I couldn't find anything higher than 6w/mK AND 2mm thick. The good ones are all from 0,5 to 1,5mm thick.

I really, REALLY doubt EVGA is using some high grade thermal pads like those we mentioned. Even EKWB ships some cheap, generic 2w/mK pads with their waterblocks...and those are considered high eng products.


----------



## Casper123123123

EVGA already release updated BIOS for 1070/1080 thermal issues guys. Please search


----------



## juniordnz

Quote:


> Originally Posted by *Casper123123123*
> 
> EVGA already release updated BIOS for 1070/1080 thermal issues guys. Please search


No kidding, sherlock.

I even talked to a guy a few pages back about how the RPM limit increased in the new BIOS. I assure you my "searching" is up to date


----------



## Casper123123123

Quote:


> Originally Posted by *juniordnz*
> 
> No kidding, sherlock.
> 
> I even talked to a guy a few pages back about how the RPM limit increased in the new BIOS. I assure you my "searching" is up to date


No agression plz. I just pinpointed for you guys if you had not such info. I understand you issue with EVGA but I do not have time to monitor these forums due to work


----------



## juniordnz

Quote:


> Originally Posted by *Casper123123123*
> 
> No agression plz. I just pinpointed for you guys if you had not such info. I understand you issue with EVGA but I do not have time to monitor these forums due to work


Didn't mean to offend. Just didn't like the tone of your "please search". Internet is a hard place to get peoples intentions


----------



## Casper123123123

Quote:


> Originally Posted by *juniordnz*
> 
> Didn't mean to offend. Just didn't like the tone of your "please search". Internet is a hard place to get peoples intentions


Understood. That's negative side of chat when people can not understand the reaction. My intention was to deliver info and not to wet someone's nose


----------



## ROKUGAN

Quote:


> Originally Posted by *pfinch*
> 
> Thank you!
> I will try it today on my Zotac Extreme AMP 1080... hope it will work on that card


Please let me know your results, I have the same card and I´m interested as well


----------



## juniordnz

EVGA just offered to replace my card for a brand new one with all pads applied.

Do you guys think it's worth it to play the silicon lottery again?

My current FTW: 1.062V / 2114mhz core / +575mhz mem / 25400 Firestrike Graphics / Stock Boost to 2025mhz


----------



## keikei

Quote:


> Originally Posted by *juniordnz*
> 
> EVGA just offered to replace my card for a brand new one with all pads applied.
> 
> Do you guys think it's worth it to play the silicon lottery again?
> 
> My current FTW: 1.062V / 2114mhz core / +575mhz mem / 25400 Firestrike Graphics / Stock Boost to 2025mhz


I cant comment on the silicon lottery, but a new revised card would be immensely better for resale and card longevity (not sure how long you keep your cards).


----------



## bg8780

Hey guys. Long time lurker of this thread.

I'm wanting to try this T4 BIOS. I have an EVGA Founder's Edition with a Hybrid cooler installed. Looks like the FE 1080's are compatible with the T4 BIOS. Is this correct?

I can't get this card above 2088mhz core. I'm hoping the T4 BIOS will get me past this. Temps never go above 42c. The card only stock boost to 1886mhz. I'm thinking I just lost the silicon lottery and a BIOS isn't going to fix this.

Anyone have a download link for the T4 BIOS? I'd like to give it a shot.

Thanks!


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> EVGA just offered to replace my card for a brand new one with all pads applied.
> 
> Do you guys think it's worth it to play the silicon lottery again?
> 
> My current FTW: 1.062V / 2114mhz core / +575mhz mem / 25400 Firestrike Graphics / Stock Boost to 2025mhz


Send them the GPU Z logs of your cards performance and remind them that they are obligated to send you one of ",,,equal or better performance..." as per their warranty.

Couldn't hurt if they agree. Though what you need the fix for is beyond me as you are gonna put it under water anyway.
If there is a chance they have binned cards though that they are keeping secret, then go for it and get a 2300 beast! lol


----------



## nexxusty

Quote:


> Originally Posted by *Casper123123123*
> 
> No agression plz. I just pinpointed for you guys if you had not such info. I understand you issue with EVGA but I do not have time to monitor these forums due to work


Then maybe don't tell members who have been in this thread since the beginning to "search"? Hehe.

We all work man.... most of us anyway.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> EVGA just offered to replace my card for a brand new one with all pads applied.
> 
> Do you guys think it's worth it to play the silicon lottery again?
> 
> My current FTW: 1.062V / 2114mhz core / +575mhz mem / 25400 Firestrike Graphics / Stock Boost to 2025mhz


What temps at that clock / voltage? Fan speed? Ambient temps? 2114 @ 1062mv doesn't sound all that outstanding, unless you're talking about auto fan speed with 22c to 24c ambient temps or higher. In that case, it's probably "decent" to "average".


----------



## juniordnz

Quote:


> Originally Posted by *keikei*
> 
> I cant comment on the silicon lottery, but a new revised card would be immensely better for resale and card longevity (not sure how long you keep your cards).


Yeah, thought about that too. I would get a card that is 3 months newer and that counts when you sell it. Also, saying the card is a "revision 1.1" sounds better than "hey, I applied some pads I bought myself from ebay"
Quote:


> Originally Posted by *Derek1*
> 
> Send them the GPU Z logs of your cards performance and remind them that they are obligated to send you one of ",,,equal or better performance..." as per their warranty.
> 
> Couldn't hurt if they agree. Though what you need the fix for is beyond me as you are gonna put it under water anyway.
> If there is a chance they have binned cards though that they are keeping secret, then go for it and get a 2300 beast! lol


Maybe EVGA USA does, but I would be doing the RMA with EVGA Brazil and they said they will send me an 100% unopened revised card. So guess there's not a chance of a binned card








Quote:


> Originally Posted by *Vellinious*
> 
> What temps at that clock / voltage? Fan speed? Ambient temps? 2114 @ 1062mv doesn't sound all that outstanding, unless you're talking about auto fan speed with 22c to 24c ambient temps or higher. In that case, it's probably "decent" to "average".


I use fans at 100% under load







if I stress it to 100% GPU usage than it would easily go up to 60ºC and throttle 2088mhz. Room temp must be around 25ºC. I believe most FTWs get above 2100mhz?

I guess this a pretty average card, right? You have done some binning yourself, would spin the silicon lotttery wheel knowing you have only one chance?


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Yeah, thought about that too. I would get a card that is 3 months newer and that counts when you sell it. Also, saying the card is a "revision 1.1" sounds better than "hey, I applied some pads I bought myself from ebay"
> Maybe EVGA USA does, but I would be doing the RMA with EVGA Brazil and they said they will send me an 100% unopened revised card. So guess there's not a chance of a binned card
> 
> 
> 
> 
> 
> 
> 
> 
> I use fans at 100% under load
> 
> 
> 
> 
> 
> 
> 
> if I stress it to 100% GPU usage than it would easily go up to 60ºC and throttle 2088mhz. Room temp must be around 25ºC. I believe most FTWs get above 2100mhz?
> 
> I guess this a pretty average card, right? You have done some binning yourself, would spin the silicon lotttery wheel knowing you have only one chance?


I'd go ahead and let that one go....it's not anything special.


----------



## Krzych04650

One important thing I didn't say about Palit Jetstream before: card has very low power limit. It hits ~105% where MSI Gaming hits 75-80%. Palit is not keeping its out of the box boost (1911) stable without increasing power limit. It also downclocks after OC due to hitting 120% wall. So even if you get high OC the card will downclock if it is stressed enough. Sits at 2101 most of the time and scores only ~2% lower than MSI in TimeSpy, both at 2101, but in case you can get higher OC, you will hit the wall even more. Maybe not in Valley and other less stressing benchmarks and games, but in things like TimeSpy or Witcher 3 it will hit 120% wall quite often.


----------



## x-apoc

Quote:


> Originally Posted by *Vellinious*
> 
> I'd go ahead and let that one go....it's not anything special.


How many FTW's have you seen that OC over 2200mhz on air?


----------



## bloot

Quote:


> Originally Posted by *Krzych04650*
> 
> One important thing I didn't say about Palit Jetstream before: card has very low power limit. It hits ~105% where MSI Gaming hits 75-80%. Palit is not keeping its out of the box boost (1911) stable without increasing power limit. It also downclocks after OC due to hitting 120% wall. So even if you get high OC the card will downclock if it is stressed enough. Sits at 2101 most of the time and scores only ~2% lower than MSI in TimeSpy, both at 2101, but in case you can get higher OC, you will hit the wall even more. Maybe not in Valley and other less stressing benchmarks and games, but in things like TimeSpy or Witcher 3 it will hit 120% wall quite often.


You can always flash the new Super JetStream BIOS, default TDP is 230W now


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Anyone got any information from EVGA about the quality of the thermal pads they are shipping? It would be really nice to know what's the thermal conductivity of those pads, because if it's low quality ones it's better to buy and apply your own...


Got this today from Tech Support Junior.

What is the thermal connductivity of the pad?

I do not have this information however the thermal pads that we use are shin-etsu and are very high grade.

They are determined to keep this a secret.


----------



## Krzych04650

Quote:


> Originally Posted by *bloot*
> 
> You can always flash the new Super JetStream BIOS, default TDP is 230W now


I don't use this card anymore so I don't care, just adding something to what I wrote about the card because I said only good things without mentioning major flaw that are realized later.


----------



## nrpeyton

We now have a *software voltage tool for 1080 classified.
*Not had much chance to test it yet but PM me for link.
Personally can't wait to finish work to start playing myself ! 

Disclaimer: use at your own risk and expense


----------



## Vellinious

Quote:


> Originally Posted by *x-apoc*
> 
> How many FTW's have you seen that OC over 2200mhz on air?


3 of the 5 I owned. Of the other 2, 1 would do 2193 and the other 2164.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Send them the GPU Z logs of your cards performance and remind them that they are obligated to send you one of ",,,equal or better performance..." as per their warranty.
> 
> Couldn't hurt if they agree. Though what you need the fix for is beyond me as you are gonna put it under water anyway.
> If there is a chance they have binned cards though that they are keeping secret, then go for it and get a 2300 beast! lol


I've read that clause in their warranty and I think it is only to do with the actual card specification. (Think you may be reading too far into the words used in the clause).

So for example if you had an extended 5 year warranty on an old GTX 980 (which are no longer made) and they had none left they would have to either give you a GTX 980TI or a GTX 1080.

The *only* exception to this would be the ASIC rated cards (because people actually paid more for a higher rated card) But I believe that would be the *only* exception.

I could be wrong; but unless someone can confirm actual experience of this I think one should be careful of getting their hopes up.

I could imagine an absolutely impossible and very difficult to manage situation EVGA would be in if they agreed and everyone jumped on the bandwagon.

Also a +150mhz offset is only getting around 5% so I really think if you were happy with your card before this whole problem started; you should probably keep it. Another 50mhz is what, 1.5%?

EVGA are great; and if we take advantage of them all that is going to happen is their customer service policy in the future won't be so supportive; which would be a real shame. 

Good post though 

=========================

*NEW TOPIC:*

Going to be shopping for a *digital multi-meter* tomorrow so I can have some probing fun with my graphics cards lol, anyone recommend anything; like special specifications or anything I need to watch out for? Never owned one before.

Want it to be at least "slightly" future proof so I can do a few basic mods in future as my confidence grows


----------



## VSG

Any basic DMM will be fine, the more expensive ones are more precise generally. So depending if you want 3+ significant figures, you will pay more. I have a


----------



## nrpeyton

not looking to spend much, will I have any problem getting one accurate to 1mv cheap cheap?

really got no idea what i am looking for at this stage lol


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> 3 of the 5 I owned. Of the other 2, 1 would do 2193 and the other 2164.


So I should really rma mine. It's nothing worth keeping at all.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> So I should really rma mine. It's nothing worth keeping at all.


read ur post with ur clock at a 1062mv but whats ur 1093mv max clock?

edit:
need more detail coz my max stable is around 2152 but even at that, performance is less than at 2113. But I can actually hit 2202 for a short space of time with some weird curve adjustements and even get a 3dmark validation with 2202 on page... so reallly need more details coz what u say is ur stable, someone else could misinterpret that and what they understand that means could be something 75mhz different +/-

edit 2:

based on above, i could say my max is 2113 or I could be coming on here saying its 2202.............. 

also what about with T4? i thought i'd never see voltage above 1093 *ever* yet my prayers were just answered; still can't believe it lol.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> read ur post with ur clock at a 1062mv but whats ur 1093mv max clock?
> 
> edit:
> need more detail coz my max stable is around 2152 but even at that, performance is less than at 2113. But I can actually hit 2202 for a short space of time with some weird curve adjustements and even get a 3dmark validation with 2202 on page... so reallly need more details coz what u say is ur stable, someone else could misinterpret that and what they understand that means could be something 75mhz different +/-


What he said. I'm talking about the max core speed without dropping frame rates in benchmarks. If you're talking about a game stable overclock, that's different.

As for the votlages....keeping the voltages as low as possible, even on the higher clocks, can help keep temps down just a shade longer and sometimes even run better. I always score higher in benches, when I can get a run in at 1.075v, as opposed to 1.081 or 1.093.

I think you're putting way too much emphasis on busting out more voltage, when even under water and in pretty optimal ambient temp environments, I'm seeing lower voltages deliver better results....usually....


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> What he said. I'm talking about the max core speed without dropping frame rates in benchmarks. If you're talking about a game stable overclock, that's different.
> 
> As for the votlages....keeping the voltages as low as possible, even on the higher clocks, can help keep temps down just a shade longer and sometimes even run better. I always score higher in benches, when I can get a run in at 1.075v, as opposed to 1.081 or 1.093.
> 
> I think you're putting way too much emphasis on busting out more voltage, when even under water and in pretty optimal ambient temp environments, I'm seeing lower voltages deliver better results....usually....


right got you; so in my case that would be 2113-2126.. anything faster and i'm dropping frames...

going to do a bit of undervolting tonight and see what happens

so one of yours could actually do 2193 before dropping frames?


----------



## VSG

Quote:


> Originally Posted by *nrpeyton*
> 
> not looking to spend much, will I have any problem getting one accurate to 1mv cheap cheap?
> 
> really got no idea what i am looking for at this stage lol


Accurate to 1 mV means going to 1.xxxx digits so that will be more than the average Radioshack level DMM. If you just want 1.xxx, you can get it within $30-40 easily.


----------



## nrpeyton

Quote:


> Originally Posted by *geggeg*
> 
> Accurate to 1 mV means going to 1.xxxx digits so that will be more than the average Radioshack level DMM. If you just want 1.xxx, you can get it within $30-40 easily.


thanks 

basically its to measure gpu core voltage; at hard level... so I want to be able to look at my reading in hwinfo64/msi ab/gpu-z etc and have it read the same on the meter. So gpu core voltages on the curve go....

950mv,962mv, 975mv, 981mv, 993mv, 1000mv, 1012mv, 1025mv, 1031mv, 1043mv, 1050mv, 1062mv, 1075mv, 1081mv, 1093mv

(I want to be able to look at the meter and then know which part of the voltage curve I am currently on)....

Once I've done that (and verified it working okay -- I can test this new voltage tool that k|ngp|n posted for us)  it may just be quickly updated from a previous version (so it can talk to 1080 classified controller) so i'm not sure the "table" would be up to date... so it could mean having to set one specific voltage to get one specific desired "actual" voltage then building my own table so I know what corresponds to what....
on the other hand, it might actually be doing exactly what it says it is; but I'd still like to see that with my own eyes 

*Edit:*
Tool also has 2 other controls (3 in total) for memory and PCI-E too.
I haven't even tried the memory one yet; but considering how well memory scales on Pascal this could bring some nice gains 

then again i'm also new to this; so gains could also be very small -- but its not just all about the gains.. its about the fun you have tweaking with all this stuff and learning something new.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> What he said. I'm talking about the max core speed without dropping frame rates in benchmarks. If you're talking about a game stable overclock, that's different.
> 
> As for the votlages....keeping the voltages as low as possible, even on the higher clocks, can help keep temps down just a shade longer and sometimes even run better. I always score higher in benches, when I can get a run in at 1.075v, as opposed to 1.081 or 1.093.
> 
> I think you're putting way too much emphasis on busting out more voltage, when even under water and in pretty optimal ambient temp environments, I'm seeing lower voltages deliver better results....usually....


2114mhz and +575 on mem is my 24/7 1.062V offset OC. 25400 graphics score on firestrike with those settins and fail proof on everything I played with it.

I could validate firestrike with 2177mhz 1.093V with T4 BIOS. 25700 points with that.


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> thanks
> 
> basically its to measure gpu core voltage; at hard level... so I want to be able to look at my reading in hwinfo64/msi ab/gpu-z etc and have it read the same on the meter. So gpu core voltages on the curve go....
> 
> 950mv,962mv, 975mv, 981mv, 993mv, 1000mv, 1012mv, 1025mv, 1031mv, 1043mv, 1050mv, 1062mv, 1075mv, 1081mv, 1093mv
> 
> (I want to be able to look at the meter and then know which part of the voltage curve I am currently on)....
> 
> Once I've done that (and verified it working okay -- I can test this new voltage tool that k|ngp|n posted for us)  it may just be quickly updated from a previous version (so it can talk to 1080 classified controller) so i'm not sure the "table" would be up to date... so it could mean having to set one specific voltage to get one specific desired "actual" voltage then building my own table so I know what corresponds to what....
> on the other hand, it might actually be doing exactly what it says it is; but I'd still like to see that with my own eyes
> 
> *Edit:*
> Tool also has 2 other controls (3 in total) for memory and PCI-E too.
> I haven't even tried the memory one yet; but considering how well memory scales on Pascal this could bring some nice gains
> 
> then again i'm also new to this; so gains could also be very small -- but its not just all about the gains.. its about the fun you have tweaking with all this stuff and learning something new.




GPU (NVVDD) OverVoltage
Memory (FBVDD/Q) OverVoltage
PEXVDD PLL Voltage Measurement

So:

NVDD is voltage for GPU

FBVDD is Memory voltage

PEXVDD is PCIExpress Voltage

Cheers

Occamrazor


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> 2114mhz and +575 on mem is my 24/7 1.062V offset OC. 25400 graphics score on firestrike with those settins and fail proof on everything I played with it.
> 
> I could validate firestrike with 2177mhz 1.093V with T4 BIOS. 25700 points with that.


That's a better graphics score than I'm getting at those clocks. At 2114 / 1.050V offset OC, I'm getting high 24k graphics scores in FS. Which driver are you using?


----------



## Vellinious

Double post, delete


----------



## Cozmo85

Just installed one under water, a 1080 SC (so basically a reference design). I had been using msi afterburner 4.3.0 with my 980ti. Is it still the best one to go with or should i use precision x? Benefits?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I think you're putting way too much emphasis on busting out more voltage, when even under water and in pretty optimal ambient temp environments, I'm seeing lower voltages deliver better results....usually....


2177MHZ Benching

Remember I'm on an old AMD CPU so my baseline will be lower.

2177mhz @ 1050mv -> crash
2177mhz @ 1093mv -> *21782* http://www.3dmark.com/3dm/16027936
2177mhz @ 1131mv -> *22070* http://www.3dmark.com/3dm/16028527
2177mhz @ 1150mv -> *22161* http://www.3dmark.com/3dm/16028633

_GPU Boost 3.0 Dynamic Voltage Control = Disabled_

*Clock set in:* MSI AB Curve window (plotted as a +163) against 1.093v
*Voltage set in*: MSI Afterburner value overridden using Classified Voltage Tool.

Only early days though 

~Nick


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> 2177MHZ Benching
> 
> Remember I'm on an old AMD CPU so my baseline will be lower.
> 
> 2177mhz @ 1050mv -> crash
> 2177mhz @ 1093mv -> *21782* http://www.3dmark.com/3dm/16027936
> 2177mhz @ 1131mv -> *22070* http://www.3dmark.com/3dm/16028527
> 2177mhz @ 1150mv -> *22161* http://www.3dmark.com/3dm/16028633
> 
> _GPU Boost 3.0 Dynamic Voltage Control = Disabled_
> 
> *Clock set in:* MSI AB Curve window (plotted as a +163) against 1.093v
> *Voltage set in*: MSI Afterburner value overridden using Classified Voltage Tool.
> 
> Only early days though guys but its making me happy anyway lol that's what counts lol
> 
> ~Nick


Looks like it's a little warm on the last run for GS2, the frames dropped off from the previous run. I'm guessing heat. Get it under water, and it'll maintain better.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Looks like it's a little warm on the last run for GS2, the frames dropped off from the previous run. I'm guessing heat. Get it under water, and it'll maintain better.


Aye the power draw seems to jump up about 7w each time I increase voltage too lol. And I can't wait to get under water lol -- can hardly contain myself; come on alphacool hurry up -- how long does it take to make a block lol...

==
Peaked at 247w draw (GPU only) on a quick FS graphics 1 & 2 run. Anyone seen it higher than that? (hwinfo64)

Temps weren't too bad because the custom runs are only a few seconds on FS 1 & 2.

But imagine if I had her under chilled water in that environment.. could be getting 7 degrees C on a run like that if I used liquid metal with block  _wish I could afford it already :-(_

*Correction*
Just done another run and actually monitored temps properly this time; your absolutely right mate; temp peaked at 42c in those few seconds, surprised i am... (ambient is only 15 degreesC, GPU idle 23C. --


----------



## ssgwright

I'm running the T4 bios and I love it. Seems my card is more stable at 1.1 than 1.2. I game @ 2140 with +400 on the mem


----------



## ssgwright

double post, sry


----------



## nrpeyton

*23,668 on an AMD FX CPU!!!* (1080 CLASSIFIED overclocked with new voltage tool)
http://www.3dmark.com/3dm/16029412

Quote:


> Originally Posted by *ssgwright*
> 
> I'm running the T4 bios and I love it. Seems my card is more stable at 1.1 than 1.2. I game @ 2140 with +400 on the mem


good to know a wee bit voltage is at least doing *something* on pascal 

you tried pushing it any harder?

I just hit 120%TDP (of max 130%) on a 320W max TDP card at 1.150v @ 2177MHZ
+900mhz memory with no artifacts


----------



## nrpeyton

deleted: double post sorry


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> Peaked at 247w draw (GPU only) on a quick FS graphics 1 & 2 run. Anyone seen it higher than that?


Here's an old FS Basic run on my FE. 310W peak. I had also measured power on the 12V rail and that measurement in HWiNFO is pretty much spot on for _this_ card.



It'll be interesting to see how high you push that voltage


----------



## Vellinious

Why don't you just use the 780 Classy / 980ti Classy blocks?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Why don't you just use the 780 Classy / 980ti Classy blocks?


I've read a few posts where people have been able to "adapt" these blocks slightly to *make* them fit; but if I paid nearly £100 for a block and it didn't fit I'd be pretty annoyed. Suppose I was just holding out until something was made for it especially. 

If the old ones did fit properly why wouldn't EK just re-package their old blocks and do a re-run? I suppose it would have been a last resort but even if i'd been faced with that; I probably would of ended up chickening out and getting a universal.

None of these multimeters seem to list their decimal places.. They all just list their minimum reading. The very cheap ones are 200mv. Not sure that's going to be accurate.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> That's a better graphics score than I'm getting at those clocks. At 2114 / 1.050V offset OC, I'm getting high 24k graphics scores in FS. Which driver are you using?


Just did this one: http://www.3dmark.com/fs/10767286

Latest Driver avaiable.
1.062V
Core 2.114 MHz
Memory 1.395 MHz

2114mhz until 49ºC then drop to 2100mhz.

EVGA just authorized my RMA. I'm really not sure what to do, already bought thermal pads myself for everything, even some 11w/mK Fujipoly for the heatplate.

Clocks aren't that outstanding, but I've seen very few 1080s doing 25400 on stock voltage, stock BIOS. It's about performance afterall, isn't it?


----------



## Derek1

Does this help?
2050 is the largest out of the box boost I have seen reported.
<2150 is about the 1st tier of OC.
<2250 about the second tier.
With 2300 being the god cards that there have been what? 3 I have seen reported.

As you know it doesn't matter what model card you have, except for the guaranteed initial boost, or manufacturer, they all OC along the same lines.

(My graph is just a rough estimate of course soley dependant on reported performance. Who knows how many grandmohters are out there with super cards surfing the net for wool at 2300. lol


----------



## Derek1

deleted double post

What is going on lately with this.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> 
> 
> Does this help?
> 2050 is the largest out of the box boost I have seen reported.
> <2150 is about the 1st tier of OC.
> <2250 about the second tier.
> With 2300 being the god cards that there have been what? 3 I have seen reported.
> 
> As you know it doesn't matter what model card you have, except for the guaranteed initial boost, or manufacturer, they all OC along the same lines.
> 
> (My graph is just a rough estimate of course soley dependant on reported performance. Who knows how many grandmohters are out there with super cards surfing the net for wool at 2300. lol


haha


----------



## nrpeyton

Well I just got home with my cheapo £12.50 multimeter.

Instructions say its accurate to a resolution of 1mv (+/- 1%)

Took a reading at 1.093v (meter reading was 1108mv)

Set voltage tool to 1.087v (meter reading was 1102mv)

Set voltage tool to 1.093v (meter reading was 1108mv)

*Used voltage tool to increase voltage to 1.150v (meter reading was 1165mv)*

So its confirmed; the *new Classified 1080 voltage tool is 100% working!
*
Not tested the MEMORY and PCI-E sliders yet, back soon


----------



## chiknnwatrmln

So I'm assuming the Classy voltage tool only works for Classy cards right?


----------



## Dragonsyph

I want my card NOW<<<< fedex not delivering on weekends through newegg is some bs. GRRRRRRRRR, im having gaming withdrawals.


----------



## Valtava

Quote:


> Originally Posted by *nrpeyton*
> 
> *23,668 on an AMD FX CPU!!!* (1080 CLASSIFIED overclocked with new voltage tool)
> http://www.3dmark.com/3dm/16029412
> 
> I just hit 120%TDP (of max 130%) on a 320W max TDP card at 1.150v @ 2177MHZ
> +900mhz memory with no artifacts


Just signed in to tell you that there is something wrong OR that card is just awfull cos i get 24.742 with MSI gtx 1080 Gaming Z (Z bios) and with fx-8350 around same clocks but with A LOT worse mb.. 4 pin ATX and PCI E 2.0 ..

Btw i lead both Timespy and FS in that combo

http://www.3dmark.com/spy/642321
http://www.3dmark.com/fs/10549590

/edit i cool everything with air


----------



## Krzych04650

Quote:


> Originally Posted by *Valtava*
> 
> Just signed in to tell you that there is something wrong OR that card is just awfull cos i get 24.742 with MSI gtx 1080 Gaming Z (Z bios) and with fx-8350 around same clocks but with A LOT worse mb.. 4 pin ATX and PCI E 2.0 ..
> 
> Btw i lead both Timespy and FS in that combo
> 
> http://www.3dmark.com/spy/642321
> http://www.3dmark.com/fs/10549590
> 
> /edit i cool everything with air


Yea 23 600 is very low score for 2177 MHz card. This is a score for 2050 MHz card or even lower. FX is crap but not a potato to affect graphics scores so much to take 1500 points from score.


----------



## Yomny

Would you guys say its necessary to run the ATX4P connector for SLI 1080?

I was trying to submit my validation but the last question in the form, validation name, i can't seem to get right. I submitted my validation through GPU-Z and received this code at the bottom of the App. Entered my name in the validation, same as my forum name. What am i doing wrong. Thanks in advance.


----------



## ucode

Quote:


> Originally Posted by *Valtava*
> 
> Just signed in to tell you that there is something wrong OR that card is just awfull cos i get 24.742 with MSI gtx 1080 Gaming Z (Z bios) and with fx-8350 around same clocks but with A LOT worse mb.. 4 pin ATX and PCI E 2.0 ..


First welcome to OCN









I run a 4 pin ATX as well with mini ITX, managed to score over 26k graphics on FS basic so cant be too bad









I think the OP was running without memory overclock so that might have hurt his score, early days.

@nrpeyton I did find an old FS basic air run at 1.2V with temps up to 80C for reference if it helps. GPU clock dropped from 2.2 to 2.15 but still managed to score over 25.5k on graphics so not too bad considering.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I've read a few posts where people have been able to "adapt" these blocks slightly to *make* them fit; but if I paid nearly £100 for a block and it didn't fit I'd be pretty annoyed. Suppose I was just holding out until something was made for it especially.
> 
> If the old ones did fit properly why wouldn't EK just re-package their old blocks and do a re-run? I suppose it would have been a last resort but even if i'd been faced with that; I probably would of ended up chickening out and getting a universal.
> 
> None of these multimeters seem to list their decimal places.. They all just list their minimum reading. The very cheap ones are 200mv. Not sure that's going to be accurate.


It looks like the only thing they've had to make any adjustments for is the backplate screws....which aren't that big of a deal anyway. No idea why EK won't even look at the board, and make sure the fitment is right, but there are 3 guys by my count on the EVGA forums that have used the blocks, AND got the backplates to work by buying different screws.....

Seriously....if the full coverage block would work, it's gonna be better than the universal with full coverage adapter. At least....that would be my choice.

I'm just hoping when the 1080ti comes out, that the blocks work for those as well, cause......I have 2 of those blocks that came off my 980ti that I'll be using. = ) Kinda wish I had known before I bought the FTWs......


----------



## LiquidHaus

Got my EVGA PowerLink in the mail yesterday...





It's a little bulky, I'll probably not end up using it with my current setup since it doesn't flow as well with my system, but it's a good thing to have for sure.


----------



## Yomny

Anyone experience fan fluctuations with the GPU is under load. Without adjusting anything through software, using the cards default fan setup it seems the fans cant hold a steady RPM. At times the rpms jump from 1k to 2k.


----------



## TWiST2k

FYI for everybody, I never received my email from EVGA after signing up for the powerlink, but I just checked my account and it was approved days ago and was waiting my payment for shipping. So anybody else that has requested one, do not rely on getting an email, login and double check.


----------



## keikei

Quote:


> Originally Posted by *TWiST2k*
> 
> FYI for everybody, I never received my email from EVGA after signing up for the powerlink, but I just checked my account and it was approved days ago and was waiting my payment for shipping. So anybody else that has requested one, do not rely on getting an email, login and double check.


Thanks for posting. It explains why I havent had a response after the initial pad submission. They could not cover shipping, seriously?


----------



## juniordnz

Quote:


> Originally Posted by *TWiST2k*
> 
> FYI for everybody, I never received my email from EVGA after signing up for the powerlink, but I just checked my account and it was approved days ago and was waiting my payment for shipping. So anybody else that has requested one, do not rely on getting an email, login and double check.


It should be in the promotion page, right? Down at the bottom where before was "Awaiting Approval, please wait 5-7 days".

Nothing yet for Brazil...

Also, they are charging 25usd to ship to Brazil


----------



## M3Stang

So after having random no post boots and black screen fan spinning 100% with my GTX 1070 SC ACX 3.0 EVGA, I returned it to best buy. I am now deciding to go with a GTX 1080 of sorts. Running on my 6700k iGPU right now and only using one monitor sucks. Im torn about which 1080 to get. EVGA support has always been awesome with me and they were helping me through this but since I am a BB elite member I found out I was still in my return window and returned it today and cancelled the RMA. I love EVGA but with the recent issues with them and my current experience I am afraid to get the GTX 1080 FTW, which is the one I want (mainly because of RGB). I was looking at the GIGABYTE G1 and the ZOTAC and ASUS STRIX and they all look great but I know NOTHING about their customer service. Should I get the 1080 at this point from EVGA? Or get something else? Thanks.


----------



## Dragonsyph

Quote:


> Originally Posted by *M3Stang*
> 
> So after having random no post boots and black screen fan spinning 100% with my GTX 1070 SC ACX 3.0 EVGA, I returned it to best buy. I am now deciding to go with a GTX 1080 of sorts. Running on my 6700k iGPU right now and only using one monitor sucks. Im torn about which 1080 to get. EVGA support has always been awesome with me and they were helping me through this but since I am a BB elite member I found out I was still in my return window and returned it today and cancelled the RMA. I love EVGA but with the recent issues with them and my current experience I am afraid to get the GTX 1080 FTW, which is the one I want (mainly because of RGB). I was looking at the GIGABYTE G1 and the ZOTAC and ASUS STRIX and they all look great but I know NOTHING about their customer service. Should I get the 1080 at this point from EVGA? Or get something else? Thanks.


The EVGA 1080 FTW hybrid is only 730 bucks and comes with a free 60 dollar game. Its the card i order and from the reviews most OC past 2150 with temps around 45c 40c if you replace the rad fan.


----------



## nrpeyton

Quote:


> Originally Posted by *M3Stang*
> 
> So after having random no post boots and black screen fan spinning 100% with my GTX 1070 SC ACX 3.0 EVGA, I returned it to best buy. I am now deciding to go with a GTX 1080 of sorts. Running on my 6700k iGPU right now and only using one monitor sucks. Im torn about which 1080 to get. EVGA support has always been awesome with me and they were helping me through this but since I am a BB elite member I found out I was still in my return window and returned it today and cancelled the RMA. I love EVGA but with the recent issues with them and my current experience I am afraid to get the GTX 1080 FTW, which is the one I want (mainly because of RGB). I was looking at the GIGABYTE G1 and the ZOTAC and ASUS STRIX and they all look great but I know NOTHING about their customer service. Should I get the 1080 at this point from EVGA? Or get something else? Thanks.


All new cards shipped from EVGA have the fix applied. Been this way for over a week now. If you grab a card from EVGA you've nothing to worry about.

=========

Anyone on an *AMD-FX CPU with a 1080*?? I need someone I can compare scores with


----------



## Valtava

Quote:


> Originally Posted by *nrpeyton*
> Anyone on an *AMD-FX CPU with a 1080*?? I need someone I can compare scores with


I have..


----------



## DStealth

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone on an *AMD-FX CPU with a 1080*?? I need someone I can compare scores with


Run TimeSpy or FS Ultra in order to reduce the CPU dependency and compare to Intel scores







)


----------



## Koniakki

Quote:


> Originally Posted by *M3Stang*
> 
> So after having random no post boots and black screen fan spinning 100% with my GTX 1070 SC ACX 3.0 EVGA, I returned it to best buy. I am now deciding to go with a GTX 1080 of sorts. Running on my 6700k iGPU right now and only using one monitor sucks. Im torn about which 1080 to get. EVGA support has always been awesome with me and they were helping me through this but since I am a BB elite member I found out I was still in my return window and returned it today and cancelled the RMA. I love EVGA but with the recent issues with them and my current experience I am afraid to get the GTX 1080 FTW, which is the one I want (mainly because of RGB). I was looking at the GIGABYTE G1 and the ZOTAC and ASUS STRIX and they all look great but I know NOTHING about their customer service. Should I get the 1080 at this point from EVGA? Or get something else? Thanks.


My vote goes for the EVGA FTW. EVGA handled the whole issue professionally and lived up once again to its high praised support.

And as *nrpeyton* said above and I quote: "All new cards shipped from EVGA have the fix applied. Been this way for over a week now. If you grab a card from EVGA you've nothing to worry about."

I would personally avoid the G1 Gaming tbh. I have one right here and this thing chokes on its self. Even after flashing the Xtreme bios. So unless its a dud card, I can't recommend it.

Also the STRIX is worth considering mainly for its T4 vbios if you care about that but since it works on most models, its down to personal preference in the end.

And if by Zotac you mean the AMP Extreme, that's a high praised card too. But I vote for the FTW again from the choices you posted.

Can't go wrong with EVGA. And if it does, we can rest assured they will fix it.


----------



## Derek1

Well, someone over at the EVGA forums has "discovered" the T4 bios. lol
http://forums.evga.com/GTX-1080-Unlocked-Bios-Voltagefan-limit-increased-NEW-possible-voltage-tool-m2548379.aspx
They seem to need some help though, one Classy user especially.
https://s11.postimg.org/6p7mzj5yr/OC_1080.png


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> It looks like the only thing they've had to make any adjustments for is the backplate screws....which aren't that big of a deal anyway. No idea why EK won't even look at the board, and make sure the fitment is right, but there are 3 guys by my count on the EVGA forums that have used the blocks, AND got the backplates to work by buying different screws.....
> 
> Seriously....if the full coverage block would work, it's gonna be better than the universal with full coverage adapter. At least....that would be my choice.
> 
> I'm just hoping when the 1080ti comes out, that the blocks work for those as well, cause......I have 2 of those blocks that came off my 980ti that I'll be using. = ) Kinda wish I had known before I bought the FTWs......


VRM temp isn't really that important as long as you keep it under specification; apart from the heat that may get conducted along board, adding to GPU temp performance?

suppose u still make a good point though; it would be interesting to know exactly how *much* heat is actually conducted from VRM to GPU.
Quote:


> Originally Posted by *Valtava*
> 
> I have..


What kind of scores are you getting on 3dmark? If you do a quick google search you'll be able to find an unlock code that will get you everything except the ability to run a "custom timespy". Can't remember exactly where I got it but that's what I done lol
Quote:


> Originally Posted by *Derek1*
> 
> Well, someone over at the EVGA forums has "discovered" the T4 bios. lol
> http://forums.evga.com/GTX-1080-Unlocked-Bios-Voltagefan-limit-increased-NEW-possible-voltage-tool-m2548379.aspx
> They seem to need some help though, one Classy user especially.
> https://s11.postimg.org/6p7mzj5yr/OC_1080.png


haha that was funny as ****, made ma afternoon reading through it lol


----------



## nrpeyton

deleted:

what is up with these boards recently; keep getting a message [form already submitted] then my post appears twice.

I only clicked 'submit' once.

these forums do also seem to crash a lot on Microsoft Edge, I'm going to have to start using chrome again :-(


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> deleted:
> 
> what is up with these boards recently; keep getting a message [form already submitted] then my post appears twice.
> 
> I only clicked 'submit' once.
> 
> these forums do also seem to crash a lot on Microsoft Edge, I'm going to have to start using chrome again :-(


Been happening all over the threads by tons of people, just not you.


----------



## SirCanealot

Hey guys,

I'm thinking of flashing my EVGA FE 1080 with the T4 bios (I have an Accelero Xtreme IV coming in the post very soon anyway, so cooling will not be an issue) and I have a few questions. I'm sorry if these have been answered already, but 800 page thread and GPU worth $$$ so I want to be sure!

So I'm flashing this bios:
https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803

And I'm using the NVFlash for signed drivers. Am I using the same commands I used for my 980? EG:

How to backup your current BIOS with nvflash:
nvflash -b backupbios.rom
How to flash the modified BIOS with nvflash:
nvflash -6 modifiedbios.rom

And a question on this bios:

Am I still able to limit power use? EG, if I set the power limit down to 50%, will the card still be forced to use less power or is the power limit slider completely disabled? I do like to force the clock down when not playing intensive games









If this is the case, would I be better served flashing one of the bioses that simply have a higher board power limit? If so, is there one people recommend?

Thanks so much!


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> VRM temp isn't really that important as long as you keep it under specification; apart from the heat that may get conducted along board, adding to GPU temp performance?
> 
> suppose u still make a good point though; it would be interesting to know exactly how *much* heat is actually conducted from VRM to GPU.
> What kind of scores are you getting on 3dmark? If you do a quick google search you'll be able to find an unlock code that will get you everything except the ability to run a "custom timespy". Can't remember exactly where I got it but that's what I done lol
> haha that was funny as ****, made ma afternoon reading through it lol


And memory. There's memory between the VRM and the core as well. Electronics run better, and more efficiently when they're not hot. Since you're talking about adding voltage, you're already looking at making them even hotter. Just sayin....for serious overclocking, you'd be much better off with a full coverage block.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Well, someone over at the EVGA forums has "discovered" the T4 bios. lol
> http://forums.evga.com/GTX-1080-Unlocked-Bios-Voltagefan-limit-increased-NEW-possible-voltage-tool-m2548379.aspx
> They seem to need some help though, one Classy user especially.
> https://s11.postimg.org/6p7mzj5yr/OC_1080.png


He's an idiot....


----------



## nrpeyton

Quote:


> Originally Posted by *SirCanealot*
> 
> Hey guys,
> 
> I'm thinking of flashing my EVGA FE 1080 with the T4 bios (I have an Accelero Xtreme IV coming in the post very soon anyway, so cooling will not be an issue) and I have a few questions. I'm sorry if these have been answered already, but 800 page thread and GPU worth $$$ so I want to be sure!
> 
> So I'm flashing this bios:
> https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803
> 
> And I'm using the NVFlash for signed drivers. Am I using the same commands I used for my 980? EG:
> 
> How to backup your current BIOS with nvflash:
> nvflash -b backupbios.rom
> How to flash the modified BIOS with nvflash:
> nvflash -6 modifiedbios.rom
> 
> And a question on this bios:
> 
> Am I still able to limit power use? EG, if I set the power limit down to 50%, will the card still be forced to use less power or is the power limit slider completely disabled? I do like to force the clock down when not playing intensive games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If this is the case, would I be better served flashing one of the bioses that simply have a higher board power limit? If so, is there one people recommend?
> 
> Thanks so much!


the command you need to use is as follows:

nvflash --overridesub [filename.rom] <<-- without the [ ]'s

Easiest way to do it is have nvflash.exe and the BIOS in the same folder, then FILE, OPEN COMMAND PROMPT (this way you'll already be in correct directory)


you need the --overridesub command to override the vendor mismatch because your cross-flashing a different manufacturers BIOS this is the only way to do it.
make sure you backup your existing BIOS first and save it in two places, use GPU-Z.
also make sure you disable graphics driver in device manager before you do anything
and there will be NO way to limit power use (power controls will be completely disabled), however don't worry the card will only try to draw what it needs, just keep an eye on temps, your FE max fan speed will be lower too

P.S.

you may also need to use command nvflash --protectoff to disable write protection before you begin.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> He's an idiot....


I'd still like to know how its reading a core clock speed of 2500. When I flashed the T4 to my Classified I had similar results. Probably just numbers reporting incorrectly. Mind you; mine still crashed trying to run a 3dmark run at 2300.

Going to whip the new multimeter out tonight and see what hardware voltages are detected while on T4 to see if that was the problem all along; its very possible the T4 was simply unable to "talk to" the Classiied voltage controller.

FTW and STRIX obviously must be using similar or same voltage controller or at least a variant from the same manufacturer, obviously EVGA/ASUS use some of the same suppliers.


----------



## Dragonsyph

WOOOT



https://www.techpowerup.com/gpuz/details/wwbym


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> WOOOT
> 
> 
> 
> 
> https://www.techpowerup.com/gpuz/details/wwbym


Welcome to the club.









Now remember to break that puppy in nice and slow, don't be too hard on it. lol


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Welcome to the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now remember to break that puppy in nice and slow, don't be too hard on it. lol


About to do a 3dmark run out of box. 8).


----------



## Dragonsyph

Out of the box, 23644 graphics score. Core was at 2011mhz 44c entire time. That any good?


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> Out of the box, 23644 graphics score. Core was at 2011mhz 44c entire time. That any good?


Probably about average out of the box. Mine was 1987,Junior I believe said his was 2025. Saw a ROG O8G at 2050.


----------



## Valtava

Quote:


> Originally Posted by *nrpeyton*
> 
> What kind of scores are you getting on 3dmark? If you do a quick google search you'll be able to find an unlock code that will get you everything except the ability to run a "custom timespy". Can't remember exactly where I got it but that's what I done lol


Time Spy 8 116 http://www.3dmark.com/spy/642321
Fire Strike 24 742 http://www.3dmark.com/fs/10549590
FS Ultra 5 859 http://www.3dmark.com/fs/10434052


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I'd still like to know how its reading a core clock speed of 2500. When I flashed the T4 to my Classified I had similar results. Probably just numbers reporting incorrectly. Mind you; mine still crashed trying to run a 3dmark run at 2300.
> 
> Going to whip the new multimeter out tonight and see what hardware voltages are detected while on T4 to see if that was the problem all along; its very possible the T4 was simply unable to "talk to" the Classiied voltage controller.
> 
> FTW and STRIX obviously must be using similar or same voltage controller or at least a variant from the same manufacturer, obviously EVGA/ASUS use some of the same suppliers.


Regardless, it's faily obvious it's having "issues", but he seems to think he's running 2500 on air.....his buggy scores in Heaven show differently, though


----------



## nrpeyton

Quote:


> Originally Posted by *Valtava*
> 
> Time Spy 8 116 http://www.3dmark.com/spy/642321
> Fire Strike 24 742 http://www.3dmark.com/fs/10549590
> FS Ultra 5 859 http://www.3dmark.com/fs/10434052


thanks, i'll do a few soon too, and come back with my scores... which 1080 you have?

My CPU is also at 4.8GHZ but I'm lucky if I get 20 minutes prime95 at 1.575v (1.52v5 on the core + a motherboard option to set a +50mv offset) I can tell ur overclocking on the bus (i think); i'm using the multiplier...
Quote:


> Originally Posted by *Vellinious*
> 
> Regardless, it's faily obvious it's having "issues", but he seems to think he's running 2500 on air.....his buggy scores in Heaven show differently, though


true lol


----------



## Dragonsyph

This score any good?


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> This score any good?


The score is good. No telling what those clocks are, though....so really, it's kind of hard to tell.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> The score is decent. No telling what those clocks are, though....so really, it's kind of hard to tell.


Its boosting to 2177 core.


----------



## nrpeyton

Quote:


> Originally Posted by *Valtava*
> 
> Time Spy 8 116 http://www.3dmark.com/spy/642321
> Fire Strike 24 742 http://www.3dmark.com/fs/10549590
> FS Ultra 5 859 http://www.3dmark.com/fs/10434052


Hi mate, just done a few Time Spy runs for comparison:

*First run:*

CPU: 4.8ghz (O/C using multiplier)
GPU: +113mhz core & +500 memory

Graphics Score: 7716
CPU Score: 3 516
Total Score: 6543

*Second run:*

CPU: 4.8ghz (O/C using bus this time)
GPU: Same as above

Graphics Score: 7820
CPU Score: 3705
Total Score: 6703

Changing to "bus" overclocking method = better result. I can't get system to boot with memory past 1600 though using this method, noticed yours is running in the 1900's. Mine won't boot at that :-( Even tried a 14, 15, 15, 40 (and 39) no luck :-(

You still beating me on both CPU and GPU though lol, at least now though I've got a goal to try and hit your score as we're on the same hardware 

So O/C'ing using multiplier and selecting 2133mhz for memory is scoring WORSE than O/C'ing on the BUS with memory at only 1600mhz


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Its boosting to 2177 core.


And the memory? At +1000 memory, it's possible it bugged the FS run (it may have bugged and ran in black and white, or dropped texture fill).


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> And the memory? At +1000 memory, it's possible it bugged the FS run (it may have bugged and ran in black and white, or dropped texture fill).


What do you mean its bugged?


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> What do you mean its bugged?


I didn't say it was. I said it's very possible, that at +1000 offset on the memory, the run may very well have bugged, and showing a higher fps than it should. If it ran in black and white, or the walkways / walls were all black without any "fill", it's a bugged run.

Not sure which card you have, but from my experience anything above +650 offset for memory clock is pretty unheard of.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I didn't say it was. I said it's very possible, that at +1000 offset on the memory, the run may very well have bugged, and showing a higher fps than it should. If it ran in black and white, or the walkways / walls were all black without any "fill", it's a bugged run.
> 
> Not sure which card you have, but from my experience anything above +650 offset for memory clock is pretty unheard of.


I bumped the voltage on my memory last night using the new classified voltage tool and done a few runs at +900 with no artifacts, loaded up 'the witcher 3' and had no issues there either.

when I stood in an empty area (and looked up at the sky -- to stabalise FPS in game....) my FPS kept increasing all the way up to +900... but weird thing is a 899 or 901 would score less than a 900. Same thing happens with 500. 499 or 501 = 3FPS lower... no idea how!

not had a chance to verify that memory voltage with the multimeter yet but that's the plan tonight 

think classified has 3 VRM phases just for memory; not sure how much that actually helps though

edit:
be back in a minute actually; i'll do an actual timespy run and see what happens....


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> I didn't say it was. I said it's very possible, that at +1000 offset on the memory, the run may very well have bugged, and showing a higher fps than it should. If it ran in black and white, or the walkways / walls were all black without any "fill", it's a bugged run.
> 
> Not sure which card you have, but from my experience anything above +650 offset for memory clock is pretty unheard of.


Nothing was black, everything looked good, had no artifacts. And im playing FarCry 4 right now with same settings.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Nothing was black, everything looked good, had no artifacts. And im playing FarCry 4 right now with same settings.


At +1000 memory offset.... What's GPUz read for memory clock?


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> At +1000 memory offset.... What's GPUz read for memory clock?


GPU-z says 1501, stock 1251.


----------



## Dragonsyph

In game its saying 6003MHz


----------



## nrpeyton

just done a run with +925 mem. started off great then crashed towards the end. despite the semi-crash the run still managed to *finish* with fully validated score,

that was with no extra voltage on mem this time; going to grab the multimeter now and see if the memory slider on the tool is actually doing anything before I bother trying


----------



## ucode

Quote:


> Originally Posted by *Dragonsyph*
> 
> This score any good?


That's a great score and being able to do +1000 on memory without crashing, artifacts, overheating is exceptional.

10Gbps GDDR5X is the slowest Micron produces and is what is used in the 1080's. The speed you have would be equivalent to what their 12Gbps chips would achieve default. Hopefully yields have increased and / or you got very lucky with the memory on your board.


----------



## nrpeyton

Quote:


> Originally Posted by *ucode*
> 
> That's a great score and being able to do +1000 on memory without crashing, artifacts, overheating is exceptional.
> 
> 10Gbps GDDR5X is the slowest Micron produces and is what is used in the 1080's. The speed you have would be equivalent to what their 12Gbps chips would achieve default. Hopefully yields have increased and / or you got very lucky with the memory on your board.


agreed, very lucky indeed mate. just crashed twice with a +925 on timespy; although wasn't an immediate crash.. it actually ran at this for a short period

Right; just tested the memory slider on the Classified Voltage tool; its working 100% 

Whats a safe memory voltage for these cards?

rep +1 for any useful information


----------



## Dragonsyph

+151 core +600 memory, score went down alot.


----------



## Vellinious

Certainly looks good.....I wouldn't have thought +1000 offset on the memory even possible.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> Certainly looks good.....I wouldn't have thought +1000 offset on the memory even possible.


Been trying out DSR x4 but seems to crash my games.


----------



## Dragonsyph

26,154 with +150 core +900 memory.

Any other benchmark i should try?


----------



## Dragonsyph

Ya highest core i can do with out crashing is 150, which boosts to 2177, and +1000 memory which is max the slider will go.

Tried 400, 500, 600, 700, 800, 900, and 1000 memory. Highest score is with +1000.


----------



## juniordnz

+1000 on memory is insane. Most cards won't get a performance increase after +575. That's same crazy good memory modules you got there, mate.


----------



## Dragonsyph

Quote:


> Originally Posted by *juniordnz*
> 
> +1000 on memory is insane. Most cards won't get a performance increase after +575. That's same crazy good memory modules you got there, mate.


Thanks at least its got something good, it seems to have alot of coil whine. Can't hear it over games but its pretty loud benching.


----------



## ucode

Quote:


> Originally Posted by *Dragonsyph*
> 
> Any other benchmark i should try?


You could try this one which likes memory OC

https://render.otoy.com/forum/download/file.php?id=55283

However it uses CUDA so you might find your memory default clock is lower and might need to be a little creative in order to push it to the max.

It's very light on the CPU so you might want to disable C-States for max performance. See if you can make 180.


----------



## xixou

In case you have more than two pascal GPU,
there is a trick to make games working in tri/four way sli:


----------



## DStealth

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya highest core i can do with out crashing is 150, which boosts to 2177, and +1000 memory which is max the slider will go.
> 
> Tried 400, 500, 600, 700, 800, 900, and 1000 memory. Highest score is with +1000.


Could you provide a validation link for compare purposes ...seems too high to me to gain almost 1k points from running 1ghz more on the memory than the regular cards...


----------



## Dragonsyph

Quote:


> Originally Posted by *DStealth*
> 
> Could you provide a validation link for compare purposes ...seems too high to me to gain almost 1k points from running 1ghz more on the memory than the regular cards...


I don't have a valid product key(tor version) so i can't upload scores. What other then the score would you like to see?


----------



## grimboso

Quote:


> Originally Posted by *Dragonsyph*
> 
> I don't have a valid product key(tor version) so i can't upload scores. What other then the score would you like to see?


You can download the program from Steam and use the Demo and then uppload scores, if you want to validate them!


----------



## DStealth

Quote:


> Originally Posted by *Dragonsyph*
> 
> I don't have a valid product key(tor version) so i can't upload scores. What other then the score would you like to see?


No problem, you can just remove the key and run it in demo mode...i have some results uploaded w/o issues. Valid score gives idea of driver modified settings. Once again your score is way higher than the average. Not going to insult you but it looks too good to be true. Maybe just FS scales so much with higher memory i dunno...
Here's is a Timespy from me @2164/+550 just for compare - http://www.3dmark.com/spy/658129
You can try this one also Ultra - http://www.3dmark.com/fs/10335528


----------



## Dragonsyph

Quote:


> Originally Posted by *DStealth*
> 
> No problem, you can just remove the key and run it in demo mode...i have some results uploaded w/o issues. Valid score gives idea of driver modified settings. Once again your score is way higher than the average. Not going to insult you but it looks too good to be true. Maybe just FS scales so much with higher memory i dunno...
> Here's is a Timespy from me @2164/+550 just for compare - http://www.3dmark.com/spy/658129
> You can try this one also Ultra - http://www.3dmark.com/fs/10335528


I havnt touched the drivers except trying out DSR at x4.

It says Valid at top until i try to upload it then says invlaid registry key.


----------



## Krzych04650

So to sum up my thoughts about MSI Gaming card...

Got 2076/5500 OC. 2101 is only possible with very low temp (~40C). It is doable with 100% fan speed, 100% case fan speed and 10C ambient temp, but for normal usage obviously this is not doable. 2076 is stable with any temperature.

Power limit is not an issue, its hard to even reach 100% with max possible overvoltage and temperature (voltage increases significantly with temp, 0.993 at 35 C and 1.093 at 70 C) after OC. Only TimeSpy manages to hit peak power limit of 105%, things like Valley or Witcher 3 are taking around 90%.

Scores:
Time Spy: http://www.3dmark.com/spy/728879
Fire Strike: http://www.3dmark.com/fs/10784049

Temps (Fractal Design R5 with 5V/400 RPM 140mm fans, 2x front, 1x bottom, 1x side and 1x top intake, 1x rear exhaust):
- 40% fan speed (quiet) - 60 C over ambient
- 60% fan speed (somewhat balanced) - 40C over ambient
- 100% fan speed (jet engine) - 30 C over ambient
- 100% fan speed + 12V/800 RPM case fans (dual jet engine) - 27 C over ambient

So cooler efficiency is rather not satisfying and I wouldn't buy this card if I had my PC in the same room I play in.

Noise is relative, but I can say that fans are not making any strange noises, just gentle humming, gentle for someone with normal hearing, not for someone like me, for me only silence in gentle.

Coil whine is there like with any other GPU in the world, I have no idea what people are talking about while saying that there is no whine on their cards. Either I have overly sensitive hearing or you are deaf, probably the first option.

Overall a good card only for SLI on air, because it is wider instead of being thicker and doesn't use 3 slots for cooling, and this is why I got it, its just exactly dual slot card, so it will have space to breathe after I add second card. I don't recommend this card if you get only one, there are much more efficient coolers around. Things like Palit or Zotac Extreme can do like 10C better. For single GPU Palit SuperJetstream is the way in my opinion, cheap, dual BIOS, similar power limit (regular JetStream has horribly low power limit so watch out) and very efficient cooling. But unfortunately it is too thick for 3-slot SLI config.


----------



## Dragonsyph

Heres a timespy run at +150 +1000 core boosting to about 2164-2177. GPU usage went down to like 60% during tests so not sure why this benchmark is that good.


----------



## Valtava

Quote:


> Originally Posted by *nrpeyton*
> 
> Hi mate, just done a few Time Spy runs for comparison:
> 
> *First run:*
> 
> CPU: 4.8ghz (O/C using multiplier)
> GPU: +113mhz core & +500 memory
> 
> Graphics Score: 7716
> CPU Score: 3 516
> Total Score: 6543
> 
> *Second run:*
> 
> CPU: 4.8ghz (O/C using bus this time)
> GPU: Same as above
> 
> Graphics Score: 7820
> CPU Score: 3705
> Total Score: 6703
> 
> Changing to "bus" overclocking method = better result. I can't get system to boot with memory past 1600 though using this method, noticed yours is running in the 1900's. Mine won't boot at that :-( Even tried a 14, 15, 15, 40 (and 39) no luck :-(
> 
> You still beating me on both CPU and GPU though lol, at least now though I've got a goal to try and hit your score as we're on the same hardware
> 
> So O/C'ing using multiplier and selecting 2133mhz for memory is scoring WORSE than O/C'ing on the BUS with memory at only 1600mhz


Sometimes memory oc make it worse.. Dunno why. if i go more than 2000 mhz points start to drop down.. But now i found optimal 215x22=4.730mzh and power +220 at extreme VCORE is 1.560V.. Temps +/- 60


----------



## ucode

Quote:


> Originally Posted by *DStealth*
> 
> Maybe just FS scales so much with higher memory i dunno...


Why don't you know? Run FS with your normal mem OC then run at default mem clock and see the difference.

For most of us we reach a point where there's a sharp drop of in performance with mem clock. Now if we continue past that point performance starts increasing again but most of us will not get to the point where it exceeds our previous gain before artifacts and crashes will occur.


----------



## Krzych04650

Quote:


> Originally Posted by *ucode*
> 
> Why don't you know? Run FS with your normal mem OC then run at default mem clock and see the difference.
> 
> For most of us we reach a point where there's a sharp drop of in performance with mem clock. Now if we continue past that point performance starts increasing again but most of us will not get to the point where it exceeds our previous gain before artifacts and crashes will occur.


Exactly. I get the best scores at 5500 MHz memory, which is +500 offset. After that scores are decreasing significantly, basically all benefit from memory overclocking is gone, and even +800 offset scores are worse than +500, in games and in benchmarks.


----------



## Dragonsyph

Quote:


> Originally Posted by *Krzych04650*
> 
> Exactly. I get the best scores at 5500 MHz memory, which is +500 offset. After that scores are decreasing significantly, basically all benefit from memory overclocking is gone, and even +800 offset scores are worse than +500, in games and in benchmarks.


For me scores went up every time i raised it +100.


----------



## ucode

Could be something in that special version 86.04.3B.01.80 VBIOS. Would be interesting to see if others with 3B can do the same and what chip numbers are on the memory. Maybe AB is holding yo back with that slider


----------



## juniordnz

Quote:


> Originally Posted by *ucode*
> 
> Could be something in that special version 86.04.3B.01.80 VBIOS. Would be interesting to see if others with 3B can do the same and what chip numbers are on the memory. Maybe AB is holding yo back with that slider


I'm with a 3B vBIOS and just now tried +675 on memory. Got some flashes on the second scene and although it validated I got 150 points less compared with my current +575 OC.

Guess there's nothing to do with the vBIOS, unfortunately.


----------



## ucode

Thanks @juniordnz, I cross flashed a 3B.00 (Could not find 3B.01) on an FE card but nVidia driver didn't want to have anything to do with it.


----------



## Vellinious

HA

http://www.3dmark.com/3dm/16027686



Kidding....it bugged. Would have validated though, if I had run more than just graphics test 1 and 2. +800 memory was where the "magic" happened. Bench ran in black and white. Frame rates were crazy high.


----------



## DStealth

Quote:


> Originally Posted by *ucode*
> 
> Why don't you know? Run FS with your normal mem OC then run at default mem clock and see the difference.
> .


I made a comparison somewhere at the beginning of the thread from 0 to +600 with the numbers. It's strange to me in Timespy we have exactly the same score(mine [email protected] vs his [email protected]) despite the +450 mem OC while in FS he has more than 700-800 point ahead...or if comparing FSP quite high % difference...Dunno maybe his 4790 helps over there while 1080p bench compared to my [email protected]+QC_3100cl14_1t...


----------



## NYU87

Quote:


> Originally Posted by *Vellinious*
> 
> HA
> 
> http://www.3dmark.com/3dm/16027686
> 
> 
> 
> Kidding....it bugged. Would have validated though, if I had run more than just graphics test 1 and 2. +800 memory was where the "magic" happened. Bench ran in black and white. Frame rates were crazy high.


How...?


----------



## Dragonsyph

Quote:


> Originally Posted by *DStealth*
> 
> I made a comparison somewhere at the beginning of the thread from 0 to +600 with the numbers. It's strange to me in Timespy we have exactly the same score(mine [email protected] vs his [email protected]) despite the +450 mem OC while in FS he has more than 700-800 point ahead...or if comparing FSP quite high % difference...Dunno maybe his 4790 helps over there while 1080p bench compared to my [email protected]+QC_3100cl14_1t...


For me in time spy i notice the GPU utilization going low as 64% doing graphics tests. Not sure why. And i just noticed my ram is at 1866mhz, you think i would get higher gpu scores if its at 2133?


----------



## Vellinious

Quote:


> Originally Posted by *NYU87*
> 
> How...?


It happens....I used to get bugged runs all the time with my 980ti. Sometimes the textures would drop out, and cause frame rates to sky rocket. Other times, it'd just bug out and run in black and white. 3D Mark can't tell the difference, so.....if you let the test run, it'll likely validate. I don't ever let them go through. Seems dishonest. I know some that do, though....pretty shady.
Quote:


> Originally Posted by *Dragonsyph*
> 
> For me in time spy i notice the GPU utilization going low as 64% doing graphics tests. Not sure why. And i just noticed my ram is at 1866mhz, you think i would get higher gpu scores if its at 2133?


Probably not. But your CPU score might jump up a bit....as long as you don't have to adjust the timings.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> It happens....I used to get bugged runs all the time with my 980ti. Sometimes the textures would drop out, and cause frame rates to sky rocket. Other times, it'd just bug out and run in black and white. 3D Mark can't tell the difference, so.....if you let the test run, it'll likely validate. I don't ever let them go through. Seems dishonest. I know some that do, though....pretty shady.
> Probably not. But your CPU score might jump up a bit....as long as you don't have to adjust the timings.


Mine just ends the benchmark if its not stable. Tried +1000 memory in ungine heaven and it crashes around 20/28 when benching on max settings. Before it crashes in ungine i see flashes of like pink.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Mine just ends the benchmark if its not stable. Tried +1000 memory in ungine heaven and it crashes around 20/28 when benching on max settings. Before it crashes in ungine i see flashes of like pink.


That's normal. The memory is causing errors, and you're seeing artifacts on the screen.

For whatever reason, 3D Mark and the NVIDIA drivers are acting....strangely....all the way back to version 353.62 that I was using as my "fall back" driver for the Maxwell architecture.


----------



## juniordnz

Could anyone with a *4.7ghz 6700K report the Phisics score on Firestrike Bench* so I could compare to my 4790K.

I know a 4.9ghz 6700K is about 16% faster than my 5ghz 4790K. But according to Silicon Lottery only the top 3% 6700K can hit 4.9ghz, 19% can hit 4.8ghz and 61% can hit 4.7. So I'd like to compare to 4.7's.


----------



## nrpeyton

Something really weird is happening.

Was benching away last night doing FireStrike Ultra. (Custom Graphics 1 & 2). Never touched any other settings except switching off Physics & Combined tests.

Was hitting 7900 --> 8100 all night then after another test suddenly I can't get over 5600.

Even tried cancelling all O/C's running everything at default except 100% fan.

Same today, can't get over 5700?!

This ever happened to anyone else?

Nick Peyton


----------



## bloot

Quote:


> Originally Posted by *nrpeyton*
> 
> Something really weird is happening.
> 
> Was benching away last night doing FireStrike Ultra. (Custom Graphics 1 & 2). Never touched any other settings except switching off Physics & Combined tests.
> 
> Was hitting 7900 --> 8100 all night then after another test suddenly I can't get over 5600.
> 
> Even tried cancelling all O/C's running everything at default except 100% fan.
> 
> Same today, can't get over 5700?!
> 
> This ever happened to anyone else?
> 
> Nick Peyton


Are you using 375.86 drivers? They are problematic, revert back in that case.


----------



## nrpeyton

Quote:


> Originally Posted by *bloot*
> 
> Are you using 375.86 drivers? They are problematic, revert back in that case.


Geforce Experience:
Currently Installed: 375.70

Available: 375.86

So not the problem :-(

*How is this possible????*
http://www.3dmark.com/3dm/16096881

http://www.3dmark.com/3dm/16082095

Can anyone spot the difference because I *can't*, both look exactly the same; except score.

.


----------



## Yomny

Anyone care to provide some insight? Is it neeeed to run the ATX4P connector when running SLI? Can't really plug it in on my board.


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> Anyone care to provide some insight? Is it neeeed to run the ATX4P connector when running SLI? Can't really plug it in on my board.


Would highly recommend it, the cards will draw up to 150w from the motherboard. (75w each).

Why can't you plug it in. Radiator clearance or length of cable?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Geforce Experience:
> Currently Installed: 375.70
> 
> Available: 375.86
> 
> So not the problem :-(
> 
> *How is this possible????*
> http://www.3dmark.com/3dm/16096881
> 
> http://www.3dmark.com/3dm/16082095
> 
> Can anyone spot the difference because I *can't*, both look exactly the same; except score.
> 
> .


Graphics test 2 flopped....hard. You can see the difference in the frame rates in the comparison link.

http://www.3dmark.com/compare/fs/10788304/fs/10795903


----------



## Koniakki

Quote:


> Originally Posted by *nrpeyton*
> 
> Geforce Experience:
> Currently Installed: 375.70
> 
> Available: 375.86
> 
> So not the problem :-(
> 
> *How is this possible????*
> http://www.3dmark.com/3dm/16096881
> 
> http://www.3dmark.com/3dm/16082095
> 
> Can anyone spot the difference because I *can't*, both look exactly the same; except score.
> 
> .


If you don't manage to resolve it soon, revert back any OS/Software changes, check startup/running programs, remove drivers with DDU, revert Motherboard bios settings to default/stock(save bios settings profile if possible).

Install another previous working driver version than 375.70 and try again at stock clocks. And we will take it from there.

BUT before all that, test extensively because sometimes we might miss the solution to even the most basic issue/problem.


----------



## nrpeyton

Quote:


> Originally Posted by *Koniakki*
> 
> If you don't manage to resolve it soon, revert back any OS/Software changes, check startup/running programs, remove drivers with DDU, revert Motherboard bios settings to default/stock(save bios settings profile if possible).
> 
> Install another previous working driver version than 375.70 and try again at stock clocks. And we will take it from there.
> 
> BUT before all that, test extensively because sometimes we might miss the solution to even the most basic issue/problem.


Yes thank you mate; the problem is actually getting me depressed.

I am trying a driver change just now with stock motherboard BIOS. Back soon 

Nick

Quote:


> Originally Posted by *Vellinious*
> 
> Graphics test 2 flopped....hard. You can see the difference in the frame rates in the comparison link.
> 
> http://www.3dmark.com/compare/fs/10788304/fs/10795903


Yup I noticed that too mate; I can "feel" it during the bench too. *Lost 5 FPS on Test 1, but losing 13 FPS on test 2*..

*13 FPS I mean ***...!?! ****


----------



## Dragonsyph

That sucks nrperyton, i hope its hot hardware and an easy fix. GL.

On a side not, just loaded up GTA 5 with max settings and scaling x2, game still looks like garbage. That washed out look from anything like 10+ feet away reall gets to me. Its like every thing is foggy or something. Or has a grey tint instead of full color.

Even tried using Redux, game still looks like trash in the distance.


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> Would highly recommend it, the cards will draw up to 150w from the motherboard. (75w each).
> 
> Why can't you plug it in. Radiator clearance or length of cable?


I have a corsair 400c, this being a midsize chassis, the motherboard ends right on top of the PSU and coincidently this is where the ATX4P connector is, not even using 90 degree adapters I could plug in the molex.

I think I'll stick to one 1080 for now, just added the hybrid cooler and its working just right.


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> I have a corsair 400c, this being a midsize chassis, the motherboard ends right on top of the PSU and coincidently this is where the ATX4P connector is, not even using 90 degree adapters I could plug in the molex.
> 
> I think I'll stick to one 1080 for now, just added the hybrid cooler and its working just right.


You tried turning the PSU around?

If that doesn't work I'd leave it "loose" inside the case if you have to for now. Not having that cable in could also affect your CPU overclock too.

================

Right motherboard is at stock, I've changed nvidia driver to latest (it wasn't on latest when problem surfaced yesterday).

Also switched PC off completely for 5 minutes to cool down.

Tried GPU at stock settings as well.

Also increased/decreased core/memory to see if I still get gains/losses from these sliders and I do.. but score is still in the 5000's now instead of in the 8000's.

*Is it possible the EVGA Thermal Pad issue has left my VRM/memory overheating and caused damage to my card?*

What are you guys getting on FS ULTRA Graphics 1&2 ?


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> You tried turning the PSU around?
> 
> If that doesn't work I'd leave it "loose" inside the case if you have to for now. Not having that cable in could also affect your CPU overclock too.
> 
> ================
> 
> Right motherboard is at stock, I've changed nvidia driver to latest (it wasn't on latest when problem surfaced yesterday).
> 
> Also switched PC off completely for 5 minutes to cool down.
> 
> Tried GPU at stock settings as well.
> 
> Also increased/decreased core/memory to see if I still get gains/losses from these sliders and I do.. but score is still in the 5000's now instead of in the 8000's.
> 
> *Is it possible the EVGA Thermal Pad issue has left my VRM/memory overheating and caused damage to my card?*
> 
> What are you guys getting on FS ULTRA Graphics 1&2 ?


Just to get software out of the way, install a fresh OS in another hard drive and just drivers and OC software and run the benchmark again, if it stays the same then it might be the hardware but i think its software! But stay away from the latest nvidia drivers!









Cheers

Occamrazor


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> You tried turning the PSU around?
> 
> If that doesn't work I'd leave it "loose" inside the case if you have to for now. Not having that cable in could also affect your CPU overclock too.
> 
> quote]
> 
> Haven't tried that but I guess if it needs to be I'll get another case. My current case is pretty stuffed. The PSU sits in a placeholder with tabs holding it in place and a vent at the bottom of the case for its fan. Right in front of the case also in front of the PSUI I have the 280mm radiator for the CPU lol.
> 
> Appreciate your help. For sure though I wont do OC or SLI until I get that plugged in.


----------



## Krzych04650

Still didn't send Palit JetStream back, here I have 2 weeks for that from declaring return, so I used the opportunity to test SLI and paired with my MSI. I was planning to get second card early 2017 so it was especially interesting for me and this was first SLI try I ever had.

In some games like Witcher 3 results are just okay, boosting from 50 FPS to 80, but in some games like Assassin's Creed Black Flag or Rise of the Tomb Rider scaling is approaching perfect level, or even exceeding it. In Black Flag from 37 FPS to stable 60 and only because game is limited to 61 FPS, in Rise of the Tomb Rider I actually saw FPS boosting from 47 to 99 in one place. Crazy stuff. Definitely getting second card in 1-2 months. Such power... even if only in AAA games. I don't need SLI for all games, just for the most demanding ones where it is needed, never playing day one so no worries here. I cannot see anything coming out soon that could be any sensible alternative for 1080 SLI, so it is probably decided that I am getting second card in early 2017, or I will just make myself a present for Christmas









i5 is not an issue for 60 FPS gaming, but bottleneck is very significant if you want to push right framerates, obviously. Very, very significant. Not sure if there is any CPU that can keep up with those cards at 3440x1440 in CPU bound areas.

Typical power draw is 450-550 W from the wall, below 500 for majority of time.

Temperatures of top card, MSI one in my case, are increased by 13 C compared to having single card, and this is with side and bottom intake, so for rather optimistic scenario. 100% GPU fan speed, 100% case fan speed and 10 C ambient temp allowed around 52 C temps under load, so there is absolutely no way to cool this kind of setup for normal usage scenario, where ambient temperatures are 20-25 and fan speeds need to be 2 times slower, if not more, to be acceptable in terms of noise. For normal usage and keeping your PC next to you, or even further away but in the same room, don't touch SLI without watercooling. Seriously, things you hear in reviews like MSI 1080 SLI review on OC3D that those two cards were quiet working together is just a complete bs. The noise of top card would wake up the dead if you run it on graveyard.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> You tried turning the PSU around?
> 
> If that doesn't work I'd leave it "loose" inside the case if you have to for now. Not having that cable in could also affect your CPU overclock too.
> 
> ================
> 
> Right motherboard is at stock, I've changed nvidia driver to latest (it wasn't on latest when problem surfaced yesterday).
> 
> Also switched PC off completely for 5 minutes to cool down.
> 
> Tried GPU at stock settings as well.
> 
> Also increased/decreased core/memory to see if I still get gains/losses from these sliders and I do.. but score is still in the 5000's now instead of in the 8000's.
> 
> *Is it possible the EVGA Thermal Pad issue has left my VRM/memory overheating and caused damage to my card?*
> 
> What are you guys getting on FS ULTRA Graphics 1&2 ?


I don't believe there were any problems with the Classys. At least, not that were reported.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I don't believe there were any problems with the Classys. At least, not that were reported.


All ACX 3.0 cards were missing the thermal pads; Classifieds included

. We also got new BIOS unlocking additional fan speed. I am also waiting on my thermal pads too. (They are currently on "awaiting shipment").

My product number is also one of the ones listed.

All the reviews done about them were simply done on FTW's as this is the most popular card.


----------



## 66racer

I dont totally buy into it that it was just missing thermal pads. everything else looks the same as an ACX2.0 card except those backplates had more holes but the 1000 series had the larger vents on one side too. A buddy had a 1070 die when watching a movie too....I mean whatever I love my 1080FTW and all but almost feel like something else was going on. Maybe a bad batch of parts or something.


----------



## juniordnz

Quote:


> Originally Posted by *66racer*
> 
> I dont totally buy into it that it was just missing thermal pads. everything else looks the same as an ACX2.0 card except those backplates had more holes but the 1000 series had the larger vents on one side too. A buddy had a 1070 die when watching a movie too....I mean whatever I love my 1080FTW and all but almost feel like something else was going on. Maybe a bad batch of parts or something.


EVGA stated that a few cards sold before september had a faulty VRM module that was causing some cards to die randomly (black screen). But that was fixed since september. If I remember the numbers right it was 3% of all sold cards before september that had that problem.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> EVGA stated that a few cards sold before september had a faulty VRM module that was causing some cards to die randomly (black screen). But that was fixed since september. If I remember the numbers right it was 3% of all sold cards before september that had that problem.


Yep, it was 4% of all chips (not chips, but whatever part was faulty, you get the idea, so it wasn't 4% of all cards) though.
Mine was part of that pre Sept 1st batch.
It black screened and fans went to 100% twice so I RMAed it.

Just beat my previous score with the new card this morning.

+125-2139 / +700-5700
TDP 130% 1.093v

http://www.3dmark.com/3dm/16108329

Core temps went to 62C. That is with the thermal fix applied and new bios.
I am still waiting for my SekiSui tape to arrive on the slow boat from China. Then I can get it underwater.

ETA Not sure why this new one will do 700+ on the Mem clock as my first would atrifact at +600. I had this one up to +800 last night. Things were getting hot at that point though so I called it a night. Score was lower than this mornings run too.


----------



## Valtava

Quote:


> Originally Posted by *66racer*
> 
> I dont totally buy into it that it was just missing thermal pads. everything else looks the same as an ACX2.0 card except those backplates had more holes but the 1000 series had the larger vents on one side too. A buddy had a 1070 die when watching a movie too....I mean whatever I love my 1080FTW and all but almost feel like something else was going on. Maybe a bad batch of parts or something.


I give it a year and then there will be a flood of broken Evgas..


----------



## juniordnz

Quote:


> Originally Posted by *Valtava*
> 
> I give it a year and then there will be a flood of broken Evgas..


Thanks, prophet.

Good for us we bought from one of the best CS on the market.


----------



## Koniakki

Quote:


> Originally Posted by *nrpeyton*
> 
> Right motherboard is at stock, I've changed *nvidia driver to latest* (it wasn't on latest when problem surfaced yesterday).
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> ...Also switched PC off completely for 5 minutes to cool down.
> 
> Tried GPU at stock settings as well.
> 
> Also increased/decreased core/memory to see if I still get gains/losses from these sliders and I do.. but score is still in the 5000's now instead of in the 8000's.
> 
> *Is it possible the EVGA Thermal Pad issue has left my VRM/memory overheating and caused damage to my card?*
> 
> What are you guys getting on FS ULTRA Graphics 1&2 ?


It's a long shot and surely you would have noticed it but you never know..

Warning: NVIDIA GeForce 375.86 WHQL Drivers Have Issues


----------



## nrpeyton

Quote:


> Originally Posted by *Krzych04650*
> 
> Still didn't send Palit JetStream back, here I have 2 weeks for that from declaring return, so I used the opportunity to test SLI and paired with my MSI. I was planning to get second card early 2017 so it was especially interesting for me and this was first SLI try I ever had.
> 
> In some games like Witcher 3 results are just okay, boosting from 50 FPS to 80, but in some games like Assassin's Creed Black Flag or Rise of the Tomb Rider scaling is approaching perfect level, or even exceeding it. In Black Flag from 37 FPS to stable 60 and only because game is limited to 61 FPS, in Rise of the Tomb Rider I actually saw FPS boosting from 47 to 99 in one place. Crazy stuff. Definitely getting second card in 1-2 months. Such power... even if only in AAA games. I don't need SLI for all games, just for the most demanding ones where it is needed, never playing day one so no worries here. I cannot see anything coming out soon that could be any sensible alternative for 1080 SLI, so it is probably decided that I am getting second card in early 2017, or I will just make myself a present for Christmas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i5 is not an issue for 60 FPS gaming, but bottleneck is very significant if you want to push right framerates, obviously. Very, very significant. Not sure if there is any CPU that can keep up with those cards at 3440x1440 in CPU bound areas.
> 
> Typical power draw is 450-550 W from the wall, below 500 for majority of time.
> 
> Temperatures of top card, MSI one in my case, are increased by 13 C compared to having single card, and this is with side and bottom intake, so for rather optimistic scenario. 100% GPU fan speed, 100% case fan speed and 10 C ambient temp allowed around 52 C temps under load, so there is absolutely no way to cool this kind of setup for normal usage scenario, where ambient temperatures are 20-25 and fan speeds need to be 2 times slower, if not more, to be acceptable in terms of noise. For normal usage and keeping your PC next to you, or even further away but in the same room, don't touch SLI without watercooling. Seriously, things you hear in reviews like MSI 1080 SLI review on OC3D that those two cards were quiet working together is just a complete bs. The noise of top card would wake up the dead if you run it on graveyard.


If you are seriously going to use 2 1080's in SLI for gaming (because you obviously desperately feel strongly about having *every possible graphics setting on ULTRA at 4k lol* you could simply leave the cards at stock. At stock or down clock to -100 you could keep noise down. You are SLI anyway so you probably wouldn't see much difference in FPS and you will still have a setup more powerful than 95% of all top end users.

Another alternative is simply wait for the 1080TI and sell your 1080. That way you don't need to worry about SLI compatibility/scaling. That's my plan anyway. Might not be QUITE as powerful as twin 1080 but it should still be enough for a fully comprehensive 4k experience on 99.9% of titles 

Thanks for post anyway; good information regarding SLI for us to read :--)

Edit: _(For everyone)_
The more I think about it the more that makes sense actually, OVERCLOCK THE HELL out your memory (which card's memory overclocks the *highest*? Use that as the *PRIMARY card*) --

remember *MEMORY is only actually utilised on ONE of the cards in an SLI setup*).

Then downclock the cores for quieter running ;-)
_(You could probably do -150 or even use the voltage curve to lock a lower voltage.(Under-volt)
My card will actually do 2000mhz at 912mv)._

Nick Peyton


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> If you are seriously going to use 2 1080's in SLI for gaming (because you obviously desperately feel strongly about having *every possible graphics setting on ULTRA at 4k lol* you could simply leave the cards at stock. At stock or down clock to -100 you could keep noise down. You are SLI anyway so you probably wouldn't see much difference in FPS and you will still have a setup more powerful than 95% of all top end users.
> 
> Another alternative is simply wait for the 1080TI and sell your 1080. That way you don't need to worry about SLI compatibility/scaling. That's my plan anyway. Might not be QUITE as powerful as twin 1080 but it should still be enough for a fully comprehensive 4k experience on 99.9% of titles
> 
> Thanks for post anyway; good information regarding SLI for us to read :--)
> 
> Nick Peyton


Just had a look over at the Valley thread and saw that there are no results at 4K for a 1080 SLI. (In the top 50 anyway)

If you see this Krzych would you mind taking a run or two at that? For comparisons sake.

I did 58FPS last night on a single. (Didn't save it though) My 45 is still on the list though.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Just had a look over at the Valley thread and saw that there are no results at 4K for a 1080 SLI. (In the top 50 anyway)
> 
> If you see this Krzych would you mind taking a run or two at that? For comparisons sake.
> 
> I did 58FPS last night on a single. (Didn't save it though) My 45 is still on the list though.


I haven't tried Valley in 4k.....might be interesting to see if it's still CPU bound. I'd almost bet it is, though. I haven't run Valley in ages, for that reason.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Yep, it was 4% of all chips (not chips, but whatever part was faulty, you get the idea, so it wasn't 4% of all cards) though.
> Mine was part of that pre Sept 1st batch.
> It black screened and fans went to 100% twice so I RMAed it.
> 
> Just beat my previous score with the new card this morning.
> 
> +125-2139 / +700-5700
> TDP 130% 1.093v
> 
> http://www.3dmark.com/3dm/16108329
> 
> Core temps went to 62C. That is with the thermal fix applied and new bios.
> I am still waiting for my SekiSui tape to arrive on the slow boat from China. Then I can get it underwater.
> 
> ETA Not sure why this new one will do 700+ on the Mem clock as my first would atrifact at +600. I had this one up to +800 last night. Things were getting hot at that point though so I called it a night. Score was lower than this mornings run too.


Did the thermal fix get you even a marginally slightly higher stable O/C? Did you do a "before" and "after" on core temp? My pads are still on "awaiting delivery".

==================

Other issue:

Is it just me... or am I not the only one who gets really ******* annoyed about why 3dmark runs don't actually show as "basic", "extreme" or "ultra"? or even as "1080", "1440" and "2160""?? You've got to *work it out* your sell by seeing roughly high the score is. But if you're troubleshooting and your scores are wayyyy off like mine; its a pain in the ******* arse. How hard would it be just to put the title of which run it actually was at the top of the ******* page. ffs


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Yep, it was 4% of all chips (not chips, but whatever part was faulty, you get the idea, so it wasn't 4% of all cards) though.
> Mine was part of that pre Sept 1st batch.
> It black screened and fans went to 100% twice so I RMAed it.
> 
> Just beat my previous score with the new card this morning.
> 
> +125-2139 / +700-5700
> TDP 130% 1.093v
> 
> http://www.3dmark.com/3dm/16108329
> 
> Core temps went to 62C. That is with the thermal fix applied and new bios.
> I am still waiting for my SekiSui tape to arrive on the slow boat from China. Then I can get it underwater.
> 
> ETA Not sure why this new one will do 700+ on the Mem clock as my first would atrifact at +600. I had this one up to +800 last night. Things were getting hot at that point though so I called it a night. Score was lower than this mornings run too.


Nice, 8)


----------



## Dragonsyph

This any good?



Seems pretty low from the cpu.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Did the thermal fix get you even a marginally slightly higher stable O/C? Did you do a "before" and "after" on core temp? My pads are still on "awaiting delivery".


On my orginal card, that boosted to 1987 out of the box, I could OC Core to 145/2164Mghz. I could OC Mem clock to +550, if I went to 600 it would artifact. The best performance I got from that one was at +135/+400 or 2130/5400. I got a FS score of just over 17K and Graphics score of 24400. All my Heaven and Valley scores were done on that card. Core temps never were above 70C.

The new card boosted out of the box at 2012. I have been playing around (not really recording or saving anything yet because I am waiting to convert to Hybrid) and found that:

1. If I OC the Core to 135/2151 with no Mem clock offset, Temps stay around <55C. As soon as I OC Mem to even 400 temps go above 60C.
2. Core won't OC above 135/2151 with a minor Mem clock offset of 300 on the 375.63 driver, however as I showed I can get 125/2139 and a +700 Mem clock to run on the 372.70 drivers from September. Temps again go up to 66C.

I am still tweaking and making mental notes. The temp issue as I see it is stemming from heat bleeding from the VRM section and VRAM down the pcb and heating up the core. I am not sure this is occurring because of the thermal pad contact with the back plate transferring the heat to the pads surrounding the gpu or not however.

I am waiting to see the Gamers Nexus analysis due sometime over the weekend or early next week. I would really like to see a comparative analysis between the OnSemi components and those used by other board makers because it seems to me that the components used by EVGA are running significantly hotter than Asus or MSi etc. But that is merely a hypothesis at this point. Maybe it is just a poorly designed cooling solution, but until that comparison is done no valid statements can be made.

So, to answer your question, the fix provides some better cooling, but no significant gains in OC to the Core and Mem OC is dependant on either a new batch of chips or the drivers. I am not sure this silicone that I have now will respond to being under water or not but will keep you posted.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Did the thermal fix get you even a marginally slightly higher stable O/C? Did you do a "before" and "after" on core temp? My pads are still on "awaiting delivery".
> 
> ==================
> 
> Other issue:
> 
> Is it just me... or am I not the only one who gets really ******* annoyed about why 3dmark runs don't actually show as "basic", "extreme" or "ultra"? or even as "1080", "1440" and "2160""?? You've got to *work it out* your sell by seeing roughly high the score is. But if you're troubleshooting and your scores are wayyyy off like mine; its a pain in the ******* arse. How hard would it be just to put the title of which run it actually was at the top of the ******* page. ffs


You mean like this?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> You mean like this?


Maybe I need to re-install 3dmark.

Also when I am doing custom runs (for GFX only) when I switch off the physics and combined tests (with sliders) they automatically switch themselves back on sometimes. (Anyone else experienced that)?


----------



## Koniakki

Quote:


> Originally Posted by *Dragonsyph*
> 
> This any good?
> 
> 
> 
> 
> Seems pretty low from the cpu.


Core/Mem OC?

For comparitive reasons, below is the highest score I have saved. Don't remember specific core clocks tho in case Valley isn't showing clocks accurately.

I would assume between 2114-2139Mhz.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> On my orginal card, that boosted to 1987 out of the box, I could OC Core to 145/2164Mghz. I could OC Mem clock to +550, if I went to 600 it would artifact. The best performance I got from that one was at +135/+400 or 2130/5400. I got a FS score of just over 17K and Graphics score of 24400. All my Heaven and Valley scores were done on that card. Core temps never were above 70C.
> 
> The new card boosted out of the box at 2012. I have been playing around (not really recording or saving anything yet because I am waiting to convert to Hybrid) and found that:
> 
> 1. If I OC the Core to 135/2151 with no Mem clock offset, Temps stay around <55C. As soon as I OC Mem to even 400 temps go above 60C.
> 2. Core won't OC above 135/2151 with a minor Mem clock offset of 300 on the 375.63 driver, however as I showed I can get 125/2139 and a +700 Mem clock to run on the 372.70 drivers from September. Temps again go up to 66C.
> 
> I am still tweaking and making mental notes. The temp issue as I see it is stemming from heat bleeding from the VRM section and VRAM down the pcb and heating up the core. I am not sure this is occurring because of the thermal pad contact with the back plate transferring the heat to the pads surrounding the GPU or not however.
> 
> I am waiting to see the Gamers Nexus analysis due sometime over the weekend or early next week. I would really like to see a comparative analysis between the OnSemi components and those used by other board makers because it seems to me that the components used by EVGA are running significantly hotter than Asus or MSi etc. But that is merely a hypothesis at this point. Maybe it is just a poorly designed cooling solution, but until that comparison is done no valid statements can be made.
> 
> So, to answer your question, the fix provides some better cooling, but no significant gains in OC to the Core and Mem OC is dependant on either a new batch of chips or the drivers. I am not sure this silicone that I have now will respond to being under water or not but will keep you posted.


That was a really good post mate; quite informative.

I never really tested temps with *before* and *after* with MEMORY O/C's on GPU before.

/\ one thing I have noticed is on my CPU... on a prime95 run it heats up *a lot more quickly* with a higher RAM overclock. I always just assumed this due to a unimpeded flow of data (I.E. less wait time) before getting more data from RAM resulting in less "idle time" at load and more strain on CPU.

Definitely going to test this for myself tonight. One thing I have noticed that may be useful to others is (from mental notes).. at higher temps my GPU MEM O/C seems to max out at +600 (before declines) but at lower temps scores don't drop until +625

I've also noticed that with a traditional +150 (core) it never crashes under 50c but will often crash at temps above 60c. If your experiences are similar I'd definitely be willing to bet we will both be lucky with stability improvements under water. 

Other than that, how much different is the memory on our GPU's compared to normal RAM? Because any RAM I've ever had never ever gets hot at all as long as its got a decent heatsink on it)., Even with my RAM under *EXTREME* strain I've never seen it getting far above 40c and never *ever* has it been hot to touch.

=======================================

*Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
Quote:


> Originally Posted by *OccamRazor*
> 
> Just to get software out of the way, install a fresh OS in another hard drive and just drivers and OC software and run the benchmark again, if it stays the same then it might be the hardware but i think its software! But stay away from the latest nvidia drivers! Occamrazor


Think I might try a fresh O/S install as OccamRazor suggested. 
argh what a pain in the arse though lol....

=======================================

*Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_

Any chance *one of you could do me a favour* please? Run a **stock** FS ULTRA (GFX 1 & 2 only) please? (Should only take about 60s -- switch DEMO off if you want)?

Thanks so much 

Nick Peyton


----------



## Dragonsyph

Quote:


> Originally Posted by *Koniakki*
> 
> Core/Mem OC?
> 
> For comparitive reasons, below is the highest score I have saved. Don't remember specific core clocks tho in case Valley isn't showing clocks accurately.
> 
> I would assume between 2114-2139Mhz.


4.8ghz 1866 ram, my min fps seems way lower.


----------



## Vellinious

Quote:


> Originally Posted by *Koniakki*
> 
> Core/Mem OC?
> 
> For comparitive reasons, below is the highest score I have saved. Don't remember specific core clocks tho in case Valley isn't showing clocks accurately.
> 
> I would assume between 2114-2139Mhz.


CPU core clock will play a large role on the scores in Valley, as well.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> That was a really good post mate; quite informative.
> 
> I never really tested temps with *before* and *after* with MEMORY O/C's on GPU before.
> 
> /\ one thing I have noticed is on my CPU... on a prime95 run it heats up *a lot more quickly* with a higher RAM overclock. I always just assumed this due to a unimpeded flow of data (I.E. less wait time) before getting more data from RAM resulting in less "idle time" at load and more strain on CPU.
> 
> Definitely going to test this for myself tonight. One thing I have noticed that may be useful to others is (from mental notes).. at higher temps my GPU MEM O/C seems to max out at +600 (before declines) but at lower temps scores don't drop until +625
> 
> I've also noticed that with a traditional +150 (core) it never crashes under 50c but will often crash at temps above 60c. If your experiences are similar I'd definitely be willing to bet we will both be lucky with stability improvements under water.
> 
> Other than that, how much different is the memory on our GPU's compared to normal RAM? Because any RAM I've ever had never ever gets hot at all as long as its got a decent heatsink on it)., Even with my RAM under *EXTREME* strain I've never seen it getting far above 40c and never *ever* has it been hot to touch.
> 
> =======================================
> 
> *Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
> Think I might try a fresh O/S install as OccamRazor suggested.
> argh what a pain in the arse though lol....
> 
> =======================================
> 
> *Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
> 
> Any chance *one of you could do me a favour* please? Run a **stock** FS ULTRA (GFX 1 & 2 only) please? (Should only take about 60s -- switch DEMO off if you want)?
> 
> Thanks so much
> 
> Nick Peyton


Intel processors contain the memory controller internally. Overclocking memory, and putting more stress on the IMC, with a greater system agent voltage to help maintain it's stability, will certainly cause higher temps.


----------



## bloot

I think vram bandwidth is the key, this is my [email protected],7 and my [email protected]/12000


----------



## Koniakki

Quote:


> Originally Posted by *bloot*
> 
> I think vram bandwidth is the key, this is my [email protected],7 and my [email protected]/12000


It's known among the Valley "benchers" that Valley just plain loves/benefits from memory OC.

That is the reason I replied to *Dragonsyph* which I assume he had a very high mem OC with his extremely mem OC capable monstrous card!


----------



## bloot

Quote:


> Originally Posted by *Koniakki*
> 
> It's known among the Valley "benchers" that Valley just plain loves/benefits from memory OC.
> 
> That is the reason I replied to *Dragonsyph* which I assume he had a very high mem OC with his extremely mem OC capable monstrous card!


Agree, he should try pushing that memory higher


----------



## Dragonsyph

Some reason i can't get over 5200 score in vally lol, my lowest frames are always in the 30s so idk whats wrong.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Some reason i can't get over 5200 score in vally lol, my lowest frames are always in the 30s so idk whats wrong.


Overclock your CPU higher. If it won't go higher because it's heating up too much, disable hyperthreading and disable all but 2 cores. Now you'll have more headroom with which to overclock those remaining two cores and get a little more clock out of them. That'll help with your Valley score.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> Overclock your CPU higher. If it won't go higher because it's heating up too much, disable hyperthreading and disable all but 2 cores. Now you'll have more headroom with which to overclock those remaining two cores and get a little more clock out of them. That'll help with your Valley scores


Ya its on an h100i and 1+ year old thermal paste and was hitting 80C + at 5ghz so thats why im doing 4.8, but even with 4.8 its hitting 70s. And my h100i slushes all the time like its half full of air now from the fluid evaporating.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya its on an h100i and 1+ year old thermal paste and was hitting 80C + at 5ghz so thats why im doing 4.8, but even with 4.8 its hitting 70s. And my h100i slushes all the time like its half full of air now from the fluid evaporating.


It's a closed system....evaporation shouldn't be an issue.


----------



## nrpeyton

I'll download Valley and do a run on my AMD FX-8350. Should give a good comparison of how CPU affects performance (since my CPU is at least 40% slower on single core than any good, modern intel that you all have).

Seriously my CPU is starting to make me feel like the "poor guy".lol

You's
Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya its on an h100i and 1+ year old thermal paste and was hitting 80C + at 5ghz so thats why im doing 4.8, but even with 4.8 its hitting 70s. And my h100i slushes all the time like its half full of air now from the fluid evaporating.


Maybe pump issue.

They should fix that free of charge. They would probably also send you a new one before you posted yours away.


----------



## Dragonsyph

Ya i ran a test at 5ghz 2133 ram, 2177 core 12012 memory, and it was only 5283, with minum of 44, min went up a tad but still not 5500 like you guys lol.

F vally lol.....

Edit: FS +150 +1000 only got me 25,910, so i lost 400 points lol ***>>>


----------



## Koniakki

Quote:


> Originally Posted by *nrpeyton*
> 
> I'll download Valley and do a run on my AMD FX-8350. Should give a good comparison of how CPU affects performance (since my CPU is at least 40% slower on single core than any good, modern intel that you all have).
> 
> Seriously my CPU is starting to make me feel like the "*poor guy*".lol
> 
> ....


Don't say that.







That's a perfectly fine and capable CPU, especially at 4.8GHz if I remember your clocks correctly.

And really interested to see the Valley score with the [email protected]'s.









Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya i ran a test at 5ghz 2133 ram, 2177 core 12012 memory, and it was only 5283, with minum of 44, min went up a tad but still not 5500 like you guys lol.
> 
> F vally lol.....
> 
> Edit: FS +150 +1000 only got me 25,910, so i lost 400 points lol ***>>>


I don't wanna turn this into Valley benching thread







but you can turn the NVCP image quality settings at "Performance" if you wish and don't mind reapplying all the settings in case you changed them(warning: it will change almost all NVCP 3D Settings) and also run the windows power High Performance preset.

Also I had done a few GTX 1080 FE runs at 2126-2138/10900-11100Mhz and score was between 124.2-126.8(5195-5307 points).

Just for comparison reasons.


----------



## Vellinious

This is the last run I did...was on air cooling, at 2153 / 11000 iirc. The CPU dependence is REALLY holding back the new GPUs in Valley.

Word is, Unigine is working on a new benchmark that will alleviate many of these issues, but.....who knows when it'll be released.

EDIT: Confirmed. https://unigine.com/en/products/benchmarks/superposition


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> This is the last run I did...was on air cooling, at 2153 / 11000 iirc. The CPU dependence is REALLY holding back the new GPUs in Valley.
> 
> Word is, Unigine is working on a new benchmark that will alleviate many of these issues, but.....who knows when it'll be released.
> 
> EDIT: Confirmed. https://unigine.com/en/products/benchmarks/superposition
> 
> SNIP


Thanks for posting this for comparison and the heads up on the link.

Just did like 10 tests.

Found out if i have core at + 140 all the way to +160 core still boosts to exact same clock. So i kept 140 instead of 150.

All these tests are done at + 140 with FS graphics test 1.

Memory.

600 122.59

625 124.96

650 122.97

700 126.57

750 124.3

800 126.84

850 125.02

900 127.58

950 124.78

1000 128.59

Just did both graphics test at +140 +1000 Boosted to 2164, 12012 memory.

My new all time HIGH score 8))))))))))))))))


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> That was a really good post mate; quite informative.
> 
> I never really tested temps with *before* and *after* with MEMORY O/C's on GPU before.
> 
> /\ one thing I have noticed is on my CPU... on a prime95 run it heats up *a lot more quickly* with a higher RAM overclock. I always just assumed this due to a unimpeded flow of data (I.E. less wait time) before getting more data from RAM resulting in less "idle time" at load and more strain on CPU.
> 
> Definitely going to test this for myself tonight. One thing I have noticed that may be useful to others is (from mental notes).. at higher temps my GPU MEM O/C seems to max out at +600 (before declines) but at lower temps scores don't drop until +625
> 
> I've also noticed that with a traditional +150 (core) it never crashes under 50c but will often crash at temps above 60c. If your experiences are similar I'd definitely be willing to bet we will both be lucky with stability improvements under water.
> 
> Other than that, how much different is the memory on our GPU's compared to normal RAM? Because any RAM I've ever had never ever gets hot at all as long as its got a decent heatsink on it)., Even with my RAM under *EXTREME* strain I've never seen it getting far above 40c and never *ever* has it been hot to touch.
> 
> =======================================
> 
> *Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
> Think I might try a fresh O/S install as OccamRazor suggested.
> argh what a pain in the arse though lol....
> 
> =======================================
> 
> *Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
> 
> Any chance *one of you could do me a favour* please? Run a **stock** FS ULTRA (GFX 1 & 2 only) please? (Should only take about 60s -- switch DEMO off if you want)?
> 
> Thanks so much
> 
> Nick Peyton


OK, need to clarify things a bit.
I just did a bunch of runs cause something was bothering me. I had the voltge slider all the way up to 100% on Px so wanted to isoalte that as a possible factor in the high temps.
As it turns out I believe that was the issue.
I went back to stock settings and did a run at 0% on the slider and Temps spiked once or twice at 57C, volts were at 1.05 with one or two spikes to 1.062.
Then did one with the slider jacked to 100% and sure enough (doh) temps spiked to 60C about 5 times and volts of course were spiking throughout at 1.093
Moved Core up to +125/2126 and Mem Clock at 0 offset and Voltage back down to 0% and got the same as first run.
Added +500 to Mem Clock, 0% on the Volts and still got the same as first run.
Moved Mem Clock up to +700 and still 0% on the Volts and highest Temps ever got was 59C. Voltage was still 1.05-1.062.

http://www.3dmark.com/3dm/16116656

So, it would seem for this card anyway, running the volts at 1.093 only gives me higher temps (66C) with no performance gain (The score above is a little less than the one I got with 100% volts/1.093 http://www.3dmark.com/3dm/16108329 but I am not convinced that it contributed to the score, it was normal variability.)

Not sure what the T4 would do for me if anything.

PS I run a custom curve where the fans are at 80% at 40C and 100% at 60C on the new bios.


----------



## Derek1

Geez, you guys and your 1080p.
I refuse to run that, I just won't!
I can't.
lol


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Geez, you guys and your 1080p.
> I refuse to run that, I just won't!
> I can't.
> lol


LOL my 4k monitor took a dump and im stuck with a vga 1080p monitor. Gotta save up again to buy a another HAHAHAHAH,, 1080p VGA< like what is this 1995.

Want a x34 predator but 1300 bucks gonna take me a while to save.

Says i cant even use K boost in evga precision because im on VGA lol


----------



## Koniakki

Quote:


> Originally Posted by *Dragonsyph*
> 
> Thanks for posting this for comparison and the heads up on the link.
> 
> Just did like 10 tests.
> 
> Found out if i have core at + 140 all the way to +160 core still boosts to exact same clock. So i kept 140 instead of 150.
> 
> All these tests are done at + 140 with FS graphics test 1.
> 
> Memory.
> 600 122.59
> 625 124.96
> 650 122.97
> 700 126.57
> 750 124.3
> 800 126.84
> 850 125.02
> 900 127.58
> 950 124.78
> 1000 128.59
> 
> Just did both graphics test at +140 +1000 Boosted to 2164, 12012 memory.
> 
> My new all time HIGH score 8))))))))))))))))


A-wesome!









Quote:


> Originally Posted by *Derek1*
> 
> OK, need to clarify things a bit.
> I just did a bunch of runs cause something was bothering me. I had the voltge slider all the way up to 100% on Px so wanted to isoalte that as a possible factor in the high temps.
> As it turns out I believe that was the issue.
> I went back to stock settings and did a run at 0% on the slider and Temps spiked once or twice at 57C, volts were at 1.05 with one or two spikes to 1.062.
> Then did one with the slider jacked to 100% and sure enough (doh) temps spiked to 60C about 5 times and volts of course were spiking throughout at 1.093
> Moved Core up to +125/2126 and Mem Clock at 0 offset and Voltage back down to 0% and got the same as first run.
> Added +500 to Mem Clock, 0% on the Volts and still got the same as first run.
> Moved Mem Clock up to +700 and still 0% on the Volts and highest Temps ever got was 59C. Voltage was still 1.05-1.062.
> 
> http://www.3dmark.com/3dm/16116656
> 
> So, it would seem for this card anyway, running the volts at 1.093 only gives me higher temps (66C) with no performance gain (The score above is a little less than the one I got with 100% volts/1.093 http://www.3dmark.com/3dm/16108329 but I am not convinced that it contributed to the score, it was normal variability.)
> 
> Not sure what the T4 would do for me if anything.
> 
> PS I run a custom curve where the fans are at 80% at 40C and 100% at 60C on the new bios.


I have noticed this as well with my testing with 4 different GTX 1080's(1 FE, 3x customs) so far. But only referring to gaming(e.g ROTTR/FC4 etc).

I usually get the best fps while at 0% extra volts(1.062v) either by offset or curve OC.

Quote:


> Originally Posted by *Derek1*
> 
> Geez, you guys and your 1080p.
> I refuse to run that, I just won't!
> I can't.
> lol


lol!


----------



## nrpeyton

AMD FX-8350 @ 4.8Ghz <- prime95 stable
1920mhz RAM

1080: @ 2100 (+100)
and MEM + 500

Unigine Valley Benchmark 1.0

FPS: *89.0*
Score: *3723*
Min FPS: *26.7*
Max FPS: *142.1*

System
Platform: *Windows 8 (build 9200) 64bit*
CPU model: *AMD FX(tm)-8350 Eight-Core Processor (4800MHz) x4*
GPU model: *NVIDIA GeForce GTX 1080 21.21.13.7586 (4095MB) x1*

Settings
Render: *Direct3D11*
Mode: *1920x1080 8xAA fullscreen*
Preset: *Extreme HD*


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> AMD FX-8350 @ 4.8Ghz <- prime95 stable
> 1920 RAM
> 
> 1080: @ 2100 (+100)
> and MEM + 500
> 
> Unigine Valley Benchmark 1.0
> 
> FPS: *89.0*
> Score: *3723*
> Min FPS: *26.7*
> Max FPS: *142.1*
> 
> System
> Platform: *Windows 8 (build 9200) 64bit*
> CPU model: *AMD FX(tm)-8350 Eight-Core Processor (4800MHz) x4*
> GPU model: *NVIDIA GeForce GTX 1080 21.21.13.7586 (4095MB) x1*
> 
> Settings
> Render: *Direct3D11*
> Mode: *1920x1080 8xAA fullscreen*
> Preset: *Extreme HD*


HOLY CRAP LOL, so with that cpu you got 3723 score vs a 4790k -6700k getting 5200-5500?


----------



## Derek1

Was wondering if anyone has had problem with Time Spy? I can't get it to complete a run. It errors out just as it is loading Graphics Test 1 with no explanation, just the 'ooops an error occurred".Then gives me a valid 0, lol.
Do I need to pay for it?


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Was wondering if anyone has had problem with Time Spy? I can't get it to complete a run. It errors out just as it is loading Graphics Test 1 with no explanation, just the 'ooops an error occurred".Then gives me a valid 0, lol.
> Do I need to pay for it?


I dont like that benchmark, GPU usage goes down to 60%, and the demo videos are like 20 years long.

A quick tip for FS though, if you use a key gen, put in a key, then turn off demo mode, then unregister 3dmark, you can now run them with no demo and upload scores to the website. 8).


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> OK, need to clarify things a bit.
> I just did a bunch of runs cause something was bothering me. I had the voltge slider all the way up to 100% on Px so wanted to isoalte that as a possible factor in the high temps.
> As it turns out I believe that was the issue.
> I went back to stock settings and did a run at 0% on the slider and Temps spiked once or twice at 57C, volts were at 1.05 with one or two spikes to 1.062.
> Then did one with the slider jacked to 100% and sure enough (doh) temps spiked to 60C about 5 times and volts of course were spiking throughout at 1.093
> Moved Core up to +125/2126 and Mem Clock at 0 offset and Voltage back down to 0% and got the same as first run.
> Added +500 to Mem Clock, 0% on the Volts and still got the same as first run.
> Moved Mem Clock up to +700 and still 0% on the Volts and highest Temps ever got was 59C. Voltage was still 1.05-1.062.
> 
> http://www.3dmark.com/3dm/16116656
> 
> So, it would seem for this card anyway, running the volts at 1.093 only gives me higher temps (66C) with no performance gain (The score above is a little less than the one I got with 100% volts/1.093 http://www.3dmark.com/3dm/16108329 but I am not convinced that it contributed to the score, it was normal variability.)
> 
> Not sure what the T4 would do for me if anything.
> 
> PS I run a custom curve where the fans are at 80% at 40C and 100% at 60C on the new bios.


Sometimes when I run a +150 'traditional' when the room is really hot (28c - 30c ambient) it crashes 3/5 times. If ambient is 18c (sitting inside with Jacket on) it never crashes.

*HOWEVER:*

If I add a bit voltage that stops it crashing BUT scores DON'T go up.

If I use the curve and/or voltage tool to *force* voltage and 2177-2202mhz score DOES go up 

I think there *IS* definitely a place for voltage; but it must be very very carefully implemented 

Not useful for anything except benching or gaining validations for card re-sale value 

P.S. memory voltage (from tool) got me NO extra memory stability.
This MAY change when I get my pads and go water.....


----------



## Valtava

Quote:


> Originally Posted by *juniordnz*
> 
> Thanks, prophet.
> 
> Good for us we bought from one of the best CS on the market.







1 more EVGA burnd down, more to come...

I mean.. Don't wanna upset those who have EVGA acx 3.0 pascals but this is really bad. I mean really really bad.


----------



## nrpeyton

Quote:


> Originally Posted by *Valtava*
> 
> 
> 
> 
> 
> 
> 1 more EVGA burnd down, more to come...
> 
> I mean.. Don't wanna upset those who have EVGA acx 3.0 pascals but this is really bad. I mean really really bad.


U sure that's the right video? All I watched was a guy losing video signal??

I even watched it again to make sure, same again and never seen any flames lol


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> HOLY CRAP LOL, so with that cpu you got 3723 score vs a 4790k -6700k getting 5200-5500?


yeah mate and core 0 on HWINFO64 is reading around 85% utilisation while all others (cores 1-7) are 10% -15% utilisation


----------



## bloot

Quote:


> Originally Posted by *nrpeyton*
> 
> U sure that's the right video? All I watched was a guy losing video signal??
> 
> I even watched it again to make sure, same again and never seen any flames lol


00:00:13 pay attention to his pc


----------



## nrpeyton

Quote:


> Originally Posted by *bloot*
> 
> 00:00:13 pay attention to his pc


I did I seen a slight *flash* as power went off lol


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> yeah mate and core 0 on HWINFO64 is reading around 85% utilisation while all others (cores 1-7) are 10% -15% utilisation


Geeze. I really wanted a 6800k but can't afford one, probably wait tell the new skylake ones come out, 7800k or some crap. And this 1866 ram 2133 OC is pretty pathetic now days. A nice 6 core with some 3000-4000 mhz ddr4 would be awesome. Just cost so much money lol..


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> Geeze. I really wanted a 6800k but can't afford one, probably wait tell the new skylake ones come out, 7800k or some crap. And this 1866 ram 2133 OC is pretty pathetic now days. A nice 6 core with some 3000-4000 mhz ddr4 would be awesome. Just cost so much money lol..


If it wasn't for the "AMD Crowd" all you guys wouldn't just be paying $700 for a new Nvidia 1080 you'd be getting arse raped by Evil Nvidia for about $2000.

I just wish AMD would catch up a bit more then maybe markets would actually improve. Coz £700 is still an arse raping.

And if you STILL buy an intel CPU after ZEN comes out then you DESERVE the black willy.

Really--- after ZEN there is no real benefit to paying £1000+ for CPU when all you do is game and benchmark.

Seriously; if you can't go AMD for your GPU I understand -- but for christ sake at least give them a chance when it comes to your CPU.

Give them some money and maybe we will see them catch up on GPU too. Then we will be in a much better place.

Honestly guys; imagine a world where it was actually a difficult decision deciding whether to go for a Nvidia or AMD GPU. Guaranteed you'd be paying at least half what your paying now.


----------



## Coopiklaani

A month after power mod with Liquid metal Pro. NOT A SAVE MOD!
The solder under the shunt resistor gets totally dissolved. Is RMA still possible?


----------



## AllGamer

Quote:


> Originally Posted by *Coopiklaani*
> 
> A month after power mod with Liquid metal Pro. NOT A SAVE MOD!
> The solder under the shunt resistor gets totally dissolved. Is RMA still possible?


I seriously doubt it, that kind of Mod, they can see it with the naked eye during RMA inspection.


----------



## Coopiklaani

Quote:


> Originally Posted by *AllGamer*
> 
> I seriously doubt it, that kind of Mod, they can see it with the naked eye during RMA inspection.


What's if I clear the liquid metal and claim the shunt resistor just fell off by itself?


----------



## Vellinious

Quote:


> Originally Posted by *Coopiklaani*
> 
> What's if I clear the liquid metal and claim the shunt resistor just fell off by itself?


If they inspect the PCB, which I'm sure they will, it's going to be pretty apparent what happened..... I wouldn't count on getting anything back from that. They may be able to repair it for you, but....I'd also guess it would be at your expense.


----------



## nrpeyton

Quote:


> Originally Posted by *AllGamer*
> 
> I seriously doubt it, that kind of Mod, they can see it with the naked eye during RMA inspection.


If they had a bit ******* sense and realised they were SERIOUSLY liming their own GPU's with a silly 5-phase VRM -- I mean come on to **** obviously ppl are going try and get around it -- when they are selling GPU's capable of going much faster and the only thing stopping is a power limit.
Quote:


> Originally Posted by *Coopiklaani*
> 
> What's if I clear the liquid metal and claim the shunt resistor just fell off by itself?


why don't you just solder it back on?!?!

Alternatively you could own up to it and tell them the truth; and ask if they would repair it for you at a small charge?

** was it the extra heat or the liquid metal actually dissolved it??


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> HOLY CRAP LOL, so with that cpu you got 3723 score vs a 4790k -6700k getting 5200-5500?


CPU means EVERYTHING in Valley. It's an absolutely horrible GPU benchmark.
Quote:


> Originally Posted by *nrpeyton*
> 
> If they had a bit ******* sense and realised they were SERIOUSLY liming their own GPU's with a silly 5-phase VRM -- I mean come on to **** obviously ppl are going try and get around it -- when they are selling GPU's capable of going much faster than a power limit.
> why don't you just solder it back on?!?!


A 5 phase VRM is not limiting these cards in any way, shape or form. The VRM regulates voltage. The power limit is an assigned value in the bios. The shunt mod gets around the bios limited power limits.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> If they had a bit ******* sense and realised they were SERIOUSLY liming their own GPU's with a silly 5-phase VRM -- I mean come on to **** obviously ppl are going try and get around it -- when they are selling GPU's capable of going much faster and the only thing stopping is a power limit.
> why don't you just solder it back on?!?!
> 
> Alternatively you could own up to it and tell them the truth; and ask if they would repair it for you at a small charge?
> 
> ** was it the extra heat or the liquid metal actually dissolved it??


I will solder it back on if I couldn't get a RMA. I'll give RMA a try first.
My guess is the current over the liquid metal cos it to react and to bind with the solder.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> CPU means EVERYTHING in Valley. It's an absolutely horrible GPU benchmark.
> A 5 phase VRM is not limiting these cards in any way, shape or form. The VRM regulates voltage. The power limit is an assigned value in the bios. The shunt mod gets around the bios limited power limits.


Yeah but to keep it within safe tolerances (knowing ppl will run furmark) -- I mean look at whats just happened with EVGA... they *have* to set these LOW power limits in BIOS to protect themselves.

That's why custom cards with more phases on the VRM have higher power limits 

But yeah I still admit I did get a bit carried away there lol


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Yeah but to keep it within safe tolerances (knowing ppl will run furmark) -- I mean look at whats just happened with EVGA... they *have* to set these LOW power limits in BIOS to protect themselves.
> 
> That's why custom cards with more phases on the VRM have higher power limits
> 
> But yeah I still admit I did get a bit carried away there lol


Quote:


> Originally Posted by *nrpeyton*
> 
> Yeah but to keep it within safe tolerances (knowing ppl will run furmark) -- I mean look at whats just happened with EVGA... they *have* to set these LOW power limits in BIOS to protect themselves.
> 
> That's why custom cards with more phases on the VRM have higher power limits
> 
> But yeah I still admit I did get a bit carried away there lol


5-Phase is not the reason for power limit. 10-Phase, 14-Phase, these are just marketing, cos the only things after market cards can do is to have more phases, dual bios and beefier cooler.
I use my FE card with T4 BIOS. And I put a thermal couple on the back of the VRM area secured with thermal pad.
This is how hot the VRM is after 30mins of valley. It draws about 260 to 280w when running valley.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> 5-Phase is not the reason for power limit. 10-Phase, 14-Phase, these are just marketing, cos the only things after market cards can do is to have more phases, dual bios and beefier cooler.
> I use my FE card with T4 BIOS. And I put a thermal couple on the back of the VRM area secured with thermal pad.


Hmm very informative post indeed. Rep+1

What do you think your 'learning' from this is then? And for the rest of us doing the mod? (As I was thinking about it at one stage although I chickened out)...

use less liquid metal? (i.e don't use too much so that it spreads over to the bottom of the resistor where the solder is) ?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Hmm very informative post indeed. Rep+1
> 
> What do you think your 'learning' from this is then? And for the rest of us doing the mod? (As I was thinking about it at one stage although I chickened out)...
> 
> use less liquid metal? (i.e don't use too much so that it spreads over to the bottom of the resistor where the solder is) ?


DON'T USE LIQUID METAL. It eats through metal. It ruins the power reading. It can cause problems.
T4 BIOS is the best solution so far if you don't care losing one DP port.
I don't see any more reason to use this power mod over T4 bios. I did mine before T4 bios didn't bother to remove it when T4 came out.


----------



## nrpeyton

*BEAT MY SCORE - Firestrike Ultra Graphics 1 & 2 only*

http://www.3dmark.com/3dm/16082095

*All settings default except Physics & Combined tests switched off.*


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> *BEAT MY SCORE - Firestrike Ultra Graphics 1 & 2 only*
> 
> http://www.3dmark.com/3dm/16082095
> 
> *All settings default except Physics & Combined tests switched off.*


WoW, what's your max boost freq?


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> *BEAT MY SCORE - Firestrike Ultra Graphics 1 & 2 only*
> 
> http://www.3dmark.com/3dm/16082095
> 
> *All settings default except Physics & Combined tests switched off.*


Id try if i could run that benchmark


----------



## nrpeyton

max is 2202mhz roughly (semi-stable) with some careful curve-work and applying voltage in the right place (enough to get a validation anyway...

I have an EVGA 1080 Classified.

BTW while I'm here; I was thinking of using Conductanout Liquid Metal with this GPU and either an EK or Alpha-cool full-cover block.

Will the liquid metal eat away at the block or eat away the top of my 1080 'chip'? (IHS??!?!)

Quote:


> Originally Posted by *Dragonsyph*
> 
> Id try if i could run that benchmark


you tried re-installing?


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> max is 2202mhz roughly (semi-stable) with some careful curve-work and applying voltage in the right place (enough to get a validation anyway...
> 
> I have an EVGA 1080 Classified.
> 
> BTW while I'm here; I was thinking of using Conductanout Liquid Metal with this GPU and either an EK or Alpha-cool full-cover block.
> 
> Will the liquid metal eat away at the block or eat away the top of my 1080 'chip'? (IHS??!?!)
> you tried re-installing?


Ill try downloading a key genny.

Just did a 3dmark 11 lol, free version. Got alot of coil whine.



Oh and btw on the liquid pro, iv used it alot and it degrades even copper changing it colors, and those tiny things on the PCB around the gpu core might get screwed from it from a leak.


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ill try downloading a key genny.
> 
> Oh and btw on the liquid pro, iv used it alot and it degrades even copper changing it colors, and those tiny things on the PCB around the gpu core might get screwed from it from a leak.


I would be using special electrical tape around the components incase of leak 

also check ur private mail


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *BEAT MY SCORE - Firestrike Ultra Graphics 1 & 2 only*
> 
> http://www.3dmark.com/3dm/16082095
> 
> *All settings default except Physics & Combined tests switched off.*


Looks like it bugged on GS2. Frames should be down around 22 or 23.


----------



## Dragonsyph

Test two started at freaking 16 FPS LOLOLOLOLOL>


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Looks like it bugged on GS2. Frames should be down around 22 or 23.


http://www.3dmark.com/3dm/16080888

http://www.3dmark.com/3dm/16081066

http://www.3dmark.com/3dm/16081308

http://www.3dmark.com/3dm/16081441

http://www.3dmark.com/3dm/16081537

http://www.3dmark.com/3dm/16081633

http://www.3dmark.com/3dm/16081892


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> 
> 
> 
> Test two started at freaking 16 FPS LOLOLOLOLOL>


Yup, that's about right.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> http://www.3dmark.com/3dm/16080888
> 
> http://www.3dmark.com/3dm/16081066
> 
> http://www.3dmark.com/3dm/16081308
> 
> http://www.3dmark.com/3dm/16081441
> 
> http://www.3dmark.com/3dm/16081537
> 
> http://www.3dmark.com/3dm/16081633
> 
> http://www.3dmark.com/3dm/16081892


All bugged....they'd likely validate online, but....they're definitely bugged.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> max is 2202mhz roughly (semi-stable) with some careful curve-work and applying voltage in the right place (enough to get a validation anyway...
> 
> I have an EVGA 1080 Classified.
> 
> BTW while I'm here; I was thinking of using Conductanout Liquid Metal with this GPU and either an EK or Alpha-cool full-cover block.
> 
> Will the liquid metal eat away at the block or eat away the top of my 1080 'chip'? (IHS??!?!)
> you tried re-installing?


I can run 2.2GHz stable with no problem. but there's no way near to 8000FSU graphics score!


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> All bugged....they'd likely validate online, but....they're definitely bugged.


Thank god; because the other night I was getting 7900 --> 8000+ all night benching away using voltage tool on memory.

Then suddenly I ran a test and it was only at 5000+ so I just assumed I'd pushed her too hard.

But when I backed the O/C off I still only hit 5000+

I was worried I'd broken something or shorted her out while I was probing with multimeter.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Thank god; because the other night I was getting 7900 --> 8000+ all night benching away using voltage tool on memory.
> 
> Then suddenly I ran a test and it was only at 5000+ so I just assumed I'd pushed her too hard.
> 
> But when I backed the O/C off I still only hit 5000+
> 
> I was worried I'd broken something or shorted her out while I was probing with multimeter.


Nah...GS2, at least recently, has had a tendency to just bug out, rather than crash.


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> Thank god; because the other night I was getting 7900 --> 8000+ all night benching away using voltage tool on memory.
> 
> Then suddenly I ran a test and it was only at 5000+ so I just assumed I'd pushed her too hard.
> 
> But when I backed the O/C off I still only hit 5000+
> 
> I was worried I'd broken something or shorted her out while I was probing with multimeter.


Lol thought mine was broke becasue i only got 6075 and not 7-8k.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Lol thought mine was broke becasue i only got 6075 and not 7-8k.


lol, that's actually normal.


----------



## nrpeyton

great, thanks for clearing that up everyone...

btw I did ask nicely a few times if someone would mind running a FS Ultra 1 & 2 for me -- and nobody would help.. so I figured if I make it competitive I'd grab everyones attention -- and it worked.. gee thanks guys.. lol

Quote:


> Thank god; because the other night I was getting 7900 --> 8000+ all night benching away using voltage tool on memory.
> 
> Then suddenly I ran a test and it was only at 5000+ so I just assumed I'd pushed her too hard.
> 
> But when I backed the O/C off I still only hit 5000+
> 
> I was worried I'd broken something or shorted her out while I was probing with multimeter.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> great, thanks for clearing that up everyone...
> 
> btw I did ask nicely a few times if someone would mind running a FS Ultra 1 & 2 for me -- and nobody would help.. so I figured if I make it competitive I'd grab everyones attention -- and it worked.. gee thanks guys.. lol


I didn't even run it. It's just easy to spot when GS2 bugs out. The frame rates will be really close to the frames they get in GS1.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I didn't even run it. It's just easy to spot when GS2 bugs out. The frame rates will be really close to the frames they get in GS1.


I am very new to this entire scene, I only ever ran 3dmark for the first time ever, last month 

I used to just game; but since I descovered this thread and read information on Classified's and Kingpin Editions and Overclocking etc etc I became fasincated.

I've had this card for 7 weeks now and not even spent 30 minutes gaming.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> I am very new to this entire scene, I only ever ran 3dmark for the first time ever, last month
> 
> I used to just game; but since I descovered this thread and read information on Classified's and Kingpin Editions and Overclocking etc etc I became fasincated.
> 
> I've had this card for 7 weeks now and not even spent 30 minutes gaming.


So how much you pushed your ram with the classified voltage tool?
3Dmark is the best game ever!


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> So how much you pushed your ram with the classified voltage tool?
> 3Dmark is the best game ever!


glad you asked; it didn't seem to get me any extra (but then when I was doing it I was getting bugged results on FS as you've all just seen) so I will have another go tomorrow night and report back.

I pushed the voltage on memory from 1.37 up to 1.5v (verified using multimeter)

I think 625 was my highest score -- but as I said i'll have another go tomorrow 

I seemed to get 625 with ambient at 18 and 600 with higher ambients 28-30c so it could be temps holding me back more than voltage <<< refering to max scores not *highest without crash*


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> glad you asked; it didn't seem to get me any extra (but then when I was doing it I was getting bugged results on FS as you've all just seen) so I will have another go tomorrow night and report back.
> 
> I pushed the voltage on memory from 1.37 up to 1.5v (verified using multimeter)
> 
> I think 625 was my highest score -- but as I said i'll have another go tomorrow
> 
> I seemed to get 625 with ambient at 18 and 600 with higher ambients 28-30c so it could be temps holding me back more than voltage <<< refering to max scores not *highest without crash*


I see ppl get +1000 with the voltage tool. +625 doesn't seem much to me with that extra voltage. How much can you push on default voltage?
I can only push my vram to +550 without losing performance.


----------



## Dragonsyph

Quote:


> Originally Posted by *Coopiklaani*
> 
> I see ppl get +1000 with the voltage tool. +625 doesn't seem much to me with that extra voltage. How much can you push on default voltage?
> I can only push my vram to +550 without losing performance.


I get +1000 with no voltage to memory besdies stock.


----------



## Dragonsyph

+140 core +995 memory = 2164 / 6000

New record for me.


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> I get +1000 with no voltage to memory besdies stock.


the 625 was with default, okay i'll have a quick run now and see if it does anything lol then i'm going to bed.

i'll have more time to do it properly tomorrow, its 2:30am here lol


----------



## Coopiklaani

Quote:


> Originally Posted by *Dragonsyph*
> 
> I get +1000 with no voltage to memory besdies stock.


Yeah, it was you I saw with +1000 mem. most ppl get about +500 and start to lose performance if pushed higher. You must have gold vrams!


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I see ppl get +1000 with the voltage tool. +625 doesn't seem much to me with that extra voltage. How much can you push on default voltage?
> I can only push my vram to +550 without losing performance.


Right here we go... okay this is a VERY PLEASANT surprise (I was astonished when it finished)

Also had to restart my machine because the 1st time I clicked *apply* with voltage tool my system crashed.

2nd time system was 100% GOOD and I actually got a higher score.

SCORES:

*+625 memory (stock 1.37v) 5 639* http://www.3dmark.com/3dm/16122136
*+650 memory (stock 1.37v) 5 747* http://www.3dmark.com/3dm/16122445
*+675 memory (stock 1.37v) 5 632* http://www.3dmark.com/3dm/16122267
*+700 memory (1.5v) 5 812* http://www.3dmark.com/3dm/16122499

Edit: Bed time now because its 3am here; I'll do a bit more testing tomorrow after work and report findings 
At least now I know how to identify a buggy run (last night I was unaware) so going forward; this post & also tomorrow onwards should be completely legitimate scores 

Edit 2:
Ran another test at +750, couldn't resist. Only got 5 714.
So its looking like the extra memory voltage is only getting me an extra 50mhz (650 to 700).

Edit 3:
*+725 memory (1.5v) 5842* http://www.3dmark.com/3dm/16122774


----------



## OccamRazor

Quote:


> Originally Posted by *Coopiklaani*
> 
> A month after power mod with Liquid metal Pro. NOT A SAVE MOD!
> The solder under the shunt resistor gets totally dissolved. Is RMA still possible?


No, all solder points and changes you made to the pcb will be visible to their inspection, Really!
Quote:


> Originally Posted by *nrpeyton*
> 
> If they had a bit ******* sense and realised they were SERIOUSLY liming their own GPU's with a silly 5-phase VRM -- I mean come on to **** obviously ppl are going try and get around it -- when they are selling GPU's capable of going much faster and the only thing stopping is a power limit.
> why don't you just solder it back on?!?!
> 
> Alternatively you could own up to it and tell them the truth; and ask if they would repair it for you at a small charge?
> 
> ** was it the extra heat or the liquid metal actually dissolved it??


Not heat but: Gallium!

Quote:


> Originally Posted by *Coopiklaani*
> 
> DON'T USE LIQUID METAL. It eats through metal. It ruins the power reading. It can cause problems.
> T4 BIOS is the best solution so far if you don't care losing one DP port.
> I don't see any more reason to use this power mod over T4 bios. I did mine before T4 bios didn't bother to remove it when T4 came out.


Gallium is part of most if not all liquid metal compounds and its corrosive to all metals except tantalum and tungsten! And has a special "appetite" for Aluminium!

Cheers

Occamrazor


----------



## OccamRazor

Quote:


> Originally Posted by *Coopiklaani*
> 
> 5-Phase is not the reason for power limit. 10-Phase, 14-Phase, these are just marketing, cos the only things after market cards can do is to have more phases, dual bios and beefier cooler.
> I use my FE card with T4 BIOS. And I put a thermal couple on the back of the VRM area secured with thermal pad.
> This is how hot the VRM is after 30mins of valley. It draws about 260 to 280w when running valley.


I can confirm this, its not the number of phases but the components rated capability of converting current, more phases just distributes the load and can be more effective in "smothening" the current fed to the GPU (overall are better than the reference VRM´s of course, as some also have higher rated parts) BUT more important is the overall VRM´s effective cooling (with emphasis on the mosfets (rated from 95ºC to 105ºC depending on the brand and make)! Golden rule of semiconductors: Less 10C doubles its life time! HEAT KILLS), i had my OG Titans at 1,[email protected] for over 2 years with EK blocks in a custom loop and temps never went over 60C while gaming and never had a problem!
My 1080 seahawk (reference board) with the T4 bios never goes above 43ºC on the GPU (even with 1,17V) and with a infrared thermometer temps never go above 55ºC on the back of the card at all VRM points and memory, assuming a 10ºC increase to the front of the card, this goes inline with guru3d review, 66.5ºC reading with a 10K euros FLIR camera.

Cheers

Occamrazor


----------



## feznz

Quote:


> Originally Posted by *Coopiklaani*
> 
> A month after power mod with Liquid metal Pro. NOT A SAVE MOD!
> The solder under the shunt resistor gets totally dissolved. Is RMA still possible?


If it were me I would solder it back on, probably just need to melt and clean off the old solder on shunt and PCB could be up and running in 10min

+1 for teaching me that Liquid metal dissolves Aluminium AND Solder.


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> That was a really good post mate; quite informative.
> 
> I never really tested temps with *before* and *after* with MEMORY O/C's on GPU before.
> 
> /\ one thing I have noticed is on my CPU... on a prime95 run it heats up *a lot more quickly* with a higher RAM overclock. I always just assumed this due to a unimpeded flow of data (I.E. less wait time) before getting more data from RAM resulting in less "idle time" at load and more strain on CPU.
> 
> Definitely going to test this for myself tonight. One thing I have noticed that may be useful to others is (from mental notes).. at higher temps my GPU MEM O/C seems to max out at +600 (before declines) but at lower temps scores don't drop until +625
> 
> I've also noticed that with a traditional +150 (core) it never crashes under 50c but will often crash at temps above 60c. If your experiences are similar I'd definitely be willing to bet we will both be lucky with stability improvements under water.
> 
> Other than that, how much different is the memory on our GPU's compared to normal RAM? Because any RAM I've ever had never ever gets hot at all as long as its got a decent heatsink on it)., Even with my RAM under *EXTREME* strain I've never seen it getting far above 40c and never *ever* has it been hot to touch.
> 
> =======================================
> 
> *Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
> *Think I might try a fresh O/S install as OccamRazor suggested.
> argh what a pain in the arse though lol...*.
> 
> =======================================
> 
> *Next Post* _(to save me taking up too much of your screen-space I'll edit this post instead)_
> 
> Any chance *one of you could do me a favour* please? Run a **stock** FS ULTRA (GFX 1 & 2 only) please? (Should only take about 60s -- switch DEMO off if you want)?
> 
> Thanks so much
> 
> Nick Peyton


Nah, i bought a Toshiba 240Gb SSD a couple days ago (was dirty cheap,







) and installed windows 10 on it and it was under 1/2 hour to have OS and drivers plus Afterburner working!









Cheers

Occamrazor


----------



## 66racer

Quote:


> Originally Posted by *juniordnz*
> 
> EVGA stated that a few cards sold before september had a faulty VRM module that was causing some cards to die randomly (black screen). But that was fixed since september. If I remember the numbers right it was 3% of all sold cards before september that had that problem.


Ah that explains it! Makes sense, I have had mine I think 3 months now and been running good. Doesnt overclock past 2050mhz but whatever. Im on 1080p/120hz anyways but watching out for black friday sales on a 1440p/144hz. I agree though, I know EVGA will at least take care of me if anything happens to it.

Thanks


----------



## Koniakki

New Hotfix drivers for the previous bugged GeForce 375.86!

GeForce 375.95 Hotfix driver download

Quote:


> Originally Posted by *OccamRazor*
> 
> Nah, i bought a Toshiba 240Gb SSD a couple days ago (was dirty cheap,
> 
> 
> 
> 
> 
> 
> 
> ) and installed windows 10 on it and it was under 1/2 hour to have OS and drivers plus Afterburner working!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Occam does this thread starts to remind you our 'ol good 980TI thread with Sky or is it just me?


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> New Hotfix drivers for the previous bugged GeForce 375.86!
> 
> GeForce 375.95 Hotfix driver download
> Occam does this thread starts to remind you our 'ol good 980TI thread with Sky or is it just me?


Yes it does my Friend, yes it does! Its a pitty that my Brother Skyn3t is full of work right now, we both got the 1080 Seahawk and fiddled with it for a while, after a headsup earlier in the game from some "LN2 heavyweight" Friends, Quoting from my Brother: "No fun in OC Pascal at all, crap GPU!"








Really no fun like we had with Kepler and Maxwell, even if we get the bios hacked we are not going anywhere with this architecture, even the next, which i suspect will be just a node shrink to 14mn and again streched speeds and again even more limited OC... Anyway wish my Brother Sky was here too!









Cheers

Occamrazor


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Right here we go... okay this is a VERY PLEASANT surprise (I was astonished when it finished)
> 
> Also had to restart my machine because the 1st time I clicked *apply* with voltage tool my system crashed.
> 
> 2nd time system was 100% GOOD and I actually got a higher score.
> 
> SCORES:
> 
> *+625 memory (stock 1.37v) 5 639* http://www.3dmark.com/3dm/16122136
> *+650 memory (stock 1.37v) 5 747* http://www.3dmark.com/3dm/16122445
> *+675 memory (stock 1.37v) 5 632* http://www.3dmark.com/3dm/16122267
> *+700 memory (1.5v) 5 812* http://www.3dmark.com/3dm/16122499
> 
> Edit: Bed time now because its 3am here; I'll do a bit more testing tomorrow after work and report findings
> At least now I know how to identify a buggy run (last night I was unaware) so going forward; this post & also tomorrow onwards should be completely legitimate scores
> 
> Edit 2:
> Ran another test at +750, couldn't resist. Only got 5 714.
> So its looking like the extra memory voltage is only getting me an extra 50mhz (650 to 700).
> 
> Edit 3:
> *+725 memory (1.5v) 5842* http://www.3dmark.com/3dm/16122774


hmm, so additional voltage only gives like 100MHz more. I kind of expect more


----------



## Derek1

Quote:


> Originally Posted by *Koniakki*
> 
> New Hotfix drivers for the previous bugged GeForce 375.86!
> 
> GeForce 375.95 Hotfix driver download
> Occam does this thread starts to remind you our 'ol good 980TI thread with Sky or is it just me?


Why isn't that driver on the Nvidia site?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Why isn't that driver on the Nvidia site?


Hotfixes usually don't show when we search for new drivers. Although they should. If we put our gfx model there they should display any hotfix avaiable for that model. Nvidia...

Did you notice any problem with the new driver? People talked about a low clock on vram but I haven't noticed here with my FTW. This hotfix targets that vram issue on Pascal cards.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Hotfixes usually don't show when we search for new drivers. Although they should. If we put our gfx model there they should display any hotfix avaiable for that model. Nvidia...
> 
> Did you notice any problem with the new driver? People talked about a low clock on vram but I haven't noticed here with my FTW. This hotfix targets that vram issue on Pascal cards.


Ya you would think that if the Nvidia driver is being released by Nvidia it would be on the site. Silly me for expecting the rational.

I saw reports by people who had used the 375.86 so didn't d/l it myself.
I am on 375.63 right now. And was using 372.70.
I was having issues with 372.90 so got rid of that one the first day.
I will probably wait a few days before trying the new one just in case. I am getting Mem clock up to +800 and with my luck the new hotfix will limit me to +250. lol


----------



## BURGER4life

Quote:


> Originally Posted by *Derek1*
> 
> Why isn't that driver on the Nvidia site?


Quote:


> Originally Posted by *juniordnz*
> 
> Hotfixes usually don't show when we search for new drivers. Although they should. If we put our gfx model there they should display any hotfix avaiable for that model. Nvidia...
> 
> Did you notice any problem with the new driver? People talked about a low clock on vram but I haven't noticed here with my FTW. This hotfix targets that vram issue on Pascal cards.


http://nvidia.custhelp.com/app/answers/detail/a_id/4260

Technically it's on the nvidia site.


----------



## Derek1

Quote:


> Originally Posted by *BURGER4life*
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4260
> 
> Technically it's on the nvidia site.


Ya I suspected as much. Again, silly me for not looking everywhere but the Driver page for it. lol


----------



## pfinch

Quote:


> Originally Posted by *OccamRazor*
> 
> Go here: http://forum.hwbot.org/showthread.php?p=455871#post455871
> 
> It's a bit hit or miss I'm afraid, some have excellent results while others not...
> 
> Cheers
> 
> Occamrazor


Hey Occamrazor,

my AMP Extreme 1080 is still downclocking during TimeSpy... Tomb Raider is stable so far









Can you show me your Afterburner + Curve settings?

Thank you!!


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> hmm, so additional voltage only gives like 100MHz more. I kind of expect more


I never had a lot of time last night mate as it was late; the first time I tried it my scores seemed to max out at 650 but last night they were maxing out at 725 at 1.5v memory.

I'll have another look this weekend anyway and see if I can push past 725

any of you tried the 1080 classified voltage tool on your FTW's memory? I doubt it will work on the core but with memory you never know?

maybe someone more experienced can shed light here --> (i'm assuming the core has a different voltage controller) and i'm probably wrong but is there a chance that the voltage controllers for the *memory* at least, are the same on both evga FTW and classified cards??


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I never had a lot of time last night mate as it was late; the first time I tried it my scores seemed to max out at 650 but last night they were maxing out at 725 at 1.5v memory.
> 
> I'll have another look this weekend anyway and see if I can push past 725
> 
> any of you tried the 1080 classified voltage tool on your FTW's memory? I doubt it will work on the core but with memory you never know?
> 
> maybe someone more experienced can shed light here --> (i'm assuming the core has a different voltage controller) and i'm probably wrong but is there a chance that the voltage controllers for the *memory* at least, are the same on both evga FTW and classified cards??


Different voltage controllers. Won't work.

I've been contemplating getting either a pair of 1080 Classys, or waiting for the 1080ti Classy...thinking that the 780 / 980ti Classy blocks will fit it as well. Think I've decided to just sell those blocks. I'm pretty happy with the FTWs.


----------



## Janes360

My GTX 1080 OC MSI GAMING X OC 2152Mhz Core DDR5X11000Mhz




 http://www.3dmark.com/fs/10704425


----------



## Derek1

New personal best.

http://www.3dmark.com/3dm/16137996
(Damn can't break 25k on Graphics score lol)

+135 Core / +850 Mem / Volts 100% 1.081 Temp-spiked at 50C once.

This is after getting Hybrid Kit installed today. If you remember when I was at 100% volts I was hitting 66C so this is a significant decrease.
The great thing is it only throttled down once and remained constant throughout the run. On air it would throttle down sometimes 4 times.
I was able to get +145 to start up on the Core but about 30 seconds into FS the screen locked. Same at +140. I think if I had the T4 I might get past that. But for now I am rock solid.
Now I need to take a run at 5Ghz on the CPU.


----------



## Janes360

Try reducing memory to + 550 when I have to remember +650 graphical overall score falls below 25,000


----------



## Derek1

Quote:


> Originally Posted by *Janes360*
> 
> Try reducing memory to + 550 when I have to remember +650 graphical overall score falls below 25,000


Will give that a try.
Though I started out at 700 and it kept increasing as I went up by 50 each run so I thought it might eventually get there. lol


----------



## Janes360

Though I started out at +400 next + UP 50


----------



## Derek1

Quote:


> Originally Posted by *Janes360*
> 
> Though I started out at +400 next + UP 50


No, didn't work for me. it dropped to 24562. Was worth a shot though.


----------



## OccamRazor

Quote:


> Originally Posted by *pfinch*
> 
> Hey Occamrazor,
> 
> my AMP Extreme 1080 is still downclocking during TimeSpy... Tomb Raider is stable so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you show me your Afterburner + Curve settings?
> 
> Thank you!!


Here you go one of the profiles i run:



DOOM [email protected] all maxed out on [email protected], never downclocks, will do one at ROTTR too!









Cheers

Occamrazor


----------



## OccamRazor

Quote:


> Originally Posted by *pfinch*
> 
> Hey Occamrazor,
> 
> my AMP Extreme 1080 is still downclocking during TimeSpy... Tomb Raider is stable so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you show me your Afterburner + Curve settings?
> 
> Thank you!!


ROTTR:



@1440p, DX12 all maxed out (except pure hair, a waste of power as i see IMO)









Cheers

Occamrazor


----------



## Krzych04650

Quote:


> Originally Posted by *OccamRazor*
> 
> all maxed out (except pure hair, a waste of power as i see IMO)


?? There is almost no performance hit and difference in hair quality is very significant. In opposite to Hairworks that destroyed Geralt's hair for a small fee of 10 FPS


----------



## OccamRazor

Quote:


> Originally Posted by *Krzych04650*
> 
> ?? There is almost no performance hit and difference in hair quality is very significant. In opposite to Hairworks that destroyed Geralt's hair for a small fee of 10 FPS


I always disable such effects, i look at the game scenario, not at the character itself (if you ask me what outfit Lara was wearing, i have to go back to the game and check it!







) but i understand people want to have the best looking character that will be in front of their eyes for the entire game time!









Cheers

Occamrazor


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> New personal best.
> 
> http://www.3dmark.com/3dm/16137996
> (Damn can't break 25k on Graphics score lol)
> 
> +135 Core / +850 Mem / Volts 100% 1.081 Temp-spiked at 50C once.
> 
> This is after getting Hybrid Kit installed today. If you remember when I was at 100% volts I was hitting 66C so this is a significant decrease.
> The great thing is it only throttled down once and remained constant throughout the run. On air it would throttle down sometimes 4 times.
> I was able to get +145 to start up on the Core but about 30 seconds into FS the screen locked. Same at +140. I think if I had the T4 I might get past that. But for now I am rock solid.
> Now I need to take a run at 5Ghz on the CPU.


I don't get it. My OC is not even near yours on core/mem clock and I make 25300-25400 every time. And that's with stock voltage, offset OC to 2114mhz and +575 on memory. I'm on air too. So that clock drops to 2088-2101 during the tests.

And yours is not the only one. I see a lot of other of cards with higher core/mem that can't score as high as mine.

An there's nothing to do with processor/ram. Because when I had stock 4690k + 1600mhz RAM I scored the same on graphics.

Weird, huh?


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> I don't get it. My OC is not even near yours on core/mem clock and I make 25300-25400 every time. And that's with stock voltage, offset OC to 2114mhz and +575 on memory. I'm on air too. So that clock drops to 2088-2101 during the tests.
> 
> And yours is not the only one. I see a lot of other of cards with higher core/mem that can't score as high as mine.
> 
> An there's nothing to do with processor/ram. Because when I had stock 4690k + 1600mhz RAM I scored the same on graphics.
> 
> Weird, huh?


I am not sure myself. It is certainly puzzling.
I noticed I have Gsync enabled. Does that matter? 4K?
Also, I am running the program off my WD Black not my 850 Evo Pro. Does that matter?
I was wondering if there is a bottleneck someplace but I don't really think there is. Got my 4820K running at 4.7 (quad core does that matter?) Dominator Platinum at 1866 so not sure what else I can do.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> New personal best.
> 
> http://www.3dmark.com/3dm/16137996
> (Damn can't break 25k on Graphics score lol)
> 
> +135 Core / +850 Mem / Volts 100% 1.081 Temp-spiked at 50C once.
> 
> This is after getting Hybrid Kit installed today. If you remember when I was at 100% volts I was hitting 66C so this is a significant decrease.
> The great thing is it only throttled down once and remained constant throughout the run. On air it would throttle down sometimes 4 times.
> I was able to get +145 to start up on the Core but about 30 seconds into FS the screen locked. Same at +140. I think if I had the T4 I might get past that. But for now I am rock solid.
> Now I need to take a run at 5Ghz on the CPU.


Grats bro, how you liking the kit? And is that 50c with the stock fan? I put an old corsair fan on mine and mine hits about 42-44c max. And thats with it connected to the hybrid thing so its low rpm. Id plug it in to my motherboard but like the stock fan the cord is to short.


----------



## Dragonsyph

Anyone use K mode to force max clock entire time? I can't try it since im on a VGA cord.


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> Grats bro, how you liking the kit?


It's terrific.
Simple conversion and great results.









I was considering doing a personal mod by leaving the copper plate off the VRAM based on what Steve at GN found when comparing the Hybrid to the Corsair Seahawk. When he removed the copper plate to use the Seahawk cooler on the FTW he saw that the idle temps were 6C cooler. His FTW stock idled at 36C. Mine idles at 28C, but I replaced the fan and am doing push pull with 2 Corsairs (ML120 and HD120). Also used the GC Extreme. So I think I will just leave the plate now.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> It's terrific.
> Simple conversion and great results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was considering doing a personal mod by leaving the copper plate off the VRAM based on what Steve at GN found when comparing the Hybrid to the Corsair Seahawk. When he removed the copper plate to use the Seahawk cooler on the FTW he saw that the idle temps were 6C cooler. His FTW stock idled at 36C. Mine idles at 28C, but I replaced the fan and am doing push pull with 2 Corsairs (ML120 and HD120). Also used the GC Extreme. So I think I will just leave the plate now.


Ya i put a pretty old corsair fan on mine and temps are around 42-44 gaming all day. Thats with a room temp of 70-80F. Need to get me a fan wire extension if they make such a thing. So i can connect the fan to my motherboard.

You think a fan connect to my h100i unit thing would work? Or do the fans have to be a certain kind to controll the rpm through it? I think you can connect 4 fans to the h100i pump head.

Did the Vram plate come with pre applied paste>?


----------



## nrpeyton

Regarding: Full Water Block Support - EVGA 1080 CLASSIFIED

Right I've just been paid, the *Alpha-cool block is* complete and their email support is asking me for my address as they reserved one especially for me....

BUT here's the dilemma... Grega from *EK* has also been on the emailer asking for measurements/photos of my PCB after I contacted him with pictures from the EVGA website of a guy who managed to get the *EK 780TI Classified block to fit.
*
Grega is claiming there is only one resistor (from visual inspection) that he thinks may be a problem; he seems very very capable and experienced (as if he actually works in a hands-on department in EK himself).

Here's a picture of the photo he sent me:


He wants me to take measurements and email back.

*Really not sure which route to go down now; I don't want to waste anyone's time. And I can't really afford to buy both.*

I feel I have an obligation to go with Alpha-cool because they "saved" us after EK officially announced they weren't supporting the 1080 Classified (and I've worked hard to try and spread the word so other Classified owners could benefit - Alphacool is even aware of this too).

Here's a picture of the Alphacool block taken from their manual (emailed to me by their support 2 weeks ago when I asked for a "teaser").



Two months ago Classified owners were worried disappointed at NO voltage support (when everyone else had T4 and we were incompatible) and NO waterblock... *both have been fixed; largely due to my own persiverence* -- now I feel like I'm spoilt for choice and can't make my mind up what way to go.

I have ZERO experience in this field. But I know many other guys on this thread (some of you that only LURK and READ but don't really post) but still get involved when something important to you comes up. I know some of you would be delighted to find out they could go out and grab an EK 780TI Classified block for their 1080 classified.

Whatever route I take.. it will be documented on here anyway; for everyones viewing pleasure -- if that means a "how I did it" with an EK 780TI so be it.. or with Alphacool I'm sure it will be easy to fit and my review would probably focus mostly on performance.

In any case... any of you guys have a steer for me?

Back soon.

Nick Peyton

P.S.

If I'm *overly-enthusiastic* its because I'm *new* at this


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> Regarding: Full Water Block Support - EVGA 1080 CLASSIFIED
> 
> Right I've just been paid, the *Alpha-cool block is* complete and their email support is asking me for my address as they reserved one especially for me....
> 
> BUT here's the dilemma... Grega from *EK* has also been on the emailer asking for measurements/photos of my PCB after I contacted him with pictures from the EVGA website of a guy who managed to get the *EK 780TI Classified block to fit.*
> 
> Grega is claiming there is only one resistor (from visual inspection) that he thinks may be a problem; he seems very very capable and experienced (as if he actually works in a hands-on department in EK himself).
> 
> Here's a picture of the photo he sent me:
> 
> 
> He wants me to take measurements and email back.
> 
> *Really not sure which route to go down now; I don't want to waste anyone's time. And I can't really afford to buy both.*
> 
> I feel I have an obligation to go with Alpha-cool because they "saved" us after EK officially announced they weren't supporting the 1080 Classified (and I've worked hard to try and spread the word so other Classified owners could benefit - Alphacool is even aware of this too).
> 
> Here's a picture of the Alphacool block taken from their manual (emailed to me by their support 2 weeks ago when I asked for a "teaser").
> 
> 
> 
> Two months ago Classified owners were worried disappointed at NO voltage support (when everyone else had T4 and we were incompatible) and NO waterblock... *both have been fixed; largely due to my own persiverence* -- now I feel like I'm spoilt for choice and can't make my mind up what way to go.
> 
> I have ZERO experience in this field. But I know many other guys on this thread (some of you that only LURK and READ but don't really post) but still get involved when something important to you comes up. I know some of you would be delighted to find out they could go out and grab an EK 780TI Classified block for their 1080 classified.
> 
> Whatever route I take.. it will be documented on here anyway; for everyones viewing pleasure -- if that means a "how I did it" with an EK 780TI so be it.. or with Alphacool I'm sure it will be easy to fit and my review would probably focus mostly on performance.
> 
> In any case... any of you guys have a steer for me?
> 
> Back soon.
> 
> Nick Peyton


Weird they don't have blocks for this card by now, its been how long since they got released? If the older 780 TI block fit would a 980 TI block fit better or are they different? And the alpha cool one is just a gpu water cooled with a fan for vrm/vram?


----------



## Vellinious

There are 2 or 3 guys in the EVGA forums that have successfully fitted the 980ti Classy block to the 1080 Classy. The only thing that caused them any issues, was the backplate.


----------



## OccamRazor

Quote:


> Originally Posted by *Derek1*
> 
> I am not sure myself. It is certainly puzzling.
> I noticed I have Gsync enabled. Does that matter? 4K?
> Also, I am running the program off my WD Black not my 850 Evo Pro. Does that matter?
> I was wondering if there is a bottleneck someplace but I don't really think there is. Got my 4820K running at 4.7 (quad core does that matter?) Dominator Platinum at 1866 so not sure what else I can do.


G-sync on is known to produce lower scores on benchmarks! Try turning it off and re-run the bench!









Cheers

Occamrazor


----------



## Derek1

Quote:


> Originally Posted by *OccamRazor*
> 
> G-sync on is known to produce lower scores on benchmarks! Try turning it off and re-run the bench!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Thanks, will give it try and report.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> There are 2 or 3 guys in the EVGA forums that have successfully fitted the 980ti Classy block to the 1080 Classy. The only thing that caused them any issues, was the backplate.


Do you mean the 780TI classy block? because when I use the configurator on the EK website and pretend I've got a 980TI its the 780TI that comes up as compatible...?

(or you just reading it how they wrote it on EVGA site)?


----------



## pfinch

Quote:


> Originally Posted by *OccamRazor*
> 
> Here you go one of the profiles i run:
> 
> 
> 
> DOOM [email protected] all maxed out on [email protected], never downclocks, will do one at ROTTR too!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Thank you









All of my games running fine without downclocks ( t4 BIOS).
But still getting Power Limits in TimeSpy at the same position


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Do you mean the 780TI classy block? because when I use the configurator on the EK website and pretend I've got a 980TI its the 780TI that comes up as compatible...?
> 
> (or you just reading it how they wrote it on EVGA site)?


It's the same block. EVGA has been using the same board layout for the Classy forever.

The blocks I purchased for my 980ti were the 780ti Classy blocks. Even said 780 right on the block itself.

That's actually pretty common.


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> ROTTR:
> 
> 
> 
> @1440p, DX12 all maxed out (except pure hair, a waste of power as i see IMO)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Hey buddy, can you please or even those of you who score 25k+ in 3DMark FS with good/high(2.1-2.2GHz/5500Mhz+) clocking GTX 1080 run the internal ROTTR benchmark @1080P/DX12/Very High preset/SMAA and post your score?

I've been meaning to ask this for some time now and I always forget.









I'm curious to see if all is good on my part and if there's a big difference in FPS versus an average clocked 1080.

Thank you.


----------



## max883

After 40.min call of duty gameplay. All max 4K

Evga gtx 1080 sc acx 3.0


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> Hey buddy, can you please or even those of you who score 25k+ in 3DMark FS with good/high(2.1-2.2GHz/5500Mhz+) clocking GTX 1080 run the internal ROTTR benchmark @1080P/DX12/Very High preset/SMAA and post your score?
> 
> I've been meaning to ask this for some time now and I always forget.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm curious to see if all is good on my part and if there's a big difference in FPS versus an average clocked 1080.
> 
> Thank you.


The DX12 benchmark is bugged but is bugged for everyone!











[email protected], [email protected]

Cheers

Occamrazor

Edit: Same clocks and mem +300, +500, +700 (crashed half [email protected] 900mhz), only a mere 3/4 fps increase, drawing 242W!


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> The DX12 benchmark is bugged but is bugged for everyone!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [email protected], [email protected]
> 
> Cheers
> 
> Occamrazor
> 
> Edit: Same clocks and mem +300, +500, +700 (crashed half [email protected] 900mhz), only a mere 3/4 fps increase, drawing 242W!


I didn't know. I have been doing this little benchmark for a while now and seems ok. I'm curious, how is it bugged?

Also here's mine. Again something is not right. That's an abnormally large fps difference?

2088/5700Mhz. That's my highest btw. Usually is ~151FPS. Could it be my cpu/ram?



If anyone with a 6700/[email protected]+ can post their fps @1080P/DX12/Very High preset/SMAA that would be great.

Edit:

Below are my Game/NVCP settings. Is everything ok or did I mess something?


----------



## Derek1

Quote:


> Originally Posted by *OccamRazor*
> 
> G-sync on is known to produce lower scores on benchmarks! Try turning it off and re-run the bench!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


A good suggestion unfortunately I can't turn that off.
My PB287 doesn't have G-sync.
I went to the Nvidia cp and there is no tab for G-sync and the only reference in Global settings is for vertical sync and it is off.
I do not know why 3d mark says I have it enabled as even in the demo version I am running I do not have the ability to turn that on and it is off by default.
And yet in the results under settings you can see it says I have it enabled.

http://www.3dmark.com/3dm/16137996


----------



## Radox-0

Quote:


> Originally Posted by *Derek1*
> 
> A good suggestion unfortunately I can't turn that off.
> My PB287 doesn't have G-sync.
> I went to the Nvidia cp and there is no tab for G-sync and the only reference in Global settings is for vertical sync and it is off.
> I do not know why 3d mark says I have it enabled as even in the demo version I am running I do not have the ability to turn that on and it is off by default.
> And yet in the results under settings you can see it says I have it enabled.
> 
> http://www.3dmark.com/3dm/16137996


Your scores seem to suggest its okay however. Seem about in the right ballpark, in fact look to be leading the pack of similar systems mostly so nothing seems off. If V-Sync was actually making an impact, the score would be much lower on the graphics score.


----------



## Derek1

Quote:


> Originally Posted by *Radox-0*
> 
> Your scores seem to suggest its okay however. Seem about in the right ballpark, in fact look to be leading the pack of similar systems mostly so nothing seems off. If V-Sync was actually making an impact, the score would be much lower on the graphics score.


Thaks for the response.

Just ran FS on the new Nvidia drver 375.95 and broke through 25k, finally!
But, it is invalid lol. Driver not supported and Time inconsistencies. Will try another run though the Driver thing I can't fix.
http://www.3dmark.com/3dm/16149762

ETA Got rid of the Timing problem but still not supporting the new driver. 25k held.
http://www.3dmark.com/3dm/16150061


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> I didn't know. I have been doing this little benchmark for a while now and seems ok. I'm curious, how is it bugged?
> 
> Also here's mine. Again something is not right. That's an abnormally large fps difference?
> 
> 2088/5700Mhz. That's my highest btw. Usually is ~151FPS. Could it be my cpu/ram?
> 
> 
> 
> If anyone with a 6700/[email protected]+ can post their fps @1080P/DX12/Very High preset/SMAA that would be great.
> 
> Edit:
> 
> Below are my Game/NVCP settings. Is everything ok or did I mess something?


Its 1080p, CPU comes into play, i still have the old faithfull [email protected]!


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> Its 1080p, CPU comes into play, i still have the old faithfull [email protected]!


Yeah, that explains it.









I just run the bench at those settings so I can have a reference base when testing the cards.

Ivy! Tbh I had my most cpu OC fun with my 3770k. Pure joy!


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> It's the same block. EVGA has been using the same board layout for the Classy forever.
> 
> The blocks I purchased for my 980ti were the 780ti Classy blocks. Even said 780 right on the block itself.
> 
> That's actually pretty common.


good okay thanks

What is that (this picture)? \/ \/ \/ (seen it on a previous post by max883)
[/quote]


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> Yeah, that explains it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just run the bench at those settings so I can have a reference base when testing the cards.
> 
> Ivy! Tbh I had my most cpu OC fun with my 3770k. Pure joy!


True! Massive Oc out of the box, tons of tweaking fun and still awesome performance today (gamewise ofcourse!







);
I game with tri-monitor 27" @3240x1920 (My old tri-VG278HE144hz) and dont feel the need to upgrade in this department
Maybe now after this "zen Hype" and prices go down (if ever...







)...










Cheers

Occamrazor


----------



## chiknnwatrmln

Quote:


> Originally Posted by *OccamRazor*
> 
> Its 1080p, CPU comes into play, i still have the old faithfull [email protected]!


Good to see another user rocking an Ivy processor with a 1080!









The only thing I'm really missing out on is a nice NVMe SSD. By the time I upgrade they should be cheap.


----------



## OccamRazor

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Good to see another user rocking an Ivy processor with a 1080!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The only thing I'm really missing out on is a nice NVMe SSD. By the time I upgrade they should be cheap.


Yap! Still rockin´and delidded too!








SSD is the most sensible speed upgrade nowadays! It custs gaming load times by sometimes more than half (depending on your space time awareness, if you use a stopwatch maybe more than that! )









Cheers

Occamrazor


----------



## chiknnwatrmln

Quote:


> Originally Posted by *OccamRazor*
> 
> Yap! Still rockin´and delidded too!
> 
> 
> 
> 
> 
> 
> 
> 
> SSD is the most sensible speed upgrade nowadays! It custs gaming load times by sometimes more than half (depending on your space time awareness, if you use a stopwatch maybe more than that! )
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


I was super sweaty and nervous when I delidded mine back in 2013! I currently have an 850 Evo and a 840 SSD, they're plenty fast with a combined 1.25TB but they don't even compare to NVMe.

Meanwhile I can't even break 25k because of PWR throttling..
http://www.3dmark.com/3dm/16154713?

But this is pretty nice : http://www.userbenchmark.com/UserRun/2103334


----------



## OccamRazor

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I was super sweaty and nervous when I delidded mine back in 2013! I currently have an 850 Evo and a 840 SSD, they're plenty fast with a combined 1.25TB but they don't even compare to NVMe.
> 
> Meanwhile I can't even break 25k because of *PWR throttling*..
> http://www.3dmark.com/3dm/16154713?
> 
> But this is pretty nice : http://www.userbenchmark.com/UserRun/2103334


Have you tried the T4 bios? It removes power and thermal limit!

(Is it just me or T4 naming reminds me of the T virus from Resident Evil? humm... Maybe its a world domination plot, all 1080 cards with the T-bios("birus"







) will turn into zombie card eaters!





















)

Cheers

Occamrazor


----------



## chiknnwatrmln

Quote:


> Originally Posted by *OccamRazor*
> 
> Have you tried the T4 bios? It removes power and thermal limit!
> 
> (Is it just me or T4 naming reminds me of the T virus from Resident Evil? humm... Maybe its a world domination plot, all 1080 cards with the T-bios("birus"
> 
> 
> 
> 
> 
> 
> 
> ) will turn into zombie card eaters!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Cheers
> 
> Occamrazor


Does the T4 bios allow for idle downclocking? I mean it's not super important but it's something I'd like to have. My card maxes at 45c so temperature is not a problem.

Regardless I'll probably give it a shot next week during the holiday.


----------



## nrpeyton

i love this thread - actually my favourite thread


----------



## Coopiklaani

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Does the T4 bios allow for idle downclocking? I mean it's not super important but it's something I'd like to have. My card maxes at 45c so temperature is not a problem.
> 
> Regardless I'll probably give it a shot next week during the holiday.


It does down clock while idle. It's the best bios for GTX1080 if you don't mind losing one DP port.
I'm on T4 bios and running my card @2201MHz 1.125v 24/7 without any problem!
http://www.3dmark.com/tsst/18774


----------



## OccamRazor

Quote:


> Originally Posted by *Coopiklaani*
> 
> It does download while idle. It's the best bios for GTX1080 if you don't mind losing one DP port.
> I'm on T4 bios and running my card @*2201MHz 1.125v* 24/7 without any problem!
> http://www.3dmark.com/tsst/18774


Lucky you...









Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Does the T4 bios allow for idle downclocking? I mean it's not super important but it's something I'd like to have. My card maxes at 45c so temperature is not a problem.
> 
> Regardless I'll probably give it a shot next week during the holiday.


Like Coopiklaani said, it downclocks just fine, give it a shot, stable clocks and voltage, no jumping up and down, its worth it!








Quote:


> Originally Posted by *nrpeyton*
> 
> i love this thread - actually *my favourite thread*


Indeed my Friend, indeed!









Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> It does download while idle. It's the best bios for GTX1080 if you don't mind losing one DP port.
> I'm on T4 bios and running my card @2201MHz 1.125v 24/7 without any problem!
> http://www.3dmark.com/tsst/18774


Your 1080 under water mate?

Quote:


> Originally Posted by *OccamRazor*
> 
> Have you tried the T4 bios? It removes power and thermal limit!
> 
> (Is it just me or T4 naming reminds me of the T virus from Resident Evil? humm... Maybe its a world domination plot, all 1080 cards with the T-bios("birus"
> 
> 
> 
> 
> 
> 
> 
> ) will turn into zombie card eaters!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Cheers
> 
> Occamrazor


lol


----------



## Cozmo85

What does the T4 bios change exactly?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Your 1080 under water mate?
> 
> lol


Yap, EKFC and custom loop. Mine is just EVGA 1080 SC, which has a FE PCB. 5 Phases VRM is not holding anything back. The mem only goes to +550 without losing performance. I wish I could tweak the mem voltage like Classy's do,


----------



## Coopiklaani

Quote:


> Originally Posted by *Cozmo85*
> 
> What does the T4 bios change exactly?


It removes Power Limit, i.e., no TDP
it by default has a up limit voltage cap of 1.2v. All other cards have voltage cap of 1.062v @0% and 1.093v @100%


----------



## xartic1

Quote:


> Originally Posted by *Coopiklaani*
> 
> Yap, EKFC and custom loop. Mine is just EVGA 1080 SC, which has a FE PCB. 5 Phases VRM is not holding anything back. The mem only goes to +550 without losing performance. I wish I could tweak the mem voltage like Classy's do,


Is the classy the only card that people have been able to adjust memory voltages?

I can bump my HOF memory voltages, core voltage and over voltage percent. When I set my core voltage, to say 1.063 then amount of power it uses from the wall goes up almost 50%, heats up instantly and absolutely no performance or voltage differences.

I can get to 1.093 on the core stable with no issues.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Yap, EKFC and custom loop. Mine is just EVGA 1080 SC, which has a FE PCB. 5 Phases VRM is not holding anything back. The mem only goes to +550 without losing performance. I wish I could tweak the mem voltage like Classy's do,


Quote:


> Originally Posted by *xartic1*
> 
> Is the classy the only card that people have been able to adjust memory voltages?
> 
> I can bump my HOF memory voltages, core voltage and over voltage percent. When I set my core voltage, to say 1.063 then amount of power it uses from the wall goes up almost 50%, heats up instantly and absolutely no performance or voltage differences.
> 
> I can get to 1.093 on the core stable with no issues.


Tonight I am doing EXTENSIVE *memory voltage* benchmarking on my Classified 1080.

I intend to post a comprehsive list of my results tonight (very soon) for everyones viewing pleasure 

Coopiklaani you are still getting temps up to 50c with that EK block?!?! My max temps are 50-55c on AIR at ambient of 20c

I am seriously starting to think about getting conductonaut liquid metal when my block arrives.

I read a post further back somewhere; someone was claiming 5c difference between Load and Idle with conductonaut

The block is full copper with nickel plating.......


----------



## Fixxxer696

Quote:


> Originally Posted by *nrpeyton*
> 
> good okay thanks
> 
> What is that (this picture)? \/ \/ \/ (seen it on a previous post by max883)


[/quote]

That would be the Node 202, the case I used to build my PC. Got a Founder's in that bad boy.


----------



## nrpeyton

Quote:


> Originally Posted by *Fixxxer696*


That would be the Node 202, the case I used to build my PC. Got a Founder's in that bad boy.



[/quote]

It looks so small with gigantic fans on the top lol... i never even realised it was a PC.. I actually thought it was maybe some high-tech special proto-type cooling 'time machine' lol

or maybe at the other end of the spectrum, a console ....


----------



## Fixxxer696

One of my co-workers told me about this case, he dubbed it "The Console Killer" because of its size. Once I looked it up, I decided that would be my next rig. I like taking it to my lady friend's house, whilst she's passed out I game like a mofo. Over clocking isn't too great, 200/500 on air on a cool night, but I was willing to sacrifice for the sleekness of the case.


----------



## nrpeyton

Quote:


> Originally Posted by *Fixxxer696*
> 
> One of my co-workers told me about this case, he dubbed it "The Console Killer" because of its size. Once I looked it up, I decided that would be my next rig. I like taking it to my lady friend's house, whilst she's passed out I game like a mofo. Over clocking isn't too great, 200/500 on air on a cool night, but I was willing to sacrifice for the sleekness of the case.


makes sense I suppose, I used to try and make my PC "blend" into the surroundings more when I had a girl up, lol.. unfortunately its not that easy to hide lol

still benching away on my other machine btw.

got to 925 on mem at 1.5v before crashes started happening... full story coming soon.


----------



## Fixxxer696

Quote:


> Originally Posted by *nrpeyton*
> 
> makes sense I suppose, I used to try and make my PC "blend" into the surroundings more when I had a bird up, lol.. unfortunately its not that easy to hide lol


Ha! I'm going to have to start using that term on the homegirl, thanks for that bro! But yeah, it does blend in nicely with an entertainment center. I always get surprised remarks when people come over and I use it as a media center. "That's a computer??? I thought it was a cable box!" Always good for a laugh.


----------



## nrpeyton

lol, i edited the post -- when i re-read it back to myself i wasn't sure

doing 2nd run on stock volts, wish it would hurry up and crash now so i can do the write up. don't really care about stock but i figure it will show as a good comparison


----------



## emperium85

I have flashed the T4 bios to my Hybrid cooled founders edition (max. 35c) and it will not go over 1,1v , speed is 2126Mhz, is there a way I can push it a little harder ?


----------



## OccamRazor

Quote:


> Originally Posted by *emperium85*
> 
> I have flashed the T4 bios to my Hybrid cooled founders edition (max. 35c) and it will not go over 1,1v , speed is 2126Mhz, is there a way I can push it a little harder ?




Select the voltage node and press CTRL + L to set the voltage!








But that node MUST be at a higher clock than all the others or it wont work!

Cheers

Occamrazor


----------



## emperium85

Quote:


> Originally Posted by *OccamRazor*
> 
> 
> 
> Select the voltage node and press CTRL + L to set the voltage!
> 
> 
> 
> 
> 
> 
> 
> 
> But that node MUST be at a higher clock than all the others or it wont work!
> 
> Cheers
> 
> Occamrazor


Thnx !!
Uhm..I couldnt't open the venster you get, we are using the same version, how do I get there ?
CTRL+L changed something , then I apllied 100% the voltage control, and now I am getting still 1,1v in GPU-Z but it will now run valley at 2151Mhz, I didnt get it









EDIT
Found it, it is CTRL + F


----------



## Derek1

What version of and where can I get nvflash?
As well as the T4 please.

There seems to be a few to choose from for each and not sure which is best.

Thanks.


----------



## OccamRazor

Double post!


----------



## OccamRazor

Quote:


> Originally Posted by *Derek1*
> 
> What version of and where can I get nvflash?
> As well as the T4 please.
> 
> There seems to be a few to choose from for each and not sure which is best.
> 
> Thanks.


Here you go:

NvflashplusT4.zip 1304k .zip file


I usualy put the nvflash folder in c:\ , its easier to get to using cmd prompt!

_nvflash -6 xxx.rom_

Dont forget to open comand prompt with Admin privileges!









Cheers

Occamrazor


----------



## pfinch

TimeSpy: 7838
http://www.3dmark.com/3dm/16173597?
What do you think?

I'm getting lower scores above 2075 mhz or/and graphical glitches at TimeSpy.
Heavenward (4k), ROTTR, Witcher 3 etc. still working fine with 21xx Mhz on the core.
So i think TimeSpy is at least for me the only Benchmark to get 100% stability.

- Zotac AMP Extreme 1080 / T4 BIOS 2075 // 5580 Mem
- I7 6700k 4.8Ghz
- 32 GB 3466 CL16 1T
- Asrock OCF Z170


----------



## Derek1

Quote:


> Originally Posted by *OccamRazor*
> 
> Here you go:
> 
> NvflashplusT4.zip 1304k .zip file
> 
> 
> I usualy put the nvflash folder in c:\ , its easier to get to using cmd prompt!
> 
> _nvflash -6 xxx.rom_
> 
> Dont forget to open comand prompt with Admin privileges!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Thanks


----------



## emperium85

For my FE the T4 bios was unstable and the scores are the same, I am back to the Original bios, waiting for the real deal


----------



## nrpeyton

*MEMORY BENCHMARKING - EVGA 1080 CLASSIFIED*

*Settings Used:*

*Firestrike Ultra Graphics 1 & 2 (validations available for all 56 scores below)*

*Power:* 130% (320w max)
*Core V:* 0%
*Core Clk:* +126 (2126-2139) _<--- maximum 3dmark validation I can get is 2202mhz (unstable)._
*Fan:* 100%

*STOCK (1.37v)*

5628 +650
5656 +650
5581 +675
5565 +675
5572 +675
5644 +700
5772 +725
5792 +725
5649 +725
5672 +750
5759 +775
5775 +775
5660 +800
5657 +800
5762 +825
5796 +825
5684 +850
5695 +850
5811 +875
5826 +875
5702 +900
5727 +900
5806 +925
5838 +925
5756 +950
5736 +950 (artifacts)

*VOLTAGE TOOL (1.5v)* _<-- voltage verified with multimeter_

5606 +650
5724 +675
5756 +675
5679 +700
5643 +700
5759 +725
5735 +725
5662 +750
5659 +750
5806 +775
5760 +775
5775 +775
5668 +800
5660 +800
5646 +800
5807 +825
5768 +825
5708 +850
5704 +850
5717 +850
5800 +875
5791 +875
5798 +875
5699 +900
5688 +900
5853 +925
5848 +925
3568 +925
5733 +950
#### +950 (crash)

*VOLTAGE TOOL (1.43v)*

5813 +925
5750 +950 (artifacts)
.........a little more work to do tonight.....
..............
....
..

Anyone see a pattern??!?!?!


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## nexxusty

Quote:


> Originally Posted by *Coopiklaani*
> 
> A month after power mod with Liquid metal Pro. NOT A SAVE MOD!
> The solder under the shunt resistor gets totally dissolved. Is RMA still possible?


Of course it's not safe.

Wasn't from the get go. It's liquid metal.

Solder or nothing. I wouldn't of even considered this. Thought you were all crazy for doing so.


----------



## nrpeyton

Quote:


> Originally Posted by *nrpeyton*
> 
> *MEMORY BENCHMARKING - EVGA 1080 CLASSIFIED*
> 
> *Settings Used:*
> 
> *Firestrike Ultra Graphics 1 & 2 (validations available for all 56 scores below)*
> 
> *Power:* 130% (320w max)
> *Core V:* 0%
> *Core Clk:* +126 (2126-2139) _<--- maximum 3dmark validation I can get is 2202mhz (unstable)._
> *Fan:* 100%
> 
> *STOCK (1.37v)*
> 
> 5628 +650
> 5656 +650
> 5581 +675
> 5565 +675
> 5572 +675
> 5644 +700
> 5772 +725
> 5792 +725
> 5649 +725
> 5672 +750
> 5759 +775
> 5775 +775
> 5660 +800
> 5657 +800
> 5762 +825
> 5796 +825
> 5684 +850
> 5695 +850
> 5811 +875
> 5826 +875
> 5702 +900
> 5727 +900
> 5806 +925
> 5838 +925
> 5756 +950
> 5736 +950 (artifacts)
> 
> *VOLTAGE TOOL (1.5v)* _<-- voltage verified with multimeter_
> 
> 5606 +650
> 5724 +675
> 5756 +675
> 5679 +700
> 5643 +700
> 5759 +725
> 5735 +725
> 5662 +750
> 5659 +750
> 5806 +775
> 5760 +775
> 5775 +775
> 5668 +800
> 5660 +800
> 5646 +800
> 5807 +825
> 5768 +825
> 5708 +850
> 5704 +850
> 5717 +850
> 5800 +875
> 5791 +875
> 5798 +875
> 5699 +900
> 5688 +900
> 5853 +925
> 5848 +925
> 3568 +925
> 5733 +950
> #### +950 (crash)
> 
> *VOLTAGE TOOL (1.43v)*
> 
> 5813 +925
> 5750 +950 (artifacts)
> .........a little more work to do tonight.....
> ..............
> ....
> ..
> 
> Anyone see a pattern??!?!?!


-right click and select "open new tab" for full size


Whats interesting when you put it into visual form -- the trend clearly shows scores actually *continue* to climb until *right before* the *crashing offset*. (Crashing being at +950 and highest scoring being +925).

P.S. don't pay too much attention between +675mhz and +700mhz at 1.5v (I missed a few benches last night and had to re-do them today) so any slight differences in room temperature or other *different day* factors I forgot about might have corrupted numbers slightly -- *but only for +675 to +700mhz at 1.5v (blue) is affected* ).

Edit:
Its a real pitty there isn't some software that could initiate automatic benchmarks and change afterburner offsets as it goes (imagine being able to leave it on over night and come back to all your numbers in the morning). I done 2 benches for each offset (or 3 when I got an unexpected result). Obviously the more benches you do the more accurate your "averages"


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> -right click and select "open new tab" for full size
> 
> 
> Whats interesting when you put it into visual form -- the trend clearly shows scores actually *continue* to climb until *right before* the *crashing offset*. (Crashing being at +950 and highest scoring being +925).
> 
> P.S. don't pay too much attention between +675mhz and +700mhz at 1.5v (I missed a few benches last night and had to re-do them today) so any slight differences in room temperature or other *different day* factors I forgot about might have corrupted numbers slightly -- *but only for +675 to +700mhz at 1.5v (blue) is affected* ).
> 
> Edit:
> Its a real pitty there isn't some software that could initiate automatic benchmarks and change afterburner offsets as it goes (imagine being able to leave it on over night and come back to all your numbers in the morning). I done 2 benches for each offset (or 3 when I got an unexpected result). Obviously the more benches you do the more accurate your "averages"


Yes, offsets ending in 5 score higher than offsets ending in 0.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Yes, offsets ending in 5 score higher than offsets ending in 0.


yeah I noticed that too.. like the *highest* scoring offsets seem to go up in increments of 50mhz (725, 775, 825, 875, 925 are all higher scoring) plus there is also still a *slight* trend where the highest "5 ending" offsets (closest to 925) win.

Personally I think I'll now either be running +925 at 1.5v or +875 at 1.37v (stock)

Moreoever it wasn't until I actually put it into graphical form that i noticed i *do indeed* score higher at +925 avarage. Before I always assumed my max was around 550-650.


----------



## Dragonsyph

Ya i got higher scores with +995 instead of +1000.

26,404 FS score with +995.


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya i got higher scores with +995 instead of +1000.
> 
> 26,404 FS score with +995.


hmm very interesting I would love to *why*... I mean the science behind that 

that is an amazing score btw; my old AMD FX processor is holding me back I think. I struggle to get over 23k on the FS basic test. I can compensate so far... by only running the graphics tests.. but only helps so much.....


----------



## Derek1

A new high.
Gained 200 on Graphics score and 150 over all by increasing Mem offset from 850 to 855.

http://www.3dmark.com/3dm/16181975


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> A new high.
> Gained 200 on Graphics score and 150 over all by increasing Mem offset from 850 to 855.
> 
> http://www.3dmark.com/3dm/16181975


nice score mate


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> A new high.
> Gained 200 on Graphics score and 150 over all by increasing Mem offset from 850 to 855.
> 
> http://www.3dmark.com/3dm/16181975


Woooot.


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm very interesting I would love to *why*... I mean the science behind that
> 
> that is an amazing score btw; my old AMD FX processor is holding me back I think. I struggle to get over 23k on the FS basic test. I can compensate so far... by only running the graphics tests.. but only helps so much.....


8((( Ya fx cpu is holding you back for sure. Im guessing your GPu usage is not 99-100 entire time?

Even with my CPU in Timespy the gpu usage goes low as 60% which is why i dont like that test at all.


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> 8((( Ya fx cpu is holding you back for sure. Im guessing your GPu usage is not 99-100 entire time?
> 
> Even with my CPU in Timespy the gpu usage goes low as 60% which is why i dont like that test at all.


*FIRESTRIKE BASIC (GRAPHICS 1 & 2 ONLY)*


not sure if that matters, I think the *wait times* are so short its unmeasurable -- but millions of these over the course of the entire run still adds up to a much much lower score :-(

you can see that even with graphics test 2 being less CPU intensive the GPU also dips to 95/96%

but neither is *ever* maxed out

does that mean its poorly optimised drivers or bad code?


----------



## SirCanealot

Talking of memory overclocking, I opened GPUZ on my new GTX 1080 the other day for the first time and it says I have Micron memory. On a Founder's Edition. I think I get around +325 on the memory before performance goes down. Therefore I hate you all








Edit: Please ignore, I am really stupid. I was thinking about the GTX 1070s with Micron GDDR5 ^_^


----------



## OccamRazor

Quote:


> Originally Posted by *SirCanealot*
> 
> Talking of memory overclocking, I opened GPUZ on my new GTX 1080 the other day for the first time and it says I have Micron memory. On a Founder's Edition. I think I get around +325 on the memory before performance goes down. Therefore I hate you all


Only Micron makes GDDR5X!
And before you "hate us all







, fiddle a bit more with it, it might surprise you...








Have you tried the T4 bios already?
I posted a zip file a few posts back with it and the nvflash!









Cheers

Occamrazor


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> A new high.
> Gained 200 on Graphics score and 150 over all by increasing Mem offset from 850 to 855.
> 
> http://www.3dmark.com/3dm/16181975


Now that's what I was talking about. Nice score, mate!









I still don't get how you guys can get those high memory OC. I tried again here and the max I could get without some black screen flash (those are like milisecond screen flashes) on graphics test 2 is +600mhz on memory.

That plus 2114mhz Core gets me 25400 graphics points with "only" stock 1.062V though. So I'm ok with that.


----------



## juniordnz

doubled, sry


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Now that's what I was talking about. Nice score, mate!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still don't get how you guys can get those high memory OC. I tried again here and the max I could get without some black screen flash (those are like milisecond screen flashes) on graphics test 2 is +600mhz on memory.
> 
> That plus 2114mhz Core gets me 25400 graphics points with "only" stock 1.062V though. So I'm ok with that.


I think it is the November batch of chips/cards. Both me and dragon have cards from about the same time and we both can get that. My original card couldn't do 600 either and that was n August card.


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SirCanealot*
> 
> Talking of memory overclocking, I opened GPUZ on my new GTX 1080 the other day for the first time and it says I have Micron memory. On a Founder's Edition. I think I get around +325 on the memory before performance goes down. Therefore I hate you all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only Micron makes GDDR5X!
> And before you "hate us all
> 
> 
> 
> 
> 
> 
> 
> , fiddle a bit more with it, it might surprise you...
> 
> 
> 
> 
> 
> 
> 
> 
> Have you tried the T4 bios already?
> I posted a zip file a few posts back with it and the nvflash!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor
Click to expand...

I have an AMP! Edition here that barely does anything above 2025-2037/+300-320 before performance drops or crashes start happening.

But after many trial and errors with curve tweaking OC I manage to take it from 138-140FPS(1080p/DX12/SMAA/VH) in ROTTR to 146-148FPS.

Which was close to my "better" cards and satisfying enough. And performance with FC4/GTA 5 wasn't far off too. Maybe a couple/few fps less.

But still a lot better than before.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> I think it is the November batch of chips/cards. Both me and dragon have cards from about the same time and we both can get that. My original card couldn't do 600 either and that was n August card.


Its that hybrid blood running through our cards that make the memory OC so savage. /wink /wink


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> -right click and select "open new tab" for full size
> 
> 
> Whats interesting when you put it into visual form -- the trend clearly shows scores actually *continue* to climb until *right before* the *crashing offset*. (Crashing being at +950 and highest scoring being +925).
> 
> P.S. don't pay too much attention between +675mhz and +700mhz at 1.5v (I missed a few benches last night and had to re-do them today) so any slight differences in room temperature or other *different day* factors I forgot about might have corrupted numbers slightly -- *but only for +675 to +700mhz at 1.5v (blue) is affected* ).
> 
> Edit:
> Its a real pitty there isn't some software that could initiate automatic benchmarks and change afterburner offsets as it goes (imagine being able to leave it on over night and come back to all your numbers in the morning). I done 2 benches for each offset (or 3 when I got an unexpected result). Obviously the more benches you do the more accurate your "averages"


Hmm, very interesting. My card hits the ECC barrier between +575 to +600, a 300 marks drop in TS. Then I need another 200 on memory to compensate for the ECC lose. So I have the roughly the same score for +575 and +775. From +775 onward all the way up to +900, I can see a small performance increase until ECC fails where I get artifacts/crashes.


----------



## nrpeyton

Quote:


> Originally Posted by *Koniakki*
> 
> I have an AMP! Edition here that barely does anything above 2025-2037/+300-320 before performance drops or crashes start happening.
> 
> But after many trial and errors with curve tweaking OC I manage to take it from 138-140FPS(1080p/DX12/SMAA/VH) in ROTTR to 146-148FPS.
> 
> Which was close to my "better" cards and satisfying enough. And performance with FC4/GTA 5 wasn't far off too. Maybe a couple/few fps less.
> 
> But still a lot better than before.


Very interesting mate; the new "pascal curve" O/C'ing method is still a mystery.

For example:

*Traditional O/C Method*
a +175 offset on an *1860 boost card* = 2189mhz *this will crash EVERY time.*

---however---:

*Pascal "New Curve O/C method":*
a +100 from voltages 850mv --> to 1043mv
_then_
+150 from 1050mv --> 1075mv
_then_ +175 from 1081mv --> 1093mv *DOESN'T CRASH.*

If *under* temp & power limits cards In *BOTH* O/C'ing methods will boost to the highest possible voltage & corresponding clock. In most cases _(in this example)_ being *2189mhz at 1093mv*.

*HOWEVER the Traditional Method crashes when the "Curve" method is stable.*

To stress the again; *actual* active frequencies and *actual* active voltages are EXACTLY the same. However the Curve method is stable but the traditional method IS NOT.

This contradicts all previous knowledge and experience. We would absolutely LOVE to learn the scientific reason behind this.

It is a mystery.

We would definitely love to hear any theories. ?









And peoples experiences. *what works and what doesn't* with Curve O/C'ing
















Together; maybe we can maybe unravel this mystery 









Nick


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Hmm, very interesting. My card hits the ECC barrier between +575 to +600, a 300 marks drop in TS. Then I need another 200 on memory to compensate for the ECC lose. So I have the roughly the same score for +575 and +775. From +775 onward all the way up to +900, I can see a small performance increase until ECC fails where I get artifacts/crashes.


ECC?


----------



## Radox-0

Quote:


> Originally Posted by *nrpeyton*
> 
> ECC?


Error Correcting code.

On that note I really think I lost the lottery lol. Have 5 cards and none of them bust past 500 Mhz on the memory without taking a hit in performance, now that's unlucky


----------



## Vellinious

None of the 1080s I've had in my possession would do more than 575 on the memory without taking a performance hit.


----------



## nrpeyton

Quote:


> Originally Posted by *Radox-0*
> 
> Error Correcting code.
> 
> On that note I really think I lost the lottery lol. Have 5 cards and none of them bust past 500 Mhz on the memory without taking a hit in performance, now that's unlucky


Is there a way I can "tell" if my card is hitting ECC?

Seems like that would be a much better way of attaining one's maximum overclock for both memory and core if we simply knew when that code begun kicking in.

Instead of running 100's of benches and having to compare different "averages" all the time.

A simpler way to put it:
If I run *enough* benches I would eventually (at some point) still get a higher score (*on a lower overclock*) even if ECC *wasn't* kicking in


----------



## Radox-0

he memory







Quote:


> Originally Posted by *nrpeyton*
> 
> Is there a way I can "tell" if my card is hitting ECC?
> 
> Seems like that would be a much better way of attaining one's maximum overclock for both memory and core if we simply knew when that code begun kicking in.
> 
> Instead of running 100's of benches and having to compete with different "averages" all the time.


Indeed, varies from card to card so just a need to find it manually. I still not found all my cards and running them in SLI, can't be faffed dialing it down exactly for those last few bits of performance.

If it helps normally I started at +500 Mhz, then went +/- 20 Mhz either side to see if there was a drop or increase in performance in a single firestrike custom GPU test. then dialed in the setting accordingly. Took about 10 tests but had a fair idea or where things drop off.

Someone else may have a more creative / easier method however.


----------



## bloot

2101/11520 http://www.3dmark.com/spy/757548

2101/11920 http://www.3dmark.com/spy/757234

More than 11950 Mhz and it crashes, not in Unigine though.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Very interesting mate; the new "pascal curve" O/C'ing method is still a mystery.
> 
> For example:
> 
> *Traditional O/C Method*
> a +175 offset on an *1860 boost card* = 2189mhz *this will crash EVERY time.*
> 
> ---however---:
> 
> *Pascal "New Curve O/C method":*
> a +100 from voltages 850mv --> to 1043mv
> _then_
> +150 from 1050mv --> 1075mv
> _then_ +175 from 1081mv --> 1093mv *DOESN'T CRASH.*
> 
> If *under* temp & power limits cards In *BOTH* O/C'ing methods will boost to the highest possible voltage & corresponding clock. In most cases _(in this example)_ being *2189mhz at 1093mv*.
> 
> *HOWEVER the Traditional Method crashes when the "Curve" method is stable.*
> 
> To stress the again; *actual* active frequencies and *actual* active voltages are EXACTLY the same. However the Curve method is stable but the traditional method IS NOT.
> 
> This contradicts all previous knowledge and experience. We would absolutely LOVE to learn the scientific reason behind this.
> 
> It is a mystery.
> 
> We would definitely love to hear any theories. ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And peoples experiences. *what works and what doesn't* with Curve O/C'ing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Together; maybe we can maybe unravel this mystery
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nick


It is not a mystery,
offset method allows certain degree of "play" in voltages. for example, an offset of +200 on core allows 2088MHz @1.093v and 1.081v or even 1.075 depending on the temperature and power. Your card may run 2088MHz @1.093v with no problem and crashes @1.081v.

A steep curve sort of locks the voltage to the high side of the cliff. I use a curve with a large gap from +200 to +300 in one voltage step. this forces the card to run at a fixed voltage for a given frequency when loaded.


----------



## Dragonsyph

I found my card will only boost to a certain core with anything from +140 to +160, any number in between those only boosts to 2164 and on rare occasions hits 2177, so by putting the +core LOW as i could while it still boosted to the same clock (2164) i was able to get it more stable. Even though +160 would still boost to 2164 i would some times see black flickers.

So 2164 at 1.09V

+995 memory at stock memory voltage.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Is there a way I can "tell" if my card is hitting ECC?
> 
> Seems like that would be a much better way of attaining one's maximum overclock for both memory and core if we simply knew when that code begun kicking in.
> 
> Instead of running 100's of benches and having to compare different "averages" all the time.
> 
> A simpler way to put it:
> If I run *enough* benches I would eventually (at some point) still get a higher score (*on a lower overclock*) even if ECC *wasn't* kicking in


ECC is more like a safeguard to ensure the operation of the card in the event of vram bit corruption.
In normal usage, bit corruption happens very rarely. It can be caused by any EM interference, solar flare, power surge and unstable vram.

ECC also has two functions, detect and correct.
ECC can detect if a bit corruption (vram error) has occurred or not.
ECC will also check whether this error is correctable or not. If the error is correctable, ECC will than stall the operation of the vram to correct the bit. This results in some performance loss.
If the error is not correctable, you simply get artifact or card crash.


----------



## Bal3Wolf

Figure id ask in here im looking into getting a 1080 with a full cover block iv found a few that come with them so i dont have to mess with doing it myself a list below of ones i found do you guys have any of these or have any info on how well they cool and overclock. Im favoring the evga or gigabyte i wont buy one likely till jan when i get my tax return hoping prices might drop some by then also.

EVGA GeForce GTX 1080 FTW HydroCopper GAMING

GIGABYTE GeForce GTX 1080 8GB GV-N1080XTREMEWB-8GD Xtreme Gaming WATERFORCE WB

MSI GeForce GTX 1080 DirectX 12 GTX 1080 SEA HAWK EK X 8GB

ZOTAC GeForce GTX 1080 ArcticStorm


----------



## SirCanealot

Quote:


> Originally Posted by *OccamRazor*
> 
> Only Micron makes GDDR5X!
> And before you "hate us all
> 
> 
> 
> 
> 
> 
> 
> , fiddle a bit more with it, it might surprise you...
> 
> 
> 
> 
> 
> 
> 
> 
> Have you tried the T4 bios already?
> I posted a zip file a few posts back with it and the nvflash!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


Oh my, please ignore me, I'm being an idiot. I was thinking of the GTX 1070s with normal GDDR5 memory









I have an Accelero Xtreme IV sitting at home and once I can be bothered to put that on and settle it down I'm going to try the T4 bios. And then start working on 'the curve' like nrpeyton has been working on ¬_¬

Regarding the reason curve overclocking is more stable, I can only theorise that certain steps are less stable than others so when the card starts jumping around to different voltages it ends up crashing. Where as with the curve we know every step is stable.

I had this issue with my GTX 980 before I put an Accelero on it and flashed the BIOS - it was stable at around 1500mhz, but with a game like Witcher 3 that pushes the power and really makes the card start moving arounds different voltage steps it crashes moving between them. Whereas, if I could have it sit at 1500mhz and not move, it was stable.
Edit: The end result of this was that I had to pull down to around 1450mhz or lower with Witcher 3, which is what pushed me to unlock the power limits on the bios! :O


----------



## OccamRazor

Quote:


> Originally Posted by *Bal3Wolf*
> 
> Figure id ask in here im looking into getting a 1080 with a full cover block iv found a few that come with them so i dont have to mess with doing it myself a list below of ones i found do you guys have any of these or have any info on how well they cool and overclock. Im favoring the evga or gigabyte i wont buy one likely till jan when i get my tax return hoping prices might drop some by then also.
> 
> EVGA GeForce GTX 1080 FTW HydroCopper GAMING
> 
> GIGABYTE GeForce GTX 1080 8GB GV-N1080XTREMEWB-8GD Xtreme Gaming WATERFORCE WB
> 
> *MSI GeForce GTX 1080 DirectX 12 GTX 1080 SEA HAWK EK X 8GB*
> 
> ZOTAC GeForce GTX 1080 ArcticStorm


(*Sorry, just realised now you wanted info about the EK version, while my card is AIO version...







*)
Temps never go beyond 45C and this is with 1,18V/1.20V, ambient temp of 25C, VRM´s never go beyond 60C, measured by a IR gun
(set a moderate fan profile in AB and that´s enough)
(removed the backplate though as it was blocking air flow and generating hot air pockets with higher voltages)
out of the box goes up to [email protected], its fun to tweak with the T4 bios,
the corsair fan does it job and its not too loud but i changed it for a push/pull fan setup and temps dropped by 3C,
These cards really need cold to get the best out of them,








My Brother Skyn3t and me got our cards at 600$ each, used, still with warranty, was a good deal!









Quote:


> Originally Posted by *SirCanealot*
> 
> Oh my, please ignore me, I'm being an idiot. I was thinking of the GTX 1070s with normal GDDR5 memory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an Accelero Xtreme IV sitting at home and once I can be bothered to put that on and settle it down I'm going to try the T4 bios. And then start working on 'the curve' like nrpeyton has been working on ¬_¬
> 
> Regarding the reason curve overclocking is more stable, I can only theorise that certain steps are less stable than others so when the card starts jumping around to different voltages it ends up crashing. Where as with the curve we know every step is stable.
> 
> I had this issue with my GTX 980 before I put an Accelero on it and flashed the BIOS - it was stable at around 1500mhz, but with a game like Witcher 3 that pushes the power and really makes the card start moving arounds different voltage steps it crashes moving between them. Whereas, if I could have it sit at 1500mhz and not move, it was stable.
> Edit: The end result of this was that I had to pull down to around 1450mhz or lower with Witcher 3, which is what pushed me to unlock the power limits on the bios! :O


Nop! this is *OCN*! We *DO NOT* ignore our fellow members, users, forum dwellers or lurkers! Its our duty to help whoever and whenever help is requested! I said this in the past and i will say it again many times as needed!
This is a forum: " _meeting or medium where ideas and views on a particular issue can be exchanged_" So, people not accepting this, well, its not the place for them at all... Heck! Some dont even accept this is a OC place, i thought the name would give it away...























The algorithm that rules the polynome heat/voltage/power/frequency is conservative by nature, so, lower voltage/power/frequency, heat dependant is always set by drivers/bios; like Coopiklaani said: _"offset method allows certain degree of "play" in voltages"_ so, controlling the other variables (increasing the TDP limit, lowering the heat by increasing the cooling effectiveness, will not trigger the limitations and allow for a higher OC!

But... (there is always a but right?)









Having a higher clock does not mean you will have a higher performance even though its benchmark or game stable, this happens both on core an memory OC, unfortunately there are lots of physical and software limitations set on this card that can trigger a low p-state while the readings all software can get from INA3221 (IC sensor) are not reported correctly as what really are, but you can see that on lower scores and lower framerates! Always check on performance and do not rely on core and memory numbers, TiN (you all know him right?) wrote: _"Key point to see if performance goes up, not just the clocks"
_

Cheers

Occamrazor


----------



## Derek1

Quote:


> Originally Posted by *OccamRazor*
> 
> Temps never go beyond 45C and this is with 1,18V/1.20V, ambient temp of 25C, VRM´s never go beyond 60C, measured by a IR gun
> (set a moderate fan profile in AB and that´s enough)
> (removed the backplate though as it was blocking air flow and generating hot air pockets with higher voltages)
> out of the box goes up to [email protected], its fun to tweak with the T4 bios,
> the corsair fan does it job and its not too loud but i changed it for a push/pull fan setup and temps dropped by 3C,
> These cards really need cold to get the best out of them,
> 
> 
> 
> 
> 
> 
> 
> 
> My Brother Skyn3t and me got our cards at 600$ each, used, still with warranty, was a good deal!
> 
> 
> 
> 
> 
> 
> 
> 
> Nop! this is *OCN*! We *DO NOT* ignore our fellow members, users, forum dwellers or lurkers! Its our duty to help whoever and whenever help is requested! I said this in the past and i will say it again many times as needed!
> This is a forum: " _meeting or medium where ideas and views on a particular issue can be exchanged_" So, people not accepting this, well, its not the place for them at all... Heck! Some dont even accept this is a OC place, i thought the name would give it away...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The algorithm that rules the polynome heat/voltage/power/frequency is conservative by nature, so, lower voltage/power/frequency, heat dependant is always set by drivers/bios; like Coopiklaani said: _"offset method allows certain degree of "play" in voltages"_ so, controlling the other variables (increasing the TDP limit, lowering the heat by increasing the cooling effectiveness, will not trigger the limitations and allow for a higher OC!
> 
> But... (there is always a but right?)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Having a higher clock does not mean you will have a higher performance even though its benchmark or game stable, this happens both on core an memory OC, unfortunately there are lots of physical and software limitations set on this card that can trigger a low p-state while the readings all software can get from INA3221 (IC sensor) are not reported correctly as what really are, but you can see that on lower scores and lower framerates! Always check on performance and do not rely on core and memory numbers, TiN (you all know him right?) wrote: _"Key point to see if performance goes up, not just the clocks"
> _
> 
> Cheers
> 
> Occamrazor


Are you referring to the Seahawk Corsair AIO?
Or the EK version? Which doesn't come with rad/fan.
Because I think the OP is inquiring about Waterblock version for a custom loop.


----------



## OccamRazor

Quote:


> Originally Posted by *Derek1*
> 
> Are you referring to the Seahawk Corsair AIO?
> Or the EK version? Which doesn't come with rad/fan.
> Because I think the OP is inquiring about Waterblock version for a custom loop.


You are right! My bad! i just realise that now, re-read the post and saw the EK there!


----------



## Derek1

Quote:


> Originally Posted by *Bal3Wolf*
> 
> Figure id ask in here im looking into getting a 1080 with a full cover block iv found a few that come with them so i dont have to mess with doing it myself a list below of ones i found do you guys have any of these or have any info on how well they cool and overclock. Im favoring the evga or gigabyte i wont buy one likely till jan when i get my tax return hoping prices might drop some by then also.
> 
> EVGA GeForce GTX 1080 FTW HydroCopper GAMING
> 
> GIGABYTE GeForce GTX 1080 8GB GV-N1080XTREMEWB-8GD Xtreme Gaming WATERFORCE WB
> 
> MSI GeForce GTX 1080 DirectX 12 GTX 1080 SEA HAWK EK X 8GB
> 
> ZOTAC GeForce GTX 1080 ArcticStorm


All things being equal the only 2 considerations here are first and foremost Tech Support/Customer Service in which case EVGA is the way to go, and second, which card you think looks better.
From a performance standpoint my impression is that all 1080s perform more or less the same regardless of brand.
And as far a cooling goes, you would have to look into failure rates of each brands block. And that takes you back to point one above.
My FTW Hybrid conversion AIO idles at 28C. I would expect at least that from a full cover block. Depending on the bench I use it will stay under 50C or max at 55-57C.


----------



## Vellinious

My FTWs with a 20c ambient temp, will idle around 23-24c, and peak out at 37c. Mileage on the peak temps will vary, depending on radiator size, water flow, coolant type, fan type, fan configuration and fan speed.


----------



## ucode

@nrpeyton

1080's do not have ECC. They do have EDC which does not correct but retransmits bad blocks. However what most people are experiencing with the sharp drop off is likely a re-training effect where timing is changed. An easy way to see this is that after the drop off gains start increasing again, not decreasing. Also 1080 is buggy and half the memory settings may give lower performance than they should. After making a small change and getting a little less performance try putting the PC to sleep then waking to regain that lost performance.


----------



## grimboso

Time for me to beat the 25k mark with a validated score!

http://www.3dmark.com/3dm/16129011


----------



## Darkboomhoney

The new Powerlink !!

Gamestable with classified Controller 160+mhz = 2164 Mhz and 750+ Memory
http://www.3dmark.com/fs/10835941
now waiting for the Alphacool Waterblock to go higher.....


----------



## Vaesauce

Helllo folks. Just wanted to say that I decided to join the 1080 craze. I previously had a GTX1070 but for $450 for a brand new Zotac AMP, I couldn't resist and pulled the trigger. Just sold my 1070 for 430$ so boom! $20 difference made lol.

That being said, I actually heard bad reviews about the AMP Edition with it's cooling solution. Not seeing the issue here for me, Getting a little coil whine but I'm not going to lose sleep over it and not loud enough to make me want to /wrists.


----------



## Koniakki

Quote:


> Originally Posted by *Vaesauce*
> 
> Helllo folks. Just wanted to say that I decided to join the 1080 craze. I previously had a GTX1070 but for for a brand new Zotac AMP, I couldn't resist and pulled the trigger. Just sold my 1070 for 430$ so boom! difference made lol.
> 
> That being said, I actually heard bad reviews about the AMP Edition with it's cooling solution. Not seeing the issue here for me, Getting a little coil whine but I'm not going to lose sleep over it and not loud enough to make me want to /wrists.


That's a sweet deal! Congrats!

Also nothing wrong with the AMP! Edition. Sweet card but I have one too right here and I have to admit that although is running about 10-12'[email protected]% fan speed vs some other cards, is also the quietest of them all.

And I mean the quietest like, comparing a hybrid's noise vs a V8 even at 100% fan(rev) speed.









Edit: Okay now I re-read it, maybe I went a bit overboard with that V8 reference(lets say a 4cyl), but y'all get the point.


----------



## Vaesauce

Quote:


> Originally Posted by *Koniakki*
> 
> That's a sweet deal! Congrats!
> 
> Also nothing wrong with the AMP! Edition. Sweet card but I have one too right here and I have to admit that although is running about 10-12'[email protected]% fan speed vs some other cards, is also the quietest of them all.
> 
> And I mean the quietest like, comparing a hybrid's noise vs a V8 even at 100% fan(rev) speed.


Thanks man! I'm hyped!

I do a lot of gaming only on 1080P and I figured the 1080 wasn't necessary but since I do a lot of Video Editing and Photo Editing too, might as well save whatever seconds I can.

The Zotac AMP! Edition is definitely a very quiet card. I've already set a Fan Profile for it to hit 100% speed and can barely hear it. I thought it was going to be louder than my Gigabyte Xtreme Gaming 1070 (Which was SUPER QUIET), but surprisingly, it's just as good. If not, better.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> ECC is more like a safeguard to ensure the operation of the card in the event of vram bit corruption.
> In normal usage, bit corruption happens very rarely. It can be caused by any EM interference, solar flare, power surge and unstable vram.
> 
> ECC also has two functions, detect and correct.
> ECC can detect if a bit corruption (vram error) has occurred or not.
> ECC will also check whether this error is correctable or not. If the error is correctable, ECC will than stall the operation of the vram to correct the bit. This results in some performance loss.
> If the error is not correctable, you simply get artifact or card crash.


Thanks, you are truly a world of information lol. Already rep+1'd you on a previous post.

There's not enough of that here, people very rarely hit the rep button on this forum. Techpowerup is different. But over there it is OVER-USED too so I think we still need a good balance.

Anyway to my point lol; is there a way to switch off ECC?

Or to be able to "realise" when it kicks in? If this was possible it would save hours upon hours of time wasted performing benchmarking all the time.

On CPU its easy, you simply run prime95 (which already knows the answer to a calculation -- but still makes the CPU calculate it -- then if if answers don't match, you have a 'warning' or error. And you know you're unstable.

With GPU; you have to waste hours benching and comparing performance scores (which can be affected by so many other factors) meaning you can never actually be truly sure if you have REALLY found your max stable; or if you are compensating a bit, or if you are going a bit overboard.

A great example of this is; if you push your memory HARD enough (+925) and get a few lucky runs, the EXTRA PUSH is *enough* to outrun the ECC loss and still give you a *slightly* higher score. Yet you could seriously be doing long-term damage to your cards memory.

Also does ECC work when you overclock the core too hard? Or is there a different term for that "kind" of error correcting?

One thing I would love to understand; is why on my old GTX 980 I always got artifacts (warning me I was approaching my max O/C) -- but with my 1080 classified I NEVER get artifacts with core-overclocking. It either runs or crashes.

Nick

Quote:


> Originally Posted by *Dragonsyph*
> 
> I found my card will only boost to a certain core with anything from +140 to +160, any number in between those only boosts to 2164 and on rare occasions hits 2177, so by putting the +core LOW as i could while it still boosted to the same clock (2164) i was able to get it more stable. Even though +160 would still boost to 2164 i would some times see black flickers.
> 
> So 2164 at 1.09V
> +995 memory at stock memory voltage.


You've definitely got that core up then since a few weeks ago, I remember you saying your core was weak and your memory great.

My max stable is normally 2151, and I used to be able to scrape a few "lucky" runs at 2176. << which sounds very similar to what your getting.

Also It was -2c outside yesterday, I opened all my windows and got my ambient down to about 13c indoors (I sat with Jacket and scarf on inside lol) and core temps never exceeded 40-45 and I was then able to get continuous un-interrupted runs at 2202mhz with no crashing.

Seems these cards love to be under 40c. I can't wait to get her under water.

If my GPU had been an EVGA HydroCopper / MSI Seahawk; I'm quite sure I'd have thought of myself as "one of the lucky ones" as I'd be getting 2202 out the box.

Imagine what its going to do when I get my water chiller in future (although that won't be until after xmas) 

Quote:


> Originally Posted by *Darkboomhoney*
> 
> The new Powerlink !!
> Gamestable with classified Controller 160+mhz = 2164 Mhz and 750+ Memory
> http://www.3dmark.com/fs/10835941
> now waiting for the Alphacool Waterblock to go higher.....


How are you getting on with the controller mate? I assume you seen my post asking Kingpin for it and he answered our calling


----------



## Derpinheimer

Quote:


> Originally Posted by *nrpeyton*
> 
> Is there a way I can "tell" if my card is hitting ECC?
> 
> Seems like that would be a much better way of attaining one's maximum overclock for both memory and core if we simply knew when that code begun kicking in.
> 
> Instead of running 100's of benches and having to compare different "averages" all the time.
> 
> A simpler way to put it:
> If I run *enough* benches I would eventually (at some point) still get a higher score (*on a lower overclock*) even if ECC *wasn't* kicking in


I just run a memory intensive game that renders in pause and watch the FPS. Since nothing is changing, any FPS change should be from the overclock. I ended up with +575 from this method. Beyond this the framerate dropped, slowly. No need for hours of wasted time.


----------



## nrpeyton

Quote:


> Originally Posted by *Derpinheimer*
> 
> I just run a memory intensive game that renders in pause and watch the FPS. Since nothing is changing, any FPS change should be from the overclock. I ended up with +575 from this method. Beyond this the framerate dropped, slowly. No need for hours of wasted time.


hmm I tried that on "the witcher 3" before but I wasn't really sure it was reliable, I was getting higher FPS on +495 then it would drop 3FPS at +500 then I'd gain 3fps again going to 525. Then it would drop AGAIN going to 550. The same would happen all the way up to +925.

You can actually see that happening visually even on 3DMARK FIRESTRIKE too, on this graph I made (took me over 50 benches and over 12 hours):


----------



## Derpinheimer

false
Quote:


> Originally Posted by *nrpeyton*
> 
> How are you getting on with the controller mate? I assume you seen my post asking Kingpin for it and he answered our calling
> hmm I tried that on "the witcher 3" before but I wasn't really sure it was reliable, I was getting higher FPS on +495 then it would drop 3FPS at +500 then I'd gain 3fps again going to 525. Then it would drop AGAIN going to 550.
> 
> You can actually see that happening visually on this graph I made too (took me over 50 benches and 24hours to create this):


Right, just keep toying with it until it stops increasing. In your example, it seems to keep going up and then crash.

So, it's possible I stopped too early. But I have tried a few games (Metro being the easiest to test), and get the same peaks. I never got artifacts, though... so again, I may have stopped too early. Now you've got me wondering


----------



## nrpeyton

Quote:


> Originally Posted by *Derpinheimer*
> 
> false
> Right, just keep toying with it until it stops increasing. In your example, it seems to keep going up and then crash.
> 
> So, it's possible I stopped too early. But I have tried a few games (Metro being the easiest to test), and get the same peaks. I never got artifacts, though... so again, I may have stopped too early. Now you've got me wondering


I used to think my max stable was about 550-650 until a few nights ago when I did that... if you read back a few pages you'll find my original post (can't miss it coz the graph stands out a mile).

However a +925 memory overclock might be SO FAST that even if it is in fact error-correcting (or re-sending frame) as someone pointed out a minute ago ECD or something.... it could actually be *over-taking* the performance hit. (*and actually still having to error-correct*

What we really need is to be able to get our GPU's to perform a calculation then test that calculation. Like we can do on our CPU's.

I'm going to have a look around for some software, there must be something floating about somewhere.


----------



## Derpinheimer

Quote:


> Originally Posted by *nrpeyton*
> 
> I used to think my max stable was about 550-650 until a few nights ago when I did that... if you read back a few pages you'll find my original post (can't miss it coz the graph stands out a mile).
> 
> However a +925 memory overclock might be SO FAST that even if it is in fact error-correcting (or re-sending frame) as someone pointed out a minute ago ECD or something.... it could actually be *over-taking* the performance hit. (*and actually still having to error-correct*
> 
> What we really need is to be able to get our GPU's to perform a calculation then test that calculation. Like we can do on our CPU's.
> 
> I'm going to have a look around for some software, there must be something floating about somewhere.


Completely rewrote post since I keep editing it:

At 595 I get the highest framerate, and at 596 it spikes down about 5%. By 1000, it recovers almost all the losses (it seems to be a linear increase from ~600 to 1000) but is still slower than 595.

I also do not see that values ending in 5 or 25 perform any better than random numbers.


----------



## Derek1

Quote:


> Originally Posted by *Derpinheimer*
> 
> Completely rewrote post since I keep editing it:
> 
> At 595 I get the highest framerate, and at 596 it spikes down about 5%. By 1000, it recovers almost all the losses (it seems to be a linear increase from ~600 to 1000) but is still slower than 595.
> 
> I also do not see that values ending in 5 or 25 perform any better than random numbers.


That may be true. Anything but a 0.


----------



## nrpeyton

*Regarding 1080 Classified Waterblock Support*

Just been doing some googling (researching online) and found information saying that apparently the EK 980TI waterblock didn't actually actively cool the VRM's of the 980's.

http://forums.evga.com/GTX-980-Ti-Classified-Waterblock-m2375665.aspx

So is there still any real point in grabbing the EK block and *making* it fit instead of just going for the alphacool one?

After all the recent communication I've had on the issue (and also reading posts on EVGA forums and posts from people like Vellinious on here) *I was ready to order the EK block.

But now I'm not so sure.
*
Can anyone who actually had a 980TI Classy and used the 780TI Classy block comment on this please?

I thought I'd finally made my mind up until I read the thread above (linked).

Here is a quote from EK_Luc on the EVGA website:
_"Since we aren't re-doing a specific block but instead using the 780 Classy block, the VRM section will still be not covered by the block like it was for the 780."_

Read this link for full info: http://forums.evga.com/GTX-980-Ti-Classified-Waterblock-m2375665.aspx (you have to scroll down a bit to get to the relevant part)

Really need some help here now.... 

Nick

Quote:


> Originally Posted by *Vellinious*


Vellinious I just quoted your name to get your attention (I get emails when people quote me so I hoped you would too) -- I believe you have the EK 780 TI block that you used on your 980 TI? What did you make of this? Apparently the VRM on the 980's was still only "passively" cooled with a full EK block? Which is exactly the same as alphacool (as the centre water section makes contact with the metal on the rest of the block). .... really need some help here now lol....


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *Regarding 1080 Classified Waterblock Support*
> 
> Just been doing some googling (researching online) and found information saying that apparently the EK 980TI waterblock didn't actually actively cool the VRM's of the 980's.
> 
> http://forums.evga.com/GTX-980-Ti-Classified-Waterblock-m2375665.aspx
> 
> So is there still any real point in grabbing the EK block and *making* it fit instead of just going for the alphacool one?
> 
> After all the recent communication I've had on the issue (and also reading posts on EVGA forums and posts from people like Vellinious on here) *I was ready to order the EK block.
> 
> But now I'm not so sure.
> *
> Can anyone who actually had a 980TI Classy and used the 780TI Classy block comment on this please?
> 
> I thought I'd finally made my mind up until I read the thread above (linked).
> 
> Here is a quote from EK_Luc on the EVGA website:
> _"Since we aren't re-doing a specific block but instead using the 780 Classy block, the VRM section will still be not covered by the block like it was for the 780."_
> 
> Read this link for full info: http://forums.evga.com/GTX-980-Ti-Classified-Waterblock-m2375665.aspx (you have to scroll down a bit to get to the relevant part)
> 
> Really need some help here now....
> 
> Nick
> 
> Vellinious I just quoted your name to get your attention (I get emails when people quote me so I hoped you would too) -- I believe you have the EK 780 TI block that you used on your 980 TI? What did you make of this? Apparently the VRM on the 980's was still only "passively" cooled with a full EK block? Which is exactly the same as alphacool (as the centre water section makes contact with the metal on the rest of the block). .... really need some help here now lol....


The only thing not cooled by the Classy block is the memory VRM....which doesn't get that hot anyway. The primary VRM is actively cooled. lol

I beat the hell out of my Classys and never had any issues. = P

http://forums.evga.com/EK-WaterBlocks-Discussion-Help-Question-Thread-m2029760-p13.aspx#2373459


----------



## maurokim

Hello everyone .. I have a gtx 1080 galax exoc. I can not find the waterblock compatible for this vga .. There? Thank you and sorry for my English.


----------



## Dragonsyph

Quote:


> Originally Posted by *maurokim*
> 
> Hello everyone .. I have a gtx 1080 galax exoc. I can not find the waterblock compatible for this vga .. There? Thank you and sorry for my English.


http://www.performance-pcs.com/bitspower-galax-geforce-gtx-1080-hof-clear-acrylic-v1-water-block-for-galax-geforce-gtx-1080-hof.html

Don't know if the HOF one will fit on a ex oc but its the best i could find.


----------



## DStealth

Just obtained highest validated score with 2139/+925 the memory it seems scales past 534 Mhz where the highest from my my previous research was...but all the benches unfortunately i did were in the range +500-600








25500+ GPU score


----------



## jleslie246

Anyone looking for a low low price on a new 1080 I think I found one.

Newegg: Asus GTX1080 Turbo $559.99 after $20 rebate.


----------



## jleslie246

DARKBOOMHONEY: Just curious: Why didnt you use the waterblock on the motherboard? GPU?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> The only thing not cooled by the Classy block is the memory VRM....which doesn't get that hot anyway. The primary VRM is actively cooled. lol
> 
> I beat the hell out of my Classys and never had any issues. = P
> 
> http://forums.evga.com/EK-WaterBlocks-Discussion-Help-Question-Thread-m2029760-p13.aspx#2373459


Great thanks for info then, the EK rep never made it very clear on the first post he made. So its only the memory VRM's that water doesn't pass over then... okay great.



BTW it's good to see all my benching work paid off; I've seen plenty of posts on here now where guys have been pushing their memories harder since I published my extensive graph/benchmark write-up.

*Took my EVGA CLASSIFIED 1080 apart* tonight *for the first time ever* (reason was *to supply EK with measurements* for EK_Grega to forward to their R&D department *to check fitting compatibility*).

EVGA never got back to me; so Grega and I agreed we'd *try* using my measurements instead.

For anyone interested in what the PCB of a 1080 Classified looks like (with 17 phase VRM) here's a look at some pictures:

There are virtually *NO* 1080 Classified PCB pictures floating around anywhere (they are very rare/hard to find)

*So here you go:*

_right click 'open in new tab' for full size_


_right click 'open in new tab' for full size_


*Anyone know how I POWER MOD this? \/ *


_right click 'open in new tab' for full size_


_right click 'open in new tab' for full size_


_right click 'open in new tab' for full size_

.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Great thanks for info then, the EK rep never made it very clear on the first post he made. So its only the memory VRM's that water doesn't pass over then... okay great.
> 
> 
> 
> BTW it's good to see all my benching work paid off; I've seen plenty of posts on here now where guys have been pushing their memories harder since I published my extensive graph/benchmark write-up.
> 
> *Took my EVGA CLASSIFIED 1080 apart* tonight *for the first time ever* (reason was *to supply EK with measurements* for EK_Grega to forward to their R&D department *to check fitting compatibility*).
> 
> EVGA never got back to me; so Grega and I agreed we'd *try* using my measurements instead.
> 
> For anyone interested in what the PCB of a 1080 Classified looks like (with 17 phase VRM) here's a look at some pictures:
> 
> There are virtually *NO* 1080 Classified PCB pictures floating around anywhere (they are very rare/hard to find)
> 
> *So here you go:*
> 
> _right click 'open in new tab' for full size_
> 
> 
> _right click 'open in new tab' for full size_
> 
> 
> *Anyone know how I POWER MOD this? \/ *
> 
> 
> _right click 'open in new tab' for full size_
> 
> 
> _right click 'open in new tab' for full size_
> 
> 
> _right click 'open in new tab' for full size_
> 
> .


I took a couple of shots of the block, and a closer shot of the corner in question. As you can see, the nut for the corner of the core is elevated slightly, so the memory block there really shouldn't even make contact, but......I think if you put a little piece of thermal pad on it, it'd be just fine.

The bottom right corner in each of these pics.

PM me, Nick. I have an idea.


----------



## nrpeyton

hey its been quiet on here for the last couple days

is it a holiday in the U.S. or something?

over here in Scotland its business as normal. lol


----------



## Vellinious

Thanksgiving Day today. The run up to the holidays, especially on a short work week like this, makes everyone a little bit crazy.


----------



## SmackHisFace

Newest Member of the club! Just ordered my GTX 1080. Got a Zotac AMP for $522 after tax in California. Cant wait for it to get here. Ill be selling my 980ti SLI. Im so sick of SLI I couldn't be happier for this "downgrade".


----------



## nrpeyton

Quote:


> Originally Posted by *SmackHisFace*
> 
> Newest Member of the club! Just ordered my GTX 1080. Got a Zotac AMP for $522 after tax in California. Cant wait for it to get here. Ill be selling my 980ti SLI. Im so sick of SLI I couldn't be happier for this "downgrade".


Welcome to the club


----------



## SirCanealot

So finally managed to get my Acellero Xtreme IV onto the card. Wasn't too bad this time, after having put on an Xtreme III onto my 980 (gluing on those heatsinks onto a £500 card was quite an interesting experience), but the 1080 FE has these ridiculous bolts to take the stock cooler off. What's wrong with screws? :'(
Had to get some pliers and carefully twist them all off. (there's like 12 of them!)

Haven't had a chance to do much testing yet, but with the fans at 100% (they basically make no noise at all really), I think I'm looking at 50-55 degrees under max load








Don't seem to be able to push the clock offset much further, but hopefully T4+curve overclocking will help a little more. And no more hair dryer sound!!!

Think I'll flash the T4 bios in the next few days and start messing around









nrpeyton, I see you mentioned before using the Futuremark stress tests. How many runs did you do before you declare a clock stable?

And did you let the card get hot when testing?
One thing I'm worried about is that if I'm testing the low end of the curve with the fans on 100% that the card is probably going to be running around 30-40 degrees, so if it were to drop to one of those clocks when under full load that is stable at 30-40 degrees, it may not be stable at 50-60 degrees. So I'm wondering if I should test with all my fans on low to allow the card to heat up a bit more for more of a 'worst case scenario'. And of course, once I'm gaming normally I will have no idea which clock state it has crashed on!!!!
















I'm extremely loath to do this testing really, but considering the results you and a few others have had I think it needs to be done!


----------



## Koniakki

Quote:


> Originally Posted by *SirCanealot*
> 
> So finally managed to get my Acellero Xtreme IV onto the card. Wasn't too bad this time, after having put on an Xtreme III onto my 980 (gluing on those heatsinks onto a card was quite an interesting experience), but the 1080 FE has these ridiculous bolts to take the stock cooler off. What's wrong with screws? :'(
> Had to get some pliers and carefully twist them all off. (there's like 12 of them!)
> 
> Haven't had a chance to do much testing yet, but with the fans at 100% (they basically make no noise at all really), I think I'm looking at 50-55 degrees under max load
> 
> 
> 
> 
> 
> 
> 
> 
> Don't seem to be able to push the clock offset much further, but hopefully T4+curve overclocking will help a little more. And no more hair dryer sound!!!
> 
> Think I'll flash the T4 bios in the next few days and start messing around
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nrpeyton, I see you mentioned before using the Futuremark stress tests. How many runs did you do before you declare a clock stable?
> 
> And did you let the card get hot when testing?
> One thing I'm worried about is that if I'm testing the low end of the curve with the fans on 100% that the card is probably going to be running around 30-40 degrees, so if it were to drop to one of those clocks when under full load that is stable at 30-40 degrees, it may not be stable at 50-60 degrees. So I'm wondering if I should test with all my fans on low to allow the card to heat up a bit more for more of a 'worst case scenario'. *And of course, once I'm gaming normally I will have no idea which clock state it has crashed on*!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm extremely loath to do this testing really, but considering the results you and a few others have had I think it needs to be done!


You could have MSI Afterburner (or if possible with a different utility) to log your session so you can check when/where it failed.


----------



## xartic1

Quote:


> Originally Posted by *Koniakki*
> 
> You could have MSI Afterburner (or if possible with a different utility) to log your session so you can check when/where it failed.


Or if possible pick up a cheap 4:3 monitor and use it specifically for monitoring your temps and clocks.


----------



## nrpeyton

Quote:


> Originally Posted by *SirCanealot*
> 
> So finally managed to get my Acellero Xtreme IV onto the card. Wasn't too bad this time, after having put on an Xtreme III onto my 980 (gluing on those heatsinks onto a £500 card was quite an interesting experience), but the 1080 FE has these ridiculous bolts to take the stock cooler off. What's wrong with screws? :'(
> Had to get some pliers and carefully twist them all off. (there's like 12 of them!)
> 
> Haven't had a chance to do much testing yet, but with the fans at 100% (they basically make no noise at all really), I think I'm looking at 50-55 degrees under max load
> 
> 
> 
> 
> 
> 
> 
> 
> Don't seem to be able to push the clock offset much further, but hopefully T4+curve overclocking will help a little more. And no more hair dryer sound!!!
> 
> Think I'll flash the T4 bios in the next few days and start messing around
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nrpeyton, I see you mentioned before using the Futuremark stress tests. How many runs did you do before you declare a clock stable?
> 
> And did you let the card get hot when testing?
> One thing I'm worried about is that if I'm testing the low end of the curve with the fans on 100% that the card is probably going to be running around 30-40 degrees, so if it were to drop to one of those clocks when under full load that is stable at 30-40 degrees, it may not be stable at 50-60 degrees. So I'm wondering if I should test with all my fans on low to allow the card to heat up a bit more for more of a 'worst case scenario'. And of course, once I'm gaming normally I will have no idea which clock state it has crashed on!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm extremely loath to do this testing really, but considering the results you and a few others have had I think it needs to be done!


MSI Afterburner can be set up to show almost everything on-screen during any game or benchmark you are running. (core clock memory clock, voltage, power % of maximum, FPS, temperature and if you're intel even your CPU temperature too (and many many more) 

~right click and select "open new tab" for full size~


You want to download 3dmark (google search) and compare scores while increasing core and memory clocks as you go (one at a time). When scores begin to drop run a few more tests to confirm. Then reduce the overclock slightly until scores begin going up again.

Start with your 'out of the box default' and try a +100, if it doesn't crash try a +150. Once it crashes reduce it by about -100 then start increasing again in incriments of 13mhz per step (monitoring scores as you go). Once you think you've found your max scoring O/C run a few extra tests a few steps over and under that value until you're sure.

Probably actually best starting with memory first. Then doing the core. With memory try increments of about 50mhz per step. Don't use furmark. Go for 3dmark or proper benching software.

I've also just found a new program called OCCT which seems to be quite good at 'error detecting' on your GPU. Could give that a try (that's what I'e just started using) although I've not had a lot of time with it yet.


----------



## emperium85

I tried many different bios on my GTX1080 FE , the best score comes with the Original Dell bios. Dont know why, but it performans better than de T4 bios.

backup.zip 149k .zip file


----------



## nrpeyton

Hi,

Okay EK doesn't officially do a full waterblock for the 1080 Classified; but I've got my hands on the EK 780 Classified waterblock *(as a few guys on EVGA forums managed to get it to fit).

*This is 1st time I've ever installed a block on a GPU (_so it doesn't help that the block isn't even specifically designed for my card_).

In the instructions it says:
_"EKWB recommends using small drops of thermal grease on each phase regulator (that is being covered with thermal pad) in order to even further improve the thermal performance of the water block."

_So I have 2 questions:

*Question 1)* what part is the 'phase regulators'? (I've put numbers beside each part in picture) Can someone please say which one is the phase regulators?

*Question 2)* Which of these parts (with corresponding numbers) actually gets hot? (Generates the heat & is actively cooled)?

*Question 3) Which ones (1, 2, 3, 4, 5 or 6) do the thermal pads go on?*

[Full size: right click >> 'open new tab']


Thanks so much

Nick Peyton


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Hi,
> 
> Okay EK doesn't officially do a full waterblock for the 1080 Classified; but I've got my hands on the EK 780 Classified waterblock *(as a few guys on EVGA forums managed to get it to fit).
> 
> *This is 1st time I've ever installed a block on a GPU (_so it doesn't help that the block isn't even specifically designed for my card_).
> 
> In the instructions it says:
> _"EKWB recommends using small drops of thermal grease on each phase regulator (that is being covered with thermal pad) in order to even further improve the thermal performance of the water block."
> 
> _So I have 2 questions:
> 
> *Question 1)* what part is the 'phase regulators'? (I've put numbers beside each part in picture) Can someone please say which one is the phase regulators?
> 
> *Question 2)* Which of these parts (with corresponding numbers) actually gets hot? (Generates the heat & is actively cooled)?
> 
> *Question 3) Which ones (1, 2, 3, 4, 5 or 6) do the thermal pads go on?*
> 
> [Full size: right click >> 'open new tab']
> 
> 
> Thanks so much
> 
> Nick Peyton


Who at EK recommended that? I think when you get the block in hand, if you follow the directions given, you'll find that completely unnecessary and be able to complete the task with relative ease.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Who at EK recommended that? I think when you get the block in hand, if you follow the directions given, you'll find that completely unnecessary and be able to complete the task with relative ease.


it says so in the manual (but doesn't say which ones are the phase regulators)

then on jayz2cents and on the manual for a different model they even recommend using a very small drop of paste between the memory chips and the pads too.

then on another video I actually watched a guy put them on what is number '3' in my photo (then if I'm doing 3 wouldn't i need to do 2? don't the capacitors heat up too)??

however if you follow the "diagram" in the "original 1 sheet black & white" manual that comes with the block it looks like they are only placed on memory and on 4 and 5 in my photo. (4 and 5 are TINY compared to 2 and 3 so I find it hard to believe that they are the only components that generate the heat)

I don't want to muck this up, because I am going to be lucky if all the components even make contact (and will have no way of checking) after the block is on. (also no way to check VRM temperature so i would never know until card exploded in 2/3 weeks time)

Once on, can't obviously see how its touching.

(I thought I could maybe put tape or a bit of paste on first then remove it to try and *test* it by seeing if it smears (or if the tape sticks and lifts off with the block when removed), *but then unless block went down "perfectly, equally, paraellel" on every corner* it could touch while I'm moving it into position but then completely be missing it once the block is in its final position....

I'd also kind of like to know (just for knowing) which parts actually do the heating up anyway. I think if I'm going take apart a card and watercool it I should probably know that lol.


----------



## charro0412

http://www.3dmark.com/3dm/16265597

MSI GeForce GTX 1080 Gaming X 8G, Grafische kaart

i7-3970x up to 4.4ghz

kingston ram 1600 up to 2.133

Rampage iv extreme

Bottleneck or not ??

Thanks so much

Charro Luft


----------



## nrpeyton

Quote:


> Originally Posted by *charro0412*
> 
> http://www.3dmark.com/3dm/16265597
> 
> MSI GeForce GTX 1080 Gaming X 8G, Grafische kaart
> 
> i7-3970x up to 4.4ghz
> 
> kingston ram 1600 up to 2.133
> 
> Rampage iv extreme
> 
> Bottleneck or not ??
> 
> Thanks so much
> 
> Charro Luft


definitely not bottlenecked with that CPU on an 1080.


----------



## charro0412

thank you so well I do not upgrade sins of the money

Thanks so much

Charro Luft


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> it says so in the manual (but doesn't say which ones are the phase regulators)
> 
> then on jayz2cents and on the manual for a different model they even recommend using a very small drop of paste between the memory chips and the pads too.
> 
> then on another video I actually watched a guy put them on what is number '3' in my photo (then if I'm doing 3 wouldn't i need to do 2? don't the capacitors heat up too)??
> 
> however if you follow the "diagram" in the "original 1 sheet black & white" manual that comes with the block it looks like they are only placed on memory and on 4 and 5 in my photo. (4 and 5 are TINY compared to 2 and 3 so I find it hard to believe that they are the only components that generate the heat)
> 
> I don't want to muck this up, because I am going to be lucky if all the components even make contact (and will have no way of checking) after the block is on. (also no way to check VRM temperature so i would never know until card exploded in 2/3 weeks time)
> 
> Once on, can't obviously see how its touching.
> 
> (I thought I could maybe put tape or a bit of paste on first then remove it to try and *test* it by seeing if it smears (or if the tape sticks and lifts off with the block when removed), *but then unless block went down "perfectly, equally, paraellel" on every corner* it could touch while I'm moving it into position but then completely be missing it once the block is in its final position....
> 
> I'd also kind of like to know (just for knowing) which parts actually do the heating up anyway. I think if I'm going take apart a card and watercool it I should probably know that lol.


I've never used thermal paste on the memory or VRM unless the block was an aquacomputer kryographics block....they're machined to use thermal paste in place of the pads.


----------



## Dragonsyph

Quote:


> Originally Posted by *charro0412*
> 
> http://www.3dmark.com/3dm/16265597
> 
> MSI GeForce GTX 1080 Gaming X 8G, Grafische kaart
> 
> i7-3970x up to 4.4ghz
> 
> kingston ram 1600 up to 2.133
> 
> Rampage iv extreme
> 
> Bottleneck or not ??
> 
> Thanks so much
> 
> Charro Luft


Graphics score seems low for 1936mhz.


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> Hi,
> 
> Okay EK doesn't officially do a full waterblock for the 1080 Classified; but I've got my hands on the EK 780 Classified waterblock *(as a few guys on EVGA forums managed to get it to fit).
> 
> *This is 1st time I've ever installed a block on a GPU (_so it doesn't help that the block isn't even specifically designed for my card_).
> 
> In the instructions it says:
> _"EKWB recommends using small drops of thermal grease on each phase regulator (that is being covered with thermal pad) in order to even further improve the thermal performance of the water block."
> 
> _So I have 2 questions:
> 
> *Question 1)* what part is the 'phase regulators'? (I've put numbers beside each part in picture) Can someone please say which one is the phase regulators?
> 
> *Question 2)* Which of these parts (with corresponding numbers) actually gets hot? (Generates the heat & is actively cooled)?
> 
> *Question 3) Which ones (1, 2, 3, 4, 5 or 6) do the thermal pads go on?*
> 
> [Full size: right click >> 'open new tab']
> 
> 
> Thanks so much
> 
> Nick Peyton


Here we go:

1 and 2: Capacitors

3: Chokes (inductors)

4: Synchronous Mosfets

5: Control Mosfets

6: Integrated drivers

By far the most heat comes from the mosfets and drivers!









The Classy has 14 Phases, each phase has 1 integrated driver, 2 control mosfets, 2 synchronous mosfets and 1 choke/inductor (plus capacitors to smooth the current), the phases function is convert, regulate and provide "clean" current (DC/DC) to the core or memory (it has 3 additional phases for the memory.)

Cheers

Occamrazor


----------



## Menthol

Quote:


> Originally Posted by *nrpeyton*
> 
> it says so in the manual (but doesn't say which ones are the phase regulators)
> 
> then on jayz2cents and on the manual for a different model they even recommend using a very small drop of paste between the memory chips and the pads too.
> 
> then on another video I actually watched a guy put them on what is number '3' in my photo (then if I'm doing 3 wouldn't i need to do 2? don't the capacitors heat up too)??
> 
> however if you follow the "diagram" in the "original 1 sheet black & white" manual that comes with the block it looks like they are only placed on memory and on 4 and 5 in my photo. (4 and 5 are TINY compared to 2 and 3 so I find it hard to believe that they are the only components that generate the heat)
> 
> I don't want to muck this up, because I am going to be lucky if all the components even make contact (and will have no way of checking) after the block is on. (also no way to check VRM temperature so i would never know until card exploded in 2/3 weeks time)
> 
> Once on, can't obviously see how its touching.
> 
> (I thought I could maybe put tape or a bit of paste on first then remove it to try and *test* it by seeing if it smears (or if the tape sticks and lifts off with the block when removed), *but then unless block went down "perfectly, equally, paraellel" on every corner* it could touch while I'm moving it into position but then completely be missing it once the block is in its final position....
> 
> I'd also kind of like to know (just for knowing) which parts actually do the heating up anyway. I think if I'm going take apart a card and watercool it I should probably know that lol.


Yes a very tiny amount of TIM on the components before pads are installed and even on pads between block can help a small amount, make sure to use non conductive TIM, I also used this method on RIVE VRM's when I had block on them. I used Prolimatec PK-1, very easy to use and very good TIM, don't expect to see any gain in temps as it's very small maybe enough to keep from killing something during benching with your EVBot setting voltages to unsafe levels, but Pascal doesn't let you set voltages to that level so I wouldn't consider it necessary to go the extra step adding the TIM


----------



## nrpeyton

*Okay just received email back from EK_Grega this minute.

Going to continue discussing a work-around with Ek_Grega so watch this space 

Anyway here's the email:*



*EKWB Support (Grega)* (EKWB Support)

Nov 27, 00:27 CET

Hello Nick,

did my best with the information I had (thank you again for detailed pictures).

Here is a sketch of how the contact area goes. The memory ICs are not completely covered and there is still a potential issue with the capacitor (marked red).
To rule out this issue we would need to physically check or have an actual model of the graphics card.
As such, I can't officialy confirm that the water block will fit.



Best regards,
Grega

I'm actually really upset like; anyone think of a worthwhile workaround that would justify keeping this block instead of buying the alphacool official one?

I am off on holiday for a week (starting today) so I am so upset. Was really looking forward to the build :-(

Ta, Nick :-(


----------



## Skrillex

Yo all, just upgraded from a 680 4GB to a MSI 1080 Armour 8GB

Aaaaand wow what a difference even just at 1080p, new 1440p monitor is coming soon to properly stretch it's legs


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *Okay just received email back from EK_Grega this minute.
> 
> I'm so disappointed :-( spent over £150 including shpping :-(
> 
> Here is the email:*
> 
> 
> 
> *EKWB Support (Grega)* (EKWB Support)
> 
> Nov 27, 00:27 CET
> 
> Hello Nick,
> 
> did my best with the information I had (thank you again for detailed pictures).
> 
> Here is a sketch of how the contact area goes. The memory ICs are not completely covered and there is still a potential issue with the capacitor (marked red).
> To rule out this issue we would need to physically check or have an actual model of the graphics card.
> As such, I can't officialy confirm that the water block will fit.
> 
> 
> 
> Best regards,
> Grega
> 
> I'm actually really upset like; anyone think of a worthwhile workaround that would justify keeping this block instead of buying the alphacool official one?
> 
> I am off on holiday for a week (starting today) so I am so upset. Was really looking forward to the build :-(
> 
> Ta, NIck :-(


I'd wait and see if it fits. The guys on the EVGA forums haven't had any issues using them, and nobody said anything about taking a dremel or anything to the block. Cover that thing up with the thermal pads to insulate it from making contact, and it should be just fine.


----------



## MonarchX

For some reason the form to register for GTX 1080 ownership doesn't accept my validation, even though it has my OCN member ID attached to it...


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I'd wait and see if it fits. The guys on the EVGA forums haven't had any issues using them, and nobody said anything about taking a dremel or anything to the block. Cover that thing up with the thermal pads to insulate it from making contact, and it should be just fine.


I think the issue is that the resistor is "higher" than the memory chips.

Meaning the "step" for memory chips on block will not make proper contact with memory due to sitting higher because of this resistor.

And I can't add more pads to the memory chips because the block wouldn't make contact with the GPU die.

However if you look at *the bottom* picture very closely you'll see that the "step" *might* actually miss the resistor (but only *JUST)*.

In fact if it does miss it (the side of the step will probably actually be touching the side of the resistor) which is fine -- because I'll be able to insulate it with a bit of thin electrical tape.

I wont' know for sure until I try it (other guys said its worked but how can we be sure that the memory chip beside the resistor is really making contact)? We won't until I try. Even then it will be very hard to check; because once the block is on visibility doesn't exist.

Here's the latest email I've had from EK_Grega on the matter and also a diagram I've made to demonstrate the issue:

*Anyway here is the latest which I think I am definitely going to attempt.*

_right click & 'open new tab' for full size_


*HOWEVER: The grinding may actually NOT BE REQUIRED*: If you zoom in on the 2nd (bottom) picture very closely you will see that the 'step' on the block *does indeed appear to "miss" the "side" of the resistor (marked red). But ONLY by a tiny tiny amount* (I'd be surprised if they weren't actually touching). But I could easily use thin electrical tape to insulate.

If it DOES overlap I doubt the 'step' on the memory block would need to be grinded very much (probably less than 0.5mm would do).

However there *are indeed guys at EVGA Forums who never had to grind the step* who also spent hours checking contact with step & memory.
One has actually just PM'd me personally just to confirm this:

*AHowes:*
_I bought the 780TI classified block.. It fit no problem. I did the GPU core check where you mought it and remove to make sure its getting solid contact and it was. Yes the memory part is a little off but enough of the memory is covering so no worries there. Did not notice that little cap in the way as I did check everything from every angle to make sure the block was flat on the memory and making perfect contact with the provided pads.
I did this to 2 1080 classifieds and so have others without issue.. no worries!_
Allen

And also from EK
* EK_Grega:*
_Given that there are reports that users were able to install it I do think it should go withouth any modifications.
It would be the best to contact the user that already installed 780 Classy water block to the 1080 Classified and let him explain if he did some modfifications to the water block.

As for the installation.
I would *use 0.5mm pads on RAM* and *1mm pads on VRM* part. *Do check if there is proper contact on VRM (marked green). If not, then add 0.5mm thermal pad.*_

*P.S. EK_Grega had another look in his own spare time and much of the information I've received has been from him

This guy is amazing.*

Anyway its time to go and try the installation (I'll keep you all updated on my progress - specifically regarding the resistor marked in red)

Have a good day,

Nick


----------



## khemist

https://imageshack.com/i/pm3tPAo1j

Now under water.


----------



## nrpeyton

Quote:


> Originally Posted by *khemist*
> 
> Now under water.


wow looks good mate 
what kind of temperature differences are you getting and what 1080 you using and what block?

*BTW - anyone think the lack of memory contact on the left memory chips (picture in my last post on previous page) will cause much of a problem for overclocking??*


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *BTW - anyone think the lack of memory contact on the left memory chips (picture in my last post on previous page) will cause much of a problem for overclocking??*


No...obviously they won't run as cool as they would if they had 100% coverage, but....really, unless you're going to be cranking the voltage up on the memory to just silly levels, I think you'll be just fine.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> No...obviously they won't run as cool as they would if they had 100% coverage, but....really, unless you're going to be cranking the voltage up on the memory to just silly levels, I think you'll be just fine.


What if I added extra pads here to help where the "step area" misses the memory (at least it would help make contact with the rest of the block so that part of the memory chip isn't running completely bare?

(so instead of adding pads to all of the memory chips just add a few to the actual *block* *here* in the "blue marked spots" on diagram? Or would that affect the GPU contact?

I'm finding it difficult to visualise this (having never done this before)


----------



## khemist

Quote:


> Originally Posted by *nrpeyton*
> 
> wow looks good mate
> what kind of temperature differences are you getting and what 1080 you using and what block?


1080 FE and heatkiller block, according to the instructions i have the outlet connected as the inlet, i'm going to have to redo it.

Not tested yet.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> What if I added extra pads here to help where the "step area" misses the memory (at least it would help make contact with the rest of the block so that part of the memory chip isn't running completely bare?
> 
> (so instead of adding pads to all of the memory chips just add a few to the actual *block* *here* in the "blue marked spots" on diagram? Or would that affect the GPU contact?
> 
> I'm finding it difficult to visualise this (having never done this before)


I wouldn't worry about trying to do anything extra for the memory. Can't hurt trying it, but....I don't think it'll make that much of a difference.


----------



## Menthol

Plenty of people overclock cards hard with only a universal block and a fan blowing on the card to remove heat from the VRM, of it's better to cool with a full block for daily use but I think you'll be just fine without further worry, but I don't think your going to let it be just fine, your a little anal about it which is a good thing









I still have 4 780ti blocks around the house , 2 EK and 2 EVGA, easy to sell video cards, harder to sell full cover blocks


----------



## nrpeyton

Quote:


> Originally Posted by *Menthol*
> 
> Plenty of people overclock cards hard with only a universal block and a fan blowing on the card to remove heat from the VRM, of it's better to cool with a full block for daily use but I think you'll be just fine without further worry, but I don't think your going to let it be just fine, your a little anal about it which is a good thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still have 4 780ti blocks around the house , 2 EK and 2 EVGA, easy to sell video cards, harder to sell full cover blocks


lol your the 2nd person who says you had spare blocks around the house,.... after all the posting I've done i wish one of you had mentioned it and i'd have bought a block from you instead of paying 150+ for a brand new one + shipping lol

*INSTALLING EK 780 TI CLASSIFIED WATERBLOCK ON 1080 CLASSIFIED*

*Anyway update:*

Block is fitted. I've also checked contact on everything by putting thermal grease ontop of the pads on memory, core and vrm pads. Then placing (then removing) block and seeingif there was residue on the block.

Not only was there residue lol... the block actually pulled the pads back off with it (as the paste is so sticky) so I think thats proof enough that everything is making perfect contact.

I also done a little mod by cutting 2 little 1mm thick squares to get full memory contact (so out of the 8 chips, 6 chips have near 100% contact with 0.5mm pads and 2 chips have 60% contact at 0.5mm and 40% contact at 1.5mm (so argubably I have at least 95% contact memory contact).

I've also added a little extra pad to the "drivers" on the VRM section (not even normally covered in EVGA spec) to further increase cooling on VRM area.

Once I get this connected up I am quite confident it is going to be just as good as a normal, full cover block. If not even better, lol 

*The exciting bit:*

*Fixing memory contact problem (for left two chips) - See first 3 pictures*


*2 square 1.0mm pads (included with block package) allows step to sit "flush" with extra pad giving 100% coverage to left 2 ram chips (60% coverage at 0.5mm and 40% coverage at 1.5mm for these two chips)*


*"Problem resistor" due to it being "higher" than memory chips was a potential compatibility problem*


*After 0.5mm pads (included with EK block package) were added to all memory chips the resistor no longer higher than memory (removing compatibility problem)*


_*Insulating "problem resistor" with electrical tape anyway just in case *_




*Cutting pre-perforated 0.5mm pads for memory (included in EK block package)*


*Adding extra thermal pads for "drivers" on VRM* (these heat up a lot but aren't even included in stock EVGA cooling, NOR do many water-coolers bother) - 1mm pads used (included with EK block package)
*(See the really thin strip i cut that I'm holding up before being stuck down)*\/ \/ \/


*Finish articles*



*Oops :-( :-(*


/\ /\ /\ /\ Not tested in system yet (time for bed) -- will let you know how I get on with PC/motherboard installation of new EK watercooled Classified 1080 tomorrow


----------



## Vellinious

Quote:


> Originally Posted by *Menthol*
> 
> Plenty of people overclock cards hard with only a universal block and a fan blowing on the card to remove heat from the VRM, of it's better to cool with a full block for daily use but I think you'll be just fine without further worry, but I don't think your going to let it be just fine, your a little anal about it which is a good thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still have 4 780ti blocks around the house , 2 EK and 2 EVGA, easy to sell video cards, harder to sell full cover blocks


Yup. I've got 2 of them here as well. My cards didn't get sold, though. They went back to EVGA in the step up program for a couple of 1080SCs.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Yup. I've got 2 of them here as well. My cards didn't get sold, though. They went back to EVGA in the step up program for a couple of 1080SCs.


Thought you had FTW's?

Also what do u think of my last post? lol


----------



## Vellinious

I do. I tested the SCs and then sold them to buy the FTWs.

I think it looks good. I'd run with it.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I do. I tested the SCs and then sold them to buy the FTWs.
> 
> I think it looks good. I'd run with it.


ahh I guessed thats what you probably did lol 

how did they compare? would be interesting to see just how a SC compares to an FTW in benchmarking and overclocking potential?


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> ahh I guessed thats what you probably did lol
> 
> how did they compare? would be interesting to see just how a SC compares to an FTW in benchmarking and overclocking potential?


Someone just posted over in the Time Spy benchmark thread that they got their 1060 up to 2152.
Score was lower of course because of the smaller Memory.
But I have been noticing that Pascal clocks with equivalency regardless of model or brand.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Someone just posted over in the Time Spy benchmark thread that they got their 1060 up to 2152.
> Score was lower of course because of the smaller Memory.
> But I have been noticing that Pascal clocks with equivalency regardless of model or brand.


Lower cards (1070's, 1060's) will usually clock higher than 1080's due to having less cores (so easier to get it stable) --i think?

*Is one 360mm radiator enough for CPU + 1080?*

My radiator:
Dimensions: Triple 120mm *(360)*
Thickness: 38mm (*medium)*
FPI: *19* (split fin)


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Lower cards (1070's, 1060's) will usually clock higher than 1080's due to having less cores (so easier to get it stable) --i think?
> 
> Ya I am not sure about that to be honest.
> CPUs don't necessarily follow that rule, do they?
> 
> *Is one 360mm radiator enough for CPU + 1080?*
> 
> My radiator:
> Dimensions: Triple 120mm *(360)*
> Thickness: 38mm (*medium)*
> FPI: *19* (split fin)


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*


your maybe correct, but if you think of it like this:

(i know this is a completely different scenario but it might help to explain my "reasoning" on it)

RAM (the more chips you have the less chance you have of overclocking it further since ALL chips would have to be equally lucky with silicon lottery)... now the bigger a GPU chip the more chance that "part" of that chip isn't as "lucky" as the rest of it. I.E. the smaller it is the more chance you've got of being lucky.

Also - the more cores the more it heats up, the more it heats up the less overclocking headroom you get.

My AMD FX-8350 can get to 5.2GHZ (from a stock of 4.0GHZ on a medium spec mobo) --- thats a 1.2GHZ increase.

An AMD FX-9590 on the other hand with a higher TDP (and simply being a higher binned chip) is usually lucky to get to 5.3ghz / 5.4GHZ unless on LN2. (and the 9590 is a 5ghz stock chip. or 4.7ghz with 5.0ghz boost).

whats the stock boost on a 1060 anyway?


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> your maybe correct, but if you think of it like this:
> 
> (i know this is a completely different scenario but it might help to explain my "reasoning" on it)
> 
> RAM (the more chips you have the less chance you have of overclocking it further since ALL chips would have to be equally lucky with silicon lottery)... now the bigger a GPU chip the more chance that "part" of that chip isn't as "lucky" as the rest of it. I.E. the smaller it is the more chance you've got of being lucky.
> 
> Also - the more cores the more it heats up, the more it heats up the less overclocking headroom you get.
> 
> My AMD FX-8350 can get to 5.2GHZ (from a stock of 4.0GHZ on a medium spec mobo) --- thats a 1.2GHZ increase.
> 
> An AMD FX-9590 on the other hand with a higher TDP (and simply being a higher binned chip) is usually lucky to get to 5.3ghz / 5.4GHZ unless on LN2. (and the 9590 is a 5ghz stock chip. or 4.7ghz with 5.0ghz boost).
> 
> whats the stock boost on a 1060 anyway?


The 1060's seem to boost to between 1700 and 1860 depending on brand and model.
Much like the 1080's.


----------



## nrpeyton

*Quick question for 1st time GPU cooler guys:

Can I use the bottom port on the block for IN and the top port for OUT?

Or does it need to be:
bottom for IN and OUT
and
top for IN and OUT?*


----------



## VSG

Use a port on each side, so you can do bottom left and top right for example. Don't have both ports on the same side as that just mean the coolant will go into the inlet port and out the outlet port bypassing the GPU.


----------



## nrpeyton

Quote:


> Originally Posted by *geggeg*
> 
> Use a port on each side, so you can do bottom left and top right for example. Don't have both ports on the same side as that just mean the coolant will go into the inlet port and out the outlet port bypassing the GPU.


okay thanks


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> ahh I guessed thats what you probably did lol
> 
> how did they compare? would be interesting to see just how a SC compares to an FTW in benchmarking and overclocking potential?


About the same, except that the SCs had a lower power limit, so they were bouncing the clocks around up top quite a bit more. The FTWs don't do that.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Lower cards (1070's, 1060's) will usually clock higher than 1080's due to having less cores (so easier to get it stable) --i think?
> 
> *Is one 360mm radiator enough for CPU + 1080?*
> 
> My radiator:
> Dimensions: Triple 120mm *(360)*
> Thickness: 38mm (*medium)*
> FPI: *19* (split fin)


Less cores = less heat. With less heat, it gives you more headroom. Since pascal is so twitchy when it comes to temps, it's possible that, that COULD affect clock rates in processors with similar leakage.


----------



## nrpeyton

argh really wish i could use both bottom ports for this block installation; would look so much better :-(

and be much safer when i need to drain


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *Quick question for 1st time GPU cooler guys:
> 
> Can I use the bottom port on the block for IN and the top port for OUT?
> 
> Or does it need to be:
> bottom for IN and OUT
> and
> top for IN and OUT?*


Bottom / top doesn't matter.
Quote:


> Originally Posted by *nrpeyton*
> 
> argh really wish i could use both bottom ports for this block installation; would look so much better :-(


You can. One for in, and one for out.

To cut down on tubing, though...I've always run the "out" from the CPU, to the GPU block, then out the bottom. Can even run them in parallel if you wish, though, I'd advise against it.

CPU to GPU....this is how I usually do my tubing runs. Keep them short.


----------



## MonarchX

How far can Gigabyte Gaming G1 GTX 1080 be pushed on air? Are voltage BIOS mods that increase OC?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Bottom / top doesn't matter.
> You can. One for in, and one for out.
> 
> To cut down on tubing, though...I've always run the "out" from the CPU, to the GPU block, then out the bottom. Can even run them in parallel if you wish, though, I'd advise against it.


ahh right I'm doing it the other way around.

pump / res -->> radiator --> 1080 --> CPU --> pump / res

(figured the 1080 is more temp "sensitive" so wanted it to get the best benefit.

will be ordering an extra new 240mm radiator on my next pay day though


----------



## nrpeyton

Quote:


> Originally Posted by *MonarchX*
> 
> How far can Gigabyte Gaming G1 GTX 1080 be pushed on air? Are voltage BIOS mods that increase OC?


There are only two options for extra voltage beyond 1.093 nvidia limit on pascal:

1. T4/STRIX BIOS (allows up to 1.2v) *IF* your card is compatible (although most are)

2. Have a EVGA 1080 Classified (and use updated voltage tool)

Nick

Quote:


> Originally Posted by *Vellinious*
> 
> Bottom / top doesn't matter.
> You can. One for in, and one for out.
> 
> To cut down on tubing, though...I've always run the "out" from the CPU, to the GPU block, then out the bottom. Can even run them in parallel if you wish, though, I'd advise against it.
> 
> CPU to GPU....this is how I usually do my tubing runs. Keep them short.


wow is that your build? looks beautiful


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> ahh right I'm doing it the other way around.
> 
> pump / res -->> radiator --> 1080 --> CPU --> pump / res
> 
> (figured the 1080 is more temp "sensitive" so wanted it to get the best benefit.
> 
> will be ordering an extra new 240mm radiator on my next pay day though


For the first few minutes the machine is on, it'll make a difference, but.....once it's been running for a while, the loop will reach it's equilibrium point, and the placement of the radiator in the loop order won't matter any more. The difference in coolant temp from the CPU to GPU is nominal....


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> For the first few minutes the machine is on, it'll make a difference, but.....once it's been running for a while, the loop will reach it's equilibrium point, and the placement of the radiator in the loop order won't matter any more. The difference in coolant temp from the CPU to GPU is nominal....


interesting 

I've got my coolant in the fridge at the moment lol, its freezing point is -3c.

So before I fill it up i'm giong to stick it in the freezer until it hits about -1 then fil machine.

Might get some AWESOME overclocking for first 5 minutes lol... we'll see...

thats if i've not broken something on the card or shorted anything out.

i'm sweating here lol

right - moment of truth is upon me.. i'm absolutely terrified here.

wish me luck.... here goes.........


----------



## Koniakki

Quote:


> Originally Posted by *nrpeyton*
> 
> interesting
> 
> I've got my coolant in the fridge at the moment lol, its freezing point is -3c.
> 
> So before I fill it up i'm giong to stick it in the freezer until it hits about -1 then fil machine.
> 
> Might get some AWESOME overclocking for first 5 minutes lol... we'll see...
> 
> thats if i've not broken something on the card or shorted anything out.
> 
> i'm sweating here lol
> 
> right - moment of truth is upon me.. i'm absolutely terrified here.
> 
> wish me luck.... here goes.........


https://postimg.org/image/70ruk4ved/



Spoiler: Warning: Spoiler!



Just kidding. Best of luck mate. Its been quite a journey if I may say!


----------



## nrpeyton

***** everything is working perfectly *

*780TI Classy block fitted to 1080 Classy*.

*Just hit my BEST ever score: 5912* http://www.3dmark.com/3dm/16339819 _(core didn't even drop below 2202 the entire run)_

*Core:* 2202 mhz
*Memory:* +925

*Max Temp*: 35c
*Max Power:* 266w

_right click 'open new tab' for full size_


*Also - I have been liasing with EKWB about supporting the 1080 classified *officialy*.*

EK_Grega and I have been in constant communication the whole way (my aim this entire journey has *always* not just to get this working for myself but for *everyone to benefit* and for official support.

Here is a screenshot of the last email from EK_Grega last night:

_right click 'open new tab' for full size_


*In case the image is to distorted the email reads:*

_EKWB Support (Grega) (EKWB Support)
Nov 28, 10:17 CET
Hello Nick!
You found a nice workaround for the RAM ICs so they get more contact with the water block.

I'll check it out and wait for the temperatures you get and then I'll update our cooling configurator.

Nice day to you too!
Best regards,
Grega
_

P.S. remember guys; "shy boys don't get sweets".

When I started out with the 1080 Classified we had no voltage control until I asked 'k|ngp|n' for it.
and
NO waterblock support.

We are now almost there (99%) to having *official* support for *both*. 

*GO CLASSY! *

*Also a big thanks to everyone who has helped with advice along the way.* When I started out here 2 months ago I never even knew PC water cooling existed.

Vellinious especially -- without his steer/advice this probably wouldn't of been possible  http://www.overclock.net/u/428392/vellinious
Quote:


> Originally Posted by *Vellinious*


Nick Peyton


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> interesting
> 
> I've got my coolant in the fridge at the moment lol, its freezing point is -3c.
> 
> So before I fill it up i'm giong to stick it in the freezer until it hits about -1 then fil machine.
> 
> Might get some AWESOME overclocking for first 5 minutes lol... we'll see...
> 
> thats if i've not broken something on the card or shorted anything out.
> 
> i'm sweating here lol
> 
> right - moment of truth is upon me.. i'm absolutely terrified here.
> 
> wish me luck.... here goes.........


Be very careful with that.....that can cause condensation to form, with coolant temps that low.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Be very careful with that.....that can cause condensation to form, with coolant temps that low.


system has been up and running for about 30 mins now, coolant is now at normal temp.

I was aware that might happen so I made sure everything was done really fast 

it wasn't at that "low" temp long enough for condensation to form 

definitely a good post though; wouldn't want any newbs getting smart ideas lol 

Can't wait to get my water chiller now, at least 2 or 3 pay days away I think 

All I've got left to do now is **probe* the back areas of the card for temperatures* around memory (which can get up to 85 under stock) and VRM which with the EVGA thermal pad issue temps on furmark were nearly up to 100c.

Won't be completely accurate but it will give *some* indication of success with VRM and memory contact success with block


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> system has been up and running for about 30 mins now, coolant is now at normal temp.
> 
> I was aware that might happen so I made sure everything was done really fast
> 
> it wasn't at that "low" temp long enough for condensation to form
> 
> definitely a good post though; wouldn't want any newbs getting smart ideas lol
> 
> Can't wait to get my water chiller now, at least 2 or 3 pay days away I think
> 
> All I've got left to do now is **probe* the back areas of the card for temperatures* around memory (which can get up to 85 under stock) and VRM which with the EVGA thermal pad issue temps on furmark were nearly up to 100c.
> 
> Won't be completely accurate but it will give *some* indication of success with VRM and memory contact success with block


Still, it is much easier to grab a FTW or even a FE card and watercool it. All gtx 1080 perform very close in term of overclocking regardless how many power phases it has, as long as you can keep those phases cool.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Still, it is much easier to grab a FTW or even a FE card and watercool it. All gtx 1080 perform very close in term of overclocking regardless how many power phases it has, as long as you can keep those phases cool.


true - but when i started out with the Classified I had *no idea* I'd run into these problems.

same goes for so many other classified owners, (who paid more -- assuming they were going to *get more* but didn't (well not to start with -- but after a lot of hard work we finally did)

(or i did, and due to documenting my journey -- others now have the *option* instead of being left dry and disappointed and wishing they'd got an FTW.

I remember thinking that myself at one point.


----------



## Vellinious

The ability of the Classy to control voltage will give them a leg up. Especially under water.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> The ability of the Classy to control voltage will give them a leg up. Especially under water.


when i started out with my water out the freezer i was getting prolonged loops with max temps of 35c on GPU. (20c - 25c drop)

now i'm getting temps maxing at 44c GPU (only 10-15c drop) however that is with a water temp of 38c (so only 6c difference between water and GPU) and thats *without* lquid metal (only using thermal grizzly)

Whats holding me back now is my massive CPU overclock combined with the average 225w-245w GPU is just toooo much for my single, medium thickness 360 radiator.

Before I added the GPU I could overclock the CPU at 1.525v and never go above 48c (CPU) playing 'The Witcher 3'. Now (at that *same* CPU voltage the CPU temp is hitting 62c and throttling)

62c is the throttling temp for AMD CPU's and is equivalent to an intel temp of about 82c +

-I'm currently working on downgrading my CPU overclock to 4.7GHZ (equivalent to an AMD FX-9590 at strock without "xboost" on) -- _at least until I can buy an extra radiator_

-Also doing some memory and VRM temperature probing on the 1080. (more on that later)


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> when i started out with my water out the freezer i was getting prolonged loops with max temps of 35c on GPU. (20c - 25c drop)
> 
> now i'm getting temps maxing at 44c GPU (only 10-15c drop) however that is with a water temp of 38c (so only 6c difference between water and GPU) and thats *without* lquid metal (only using thermal grizzly)
> 
> Whats holding me back now is my massive CPU overclock combined with the average 225w-245w GPU is just toooo much for my single, medium thickness 360 radiator.
> 
> Before I added the GPU I could overclock the CPU at 1.525v and never go above 48c (CPU) playing 'The Witcher 3'. Now (at that *same* CPU voltage the CPU temp is hitting 62c and throttling)
> 
> 62c is the throttling temp for AMD CPU's and is equivalent to an intel temp of about 82c +
> 
> -I'm currently working on downgrading my CPU overclock to 4.7GHZ (equivalent to an AMD FX-9590 at strock without "xboost" on) -- _at least until I can buy an extra radiator_
> 
> -Also doing some memory and VRM temperature probing on the 1080. (more on that later)


Thermal Grizzly Kryonaut is all you need.

38c coolant temps are pretty high. I typically shoot for no more than 10c above ambient for coolant temps.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Thermal Grizzly Kryonaut is all you need.
> 
> 38c coolant temps are pretty high.


yeah I realise that now, lol... i'm amazed at how efficient the 1080 is at transfering heat to the block (or block is at pulling heat from 1080) -- on CPU its MUCH less efficient.

thats why i bought the liquid metal (as I assumed GPU would be the *same* and i wanted best possible temps.

although I decided I'd make sure it worked first before taking the added risk with the liquid metal. now I don't think I'm going to bother with the liquid metal. like you say.. no need 

must be something to do with the GPU not having a "lid" or "IHS"? like CPU's do?

And yes, I can't wait to get a new radiator.. EK_Grega has given me a little discount code so I might have a look and order one now (but that would mean going into my bank overdraft) or ---- i could do the sensible thing and just wait until I get paid lol. (or even go 2nd hand on Ebay)?

A bit scared to put someone elses 2nd hand radiator into my brand new EK equipment though (none of it is more than 2 months old)

OMG when I think about it -- my EK equipment is actually almost as expensive as a 1080 (give or take 100 bux) and definitely more expensive than my mob+CPU combined lol


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> yeah I realise that now, lol... i'm amazed at how efficient the 1080 is at transfering heat to the block (or block is at pulling heat from 1080) -- on CPU its MUCH less efficient.
> 
> thats why i bought the liquid metal (as I assumed GPU would be the *same* and i wanted best possible temps.
> 
> although I decided I'd make sure it worked first before taking the added risk with the liquid metal. now I don't think I'm going to bother with the liquid metal. like you say.. no need
> 
> must be something to do with the GPU not having a "lid" or "IHS"? like CPU's do?
> 
> And yes, I can't wait to get a new radiator.. EK_Grega has given me a little discount code so I might have a look and order one now (but that would mean going into my bank overdraft) or ---- i could do the sensible thing and just wait until I get paid lol. (or even go 2nd hand on Ebay)?
> 
> A bit scared to put someone elses 2nd hand radiator into my brand new EK equipment though (none of it is more than 2 months old)
> 
> OMG when I think about it -- my EK equipment is actually almost as expensive as a 1080 (give or take 100 bux) and definitely more expensive than my mob+CPU combined lol


Water cooling is a crazy expensive hobby.....more of a habit really. Welcome to being broke for the rest of your life. Now that you've put your rig under water, you'll NEVER want to go back to air cooling.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Water cooling is a crazy expensive hobby.....more of a habit really. Welcome to being broke for the rest of your life. Now that you've put your rig under water, you'll NEVER want to go back to air cooling.


lol i've been sucked in haha.

*
Final GamersNexus "take" on EVGA thermal pad issue*

All EVGA owners, you have to read this \/
http://forums.evga.com/Gamers-Nexus-Report-m2587196.aspx

*GDDRX5 1080 Memory Temperatures*

If I probed the back of the card, with back plate off, (so bare circuit board) how accurate a temp reading would I get?

For example:
I get a reading on *back of GDDRX5 chip of 55c* -- what would that equate to as a *"real temperate"?* -- *using the method in my picture below \/*

*...notice the "dotted metal imprints"
of the GDDRX5 chips like this:*
*<<temperature probe*

Reason I'd want to do this is to *test for adequate contact from:
GDDRX5 --> Pad --> WaterBlock* after installing a 780ti Classy block on a 1080 Classy card 

Would a *55c* reading *taken this way* be a *good VRAM temperature for a water-cooled card? (35c water temp)*


----------



## SmackHisFace

Hey guys just got my 1080 in the mail today. I have a Zotac AMP. I have a few overclocking questions. When I hit Control F in afterburner it brings up the voltage curve going up to 1200mv however in game I cant ever get it to surpass 1090mv. I have the voltage slider at max aka +100 in AB and the power limit at 120%. The power limit is staying under 100% and temps are in the 60s so why isnt the voltage going higher. Is this normal? Thanks in advance guys this card is awesome.

EDIT: Perf Cap reason is VREL. The voltage slider is working It takes me from 1.04 to about 1.08 but I just feel like since the curve in AB shows up to 1.20 there should be more to play with. Im at about 2050mhz give or take which is nice but Id love to squeeze every last MHZ I can out of it while staying with the stock bios on Air.

EDIT2: Also getting a Pwr PerfCap reason in some games with the power not exceeding 100% is that normal?


----------



## nrpeyton

Quote:


> Originally Posted by *SmackHisFace*
> 
> Hey guys just got my 1080 in the mail today. I have a Zotac AMP. I have a few overclocking questions. When I hit Control F in afterburner it brings up the voltage curve going up to 1200mv however in game I cant ever get it to surpass 1090mv. I have the voltage slider at max aka +100 in AB and the power limit at 120%. The power limit is staying under 100% and temps are in the 60s so why isnt the voltage going higher. Is this normal? Thanks in advance guys this card is awesome.
> 
> EDIT: Perf Cap reason is VREL. The voltage slider is working It takes me from 1.04 to about 1.08 but I just feel like since the curve in AB shows up to 1.20 there should be more to play with. Im at about 2050mhz give or take which is nice but Id love to squeeze every last MHZ I can out of it while staying with the stock bios on Air.


*Everyone* is BIOS limited to 1.093v on pascal.

This includes *all manufacturers* and *all cards* from 1050's all the way up to 1080's.

There is *no* BIOS editor out for PASCAL yet.

However there are only 2 known workarounds (options for voltage on Pascal). These are:

-EVGA 1080 Classified Voltage Tool (classified 1080 cards only) this allows core, memory and & PCI-E voltage adjustement

-Flash the "T4/STRIX" BIOS to your card (this BIOS allows up to 1.2v on the core) -- compatible with most cards but please DO your research first 

Also -- overclocking on Pascal is very limited above 40c. (With a little bit further headroom under 25c)/


----------



## Fediuld

Quote:


> Originally Posted by *Vellinious*
> 
> Water cooling is a crazy expensive hobby.....more of a habit really. Welcome to being broke for the rest of your life. Now that you've put your rig under water, you'll NEVER want to go back to air cooling.


Water cooling IS NOT a crazy expensive hobby, if you want the benefits and now the showing off colour.
You can buy a Predator 360 with QDC, and when you replace graphic cards, just buy prefilled blocks from EK instead of empty ones.

And takes matter of seconds to replace a graphic card on the loop.


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> lol i've been sucked in haha.
> 
> *
> Final GamersNexus "take" on EVGA thermal pad issue*
> 
> All EVGA owners, you have to read this \/
> http://forums.evga.com/Gamers-Nexus-Report-m2587196.aspx
> *GDDRX5 1080 Memory Temperatures*
> If I probed the back of the card, with back plate off, (so bare circuit board) how accurate a temp reading would I get?
> For example:
> I get a reading on *back of GDDRX5 chip of 55c* -- what would that equate to as a *"real temperate"?* -- *using the method in my picture below \/*
> *...notice the "dotted metal imprints"
> of the GDDRX5 chips like this:*
> *<<temperature probe*
> Reason I'd want to do this is to *test for adequate contact from:
> GDDRX5 --> Pad --> WaterBlock* after installing a 780ti Classy block on a 1080 Classy card
> Would a *55c* reading *taken this way* be a *good VRAM temperature for a water-cooled card? (35c water temp)*


That is the same temperature i get on the same spots on my Seahawk, take a 10C delta over that temp to the other side of the PCB and you get 60/65C!
Optimal temps should be "as low as you can get" but the power phases specs temp are nominal until reaching 70/80C* that is when thermal derating occours (near 5A drop per 10C increase after hitting threshold).
(assuming cooling is adequate at minimum 400LFM (2m/s) with heatsink, in your case with waterblock and watercooling temps never get to that threshold )

*Depending on what is max Amperage per phase, still not sure what is the case with the Classy (50A or 60A per phase), will enquire that with TIN and get back to you on that!









Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *OccamRazor*
> 
> That is the same temperature i get on the same spots on my Seahawk, take a 10C delta over that temp to the other side of the PCB and you get 60/65C!
> Optimal temps should be "as low as you can get" but the power phases specs temp are nominal until reaching 70/80C* that is when thermal derating occours (near 5A drop per 10C increase after hitting threshold).
> (assuming cooling is adequate at minimum 400LFM (2m/s) with heatsink, in your case with waterblock and watercooling temps never get to that threshold )
> 
> *Depending on what is max Amperage per phase, still not sure what is the case with the Classy (50A or 60A per phase), will enquire that with TIN and get back to you on that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


okay thanks,

using the method above; 55c is what I am getting on the VRAM nearest the VRM side of card, then about 48c on the "middle" VRAM chips then only about 38c on the VRAM chips nearest the "display port connector (left) side"

I am trying to work out if that is good for a water cooled card, or if that is too high for water cooling (in which case I may need to take everything apart again and re-check contact) :-(

water temp was in the mid 30's I think due to lack of radiators (which I'll fix on pay day)

temp was also taken after prolonged gaming/stress testing


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> okay thanks,
> 
> using the method above; 55c is what I am getting on the VRAM nearest the VRM side of card, then about 48c on the "middle" VRAM chips then only about 38c on the VRAM chips nearest the "display port connector (left) side"
> 
> I am trying to work out if that is good for a water cooled card, or if that is too high for water cooling (in which case I may need to take everything apart again and re-check contact) :-(
> 
> water temp was in the mid 30's I think due to lack of radiators (which I'll fix on pay day)
> 
> temp was also taken after prolonged gaming/stress testing


That is about the same i have with my card on load, 55C on the memory chips close to the VRM´s and 38/39C on the opposite side!


----------



## SmackHisFace

Quote:


> Originally Posted by *nrpeyton*
> 
> *Everyone* is BIOS limited to 1.093v on pascal.
> 
> This includes *all manufacturers* and *all cards* from 1050's all the way up to 1080's.
> 
> There is *no* BIOS editor out for PASCAL yet.
> 
> However there are only 2 known workarounds (options for voltage on Pascal). These are:
> 
> -EVGA 1080 Classified Voltage Tool (classified 1080 cards only) this allows core, memory and & PCI-E voltage adjustement
> 
> -Flash the "T4/STRIX" BIOS to your card (this BIOS allows up to 1.2v on the core) -- compatible with most cards but please DO your research first
> 
> Also -- overclocking on Pascal is very limited above 40c. (With a little bit further headroom under 25c)/


Wow thanks for the info. What do you mean overclocking is limited past 40c? Do these cards auto lower voltage starting that soon? So looks like my card OCs from 2025-2060 depending on the game. Pretty average result? Thanks again for the info.


----------



## ROKUGAN

Quote:


> Originally Posted by *Fediuld*
> 
> Water cooling IS NOT a crazy expensive hobby, if you want the benefits and now the showing off colour.
> You can buy a Predator 360 with QDC, and when you replace graphic cards, just buy prefilled blocks from EK instead of empty ones.
> 
> And takes matter of seconds to replace a graphic card on the loop.


You can, but I wouldn´t. See below. I was willing to change my Corsair H110i for a Predator 360, but after several revisions they're still leaking, I´ve lost my confidence on EKWB:

http://www.overclock.net/t/1566468/ek-predator-club-discussion-thread/6040#post_25677310

http://www.overclock.net/t/1566468/ek-predator-club-discussion-thread/6030#post_25670678

Also, these results putting a MSI GTX 1080 Z on a Predator 280 loop were not really impressive:


----------



## steeludder

*5960x + GTX 1080 record SMASHED!*

*8713 3dMarks*, fellas! Get in there! And didn't even need to flash the vbios!

http://www.3dmark.com/spy/794372

Waterchiller + cold weather = 4C water temp.

And yes. On a bloody reference, founders edition card.

PS: Graphics score was 8444. That's with the curve's 1.093 dot on 2240MHz. Forgot to look at Afterburner's chart but I suspect I voltage throttled at some point since 3dMark is showing Core clock at 2215 (Card is power modded).


----------



## Dragonsyph

Quote:


> Originally Posted by *steeludder*
> 
> *5960x + GTX 1080 record SMASHED!*
> 
> *8713 3dMarks*, fellas! Get in there! And didn't even need to flash the vbios!
> 
> http://www.3dmark.com/spy/794372
> 
> Waterchiller + cold weather = 4C water temp.
> 
> And yes. On a bloody reference, founders edition card.
> 
> PS: Graphics score was 8444. That's with the curve's 1.093 dot on 2240MHz. Forgot to look at Afterburner's chart but I suspect I voltage throttled at some point since 3dMark is showing Core clock at 2215 (Card is power modded).


Not bad on graphics score of 8444, think my highest has been 8591. Think last one i did was 8509 or something.


----------



## diablodbl

I bought a GTX 1080 Amp Extreme, out of the box it runs at 2040/10800.

I overclocked the RAM to 11K just fine, but im having problems tô overclock the GPU, used AB, Zotac and EVGA precision, and I cant get a stable +25mhz even ajusting the voltage to +100% and power limit tô 120% with 1.09v.(game freeze)

Sometimes the AB show a Power limit "1"

Do you guys have any ideia?

Temps around 55-58C.


----------



## OccamRazor

Quote:


> Originally Posted by *diablodbl*
> 
> I bought a GTX 1080 Amp Extreme, out of the box it runs at 2040/10800.
> 
> I overclocked the RAM to 11K just fine, but im having problems tô overclock the GPU, used AB, Zotac and EVGA precision, and I cant get a stable +25mhz even ajusting the voltage to +100% and power limit tô 120% with 1.09v.
> 
> Sometimes the AB show a Power limit "1"
> 
> Do you guys have any ideia?
> 
> Temps around 55-58C.


Power limited for sure!
Try the T4 bios, it works on your card as some users report:

NvflashplusT4.zip 1304k .zip file

This bios has power limit and temp limit removed and voltage can be ajusted via curve to 1,2V! Beware that fan is disabled until cards temp reach 40C!

Cheers

Occamrazor


----------



## Napoleon85

Quote:


> Originally Posted by *OccamRazor*
> 
> Power limited for sure!
> Try the T4 bios, it works on your card as some users report:
> 
> NvflashplusT4.zip 1304k .zip file
> 
> This bios has power limit and temp limit removed and voltage can be ajusted via curve to 1,2V! Beware that fan is disabled until cards temp reach 40C!
> 
> Cheers
> 
> Occamrazor


Where can I find more info on custom GTX 1080 BIOS, specifically for the EVGA FTW model. I've been checking for an update to the OP here, but it hasn't been updated in 5 months. Is there another thread being kept up to date? I feel like I can't do anything with the stock bios and the EVGA Precision XOC software since it locks me in at 50Mhz increments and automatically down-clocks as soon as the card starts to warm up (throttling at 60-65c lol).


----------



## emsj86

So I just got my gtx 1080 in. Kind of disappointed. I thought seeing benchmarks and from forums I would see a decent performance gain from my 780 s in sli. In valley benchmark I have 2 fps less and fire strike my score is he same even with the 1080 oc (150 on the core 500 on memory). Games even have less fps. I thought the 1080 was close to 2 x 980 yet my 780s out performed it. ( valley avg fps 104.5 (1080 max) and fire strike I got 17000


----------



## Napoleon85

Quote:


> Originally Posted by *emsj86*
> 
> So I just got my gtx 1080 in. Kind of disappointed. I thought seeing benchmarks and from forums I would see a decent performance gain from my 780 s in sli. In valley benchmark I have 2 fps less and fire strike my score is he same even with the 1080 oc (150 on the core 500 on memory). Games even have less fps. I thought the 1080 was close to 2 x 980 yet my 780s out performed it. ( valley avg fps 104.5 (1080 max) and fire strike I got 17000


I found the same, that performance was nearly identical from my heavily OC'd 780s to my "stock" EVGA GTX 1080 FTW. The way I look at is that I lost the headache of SLI (specifically the fact that Nvidia seems to never have their **** together on launch day and I was running new games on one GPU for a week or two), and gained a lot of headroom for when I decide to go back down that route. I also found that a few games performed better due to the extra VRAM, since 3GB was choking on a lot of titles at 1440P with any kind of AA.


----------



## Dragonsyph

Quote:


> Originally Posted by *emsj86*
> 
> So I just got my gtx 1080 in. Kind of disappointed. I thought seeing benchmarks and from forums I would see a decent performance gain from my 780 s in sli. In valley benchmark I have 2 fps less and fire strike my score is he same even with the 1080 oc (150 on the core 500 on memory). Games even have less fps. I thought the 1080 was close to 2 x 980 yet my 780s out performed it. ( valley avg fps 104.5 (1080 max) and fire strike I got 17000


17,000 in firestrike? Im getting 26408 graphics. And around 125-135 avg fps in vally. Seems somethings wrong .


----------



## Koniakki

Quote:


> Originally Posted by *emsj86*
> 
> So I just got my gtx 1080 in. Kind of disappointed. I thought seeing benchmarks and from forums I would see a decent performance gain from my 780 s in sli. In valley benchmark I have 2 fps less and fire strike my score is he same even with the 1080 oc (150 on the core 500 on memory). Games even have less fps. I thought the 1080 was close to 2 x 980 yet my 780s out performed it. ( valley avg fps 104.5 (1080 max) and fire strike I got 17000


As Dragonsyph, mentioned below, can you tell us the *gpu score* of that Firestrike score?

Also if that 104.5FPS score in Valley is with ExtremeHD preset then something is really going wrong.

You should get 120-125FPS easily with a GTX 1080 if using Extreme HD Valley preset

Beautiful rig btw! Looks great!









Quote:


> Originally Posted by *Dragonsyph*
> 
> 17,000 in firestrike? Im getting over 26,000. And around 125-135 avg fps in vally. Seems somethings wrong .


Probably 17k for total score.


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> 17,000 in firestrike? Im getting over 26,000. And around 125-135 avg fps in vally. Seems somethings wrong .


Ya but you are OCed to 2200+ and +1000 on the Mem.
Offsets do not reflect what the clocks are running at.
For me Core is at 2152 and Memory is at 11700 and I am barely breaking 25K on FS Graphics score, 17900 overall.
FPS is at 1080p in Valley is around 127, 1440p 79, 2160p 50.

Don't think he is underwater either like you and I are. So he probably throttling down as opposed to staying at a constant OC.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Ya but you are OCed to 2200+ and +1000 on the Mem.
> Offsets do not reflect what the clocks are running at.
> For me Core is at 2152 and Memory is at 11700 and I am barely breaking 25K on FS Graphics score, 17900 overall.
> FPS is at 1080p in Valley is around 127, 1440p 79, 2160p 50.
> 
> Don't think he is underwater either like you and I are. So he probably throttling down as opposed to staying at a constant OC.


Ya that makes sense haha, guess i jumped the gun. But 102 vally fps does seem kinda low for a 4.8ghz 4790k with even a stock 1080.


----------



## Dragonsyph

Quote:


> Originally Posted by *Koniakki*
> 
> As Dragonsyph, mentioned below, can you tell us the *gpu score* of that Firestrike score?
> 
> Also if that 104.5FPS score in Valley is with ExtremeHD preset then something is really going wrong.
> 
> You should get 120-125FPS easily with a GTX 1080 if using Extreme HD Valley preset
> 
> Beautiful rig btw! Looks great!
> 
> 
> 
> 
> 
> 
> 
> 
> Probably 17k for total score.


Ya i had to edit my post and put in graphic score haha. Him letting us know graphics score and actual clock speeds will help us see if theres something wrong.


----------



## Napoleon85

Quote:


> Originally Posted by *Derek1*
> 
> Ya but you are OCed to 2200+ and +1000 on the Mem.
> Offsets do not reflect what the clocks are running at.
> For me Core is at 2152 and Memory is at 11700 and I am barely breaking 25K on FS Graphics score, 17900 overall.
> FPS is at 1080p in Valley is around 127, 1440p 79, 2160p 50.
> 
> Don't think he is underwater either like you and I are. So he probably throttling down as opposed to staying at a constant OC.


Seems right to me, got ~17K on Firestrike with ~23.5K graphics when my card ran at 2050. I've went back to stock since finding out that it's unstable at any OC, even +50.


----------



## OccamRazor

Quote:


> Originally Posted by *Napoleon85*
> 
> Where can I find more info on custom GTX 1080 BIOS, specifically for the EVGA FTW model. I've been checking for an update to the OP here, but it hasn't been updated in 5 months. Is there another thread being kept up to date? I feel like I can't do anything with the stock bios and the EVGA Precision XOC software since it locks me in at 50Mhz increments and automatically down-clocks as soon as the card starts to warm up (throttling at 60-65c lol).


There are no custom bios floating around except T4 and a XOC bios, both ASUS, there will be also one bios pretty soon from K|NGP|N but for the EVGA Classy, so, until a bios editor comes up or someone HEX the bios properly, we are stuck with T4! Its doing fine at my end!









Cheers

Occamrazor


----------



## Derek1

Quote:


> Originally Posted by *Napoleon85*
> 
> Seems right to me, got ~17K on Firestrike with ~23.5K graphics when my card ran at 2050. I've went back to stock since finding out that it's unstable at any OC, even +50.


Did you run the FS Stress tests?


----------



## Napoleon85

Quote:


> Originally Posted by *OccamRazor*
> 
> There are no custom bios floating around except T4 and a XOC bios, both ASUS, there will be also one bios pretty soon from K|NGP|N but for the EVGA Classy, so, until a bios editor comes up or someone HEX the bios properly, we are stuck with T4! Its doing fine at my end!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


T4 will work on the EVGA FTW card? If so I'll give it a flash later an test. Thanks!
Quote:


> Originally Posted by *Derek1*
> 
> Did you run the FS Stress tests?


Negative, tested in real-world gaming after an overnight Valley loop and had a crash while playing The Division, which seems to be excellent at exposing any instability in OC since it hammers GPU, CPU and RAM pretty hard. playing at 1440P with settings maxed I see 100% util on all 8 CPU cores, 100% GPU usage, 4-7GB VRAM usage and 20+ GB system RAM usage.


----------



## Vellinious

Quote:


> Originally Posted by *Napoleon85*
> 
> T4 will work on the EVGA FTW card? If so I'll give it a flash later an test. Thanks!
> Negative, tested in real-world gaming after an overnight Valley loop and had a crash while playing The Division, which seems to be excellent at exposing any instability in OC since it hammers GPU, CPU and RAM pretty hard. playing at 1440P with settings maxed I see 100% util on all 8 CPU cores, 100% GPU usage, 4-7GB VRAM usage and 20+ GB system RAM usage.


Valley is a horrible GPU test, especially with higher end hardware. If you're looking for a stability test, loop FS Ultra graphics test 1 and 2 for a few hours. If there's any instability in your overclock, that's likely to find it.


----------



## Napoleon85

Good to know - I was using Valley since it seemed to generate more heat and found instability quicker on my 780s than looping the FS Ultra tests. At the end of the day, I consider it stable when it doesn't crash under real-world workloads, but the synthetics are a nice quick check for an obvious problem.


----------



## emsj86

I'll get and run the benchmarks again tonight. But fire strike was total score for 17k. And I am under water so no throttling. The core stays at 2177 the entire time. I know cou plays a part but 4790k at 4.7 should be close. The part that bums me out he most is battlefield 1 wouldn't work on my 780s due to direct x error saying no enough virtual memory make sure card has 2gb (wide spread among 780/770 owners)thought 1080 would fix it and Nike but that's a whole other story


----------



## OccamRazor

Quote:


> Originally Posted by *emsj86*
> 
> I'll get and run the benchmarks again tonight. But fire strike was total score for 17k. And I am under water so no throttling. The core stays at 2177 the entire time. I know cou plays a part but 4790k at 4.7 should be close


Just did a quick run on FS, [email protected] and 1080 on water also (Seahawk X) @ 2126 core and +565 mem, no added voltage on T4 bios, got 17002 and 25238 graphic score.



Older CPU´s have really low scores on FS!

Cheers

Occamrazor


----------



## Dragonsyph

Heres my epic computer ROFL>>>>


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> Heres my epic computer ROFL>>>>


Are you using a coffee pot on that thing?
Talk about ghetto.


----------



## diablodbl

Quote:


> Originally Posted by *OccamRazor*
> 
> Power limited for sure!
> Try the T4 bios, it works on your card as some users report:
> 
> NvflashplusT4.zip 1304k .zip file
> 
> This bios has power limit and temp limit removed and voltage can be ajusted via curve to 1,2V! Beware that fan is disabled until cards temp reach 40C!
> 
> Cheers
> 
> Occamrazor


This "power limit for sure" is 100%? So the chance to reach a higher clock is great...

The ideia is to flash to T4 bios and start de overclock test again? This is the most agressive overclock bios for 1080, right?

This bios flash process is safe? Never done it before.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> Are you using a coffee pot on that thing?
> Talk about ghetto.


Lol what do you mean coffee pot?

Ya its pretty ghetto hahaha, when i tore apart the 290 cf water cooled i just tossed it all back to gether on a table and left it.


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> Lol what do you mean coffee pot?
> 
> Ya its pretty ghetto hahaha, when i tore apart the 290 cf water cooled i just tossed it all back to gether on a table and left it.


LOL

When I first saw the resevoir it looked like a coffee pot sitting on a hotplate.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> LOL
> 
> When I first saw the resevoir it looked like a coffee pot sitting on a hotplate.


HAHAHAH


----------



## OccamRazor

Quote:


> Originally Posted by *diablodbl*
> 
> This "power limit for sure" is 100%? So the chance to reach a higher clock is great...
> 
> The ideia is to flash to T4 bios and start de overclock test again? This is the most agressive overclock bios for 1080, right?
> 
> This bios flash process is safe? Never done it before.


Yap! This time on FS got 17114 and 25547 graphics score, but look at power draw in HWinfo











Flash is easy, just extract the nvflash files and the T4 bios to C:\
open elevated cmd window (hit start button, type "cmd", right click cmd icon, run as administrator)
change directory to C:\ (type " cd C:\ " )
OPTIONAL: type " nvflash --protectoff " then hit enter (turns off any error msg that may result from software write protection)
BACKUP ORIGINAL BIOS: Type " nvflash -b backup.rom " and a copy of your bios will be saved to the nvflash directory
type " nvflash -4 -5 -6 (bios name).rom "
wait for it to finish
reboot
Done!









Cheers

Occamrazor


----------



## emsj86

Quote:


> Originally Posted by *OccamRazor*
> 
> Just did a quick run on FS, [email protected] and 1080 on water also (Seahawk X) @ 2126 core and +565 mem, no added voltage on T4 bios, got 17002 and 25238 graphic score.
> 
> 
> 
> Older CPU´s have really low scores on FS!
> 
> Cheers
> 
> Occamrazor


You basically got what I scored. Hmm I guess I thought I would see Atleast 10-20 fps more on a 1080 over 2 x 780s. Almost makes me want to go back to my two 780s as those cards were beast for me


----------



## emsj86

What exactly is the T4 bios and can I do it to my 1080 FtW. I used nvflash and Msi afterburner 1.3 volt mod for my 780s is it similar where you can adjust to higher power draw and voltage or is it more like flashing a stock 780 with a galaxy 780


----------



## Vellinious

There is no bios editor for Pascal. The T4 bios was a bios created for the LN2 guys running the ASUS STRIX. It does work on the FTW, though...I haven't tried it myself.


----------



## Fediuld

Quote:


> Originally Posted by *ROKUGAN*
> 
> You can, but I wouldn´t. See below. I was willing to change my Corsair H110i for a Predator 360, but after several revisions they're still leaking, I´ve lost my confidence on EKWB:
> 
> http://www.overclock.net/t/1566468/ek-predator-club-discussion-thread/6040#post_25677310
> 
> http://www.overclock.net/t/1566468/ek-predator-club-discussion-thread/6030#post_25670678
> 
> Also, these results putting a MSI GTX 1080 Z on a Predator 280 loop were not really impressive:


Using Predator 360 1.1, and prefilled blocks for Nano, FuryX and 1080. I don't have any leaks, nor anyone else who hasn't tamper with the kit.


----------



## LOLcake22

What would you recommend. The A8G variant of the strix (not O8G) or Gaming X MSI? Gaming X has slightly higher clocks


----------



## nrpeyton

Quote:


> Originally Posted by *OccamRazor*
> 
> That is about the same i have with my card on load, 55C on the memory chips close to the VRM´s and 38/39C on the opposite side!


Do you have temp probes attached to your fan controller (as I do then)?

Did you take the back plate off your MSI Seahawk especially to probe the temps on it (VRM & GDDRX5)? Just out of curiosity? Or did you have an issue at some point?

I decided I better check since I custom fitted a 780 TI block to my 1080 classy lol.

In any case I was pleased to see someone else has also checked this (I had *no* way to compare otherwise) 

Sure I gave you rep for that post lol 

Quote:


> Originally Posted by *ROKUGAN*
> 
> You can, but I wouldn´t. See below. I was willing to change my Corsair H110i for a Predator 360, but after several revisions they're still leaking, I´ve lost my confidence on EKWB:
> 
> http://www.overclock.net/t/1566468/ek-predator-club-discussion-thread/6040#post_25677310
> 
> http://www.overclock.net/t/1566468/ek-predator-club-discussion-thread/6030#post_25670678


That's serious man; EK cover you for 2 years on "manufacturing flaws".

I watched your video and if you haven't pushed the tube far enough up onto the fitting before screwing you *will* get leaks (drain and check this first).

If wasn't the problem; I'd definitely be getting in contact with them.

Cooling is an expensive business. And a company like EK needs people who will come back.

Its also quite controversial (water-cooling) if you think about it... the slightest leak and your ****ed.

I'm sure they'd want to avoid the embarrassment.

Contact them and I'm sure they'll help 

They've always been very competent and extremely fast whenever I've emailed them 

P.S. their "terms and conditions" page is a "harsh" read (rather dis-heartening) but in the business that they are in; they can't have newbies who don't know what they are doing claiming for everything left, right and centre.

You know what its like these days, all these "no win no fees" companies (everyone trying to sue everyone) its ridiculous.

Email EK *privately* and see what they actually say in private 

Quote:


> Originally Posted by *ROKUGAN*
> 
> Also, these results putting a MSI GTX 1080 Z on a Predator 280 loop were not really impressive:


Air cooling designs are getting bigger and TDP's are getting lower. GPU's also don't even have "lids" on them anymore (IHS's) like CPU's do.

So you really need to be pushing at least 480mm's of radiator to see big temp drops on GPU when you've also got an overclocked CPU in same loop.


----------



## diablodbl

Quote:


> Originally Posted by *Vellinious*
> 
> There is no bios editor for Pascal. The T4 bios was a bios created for the LN2 guys running the ASUS STRIX. It does work on the FTW, though...I haven't tried it myself.


Do you recomend apply T4 on a AMP Extreme?

Mine is running 2050/11200 1.05v 56C, can push only 1080mhz with 1.095v. Nothing more than that with stock bios.


----------



## nrpeyton

Quote:


> Originally Posted by *steeludder*
> 
> *5960x + GTX 1080 record SMASHED!*
> 
> *8713 3dMarks*, fellas! Get in there! And didn't even need to flash the vbios!
> 
> http://www.3dmark.com/spy/794372
> 
> Waterchiller + cold weather = 4C water temp.
> 
> And yes. On a bloody reference, founders edition card.
> 
> PS: Graphics score was 8444. That's with the curve's 1.093 dot on 2240MHz. Forgot to look at Afterburner's chart but I suspect I voltage throttled at some point since 3dMark is showing Core clock at 2215 (Card is power modded).


*wow* nice overclock mate (especially on a FE lol)  ... what was your max O/C's before? On air? And on normal water?

I'd also love a link to shop for your water chiller as I'm currently considering one myself 

Quote:


> Originally Posted by *diablodbl*
> 
> Do you recomend apply T4 on a AMP Extreme?
> 
> Mine is running 2050/11200 1.05v 56C, can push only 1080mhz with 1.095v. Nothing more than that with stock bios.


*Have a look at these posts (they all talk about the T4 being flashed to the Amp Extreme and some of the results people got):

Link:
http://www.overclock.net/newsearch?advanced=1&byuser=&containingthread%5B0%5D=1601288&newer=1&output=posts&resultSortingPreference=relevance&sdate=0&search=amp+extreme+t4+bios&type=all
*

Quote:


> Originally Posted by *emsj86*
> 
> So I just got my gtx 1080 in. Kind of disappointed. I thought seeing benchmarks and from forums I would see a decent performance gain from my 780 s in sli. In valley benchmark I have 2 fps less and fire strike my score is he same even with the 1080 oc (150 on the core 500 on memory). Games even have less fps. I thought the 1080 was close to 2 x 980 yet my 780s out performed it. ( valley avg fps 104.5 (1080 max) and fire strike I got 17000


If you're still only gaming at 1080p resolution (and running 1080p benchmarks) you're not getting the benefits of the 8GB GDDRX5 on your new GTX 1080.

Game at 4k and you'll definitely see the difference.

When I'm gaming at 1440p my GTX 1080 gives me roughly the same performance as my 980 SLI (and in non-SLI supported games my framerate is *doubled*).

My FPS at 4K (Ultra) is also much smoother + a bit faster on my GTX 1080 compared to my 2 GTX 980's in SLI.

Also I noticed in your system specs your motherboard RAM is only clocked at 1600mhz (you definitely want to bump that up a bit).

*Maybe* try O/C'ing your GPU less (overclocking past stability forces the card to error-correct by re-sending a frame) and you lose points.

P.S. beautiful build by the way 

Quote:


> Originally Posted by *Dragonsyph*
> 
> Heres my epic computer ROFL>>>>


lol are you planning on leaving it like that? haha 
that post made my day lol 

*GPU Error Stability Testing*

*I've just found a program that lets you "error check" a GPU overclock (much the same way prime95 error checks a CPU overclock). Software is called 'OCCT'. (I got up to +675 mem before errors)*

*Full Size - right click picture & 'open new tab'*


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> Everyone is BIOS limited to 1.093v on pascal.
> 
> This includes all manufacturers and all cards from 1050's all the way up to 1080's.


Not true.


----------



## Vellinious

Quote:


> Originally Posted by *ucode*
> 
> Not true.


The Classys, with the voltage tool are the only ones I'm aware of that aren't.....without running the T4 bios, which doesn't come stock on anything.


----------



## ROKUGAN

That's serious man; EK cover you for 2 years on "manufacturing flaws".

I watched your video and if you haven't pushed the tube far enough up onto the fitting before screwing you *will* get leaks (drain and check this first).

If wasn't the problem; I'd definitely be getting in contact with them.

Cooling is an expensive business. And a company like EK needs people who will come back.

Its also quite controversial (water-cooling) if you think about it... the slightest leak and your ****ed.

I'm sure they'd want to avoid the embarrassment.

Contact them and I'm sure they'll help 

They've always been very competent and extremely fast whenever I've emailed them 

P.S. their "terms and conditions" page is a "harsh" read (rather dis-heartening) but in the business that they are in; they can't have newbies who don't know what they are doing claiming for everything left, right and centre.

You know what its like these days, all these "no win no fees" companies (everyone trying to sue everyone) its ridiculous.

Email EK *privately* and see what they actually say in private 

Ok, I think you didn´t read my post properly









http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/8290#post_25678295

1) I have a Corsair AIO not an EKWB, but I´ve been following the EKWB Predator thread because I was interested on buying the Predator 360 and was worried about users reported leaking.
Actually the Predator 1.0 was officially recalled by EKWB for having flaws:

http://www.eteknix.com/ekwb-issues-full-recall-predator-240-360-aio-coolers/

Now, the problem is that some users have reported leaks also in the ¨fixed¨ version (rev 1.1c). I don´t think it´s about noobs not following mounting instructions. This is about a flawed product from the beginning, which is worriesome. Of course it´s not happening with every single unit, but it happens, so the risk is there.

2) The video I posted was about performance of a 1080 in a Predator 280 loop (not very exciting, acutally), not about leaking


----------



## steeludder

Quote:


> Originally Posted by *nrpeyton*
> 
> *wow* nice overclock mate (especially on a FE lol)  ... what was your max O/C's before? On air? And on normal water?
> 
> I'd also love a link to shop for your water chiller as I'm currently considering one myself


Hey
Thanks, FE's are no worse overclockers than any of the 3rd party cards. People found that out quite early in the game when FE's consistently seemed to clock higher than most custom cards (I personally still think nVidia binned them - even though I'll never be able to prove it







).

Never tried it on air. Slapped an EK waterblock on it from day one. On water, power modded, I was able to eek out a Time Spy score at 2164MHz (and 4.5GHz on the 5960x). Here it is, fyi: http://www.3dmark.com/spy/312907

So the chiller basically added 50-75MHz worth of overclocking to the 1080 compared to a strong WC setup... as well as 100MHz worth on the CPU. Note that my CPU is a rather poor clocker. I needed to feed it 1.4V to hold 4.4GHz under water. Now I run it at 4.5GHz at the same voltage (it's gaming stable, but tends to fail stress testing like RealBench). It'll hold [email protected] for short benching sessions now.

The chiller is a Hailea HC-500A. It'll dissipate the heat of 1080, 5960x and VRM and cool water all the way down to 3C... if ambient temp is low. I reckon it'll struggle in the summer though. I would make an educated guesstimate that it can chill ~12-15C below ambient with the heat I'm throwing at it.

PS: you've got to watch dew point though. Get a hygro/thermometer with dew point indicator and put it inside your case. You do NOT want condensation to build up. I've actually insulated my case with armaflex (well, to some extent) and put a small dessicant unit inside to keep it dry and lower dew point.


----------



## Casper123123123

Hello Guys,

I decided to go with Alphacool GPX Pro solution for our card. Will provide you an update with temps in 2-3 weeks. Meanwhile if there are such owners please share your results here guys ...

Best Regards,
DJ_Cas


----------



## kasper96

Hi Casper123....

I installed an Alphacool GPX Pro on my EVGA 1080 FTW just recently. (In fact I sent my card in to Alphacool in July for the pre-production scanning process and got my cooler for free







). Installation was no problem, never got a temperature above 36°C of the GPU. Testing with Furmark and Prime95 ([email protected]@1.31V) in parallel.
Custom water cooling loop with an external MoRa 360.
However I cannot compare to other cards/setups since this is my first water cooling build.
Benchmarking with 2179MHz and +510 MHz on VRAM works (Time Spy, Firemark Ultra/Extreme, Valley) is fine. Witcher 3 crashes with this setup, have to use 2151Mhz and +400MHz VRAM. Did not fiddle with the curves in AB so far , just moving sliders to the right ;-)

Best wishes,
kasper96


----------



## Casper123123123

Quote:


> Originally Posted by *kasper96*
> 
> Hi Casper123....
> 
> I installed an Alphacool GPX Pro on my EVGA 1080 FTW just recently. (In fact I sent my card in to Alphacool in July for the pre-production scanning process and got my cooler for free
> 
> 
> 
> 
> 
> 
> 
> ). Installation was no problem, never got a temperature above 36°C of the GPU. Testing with Furmark and Prime95 ([email protected]@1.31V) in parallel.
> Custom water cooling loop with an external MoRa 360.
> However I cannot compare to other cards/setups since this is my first water cooling build.
> Benchmarking with 2179MHz and +510 MHz on VRAM works (Time Spy, Firemark Ultra/Extreme, Valley) is fine. Witcher 3 crashes with this setup, have to use 2151Mhz and +400MHz VRAM. Did not fiddle with the curves in AB so far , just moving sliders to the right ;-)
> 
> Best wishes,
> kasper96


Nice. Have you used coolers on radiator aside from card and which one do you use? 240mm or 360mm
http://www.alphacool.com/shop/radiatoren/radiatoren-aktiv/21361/alphacool-nexxxos-eiswolf-/-eisbaer-ready-st30-full-copper-240mm-radiator?c=20543
http://www.alphacool.com/shop/radiatoren/radiatoren-aktiv/21363/alphacool-nexxxos-eiswolf-/-eisbaer-ready-st30-full-copper-360mm-radiator?c=20543


----------



## nrpeyton

Quote:


> Originally Posted by *ucode*
> 
> Not true.


*You never read my full post. (you must have stopped after the first sentence)? lol

The rest of it said this:*
_"However there are only 2 known workarounds (options for voltage on Pascal). These are:

-EVGA 1080 Classified Voltage Tool (classified 1080 cards only) this allows core, memory and & PCI-E voltage adjustement

-Flash the "T4/STRIX" BIOS to your card (this BIOS allows up to 1.2v on the core) -- compatible with most cards but please DO your research first 

Also -- overclocking on Pascal is very limited above 40c. (With a little bit further headroom under 25c)"_

And moreover; it was actually Vellinious who started thread r.e. EVGA voltage control at KingpinCooling.com and then it was me who asked Kingpin (or kindly persuaded him) to release an updated 1080 voltage tool  *which we got 2 days later *
http://forum.kingpincooling.com/showthread.php?t=3908 <-- see link

Anything else requires phsycial hardware mods (cutting off resistors and soldering new ones onto your board -- completely annihilating any warranty lol

 

Quote:


> Originally Posted by *ROKUGAN*
> 
> That's serious man; EK cover you for 2 years on "manufacturing flaws".
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I watched your video and if you haven't pushed the tube far enough up onto the fitting before screwing you *will* get leaks (drain and check this first).
> 
> If wasn't the problem; I'd definitely be getting in contact with them.
> 
> Cooling is an expensive business. And a company like EK needs people who will come back.
> 
> Its also quite controversial (water-cooling) if you think about it... the slightest leak and your ****ed.
> 
> I'm sure they'd want to avoid the embarrassment.
> 
> Contact them and I'm sure they'll help
> 
> They've always been very competent and extremely fast whenever I've emailed them
> 
> P.S. their "terms and conditions" page is a "harsh" read (rather dis-heartening) but in the business that they are in; they can't have newbies who don't know what they are doing claiming for everything left, right and centre.
> 
> You know what its like these days, all these "no win no fees" companies (everyone trying to sue everyone) its ridiculous.
> 
> Email EK *privately* and see what they actually say in private
> 
> 
> 
> Ok, I think you didn´t read my post properly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/8290#post_25678295
> 
> 1) I have a Corsair AIO not an EKWB, but I´ve been following the EKWB Predator thread because I was interested on buying the Predator 360 and was worried about users reported leaking.
> Actually the Predator 1.0 was officially recalled by EKWB for having flaws:
> 
> http://www.eteknix.com/ekwb-issues-full-recall-predator-240-360-aio-coolers/
> 
> Now, the problem is that some users have reported leaks also in the ¨fixed¨ version (rev 1.1c). I don´t think it´s about noobs not following mounting instructions. This is about a flawed product from the beginning, which is worriesome. Of course it´s not happening with every single unit, but it happens, so the risk is there.
> 
> 2) The video I posted was about performance of a 1080 in a Predator 280 loop (not very exciting, acutally), not about leaking


Okay sorry, my apologies -- it was very late when I wrote it lol and I was trying to catch up on 2 days worth as I was offline installing/testing my new waterblock


----------



## nrpeyton

deleted

Edit:
guys; how do I delete my own post???? can't find the button? am i being completely stupid lol


----------



## kasper96

Prio to my current setup I used an AIO from Corsair H110 with 2 x140mm Noiseblocker A14 PWM, but for CPU only. Temperature slightly higher than now.
Other than that I have not used any other radiator. Jumped on the external radiator bandwagon right from the start.
I am using this one: http://www.aquatuning.de/wasserkuehlung/radiatoren/radiatoren-aktiv/15381/watercool-mo-ra3-360-lt-black
Best wishes,
kasper96


----------



## nrpeyton

Quote:


> Originally Posted by *steeludder*
> 
> Hey
> Thanks, FE's are no worse overclockers than any of the 3rd party cards. People found that out quite early in the game when FE's consistently seemed to clock higher than most custom cards (I personally still think nVidia binned them - even though I'll never be able to prove it
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> Never tried it on air. Slapped an EK waterblock on it from day one. On water, power modded, I was able to eek out a Time Spy score at 2164MHz (and 4.5GHz on the 5960x). Here it is, fyi: http://www.3dmark.com/spy/312907
> 
> So the chiller basically added 50-75MHz worth of overclocking to the 1080 compared to a strong WC setup... as well as 100MHz worth on the CPU. Note that my CPU is a rather poor clocker. I needed to feed it 1.4V to hold 4.4GHz under water. Now I run it at 4.5GHz at the same voltage (it's gaming stable, but tends to fail stress testing like RealBench). It'll hold [email protected] for short benching sessions now.
> 
> The chiller is a Hailea HC-500A. It'll dissipate the heat of 1080, 5960x and VRM and cool water all the way down to 3C... if ambient temp is low. I reckon it'll struggle in the summer though. I would make an educated guesstimate that it can chill ~12-15C below ambient with the heat I'm throwing at it.
> 
> PS: you've got to watch dew point though. Get a hygro/thermometer with dew point indicator and put it inside your case. You do NOT want condensation to build up. I've actually insulated my case with armaflex (well, to some extent) and put a small dessicant unit inside to keep it dry and lower dew point.


Rep+1

Loved reading this post; thats exactly the information I was looking for. It was also that exact chiller I was looking at too 

Also thats a screaming overclock for a 3.5ghz CPU (1.0 --> 1.1GHZ overclock)  what motherboard do you have on that?

Also do you think it was all worth it then? Or did you do it for the same reason I am thinking of: _because I'm addicted to tinkering around and seeing how far I can push everything lol _

Whats the noise like from it (would it bother someone in another room trying to get to sleep before work in the morning)?

And lastly if you had the option of buying a chiller again; would you of spent a little extra and got a more powerful model or are you completely happy you got the correct power/requirement ratio on the HC-500A?

I also see a few similar models with same names (vastly differently priced on different sites):

*Hailea water chiller* Ultra Titan 1500 (*HC500* = 790Watt cooling capacity) £497.89 http://www.watercoolinguk.co.uk/p/Hailea-water-chiller-Ultra-Titan-1500-HC500-=-790Watt-cooling-capacity-UK-Plug_53532.html

*Hailea Waterchiller* Ultra Titan *500* (HC300-395Watt cooling capacity) £376.45 http://www.watercoolinguk.co.uk/p/Hailea-Waterchiller-Ultra-Titan-500-HC300-395Watt-cooling-capacity_20553.html?gclid=COasgsbL0NACFU0aGwode2wAaA

*Hailea Aquarium Chiller HC 500A* £373.99 http://www.completeaquatics.co.uk/hailea-aquarium-chiller-hc500a#tab-description <-- this one is from aquatics website so doesn't list cooling power in watts

/\ a lot of "500"'s used in the product titles too; a bit confusing.
How would your HC-500A compare to the two on the computing websites in terms of power? as I can only find that one on the aquarium sites (so I'd have to work out the tubing adapter requirements etc for linking it into my loop on my own......


----------



## nrpeyton

Quote:


> Originally Posted by *kasper96*
> 
> Hi Casper123....
> 
> I installed an Alphacool GPX Pro on my EVGA 1080 FTW just recently. (In fact I sent my card in to Alphacool in July for the pre-production scanning process and got my cooler for free
> 
> 
> 
> 
> 
> 
> 
> ). Installation was no problem, never got a temperature above 36°C of the GPU. Testing with Furmark and Prime95 ([email protected]@1.31V) in parallel.
> Custom water cooling loop with an external MoRa 360.
> However I cannot compare to other cards/setups since this is my first water cooling build.
> Benchmarking with 2179MHz and +510 MHz on VRAM works (Time Spy, Firemark Ultra/Extreme, Valley) is fine. Witcher 3 crashes with this setup, have to use 2151Mhz and +400MHz VRAM. Did not fiddle with the curves in AB so far , just moving sliders to the right ;-)
> 
> Best wishes,
> kasper96


*What VRM and Memory temperatures are you getting with the Alphacool block?*

Are you able to check?

I am very interested because I nearly got the Alphacool block myself (but decided against it because the VRM and GDDRX5 memory is only "passively" cooled. Only the GPU core gets water flow.

Here's an example of how I took temps with an *EK 780 TI Classy block modified to fit my EVGA 1080 Classified*. Using 4 temperature probes connected to my £25 fan controller: (orange arrows pointing to back of GDDRX5 RAM chips)

*Full Size - right click picture & 'open new tab*'


*Up Close:*


Quote:


> Originally Posted by *Casper123123123*
> 
> Hello Guys,
> 
> I decided to go with Alphacool GPX Pro solution for our card. Will provide you an update with temps in 2-3 weeks. Meanwhile if there are such owners please share your results here guys ...
> 
> Best Regards,
> DJ_Cas


Which 1080 card do you have?


----------



## kasper96

Hi nrpeyton,

unfortunately I will not be able to measure VRM and memory temperature. There were tons of thermal pads included and i installed also the backplate. Too much of a hassle for me to disassemble the rig, drain the loop etc. I tend to build the system, overclock to see what is possible, turn down the settings a bit and leave it as it is.
I touched the backplate while benchmarking and I guess it is around 35-40°C. I can measure with a thermal probe, but not earlier than early next week (will be on a trip for a couple of days) and provide the data.
I have an inverted build with the backplate facing towards the bottom of the PC, be quiet1 Dark Base Pro, so airflow is not an issue.

Best wishes,
kasper96


----------



## kdgamer

I picked up a Gigabyte GTX 1080 Turbo OC for £499.98 on Black Friday (plus sold WatchDogs 2 code for a further £25 off!). Not much info on this card out there however it's a blower style cooler with a beefier looking heat sink than the Founders Edition. So far so good running relatively cool and quiet in my Fractal Node 304 ITX rig. Before this I had a massive Asus GTX 980ti Strix triple fan solution dumping tones of hot air into my case, which while didn't overheat or any thing of the sort, my CPU is noticeably cooler now that my CPUs AIO (my only exhaust in the case) isn't having the exhaust the GPUs hot air also.


GV-N1080TTOC-8GD


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> Do you have temp probes attached to your fan controller (as I do then)?
> 
> Did you take the back plate off your MSI Seahawk especially to probe the temps on it (VRM & GDDRX5)? Just out of curiosity? Or did you have an issue at some point?
> 
> I decided I better check since I custom fitted a 780 TI block to my 1080 classy lol.
> 
> In any case I was pleased to see someone else has also checked this (I had *no* way to compare otherwise)
> 
> Sure I gave you rep for that post lol


I use a infrared thermometer and i removed the backplate because it was not cooling anything and could generate hot air pockets, this way the case fans blow all over the cards back!








Thanks for the rep! You also deserve lots of rep for all your endevour and effort to give something back to the comunity!









Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *kasper96*
> 
> Hi nrpeyton,
> 
> unfortunately I will not be able to measure VRM and memory temperature. There were tons of thermal pads included and i installed also the backplate. Too much of a hassle for me to disassemble the rig, drain the loop etc. I tend to build the system, overclock to see what is possible, turn down the settings a bit and leave it as it is.
> I touched the backplate while benchmarking and I guess it is around 35-40°C. I can measure with a thermal probe, but not earlier than early next week (will be on a trip for a couple of days) and provide the data.
> I have an inverted build with the backplate facing towards the bottom of the PC, be quiet1 Dark Base Pro, so airflow is not an issue.
> 
> Best wishes,
> kasper96


Okay thank you; would be a fantastic comparison though (sure you'd be interested too in the results between these Alphacool hybrids and full cover EK's). 

Can you take the back plate off without removing anything else (I don't even need to remove my card from the slot)

Anyway look forward to hearing from you on this one if you get time


----------



## nrpeyton

Quote:


> Originally Posted by *OccamRazor*
> 
> I use a infrared thermometer and i removed the backplate because it was not cooling anything and could generate hot air pockets, this way the case fans blow all over the cards back!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the rep! You also deserve lots of rep for all your endevour and effort to give something back to the comunity!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occamrazor


*Thanks mate, appreciated -- means a lot to hear someone say that 

Speaking of that: here's the latest email from EK_Grega:
*

==========================================


EKWB Support (Grega) (EKWB Support)
Nov 29, 17:50 CET
_Hello Nick,

GREAT!

GPU temperatures are good (means expected with your water temps).

55°C max *memory* temp with overclocked GDDR5X is also good.

Another good measurement would be to place a probe behind the GPU VRM part and run a few heavy benchmarks.

I would also ask you for a few detailed pictures of the installed water block.
-from the side (FC Terminal part)
-from the side (contact with GPU VRM)
Best regards,
Grega_
==========================================

*We may see EK Cooling Configurator updated soon 









I've just sent him all the data off yesterday on this so I'll hear shortly if I need to look into trying something to improve temps of the 3 GDDR5X chips between the Core & VRM.

'First thought' maybe they weren't making as good 'contact'. But when think about it, due to being in the middle of Core & VRM it's not difficult to expect them run hotter than others.

Alternatively I might need to consider doing same "mod" to compensate the way the "step" on block misses the chips on the right hand side, (like I done on the left). After all, they are the hottest chips so it make sense to have 100% 'surface contact' instead of only 85%.

See below:
*
Full size - right click *picture* & 'open new tab'


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> *Thanks mate, appreciated -- means a lot to hear someone say that
> 
> Speaking of that: here's the latest email from EK_Grega:
> *
> 
> ==========================================
> 
> 
> EKWB Support (Grega) (EKWB Support)
> Nov 29, 17:50 CET
> _Hello Nick,
> 
> GREAT!
> 
> GPU temperatures are good (means expected with your water temps).
> 
> 55°C max *memory* temp with overclocked GDDR5X is also good.
> 
> Another good measurement would be to place a probe behind the GPU VRM part and run a few heavy benchmarks.
> 
> I would also ask you for a few detailed pictures of the installed water block.
> -from the side (FC Terminal part)
> -from the side (contact with GPU VRM)
> Best regards,
> Grega_
> ==========================================
> 
> *We may see EK Cooling Configurator updated soon
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've just sent him all the data off yesterday on this so I'll hear shortly if I need to look into trying something to improve temps of the 3 GDDR5X chips between the Core & VRM.
> 
> 'First thought' maybe they weren't making as good 'contact'. But when think about it, due to being in the middle of Core & VRM it's not difficult to expect them run hotter than others.
> 
> Alternatively I might need to consider doing same "mod" to compensate the way the "step" on block misses the chips on the right hand side, (like I done on the left). After all, they are the hottest chips so it make sense to have 100% 'surface contact' instead of only 85%.
> 
> See below:
> *
> Full size - right click *picture* & 'open new tab'






Did you put thermal pad fix un the underside of the PCB along the mosfet area?
No wait...damn Classy.
lol
The area adjaent to those three memory chips, your hotspot.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> 
> Did you put thermal pad fix un the underside of the PCB along the mosfet area?
> No wait...damn Classy.
> lol
> The area adjaent to those three memory chips, your hotspot.



*see the row of 3 memory chips here (hottest part of card) /\*
*above where arrow points*_Edit: arrow showing different places depending on resolution. Its the row of 3 memory chips sort of between VRM and core_

No I never received my pads from EVGA, I just used 0.5mm on memory and 1.0mm on VRM (EK pads)

no backplate though, just running bare circuit board (that's how I'm able to probe temps so easily) 

*Anyone else fancy taking a few screws off to remove back plate then getting me some temperatures for comparison? (OccamRazor done one too so would be nice to get some more people involved) Temperature of memory vastly affects how hard you can overclock it *

Quote:


> Originally Posted by *kasper96*
> 
> Prio to my current setup I used an AIO from Corsair H110 with 2 x140mm Noiseblocker A14 PWM, but for CPU only. Temperature slightly higher than now.
> Other than that I have not used any other radiator. Jumped on the external radiator bandwagon right from the start.
> I am using this one: http://www.aquatuning.de/wasserkuehlung/radiatoren/radiatoren-aktiv/15381/watercool-mo-ra3-360-lt-black
> Best wishes,
> kasper96


I looked at that picture; it doesn't look long enough to be a 360mm radiator?


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> no backplate though, just running bare circuit board (that's how I'm able to probe temps so easily)
> 
> and no I never received my pads from EVGA, I just used 0.5mm on memory and 1.0mm on VRM (EK pads)






Well those 3 Memory chips are the ones that are obviously in the most danger from heat bleeding down the pcb.
Anything you can do to draw the heat away from them would be of benefit and if that means putting the backplate on with some good pads 11w/mk making contact with the backplate to dissipate the heat then it is worth a try.
But Intuitively you would think with backplate off it would be cooler in that area. But then you don't have fans blowing over it do you which would of course confound your probing.

ETA Either that or sell the pig and get a FTW like you should have at the start, lol


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> 
> Well those 3 Memory chips are the ones that are obviously in the most danger from heat bleeding down the pcb.
> Anything you can do to draw the heat away from them would be of benefit and if that means putting the backplate on with some good pads 11w/mk making contact with the backplate to dissipate the heat then it is worth a try.
> But Intuitively you would think with backplate off it would be cooler in that area. But then you don't have fans blowing over it do you which would of course confound your probing.
> 
> ETA Either that or sell the pig and get a FTW like you should have at the start, lol


haha lol - this Classy has been a roller coaster ride  but at one point I was physically and mentally depressed wishing I had an FTW lol; not anymore though haha 

okay good advice, *just ordered some new pads* (i'll try them on those 3 chips):



*Fujipoly Thermal Pad 0.5mm 14W/mK (£10.79)* _I hope they're genuine Fujipoly and not a fake rip off!!_
http://vod.ebay.co.uk/vod/FetchOrderDetails?transid=1587235013008&itemid=181831682624&qu=1&ul_noapp=true

Anyone have experience know if they will really actually make a difference or is it just marketing ploy? (EK pads are rated only 3.5W/mK)

OR just had another idea??!?!

What about three '14mm x 10mm x 0.5mm' Copper Shims?
_/\ 14mm*10mm is exact surface area of GDDR5X chips_


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> haha lol - this Classy has been a roller coaster ride  but at one point I was physically and mentally depressed wishing I had an FTW lol; not anymore though haha
> 
> okay good advice, *just ordered some new pads* (i'll try them on those 3 chips):
> 
> 
> 
> *Fujipoly Thermal Pad 0.5mm 14W/mK (£10.79)* _I hope they're genuine Fujipoly and not a fake rip off!!_
> http://vod.ebay.co.uk/vod/FetchOrderDetails?transid=1587235013008&itemid=181831682624&qu=1&ul_noapp=true
> 
> Anyone have experience know if they will really actually make a difference or is it just marketing ploy? (EK pads are rated only 3.5W/mK)
> 
> OR just had another idea??!?!
> 
> What about three '14mm x 10mm x 0.5mm' Copper Shims?
> _/\ 14mm*10mm is exact surface area of GDDR5X chips_






The copper extrusion that covers the chips on the FTW Hybrid have pads. It is not bare coverage so I would not recommend that.

Also I amnot aware of Fuji having 14w pads. I know they have 11 and 17 but maybe they do have 14 I just have not seen them before.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> 
> The copper extrusion that covers the chips on the FTW Hybrid have pads. It is not bare coverage so I would not recommend that.
> 
> Also I amnot aware of Fuji having 14w pads. I know they have 11 and 17 but maybe they do have 14 I just have not seen them before.


I couldn't find the 17's in 0.5 :-(

I might email Fujipoly now and check (and report this guy if he is defrauding everyone).

Can anyone read/write in Japanese?

*NVIDIA Pascal open GPU documentation released:*

*Right click link and select "save target as" for the PDF's*

ftp://download.nvidia.com/open-gpu-doc/pascal/1/BIT_DISPLAY_PTRS-U-BIT_DP_PTRS-d.pdf
ftp://download.nvidia.com/open-gpu-doc/pascal/1/gp100-fbpa.txt
ftp://download.nvidia.com/open-gpu-doc/pascal/1/gp100-mmu-format.pdf
ftp://download.nvidia.com/open-gpu-doc/pascal/1/gp100-msi-intr.txt

Could be useful for the programming of GPU tools

/\ originally posted at echpowerup.com by StefanM


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> I couldn't find the 17's in 0.5 :-(
> 
> I might email Fujipoly now and check (and report this guy if he is defrauding everyone).
> 
> Can anyone read/write in Japanese?
> 
> *NVIDIA Pascal open GPU documentation released:*
> 
> *Right click link and select "save target as" for the PDF's*
> 
> ftp://download.nvidia.com/open-gpu-doc/pascal/1/BIT_DISPLAY_PTRS-U-BIT_DP_PTRS-d.pdf
> ftp://download.nvidia.com/open-gpu-doc/pascal/1/gp100-fbpa.txt
> ftp://download.nvidia.com/open-gpu-doc/pascal/1/gp100-mmu-format.pdf
> ftp://download.nvidia.com/open-gpu-doc/pascal/1/gp100-msi-intr.txt
> 
> Could be useful for the programming of GPU tools
> 
> /\ originally posted at echpowerup.com by StefanM


http://www.frozencpu.com/cat/l3/g8/c487/s1797/list/p1/Thermal_Interface-Thermal_Pads_Tape-Ultra_Extreme_Thermal_Pads-Page1.html

http://www.frozencpu.com/cat/l3/g8/c487/s1730/list/p1/Thermal_Interface-Thermal_Pads_Tape-Extreme_Thermal_Pads-Page1.html

Won't do you any good over there but here is a NA supplier.


----------



## Tdbeisn554

So I returned my replacement 1080 classified, had led bleed... RMA was really slow (took like a month...) And when I finally got a new card it was not a new one... I'm really angry as I made ticket within the 30 days of purchase. I RMA'ed the one too (at least got a pre paid label and I am promised a brand new one) my card will arrive at EVGA tomorrow so I really hope I do not need to wait really long to get my new card, Since I bought it like 2 months ago and only got to use it for like 2 weeks... really disappointing in EVGA's customer service.

If I have my card back I will try overclocking and tinkering with it and playing lots of games at max settings instead of low/medium that I do now on my laptop


----------



## ucode

Quote:


> You never read my full post. (you must have stopped after the first sentence)?


I can read just fine. Hardware voltage mod's will still show default voltage on software. I've seen a HOF card (HOF VBIOS) showing over 1.093V and personally have run a 1050Ti over 1.1V on stock VBIOS. Granted it wasn't a huge amount but nevertheless...


----------



## OccamRazor

Quote:


> Originally Posted by *Archang3l*
> 
> So I returned my replacement 1080 classified, had led bleed... RMA was really slow (took like a month...) And when I finally got a new card it was not a new one... I'm really angry as I made ticket within the 30 days of purchase. I RMA'ed the one too (at least got a pre paid label and I am promised a brand new one) my card will arrive at EVGA tomorrow so I really hope I do not need to wait really long to get my new card, Since I bought it like 2 months ago and only got to use it for like 2 weeks... really disappointing in EVGA's customer service.
> 
> If I have my card back I will try overclocking and tinkering with it and playing lots of games at max settings instead of low/medium that I do now on my laptop


EVGA is moving shop to new warehouses, that could explain the delays and mixups in customer service but as a huge company as they are, they should have thought about it in the first place, customer always comes first! Its a shame really...

Cheers

Occamrazor


----------



## OccamRazor

Anyone having Displayport problems???? I can´t seem to get any port working... already tried lots of things like disabling fast boot and uefi pcie boot and all i could think of (and yes i know with T4 bios 1 displayport is disabled...







)


----------



## Crazy9000

Quote:


> Originally Posted by *OccamRazor*
> 
> Anyone having Displayport problems???? I can´t seem to get any port working... already tried lots of things like disabling fast boot and uefi pcie boot and all i could think of (and yes i know with T4 bios 1 displayport is disabled...
> 
> 
> 
> 
> 
> 
> 
> )


2x display port and one DVI monitor, and haven't had a single problem since launch on my MSi aero 1080.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> http://www.frozencpu.com/cat/l3/g8/c487/s1797/list/p1/Thermal_Interface-Thermal_Pads_Tape-Ultra_Extreme_Thermal_Pads-Page1.html
> 
> http://www.frozencpu.com/cat/l3/g8/c487/s1730/list/p1/Thermal_Interface-Thermal_Pads_Tape-Extreme_Thermal_Pads-Page1.html
> 
> Won't do you any good over there but here is a NA supplier.


Best one I could find (in stock) was 11 W/mk. Must be the EVGA thermal issue everyone is buying them lol :-(

Shipping is also 2X the price of the pad lol. And takes "15 to 30 days". How the **** can something take 15-30 days to be shipped across?

I ordered an EK block last week, on Thursday it was shipped abroad from Slovenia to Scotland (U.K.) arrived *day after* on Friday.

The U.S. postal service must be the worst in the world lol .. sorry but gets me frustrated :-(

15 to 30 days at $15 and it is only the same size as a letter.

Anyway thanks for the link I've added it to my favourites; I'll see how I get on with this ebay one first and if its no good I'll email frozencpu.com and see if they can ship it at a decent price, a lot sooner.

Quote:


> Originally Posted by *Archang3l*
> 
> So I returned my replacement 1080 classified, had led bleed... RMA was really slow (took like a month...) And when I finally got a new card it was not a new one... I'm really angry as I made ticket within the 30 days of purchase. I RMA'ed the one too (at least got a pre paid label and I am promised a brand new one) my card will arrive at EVGA tomorrow so I really hope I do not need to wait really long to get my new card, Since I bought it like 2 months ago and only got to use it for like 2 weeks... really disappointing in EVGA's customer service.
> 
> If I have my card back I will try overclocking and tinkering with it and playing lots of games at max settings instead of low/medium that I do now on my laptop


Why didn't you make use of "cross-shipping" service or "advanced RMA service" you wouldn't of been without a card for more than a day or two?

Didn't you RMA it the first time due to it being a "bad clocker" or did you say it was 'coil whine'? can't quite remember?


----------



## OccamRazor

Quote:


> Originally Posted by *Crazy9000*
> 
> 2x display port and one DVI monitor, and haven't had a single problem since launch on my MSi aero 1080.


I have 3 ASUS VG278HE, 1 is working through DVI but the other 2 do not work with DP to DVI adapter, already tried 2 cables also and nothing!


----------



## Crazy9000

Quote:


> Originally Posted by *OccamRazor*
> 
> I have 3 ASUS VG278HE, 1 is working through DVI but the other 2 do not work with DP to DVI adapter, already tried 2 cables also and nothing!


It's likely the DP to DVI adapters







. Those have always been troublesome.


----------



## OccamRazor

Quote:


> Originally Posted by *Crazy9000*
> 
> It's likely the DP to DVI adapters
> 
> 
> 
> 
> 
> 
> 
> . Those have always been troublesome.


I have no choice, other than buy another monitor that is...


----------



## Crazy9000

Quote:


> Originally Posted by *OccamRazor*
> 
> I have no choice, other than buy another monitor that is...


Apparently for multiple monitors you often need active display port adapters instead of the much cheaper passive ones. I don't really know why.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> The copper extrusion that covers the chips on the FTW Hybrid have pads. It is not bare coverage so I would not recommend that.
> .


Don't understand what you mean by that above?

What I was meanig to say is; instead of having:
GDDR5X chip --> 0.5mm pad --> waterblock
have:
GDDR5X chip --> 0.5mm copper shim --> waterblock (+ thermal paste)

And *only* on the 3 hottest chips between core and VRM would I use the little "shims", all other 5 GDDR5X chips would just keep their 0.5mm pads.

(I am actually getting higher mem O/C until temps on those 3 GDDR5X chips begun exceeding about 42c).

Also -- I've NEVER seen a game use over 5gb VRAM. It's a shame there isn't some software to force the card to use the other cooler chips first lol...


----------



## SmackHisFace

Quote:


> Originally Posted by *OccamRazor*
> 
> I have no choice, other than buy another monitor that is...


I use two DP to DVI cables on my 1080 zotac stock bios. No issues


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Don't understand what you mean by that above?
> 
> What I was meanig to say is; instead of having:
> GDDR5X chip --> 0.5mm pad --> waterblock
> have:
> GDDR5X chip --> 0.5mm copper shim --> waterblock (+ thermal paste)
> 
> And *only* on the 3 hottest chips between core and VRM would I use the little "shims", all other 5 GDDR5X chips would just keep their 0.5mm pads.
> 
> (I am actually getting higher mem O/C until temps on those 3 GDDR5X chips begun exceeding about 42c).
> 
> Also -- I've NEVER seen a game use over 5gb VRAM. It's a shame there isn't some software to force the card to use the other cooler chips first lol...


http://www.gamersnexus.net/hwreviews/2582-evga-gtx-1080-ftw-hybrid-review-vs-sea-hawk-x?showall=1

On the Hybrid that copper plate goes over the pump cold plate and cools the VRAM. There are thermal pads between the copper and the chips.


----------



## nexxusty

Quote:


> Originally Posted by *OccamRazor*
> 
> (*Sorry, just realised now you wanted info about the EK version, while my card is AIO version...
> 
> 
> 
> 
> 
> 
> 
> *)
> Temps never go beyond 45C and this is with 1,18V/1.20V, ambient temp of 25C, VRM´s never go beyond 60C, measured by a IR gun
> (set a moderate fan profile in AB and that´s enough)
> (removed the backplate though as it was blocking air flow and generating hot air pockets with higher voltages)
> out of the box goes up to [email protected], its fun to tweak with the T4 bios,
> the corsair fan does it job and its not too loud but i changed it for a push/pull fan setup and temps dropped by 3C,
> These cards really need cold to get the best out of them,
> 
> 
> 
> 
> 
> 
> 
> 
> My Brother Skyn3t and me got our cards at 600$ each, used, still with warranty, was a good deal!
> 
> 
> 
> 
> 
> 
> 
> 
> Nop! this is *OCN*! We *DO NOT* ignore our fellow members, users, forum dwellers or lurkers! Its our duty to help whoever and whenever help is requested! I said this in the past and i will say it again many times as needed!
> This is a forum: " _meeting or medium where ideas and views on a particular issue can be exchanged_" So, people not accepting this, well, its not the place for them at all... Heck! Some dont even accept this is a OC place, i thought the name would give it away...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The algorithm that rules the polynome heat/voltage/power/frequency is conservative by nature, so, lower voltage/power/frequency, heat dependant is always set by drivers/bios; like Coopiklaani said: _"offset method allows certain degree of "play" in voltages"_ so, controlling the other variables (increasing the TDP limit, lowering the heat by increasing the cooling effectiveness, will not trigger the limitations and allow for a higher OC!
> 
> But... (there is always a but right?)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Having a higher clock does not mean you will have a higher performance even though its benchmark or game stable, this happens both on core an memory OC, unfortunately there are lots of physical and software limitations set on this card that can trigger a low p-state while the readings all software can get from INA3221 (IC sensor) are not reported correctly as what really are, but you can see that on lower scores and lower framerates! Always check on performance and do not rely on core and memory numbers, TiN (you all know him right?) wrote: _"Key point to see if performance goes up, not just the clocks"
> _
> 
> Cheers
> 
> Occamrazor


I help when people deserve it.

I've seen a lot of people looking for handouts in this forum lately and I'm not having it.


----------



## Radmanhs

what kinda OC's are you guy able to get on your cards? I only manage to get 250 out of the mem which isn't great. However I am able to get 225 out of the gpu which as far as I can tell is pretty good.


----------



## kasper96

nrpeyton:
I looked at that picture; it doesn't look long enough to be a 360mm radiator?

This thing is huge (though it is the small version, there is also a MoRa 420). It can be equipped with either 9 x 120 mm fans or 4 x 180 mm fans, which I did. The advantage is it always gets fresh, i.e.cool air, since it is mounted underneath my table, below the drawer.
It is more or less 3 360mm radiators, 60 mm thick.


----------



## nrpeyton

Quote:


> Originally Posted by *nexxusty*
> 
> I help when people deserve it.
> 
> I've seen a lot of people looking for handouts in this forum lately and I'm not having it.


who u referring to?

ive seen u post that exact same post before a while ago?

Quote:


> Originally Posted by *Derek1*
> 
> http://www.gamersnexus.net/hwreviews/2582-evga-gtx-1080-ftw-hybrid-review-vs-sea-hawk-x?showall=1
> 
> On the Hybrid that copper plate goes over the pump cold plate and cools the VRAM. There are thermal pads between the copper and the chips.


I skimmed over that link a bit (need to get some sleep) but the bits I read made it make perfect sense now 

there's also a part where they said that GDDR5X doesn't generate a lot of heat; so that really makes me wonder now if I'm actually getting proper contact on those 3 VRAM chips I keep going on about at all....

going to have to do some really good in-depth trial and error testing to try and work out whats causing it. 65c on a VRAM chip can't be right (especially when only measured from back of PCB).

anyway very useful mate cheers, rep +1 for that link lol 

good night for now


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> who u referring to?
> 
> ive seen u post that exact same post before a while ago?
> 
> I skimmed over that link a bit (need to get some sleep) but the bits I read made it make perfect sense now
> 
> there's also a part where they said that GDDR5X doesn't generate a lot of heat; so that really makes me wonder now if I'm actually getting proper contact on those 3 VRAM chips I keep going on about at all....
> 
> going to have to do some really good in-depth trial and error testing to try and work out whats causing it. 65c on a VRAM chip can't be right (especially when only measured from back of PCB).
> 
> anyway very useful mate cheers, rep +1 for that link lol
> 
> good night for now


Well the nexus benchmark of my card, which uses a plate for the vram that feeds into the GPU block ran 8-10C hotter with the plate on vs it off. So if vram is boosting the gpu block temps by 10C i think its safe to say it generates decent amount of heat.

Quote:


> Originally Posted by *Radmanhs*
> 
> what kinda OC's are you guy able to get on your cards? I only manage to get 250 out of the mem which isn't great. However I am able to get 225 out of the gpu which as far as I can tell is pretty good.


With +225 what is your card boosting too and is it staying there entire time?

I was able to get 2164-2177 for core, with +1000 to memory, use +995, some reason 995 gets higher bechmarks.


----------



## Radmanhs

after a little tweaking, I had to drop the core to 220, which was why I thought it was my ram that was capped for some reason. The ram can go a lot higher now, by my core clock is only at 2126


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> Well the nexus benchmark of my card, which uses a plate for the vram that feeds into the GPU block ran 8-10C hotter with the plate on vs it off. So if vram is boosting the gpu block temps by 10C i think its safe to say it generates decent amount of heat.


Its not the block that is heating up by an extra 10c (that's not what I meant).. I mean the actual VRAM (memory chips) on my card are heating up to 65 degrees C +.

But on gamersnexus I just read a text saying that GDDR5X doesn't generate a lot of heat. So I'm worried my block isn't actually making proper contact with 3 of the memory chips on my board. (3 out of total of 8)

I'm taking the temperatures using temp probes connected to my fan controller. (Sticking them to the back of the PCB) obviously with *all* back-plates off. 

The core & VRM temps are great.


----------



## Dragonsyph

If my core OC is at 2164 with 1.09v and OC it more never artifacts just crashes the benchmark does this mean its not enough voltage right?

So i could probably get higher core clock with higher voltages?


----------



## GreedyMuffin

I run 2100 at 1.000V. Card is cool and is using less voltage and power compared to stock !


----------



## Dragonsyph

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I run 2100 at 1.000V. Card is cool and is using less voltage and power compared to stock !


Whats the fun in that? Crank that sob to the max son.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Its not the block that is heating up by an extra 10c (that's not what I meant).. I mean the actual VRAM (memory chips) on my card are heating up to 65 degrees C +.
> 
> But on gamersnexus I just read a text saying that GDDR5X doesn't generate a lot of heat. So I'm worried my block isn't actually making proper contact with 3 of the memory chips on my board. (3 out of total of 8)
> 
> I'm taking the temperatures using temp probes connected to my fan controller. (Sticking them to the back of the PCB) obviously with *all* back-plates off.
> 
> The core & VRM temps are great.


As you and Dragon both saw in the GN link the Hybrid ran hotter than the Seahawk and it was Steve's hypothesis that the rreeason was the copperplate was inflating the Core temps because there was more now to cool making it less effective on the Core. Especially from those three Memory chips next to the VRM. Proximity to the VRM was making those three chips warmer than the others.

I was going to test that hypthesis with my conversion kit by leaving the copper plate off and instead use heat sinks on the VRAM, because as you noted Steve saying the GDDR5X run cooler therefore making it possible to get away with the heatsinks, and then I could keep the water unit cold plate dedicated to cooling the Core only. I was going to attach the heat sinks using Seki Sui thermal tape, still waiting for that to arrive from Japan btw, as the heat sinks come with 3M 8815 adhesive pads attached which are not as good. *ETA Before conversion I could only OC the Memory to 600 before I would begin to see artifacts, now however, with the copper plate cooling the VRAM I am able to go to 850 stable.
*
But I went ahed and made my conversion leaving the copper plate and the only mod I made was by swapping out the stock fan and going push pull with a couple Corsairs. My resulting temps are equivalent now to the SeaHawk, idling around 28C and never going above 50C under benching/game stress. So if the tape ever arrives I am not sure now whether I will make that mod as it seems unnecessary. I don't think I can get the Core temps any lower and /or not be effected by the increase in VRAM temps I would see. I am not monitoring the VRAM at all at this point so I am unsure of what is going on there. Maybe I will pick up the necessary tools over Christmas to check.

But my main purpose in providing that link was to show you that they do not go bare copper to chip, the copper plate has thermal pads on them.


----------



## Dragonsyph

Quote:


> Originally Posted by *Derek1*
> 
> As you and Dragon both saw in the GN link the Hybrid ran hotter than the Seahawk and it was Steve's hypothesis that the rreeason was the copperplate was inflating the Core temps because there was more now to cool making it less effective on the Core. Especially from those three Memory chips next to the VRM. Proximity to the VRM was making those three chips warmer than the others.
> 
> I was going to test that hypthesis with my conversion kit by leaving the copper plate off and instead use heat sinks on the VRAM, because as you noted Steve saying the GDDR5X run cooler therefore maing it possible to get away with the heatsinks, and then I could keep the water unit cold plate dedicated to cooling the Core only. I was going to attach the heat sinks using Seki Sui thermal tape, still waiting for that to arrive from Japan btw, as the heat sinks come with 3M 8815 adhesive pads attached which are not as good.
> 
> But I went ahed and made my conversion leaving the copper plate and the only mod I made was by swapping out the stock fan and going push pull with a couple Corsairs. My resulting temps are equivalent now to the SeaHawk, idling around 28C and never going above 50C under benching/game stress. So if the tape ever arrives I am not sure now whether I will make that mod as it seems unnecessary. I don't think I can get the Core temps any lower and /or not be effected by the increase in VRAM temps I would see. I am not monitoring the VRAM at all at this point so I am unsure of what is going on there. Maybe I will pick up the necessary tools over Christmas to check.
> 
> But my main purpose in providing that link was to show you that they do not go bare copper to chip, the copper plate has thermal pads on them.


Ya it would be nice if nvidia put onboard sensors for these things that we could see in GPUZ..


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> Why didn't you make use of "cross-shipping" service or "advanced RMA service" you wouldn't of been without a card for more than a day or two?
> 
> Didn't you RMA it the first time due to it being a "bad clocker" or did you say it was 'coil whine'? can't quite remember?


I did my first card with the cross-ship option, but you need to actually pay the full 800€ instead of a charge on your card. And since I do not a credit card to pay I had to ask my dad, he was not really happy to pay it... so 2nd time and now I just opted for standard rma.

And yes I returned it because of coil whine and because it was a bad clocker too..


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> If my core OC is at 2164 with 1.09v and OC it more never artifacts just crashes the benchmark does this mean its not enough voltage right?
> 
> So i could probably get higher core clock with higher voltages?


That's what I thought as welllbut when I used the T4 I could only get voltage to go to 1.1v and would still crash at 2177. So either my chip has maxed out or I didn't use AB voltage curve right. Still tinkering with that.


----------



## ROKUGAN

Hi,

Anyone with a ZOTAC 1080 AMP! Extreme and T4 Bios can post some feedback about the card results with that Asus Bios?
I´ve seen several people writing about it but none posted any results back (at least I haven´t found them).

Many thanks


----------



## Coopiklaani

Rock solid 2202MHz @ 1.125v on my EVGA SC with T4! But my vram is hold me back. 550 is the most I can push without losing performance.
http://www.3dmark.com/tsst/18774
http://www.3dmark.com/fsst/225916


----------



## Vellinious

Quote:


> Originally Posted by *Coopiklaani*
> 
> Rock solid 2202MHz @ 1.125v on my EVGA SC with T4! But my vram is hold me back. 550 is the most I can push without losing performance.
> http://www.3dmark.com/tsst/18774
> http://www.3dmark.com/fsst/225916


What's it score?


----------



## xxspookyxx

Hello Guys
My GTX 1080 Galax in standard configuration, in 2 minutes go to max temp 92ºc Assassins Creeds Unity.

I replace thermal paste with MX-4, and put kraken g10+ watercooler H75 fan 100%. My temps now its 52ºc in Assassins Creed Unity overclocked 2113mhz/52ºc/1.075v

Today i put all in ultra 4k, and in a specific place in World of Warcraft with high density of textures, my FPS drops 120 to 35 fps, (Thats not the problem).
But i see my Gtx 1080 in gpu-z and hit 102-110% power use, and temps go to 62ºc

Thats is normal for a watercooler gpu?


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Rock solid 2202MHz @ 1.125v on my EVGA SC with T4! But my vram is hold me back. 550 is the most I can push without losing performance.
> http://www.3dmark.com/tsst/18774
> http://www.3dmark.com/fsst/225916


*What temps are you getting on you VRAM (specifically the 3 chips between core and VRM)?*



Just like the core; the colder you get it the more it will overclock.

Quote:


> Originally Posted by *xxspookyxx*
> 
> Hello Guys
> My GTX 1080 Galax in standard configuration, in 2 minutes go to max temp 92ºc Assassins Creeds Unity.
> 
> I replace thermal paste with MX-4, and put kraken g10+ watercooler H75 fan 100%. My temps now its 52ºc in Assassins Creed Unity overclocked 2113mhz/52ºc/1.075v
> 
> Today i put all in ultra 4k, and in a specific place in World of Warcraft with high density of textures, my FPS drops 120 to 35 fps, (Thats not the problem).
> But i see my Gtx 1080 in gpu-z and hit 102-110% power use, and temps go to 62ºc
> 
> Thats is normal for a watercooler gpu?


*62c is little high for watercooled GTX 1080, but it also depends a lot on how big your radiator is and the ambient temperature (and also if the water is also cooling memory/vrm)*


----------



## AllGamer

Quote:


> Originally Posted by *xxspookyxx*
> 
> Hello Guys
> My GTX 1080 Galax in standard configuration, in 2 minutes go to max temp 92ºc Assassins Creeds Unity.
> 
> I replace thermal paste with MX-4, and put kraken g10+ watercooler H75 fan 100%. My temps now its 52ºc in Assassins Creed Unity overclocked 2113mhz/52ºc/1.075v
> 
> Today i put all in ultra 4k, and in a specific place in World of Warcraft with high density of textures, my FPS drops 120 to 35 fps, (Thats not the problem).
> But i see my Gtx 1080 in gpu-z and hit 102-110% power use, and temps go to 62ºc
> 
> Thats is normal for a watercooler gpu?


How many radiators and what sizes do you have in that loop?

sounds like either your heat is not being vented fast enough, or your loop is too small to handle the OC of the Video card + CPU

Does the Temp still shoots up to 92C if you run the video card in stock speed?


----------



## Vellinious

Quote:


> Originally Posted by *AllGamer*
> 
> How many radiators and what sizes do you have in that loop?
> 
> sounds like either your heat is not being vented fast enough, or your loop is too small to handle the OC of the Video card + CPU
> 
> Does the Temp still shoots up to 92C if you run the video card in stock speed?


His temps are fine. It looks like he's hitting the power limit.


----------



## AllGamer

Quote:


> Originally Posted by *Vellinious*
> 
> His temps are fine. It looks like he's hitting the power limit.


I'm guessing that too,
that's why I asked if running at stock speed also hit that high.

because if it still goes up that high on stock speed, then he has some waterloop setup issue, or probably the block is not making proper contact with the GPU, or it needs some thermal pad in some areas of GPU to properly transfer the heat.


----------



## diablodbl

Quote:


> Originally Posted by *ROKUGAN*
> 
> Hi,
> 
> Anyone with a ZOTAC 1080 AMP! Extreme and T4 Bios can post some feedback about the card results with that Asus Bios?
> I´ve seen several people writing about it but none posted any results back (at least I haven´t found them).
> 
> Many thanks


Im looking for someone with this config.

I read all this topic:

http://www.overclock.net/t/1612207/zotac-gtx-1080-amp-bios-flash-to-amp-extreme-edition/60

And im not feeling safe to crossflash a Bios from another brand, so far i just flashed AMP! and AMP Extreme.

My Max is 2050/5600, witch is a shame for a card with that much potencial.


----------



## ROKUGAN

Quote:


> Originally Posted by *diablodbl*
> 
> Im looking for someone with this config.
> 
> I read all this topic:
> 
> http://www.overclock.net/t/1612207/zotac-gtx-1080-amp-bios-flash-to-amp-extreme-edition/60
> 
> And im not feeling safe to crossflash a Bios from another brand, so far i just flashed AMP! and AMP Extreme.
> 
> My Max is 2050/5600, witch is a shame for a card with that much potencial.


Same here. I´ve read several members talking about flashing the T4 Bios in the AMP Extreme but I asked them on results and none reported back.
Maybe they bricked their cards and aren´t able to read my feedback requests









Now, I´m pretty sure it will work, but would like to know about the difference as the AMP Extreme bios is already pretty fine.


----------



## xxspookyxx

.


----------



## xxspookyxx

.


----------



## xxspookyxx

Quote:


> Originally Posted by *nrpeyton*
> 
> *What temps are you getting on you VRAM (specifically the 3 chips between core and VRM)?*
> 
> 
> 
> Just like the core; the colder you get it the more it will overclock.
> 
> *62c is little high for watercooled GTX 1080, but it also depends a lot on how big your radiator is and the ambient temperature (and also if the water is also cooling memory/vrm)*


My max temp in CPU its 58ºc in standard voltage, overclock 4.7ghz.

I replace the fan of kraken g10, by the another cooler with more cfm and 2100rpm instead of 1500rpm.

In CPU i use H105, with push pull in top of case (INTAKE)
I have one 1 in rear exhaust, and 2 fans in front intake
and the H75 are exhaust in bottom.

My ambient temps are 27-29ºc


----------



## Radmanhs

I get a nice 45 degrees under load after overclocking. Then again I'm running 2 triple rads for 2 components and all the fans are controlled by the CPU and non of them run above 800 rpm


----------



## steeludder

Quote:


> Originally Posted by *nrpeyton*
> 
> Rep+1
> 
> Loved reading this post; thats exactly the information I was looking for. It was also that exact chiller I was looking at too
> 
> Also thats a screaming overclock for a 3.5ghz CPU (1.0 --> 1.1GHZ overclock)  what motherboard do you have on that?


It's not THAT good of an overclock considering the temps and the voltage I have to feed it. Good chips would do 4.5GHz at 1.3V... I've got an ASUS X99-Deluxe mobo.

Quote:


> Originally Posted by *nrpeyton*
> 
> Also do you think it was all worth it then? Or did you do it for the same reason I am thinking of: _because I'm addicted to tinkering around and seeing how far I can push everything lol _


All of the above, lol








Quote:


> Originally Posted by *nrpeyton*
> 
> Whats the noise like from it (would it bother someone in another room trying to get to sleep before work in the morning)?


It's like a standalone fridge type of noise. So quite noisy when it switches on. Which it does for a couple of minutes every now and then to get the temp of the water back down. Then it completely stops until water temp is 2C above target. Would it disturb somebody in _another_ room? No.
Quote:


> Originally Posted by *nrpeyton*
> 
> And lastly if you had the option of buying a chiller again; would you of spent a little extra and got a more powerful model or are you completely happy you got the correct power/requirement ratio on the HC-500A?


No it's perfect. With room temps around or under 25C you get really good results and can easily get water around or below 10C. As said earlier, for that sort of heat deployment (1x overclocked 1080, 1x overclocked 5960x, 1x MoBo VRM) it will cool water ~15C under ambient. That's quite stunning imo. Bear in mind the next model up is a lot bigger (and noisier). Also your main problem is not absolute cooling, it's dew point. The HC-500A will cool water below dew point if you're not careful, so no point in getting a bigger more powerful unit unless you want to add a few more things to your loop.
Quote:


> Originally Posted by *nrpeyton*
> 
> I also see a few similar models with same names (vastly differently priced on different sites):
> 
> *Hailea water chiller* Ultra Titan 1500 (*HC500* = 790Watt cooling capacity) £497.89 http://www.watercoolinguk.co.uk/p/Hailea-water-chiller-Ultra-Titan-1500-HC500-=-790Watt-cooling-capacity-UK-Plug_53532.html
> 
> *Hailea Waterchiller* Ultra Titan *500* (HC300-395Watt cooling capacity) £376.45 http://www.watercoolinguk.co.uk/p/Hailea-Waterchiller-Ultra-Titan-500-HC300-395Watt-cooling-capacity_20553.html?gclid=COasgsbL0NACFU0aGwode2wAaA
> 
> *Hailea Aquarium Chiller HC 500A* £373.99 http://www.completeaquatics.co.uk/hailea-aquarium-chiller-hc500a#tab-description <-- this one is from aquatics website so doesn't list cooling power in watts
> 
> /\ a lot of "500"'s used in the product titles too; a bit confusing.
> How would your HC-500A compare to the two on the computing websites in terms of power? as I can only find that one on the aquarium sites (so I'd have to work out the tubing adapter requirements etc for linking it into my loop on my own......


I have the HC-500A (aka "Ultra Titan 1500"), dissipating 790W. Got it for £400 at Sterner Aquatech: http://www.sterner.co.uk/product/hailea-hc-500a-water-chiller/

The only thing you have to factor in is sufficient water flow rate. I'm running under the recommended requirements of the chiller. I've got two D5 pumps and they only barely manage to push 500 liters per hour. Now it's probably nothing to worry about since the chiller is only powering up every now and then so it won't overheat. But something to keep in mind. Aquarium chillers need high flow rate. Adding a third D5 wouldn't hurt if the chiller was to run harder for longer amounts of time, but also don't forget that pressure builds up in the loop as a consequence and the blocks and connections will only manage that much before they pop. It's a whole art in itself!


----------



## diablodbl

Quote:


> Originally Posted by *ROKUGAN*
> 
> Same here. I´ve read several members talking about flashing the T4 Bios in the AMP Extreme but I asked them on results and none reported back.
> Maybe they bricked their cards and aren´t able to read my feedback requests
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, I´m pretty sure it will work, but would like to know about the difference as the AMP Extreme bios is already pretty fine.


I saw sometimes my card show "power limit 1"...

Im trying to test between the 2 AMP extreme bios, because they have diferent power limit. Nothing so far...


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> *What temps are you getting on you VRAM (specifically the 3 chips between core and VRM)?*
> 
> 
> 
> Just like the core; the colder you get it the more it will overclock.
> 
> *62c is little high for watercooled GTX 1080, but it also depends a lot on how big your radiator is and the ambient temperature (and also if the water is also cooling memory/vrm)*


EVGA SC is based on the FE PCB. So I can use a normal EK 1080 WB on it. I don't have a T.probe on the vram, but I do have one on the vrm. It reads ~65C under heavy load (>280w) with my liquid temp being 45C. SInce vrams are also covered by the block, and have good contact with the block. My educational guess is vram temp ~= liquid temp +5, which translate to about 50C or 55C worst case.


----------



## Coopiklaani

Quote:


> Originally Posted by *Vellinious*
> 
> What's it score?


http://www.3dmark.com/spy/743311
http://www.3dmark.com/fs/10828458
http://www.3dmark.com/fs/10811363

Those are my sores.


----------



## nrpeyton

Quote:


> Originally Posted by *steeludder*
> 
> It's not THAT good of an overclock considering the temps and the voltage I have to feed it. Good chips would do 4.5GHz at 1.3V... I've got an ASUS X99-Deluxe mobo.
> All of the above, lol
> 
> 
> 
> 
> 
> 
> 
> 
> It's like a standalone fridge type of noise. So quite noisy when it switches on. Which it does for a couple of minutes every now and then to get the temp of the water back down. Then it completely stops until water temp is 2C above target. Would it disturb somebody in _another_ room? No.
> No it's perfect. With room temps around or under 25C you get really good results and can easily get water around or below 10C. As said earlier, for that sort of heat deployment (1x overclocked 1080, 1x overclocked 5960x, 1x MoBo VRM) it will cool water ~15C under ambient. That's quite stunning imo. Bear in mind the next model up is a lot bigger (and noisier). Also your main problem is not absolute cooling, it's dew point. The HC-500A will cool water below dew point if you're not careful, so no point in getting a bigger more powerful unit unless you want to add a few more things to your loop.
> I have the HC-500A (aka "Ultra Titan 1500"), dissipating 790W. Got it for £400 at Sterner Aquatech: http://www.sterner.co.uk/product/hailea-hc-500a-water-chiller/
> 
> The only thing you have to factor in is sufficient water flow rate. I'm running under the recommended requirements of the chiller. I've got two D5 pumps and they only barely manage to push 500 liters per hour. Now it's probably nothing to worry about since the chiller is only powering up every now and then so it won't overheat. But something to keep in mind. Aquarium chillers need high flow rate. Adding a third D5 wouldn't hurt if the chiller was to run harder for longer amounts of time, but also don't forget that pressure builds up in the loop as a consequence and the blocks and connections will only manage that much before they pop. It's a whole art in itself!


That is the most useful post I've read on this forum to-date. Excellent reading mate. Fantastic. Honestly rep +1

I've actually emailed some of the retailers on these sites and they either can't be bothered emailing back at all or simply say things like "we can't comment on this or " we can't guarantee that" or "that is open to subjection" lol; I wasn't even looking for guatantees anyway, just rough estimates on customer experience 

*Just reading your post has actually got me excited lol.* I can't wait until after Xmas when I can maybe scrape all the money together.

Looks like I'll be eating beans on toast for a few weeks haha.

Suppose I'll need to grab an extra pump now too lol.  Never realised that. So all in -- including extra pump, shipping and more tubing/fittings I am probably talking at least £550.00. 

Thanks again for making an effort with that post mate, very appreciated ;-)

Wish all posts were as informative lol 

Have a good day mate and best regards 

Nick Peyton


----------



## steeludder

Quote:


> Originally Posted by *nrpeyton*
> 
> That is the most useful post I've read on this forum to-date. Excellent reading mate. Fantastic. Honestly rep +1
> 
> I've actually emailed some of the retailers on these sites and they either can't be bothered emailing back at all or simply say things like "we can't comment on this or " we can't guarantee that" or "that is open to subjection" lol; I wasn't even looking for guatantees anyway, just rough estimates on customer experience
> 
> *Just reading your post has actually got me excited lol.* I can't wait until after Xmas when I can maybe scrape all the money together.
> 
> Looks like I'll be eating beans on toast for a few weeks haha.
> 
> Suppose I'll need to grab an extra pump now too lol.  Never realised that. So all in -- including extra pump, shipping and more tubing/fittings I am probably talking at least £550.00.
> 
> Thanks again for making an effort with that post mate, very appreciated ;-)
> 
> Wish all posts were as informative lol
> 
> Have a good day mate and best regards
> 
> Nick Peyton


LoL, my pleasure dude, you can read about my entire journey here if you want: https://community.futuremark.com/forum/showthread.php?185362-Project-quot-Noob-Chill-On-A-Bender-quot-Build-Log
Good luck!


----------



## Vellinious

Quote:


> Originally Posted by *Coopiklaani*
> 
> http://www.3dmark.com/spy/743311
> http://www.3dmark.com/fs/10828458
> http://www.3dmark.com/fs/10811363
> 
> Those are my sores.


May want to back off the core a little bit (or maybe lower the voltage at that clock), and see if it improves your scores. With 2200+, you really should be seeing 8.4k+ graphics scores in Timespy.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> That is the most useful post I've read on this forum to-date. Excellent reading mate. Fantastic. Honestly rep +1
> 
> I've actually emailed some of the retailers on these sites and they either can't be bothered emailing back at all or simply say things like "we can't comment on this or " we can't guarantee that" or "that is open to subjection" lol; I wasn't even looking for guatantees anyway, just rough estimates on customer experience
> 
> *Just reading your post has actually got me excited lol.* I can't wait until after Xmas when I can maybe scrape all the money together.
> 
> Looks like I'll be eating beans on toast for a few weeks haha.
> 
> Suppose I'll need to grab an extra pump now too lol.  Never realised that. So all in -- including extra pump, shipping and more tubing/fittings I am probably talking at least £550.00.
> 
> Thanks again for making an effort with that post mate, very appreciated ;-)
> 
> Wish all posts were as informative lol
> 
> Have a good day mate and best regards
> 
> Nick Peyton


I guess there is no point in any of us ever posting here again now that all our submissions have been reduced to chaff. lol


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> EVGA SC is based on the FE PCB. So I can use a normal EK 1080 WB on it. I don't have a T.probe on the vram, but I do have one on the vrm. It reads ~65C under heavy load (>280w) with my liquid temp being 45C. SInce vrams are also covered by the block, and have good contact with the block. My educational guess is vram temp ~= liquid temp +5, which translate to about 50C or 55C worst case.


Okay great, thanks for replying to that mate, I've noticed my "overclockability" (if that's even a word lol) for memory drops *dramatically* as soon as memory temps begin exceeding 40c-44c.

Looking at the example PCB picture above (in my original post):
-When heavily stressed (Furmark 300w draw) the two memory chips, far left (nearest monitor display outputs) only 38 degrees C. 
-Three chips in the middle (directly above) the core about 42-44c.
- But the 3 chips between VRM and core (right hand side) are getting to 65 degrees C.

*Reaching out to you all *

I have almost finished my official guide: "How to fit the EK 780 TI Classified Block to a 1080 Classified". _Core never exceeds 44c (VRM about the same) *Furmark*._

I'm only 5% away from EK agree officially to support 1080 Classified with their cooling configurator.
EKWB's EK_Grega's emails can be found a few pages back on this thread.

I need to make sure there isn't going to be a potential over-heat issue on those 3 chips before I finalise everything. _Do I need to revisit thermal pad thickness e.t.c_

Giving something back to the community; I am 

Would be great (much appreciated) not just by me but benefit us *all* if *you* _(yes you reading this)_ ;-) could provide me some temps for those memory 3 chips. 

Risked my card. 3 months to find confidence it took! (With unofficial install).

But success!







85% followed *stock instructions in manual* I did 

Guys the 1080 Classified has "overclocking in its DNA"; it is officially sponsored by Kingpin. And Kingpin works very closely with EVGA and has some influence in the industry.

Having card models like the 1080 Classified helps us *all*; it reminds the industry how important overclocking is to us
.
It's owners of these models that perform LN2, write BIOS editors and voltage & power mods and provide us with a world of information on our cards.

There is even an "LN2" labelled bios switch on my 1080 Classified lol. (don't believe me? I have picture right here) 

So I'm reaching out to you guys; if any of you have a spare hour (and don't mind removing your back-plate; give me your GDDR5X memory temps, could you? *three* chips in the picture above.  _5 posts back_

Remember don't use "bare metal probes". (Not unless you cover the area with a little thin electrical tape first -- just take it slowly and enjoy and nothing dangerous will occur 

Thanks you. 

Nick Peyton


----------



## xxspookyxx

Quote:


> Originally Posted by *AllGamer*
> 
> How many radiators and what sizes do you have in that loop?
> 
> sounds like either your heat is not being vented fast enough, or your loop is too small to handle the OC of the Video card + CPU
> 
> Does the Temp still shoots up to 92C if you run the video card in stock speed?


Before temps hits 92ºc in Assassins creed, after replace thermal paste and put the kraken g10+waterccoler h75 and overclock to 2126mhz in 1.075v, my temps are 52ºc in Gaming, but in specific moment, with a lot density high texture, hit 62ºc in World of Warcraft with 110% power target in use.

My case its cool, max temp in CPU its 58ºc in standard voltage, overclocked to 4.7ghz.
I replace the fan of kraken g10, by the another cooler with more cfm and 2100rpm instead of 1500rpm. to more cool vrms.

Inside case, im use in CPU one H105, with push pull in top (INTAKE)
I have one 1 fan normal in rear (exhaust), and 2 fans normal in front (intake)
and the H75 are exhaust in bottom.

My ambient temps are 27-29ºc



Quote:


> Originally Posted by *Vellinious*
> 
> His temps are fine. It looks like he's hitting the power limit.


Hmm the temps are normal for a watercooler?

My ambiente temps are 27-29ºc


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay great, thanks for replying to that mate, I've noticed my "overclockability" (if that's even a word lol) for memory drops *dramatically* as soon as memory temps begin exceeding 40c-44c.
> 
> Looking at the example PCB picture above (in my original post):
> -When heavily stressed (Furmark 300w draw) the two memory chips, far left (nearest monitor display outputs) only 38 degrees C.
> -Three chips in the middle (directly above) the core about 42-44c.
> - But the 3 chips between VRM and core (right hand side) are getting to 65 degrees C.
> 
> I have almost finished my official guide: "How to fit the EK 780 TI Classified Block to a 1080 Classified".
> 
> And I'm only 5% away from EK agree officially support the 1080 Classified with their cooling configurator.
> EKWB's EK_Grega's emails can be found a few pages back on this thread.
> 
> I need to make sure there isn't going to be a potential over-heat issue on those 3 chips before I finalise everything.
> 
> I'm trying to give something back to the community here.
> 
> Would be great (much appreciated) not just by me but probably all 1080 Classified customers (and even potential new ones) if more people could provide me some temps for those 3 chips. These guys still have no "official" option for waterblock. It took me 3 months to find the confidence to risk my card. But it worked out and the installation was 85% original instructions in manual.
> 
> Guys the 1080 Classified has "overclocking in its DNA"; it is officially sponsored by Kingpin. And Kingpin works very closely with EVGA.
> 
> Having card models like the 1080 Classified helps us *all*; it reminds the industry how important overclocking (and the tools) to do it are.
> To everyone here in this community. Whether you own a 1080 Classified or not we need models like this because it the owners of these models that perform LN2, write BIOS editors and voltage & power mods and provide us with a world of information on our cards.
> 
> There is even an "LN2" labelled bios switch on my 1080 Classified lol. (don't believe me? I have picture right here)
> 
> So I'm reaching out to you guys; if any of you have a spare hour (and don't mind removing your back-plate could you give me temps for the *three* GDDR5X memory chips in the picture above.
> 
> Remember don't use "bare metal probes". (Not unless you cover the area with a little thin electrical tape first -- at least).
> 
> Thanks you.
> 
> Nick Peyton


Indeed, Classy has its overclocking DNA. LN2 switch turns on an internal heater that prevents water condensing on the PCB. Some motherboard has this function as well. What Evga should really do is to release a barebone classy card and giving ppl options to purchase a LN2 crucible or a FCWB. Really, Classy on Air is not the way is should be.


----------



## Derek1

Have you flipped your LN2 switch?
Is that how EVGA got 1.6v to their card to clock it to 2800?


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Have you flipped your LN2 switch?
> Is that how EVGA got 1.6v to their card to clock it to 2800?


I would imagine that was done with an E-Power add on board....


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Classy on Air is not the way is should be.


True, and 5% away from official new support we are at least with water.

For LN2 kingpincooling.com = world of information  I am overwhealmed over thee. For start-out overclockers like myself we can master water first  do a few mods and grow confidence and learn 

Then later LN2


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Have you flipped your LN2 switch?
> Is that how EVGA got 1.6v to their card to clock it to 2800?




probably need to "right click - open new tab' for full size to make it out, switch is soo small lol


----------



## Derek1

ummm no that pic was huge, lol.
Same place as my Master Slave.

ETA Just looked in at EVGA forum and Jacob finally gave an 'official' response to the thermal conductivity question of the pads. 1w/mK


----------



## diablodbl

Time Spy = 7037
*Graphics score 7 923*
CPU test 4 308

Firestrike = 16681
*Graphics score 24 337*
Physics score 11 240
Physics test 6 336

CPU [email protected]
Msi MPower Z77
16Gb 2400Mhz C10
GTX 1080 Zotac Amp Extreme 2030~2062/5600

It is normal?









Anyone know the meaning of "Voltage Limit 1"?


----------



## AllGamer

Quote:


> Originally Posted by *xxspookyxx*
> 
> Before temps hits 92ºc in Assassins creed, after replace thermal paste and put the kraken g10+waterccoler h75 and overclock to 2126mhz in 1.075v, my temps are 52ºc in Gaming, but in specific moment, with a lot density high texture, hit 62ºc in World of Warcraft with 110% power target in use.
> 
> My case its cool, max temp in CPU its 58ºc in standard voltage, overclocked to 4.7ghz.
> I replace the fan of kraken g10, by the another cooler with more cfm and 2100rpm instead of 1500rpm. to more cool vrms.
> 
> Inside case, im use in CPU one H105, with push pull in top (INTAKE)
> I have one 1 fan normal in rear (exhaust), and 2 fans normal in front (intake)
> and the H75 are exhaust in bottom.
> 
> My ambient temps are 27-29ºc
> 
> 
> Hmm the temps are normal for a watercooler?
> 
> My ambiente temps are 27-29ºc


Let me get this straight...

you are using one of this for the GPU


and one of this for the CPU


I'd suggest you to use the H105 in the GPU, and H75 on the CPU instead, basically swap it, or get another H105 if your case has enough space.

the GPU is generating way too much heat for the H75 to handle when OCed.

H75 is passable for stock speed, but not while OCing.

if you are planning to OC, you'll need more radiator, at least a 240mm or basically the H105


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> ummm no that pic was huge, lol.
> Same place as my Master Slave.
> 
> ETA Just looked in at EVGA forum and Jacob finally gave an 'official' response to the thermal conductivity question of the pads. 1w/mK


Jesus Christ...can you link me to that?

I'm definitely waiting for my 6w/mK then...


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> Jesus Christ...can you link me to that?
> 
> I'm definitely waiting for my 6w/mK then...


Here ya go Junior.

http://forums.evga.com/The-EVGA-thermal-pad-conductivity-m2588456.aspx


----------



## Radox-0

Quote:


> Originally Posted by *diablodbl*
> 
> Time Spy = 7037
> *Graphics score 7 923*
> CPU test 4 308
> 
> Firestrike = 16681
> *Graphics score 24 337*
> Physics score 11 240
> Physics test 6 336
> 
> CPU [email protected]
> Msi MPower Z77
> 16Gb 2400Mhz C10
> GTX 1080 Zotac Amp Extreme 2030~2062/5600
> 
> It is normal?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know the meaning of "Voltage Limit 1"?


Just means that the limit to going any higher is voltage and in this case not temperature. Your numbers look right for those sorts of clocks.


----------



## diablodbl

Quote:


> Originally Posted by *Radox-0*
> 
> Just means that the limit to going any higher is voltage and in this case not temperature. Your numbers look right for those sorts of clocks.


In other words, is right to say that voltage is limiting my overclock, or even maybe my graphic boost/performance?


----------



## x-apoc

So my ftw card seem to hit a wall at +101 mhz core and +5800mhz MEM, this is under 62c on air. Would getting hybrid cooler be any worth it at this point, other then aesthetics and less fan noise I think not?


----------



## Radox-0

Quote:


> Originally Posted by *diablodbl*
> 
> In other words, is right to say that voltage is limiting my overclock, or even maybe my graphic boost/performance?


Yes right now your limited in terms of voltages. I imagine you should be able to a squeeze a few more hz out of it. Try upping the clocks and voltage to see if there is any additional gain. May not see a massive jump before hitting a limit, most on air will top out in the 2100 mhz region +/- few mhz consistent boost and a bit more for water I am finding.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> ummm no that pic was huge, lol.
> Same place as my Master Slave.
> 
> ETA Just looked in at EVGA forum and Jacob finally gave an 'official' response to the thermal conductivity question of the pads. 1w/mK


haha yeah I replied on the thread lol, kept it polite though 

Quote:


> Originally Posted by *Derek1*
> 
> Have you flipped your LN2 switch?
> Is that how EVGA got 1.6v to their card to clock it to 2800?


Aye Vellinious got it spot on.

EVGA are also releasing a new EPOWER board that has built in EVBOT soon too 

I've been trying to buy an "EVGA Probe it Adapter with Cables" plugs straight into your graphics card then other end into your multimeter but it only seems available to U.S. customers.


----------



## nrpeyton

Quote:


> Originally Posted by *AllGamer*
> 
> Let me get this straight...
> 
> you are using one of this for the GPU
> 
> 
> and one of this for the CPU
> 
> 
> I'd suggest you to use the H105 in the GPU, and H75 on the CPU instead, basically swap it, or get another H105 if your case has enough space.
> 
> the GPU is generating way too much heat for the H75 to handle when OCed.
> 
> H75 is passable for stock speed, but not while OCing.
> 
> if you are planning to OC, you'll need more radiator, at least a 240mm or basically the H105


You really want to be pushing 480mm total for both CPU & GPU (especially if both are highly overclocked).

I have a AMD FX-8350 and an EVGA 1080 Classified and my water temp gradually rises in my loop as the radiator can't dissipate enough heat.


----------



## Vellinious

You're never going to have enough rad to dissipate all the heat.....I've tried. You can keep the delta t really low, but you'll never get it back to ambient.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Jesus Christ...can you link me to that?
> 
> I'm definitely waiting for my 6w/mK then...


Quote:


> Originally Posted by *Derek1*
> 
> Here ya go Junior.
> 
> http://forums.evga.com/The-EVGA-thermal-pad-conductivity-m2588456.aspx


Won't make a 'notable' difference to your temperatures.

I just *upgraded* my thermal paste *from Kryonaut 12 w/MK* (traditional non-electrically conductive paste)
*to:*
Conductonaut *73 w/Mk (Liquid Metal)* my temperatures are *NOT better.*

I was very disappointed. Still am :-(

TBH my expectations for the liquid metal must have been drastically unrealistic. Never learned that until last night.

Unless I done something wrong with the Liquid Metal; then I can assure you this *" w/MK"* thing is *hugely over rated!*

Nick


----------



## diablodbl

"Voltage limit 1 simply means you have hit the max power draw for the card settings."

Techpowerup shows two different BIOSes for the extreme.

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=Zotac&model=&interface=&memType=&memSize=&did=10de-1b80--&since=

The BIOS you linked to shows the following power limit. https://www.techpowerup.com/vgabios/185008/zotac-gtx1080-8192-160616
Quote: 86.04.17.00.BA (my bios)
Board power limit
Target: 270.0 W
Limit: 326.0 W Adj.
Range: -50%, +21%

The other BIOS shows the following. https://www.techpowerup.com/vgabios/183937/zotac-gtx1080-8192-160603
Quote:86.04.17.00.04
Board power limit
Target: 320.0 W
Limit: 386.0 W
Adj. Range: -50%, +21%

Make sense try to swap Bios?

As both are from Zotac AMP, there is no risk right?


----------



## nrpeyton

Quote:


> Originally Posted by *diablodbl*
> 
> "Voltage limit 1 simply means you have hit the max power draw for the card settings."
> 
> Techpowerup shows two different BIOSes for the extreme.
> 
> https://www.techpowerup.com/vgabios/?architecture=&manufacturer=Zotac&model=&interface=&memType=&memSize=&did=10de-1b80--&since=
> 
> The BIOS you linked to shows the following power limit. https://www.techpowerup.com/vgabios/185008/zotac-gtx1080-8192-160616
> Quote: 86.04.17.00.BA (my bios)
> Board power limit
> Target: 270.0 W
> Limit: 326.0 W Adj.
> Range: -50%, +21%
> 
> The other BIOS shows the following. https://www.techpowerup.com/vgabios/183937/zotac-gtx1080-8192-160603
> Quote:86.04.17.00.04
> Board power limit
> Target: 320.0 W
> Limit: 386.0 W
> Adj. Range: -50%, +21%
> 
> Make sense try to swap Bios?
> 
> As both are from Zotac AMP, there is no risk right?


You can guage which one you're already on with HWINFO64 which tells you how much power your card is actually drawing in watts (not just % of TDP like most other apps)

Anyway what game uses more than 326 watts of power? lol

Even the most stressful of benchmarks (Firestrike Ultra 4K) maxes out at about 285w (and it only hits 285 for a split second) average is probably around 245w


----------



## diablodbl

Quote:


> Originally Posted by *nrpeyton*
> 
> You can guage which one you're already on with HWINFO64 which tells you how much power your card is actually drawing in watts (not just % of TDP like most other apps)
> 
> Anyway what game uses more than 326 watts of power? lol
> 
> Even the most stressful of benchmarks (Firestrike Ultra 4K) maxes out at about 285w (and it only hits 285 for a split second) average is probably around 245w


Im trying to find this info on HWINFO64...


----------



## nrpeyton

Quote:


> Originally Posted by *diablodbl*
> 
> Im trying to find this info on HWINFO64...


right click picture & select 'open in new tab' for *FULL SIZE*



Make sure you're using HWINFO64 (not HWINFO32) as it may not show GPU power on the '32' version.

Also you'll need to run a furmark stress test with your power slider on MSI AB turned up to max and your TDP % reading at 120% then you'll know roughly the max power draw of your card. Mine is 320W (when I'm maxed out HWINFO64 will show it jumping around between 316 watts and 326 watts.

But I know from my BIOS file on techpowerup its limit is 320w. (but nothing in electronics/power is going to be exact) occasionally it'll manage to draw a few watts more than its allowed under maximum stress

personally I love watching the power row in HWINFO64 when I'm benching or gaming; its the best indicator on how much strain your GPU is under. Easier than just seeing a %.

Because that % could be a % of anything, I mean until you actually KNOW what your max power draw in watts IS the % means f**k all lol

Then even if you did know; you're still usually too lazy to do the maths; that just takes the fun out of it anyway coz you never actually know what its currently at in *real time*

*UPDATE: How to show actual GPU power in Watts *in-game**
Or any other system information you want










Quote:


> Remember the good old days with FRAPS?
> _"~this one's a lot more than just an FPS counter"_


----------



## Agoniizing

My game stable run MSI 1080 Gaming X 2076MHz core clock/11GHz Memory


----------



## SirCanealot

So I flashed the T4 BIOS last night on my GE GTX 1080 that has an Accelero Xtreme IV fitted. Dear god, it gets toasty now!








It's now getting up to around 68 degrees under a heavy load, which is massive for an Accelero!

Before I flashed I hooked up my watt meter and the PC seemed to draw around 300 watts under load... After flashing, we're up to 340-360 :O
I hope my single 8 pin doesn't melt!!









My 3D mark score has barely gone up though - may have to flash back to the original BIOS and take a look at what that is doing. But playing Dragon Age Inquisition (4K DSR, mix of high/ultra settings), it's happily strolling along at 2100-2150 and performance seems a little better









My voltage sometimes hits 1.1v, usually hangs just under it. It was suggested the T4 bios goes up to 1.2v. I've increased the voltage metre in Afterburner but it does nothing. Just to double check, is there nothing further that can be done to increase the voltage more? Just wondering if I'm missing something silly.

PS: Thanks for all the help everyone, as I have had a few questions on here recently.


----------



## diablodbl

Quote:


> Originally Posted by *SirCanealot*
> 
> So I flashed the T4 BIOS last night on my GE GTX 1080 that has an Accelero Xtreme IV fitted. Dear god, it gets toasty now!
> 
> 
> 
> 
> 
> 
> 
> 
> It's now getting up to around 68 degrees under a heavy load, which is massive for an Accelero!
> 
> Before I flashed I hooked up my watt meter and the PC seemed to draw around 300 watts under load... After flashing, we're up to 340-360 :O
> I hope my single 8 pin doesn't melt!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 3D mark score has barely gone up though - may have to flash back to the original BIOS and take a look at what that is doing. But playing Dragon Age Inquisition (4K DSR, mix of high/ultra settings), it's happily strolling along at 2100-2150 and performance seems a little better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My voltage sometimes hits 1.1v, usually hangs just under it. It was suggested the T4 bios goes up to 1.2v. I've increased the voltage metre in Afterburner but it does nothing. Just to double check, is there nothing further that can be done to increase the voltage more? Just wondering if I'm missing something silly.
> 
> PS: Thanks for all the help everyone, as I have had a few questions on here recently.


You are using the accelero backplate?

Have you measure any other temperature besides GPU?


----------



## SirCanealot

Quote:


> Originally Posted by *diablodbl*
> 
> You are using the accelero backplate?
> 
> Have you measure any other temperature besides GPU?


If you mean the back heatsink, yes I am using it. I don't have any other temperature measures apart from the GPU since NVidia doesn't give us that









If you're wondering about VRM temps, given the fans are always on 100% if I am gaming and I have a Raven 3 case with an AP182 pushing quite a lot of air at the card I'm not too worried


----------



## steeludder

Quote:


> Originally Posted by *SirCanealot*
> 
> So I flashed the T4 BIOS last night on my GE GTX 1080 that has an Accelero Xtreme IV fitted. Dear god, it gets toasty now!
> 
> 
> 
> 
> 
> 
> 
> 
> It's now getting up to around 68 degrees under a heavy load, which is massive for an Accelero!
> 
> Before I flashed I hooked up my watt meter and the PC seemed to draw around 300 watts under load... After flashing, we're up to 340-360 :O
> I hope my single 8 pin doesn't melt!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 3D mark score has barely gone up though - may have to flash back to the original BIOS and take a look at what that is doing. But playing Dragon Age Inquisition (4K DSR, mix of high/ultra settings), it's happily strolling along at 2100-2150 and performance seems a little better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My voltage sometimes hits 1.1v, usually hangs just under it. It was suggested the T4 bios goes up to 1.2v. I've increased the voltage metre in Afterburner but it does nothing. Just to double check, is there nothing further that can be done to increase the voltage more? Just wondering if I'm missing something silly.
> 
> PS: Thanks for all the help everyone, as I have had a few questions on here recently.


Have you try setting higher voltage points using the curve?


----------



## SirCanealot

Quote:


> Originally Posted by *steeludder*
> 
> Have you try setting higher voltage points using the curve?


Tried forcing it past 1.1v via the curvy thingy in Afterburner, but no go this does not work. Does this work for everyone else? :O


----------



## Derek1

Quote:


> Originally Posted by *SirCanealot*
> 
> Tried forcing it past 1.1v via the curvy thingy in Afterburner, but no go this does not work. Does this work for everyone else? :O


You locked it in using Shift L?

Go back a few pages and look at Occam's curve he posted to see what I mean.

Or this one...

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/8190


----------



## nrpeyton

*New *Official* EK waterblock support for 1080 CLASSIFIED*

Great news!

Took a risk a few weeks ago and purchased a non-compatible full-cover block from EK and got it to fit my EVGA 1080 Classified. Turned out to be a *great fit* with only minor adjustments from manual.

Worked with EK_Grega very closely the entire way. Exchanging comprehensive information on:
-measurements,
-any "slight" mods required
-detailed information on my temperature results with core, memory & VRM

My journey has been documented in detail (the whole way over a week or so) at the 1080 Owners Club on overclock.net.
http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club

My goal throughout this journey has always been to give something back to the community by helping EKWB realise official support for the Classified 1080 with their 780TI Classy block.

Just received an email from EK_Grega a minute ago. (screenshots at bottom)

Here's an updated link to configurator and screenshot below:
https://www.ekwb.com/configurator/step1_complist?gpu_gpus=2005



*Updated Configurator*



If anyone needs any help / advice fitting this block to their Classified you are more than welcome to PM me and I will supply any help you require.

I intend to bring all my posts together, making it into one, single comprehensive guide/FAQ soon.
So watch this space 

Have a great day,

Nick Peyton

P.S.

*Screenshot of Emails:*


----------



## Spykerv

Quote:


> Originally Posted by *x-apoc*
> 
> So my ftw card seem to hit a wall at +101 mhz core and +5800mhz MEM, this is under 62c on air. Would getting hybrid cooler be any worth it at this point, other then aesthetics and less fan noise I think not?


I can't say for certain. I have the FTW Hybrid. Currently base clock is at 1821 or 6% increase based on a review site. I haven't OCCTd higher than that though so no clue.

I did it for the aesthetic and temps closer to ambient. But ymmv, price wise it honestly isn't worth it. Even with the extra voltage regulation on air I might has well have gone for the $550 cards from Asus instead of dropping the extra for this.


----------



## x-apoc

Quote:


> Originally Posted by *Spykerv*
> 
> I can't say for certain. I have the FTW Hybrid. Currently base clock is at 1821 or 6% increase based on a review site. I haven't OCCTd higher than that though so no clue.
> 
> I did it for the aesthetic and temps closer to ambient. But ymmv, price wise it honestly isn't worth it. Even with the extra voltage regulation on air I might has well have gone for the $550 cards from Asus instead of dropping the extra for this.


yeah, might as well take the 120 bucks, and put it for another 1080


----------



## apw63

Hello,

I've go the strix 1080 base card. I've been thinking about flashing the bios to the OC version. What advantages could i potentially see by doing this? I understand that the base clocks will be higher. Will I be able to aptly more volt to the card? The card is under water EK strix block. My off sets now are 240/740.



when I used EVGA precision the monitor shows. That the card is hitting the power limit, it bounces between 0 to 1. I have never seen the card go over 44C.


----------



## SirCanealot

Quote:


> Originally Posted by *Derek1*
> 
> You locked it in using Shift L?
> 
> Go back a few pages and look at Occam's curve he posted to see what I mean.


Cheers for that! I have tried locking above 1.1v, but it didn't seem to do anything on my card. I'll have to double check when I'm back home (away at GF's for the weekend with only a GTX 970 there to keep me amused :'().

By the way I'm just pressing the L key to lock (which is what the readme seems to suggest: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html). Is there any difference using shift+L?

Thanks!


----------



## Derek1

Quote:


> Originally Posted by *SirCanealot*
> 
> Cheers for that! I have tried locking above 1.1v, but it didn't seem to do anything on my card. I'll have to double check when I'm back home (away at GF's for the weekend with only a GTX 970 there to keep me amused :'().
> 
> By the way I'm just pressing the L key to lock (which is what the readme seems to suggest: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html). Is there any difference using shift+L?
> 
> Thanks!


Not too sure. lol

When I see 'L' in instructions I assume they mean upper case L as opposed to just hitting lower case L.

But I haven't tried just hitting lower case so who the hell knows, lol

When I do it my way it lock the voltage along a specific axis and clock speed. Like K Boost in Precision the card will run at that all the time.

I am still in the process of tinkering with both the T4 and AB as I am new to both and learning along the way. But I did run into the same problem as you during a few of my trials of not being able to get the card to go past 1.1v.


----------



## charro0412

http://www.3dmark.com/3dm/16265597 old

http://www.3dmark.com/3dm/16425301 new

time for new cpu to old


----------



## juniordnz

Quote:


> Originally Posted by *charro0412*
> 
> http://www.3dmark.com/3dm/16265597 old
> 
> http://www.3dmark.com/3dm/16425301 new
> 
> time for new cpu to old


It's nice what a good +500mhz on CPU can do, huh?


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> Hello,
> 
> I've go the strix 1080 base card. I've been thinking about flashing the bios to the OC version. What advantages could i potentially see by doing this? I understand that the base clocks will be higher. Will I be able to aptly more volt to the card? The card is under water EK strix block. My off sets now are 240/740.
> 
> 
> 
> when I used EVGA precision the monitor shows. That the card is hitting the power limit, it bounces between 0 to 1. I have never seen the card go over 44C.


Between the "base" STRIX BIOS and the highest "OC" version BIOS there is only a 21 watts difference (in maximum power draw) .

That's the only "real" difference between the cards. (As you can always O/C the frequency up to the OC version yourself using overclocking apps such as 'Asus GPU Tweak' or MSI Afterburner (much better app).

21 watts isn't a lot. If you're getting a lot of power throttling now; upgrading to that BIOS won't help an awful lol. (it will a little, but only by 21 watts)

You can measure power draw in watts using HWINFO64 app.

If you want to stop power throttling you could flash the T4/STRIX BIOS (your card is perfect for it).

It has no power limit (completely removed) and no temp limit and even has increased voltage limits too.

Just be careful though especially if you're on air; because using that BIOS you'll get a lot more performance but also *a lot more heat*.

Nick

Quote:


> Originally Posted by *SirCanealot*
> 
> Cheers for that! I have tried locking above 1.1v, but it didn't seem to do anything on my card. I'll have to double check when I'm back home (away at GF's for the weekend with only a GTX 970 there to keep me amused :'().
> 
> By the way I'm just pressing the L key to lock (which is what the readme seems to suggest: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html). Is there any difference using shift+L?
> 
> Thanks!


Quote:


> Originally Posted by *Derek1*
> 
> Not too sure. lol
> 
> When I see 'L' in instructions I assume they mean upper case L as opposed to just hitting lower case L.
> 
> But I haven't tried just hitting lower case so who the hell knows, lol
> 
> When I do it my way it lock the voltage along a specific axis and clock speed. Like K Boost in Precision the card will run at that all the time.
> 
> I am still in the process of tinkering with both the T4 and AB as I am new to both and learning along the way. But I did run into the same problem as you during a few of my trials of not being able to get the card to go past 1.1v.


Even when you're using "L" to lock the voltage in AB you still need to set the voltage slider to 100% *and* apply a decent offset value (+frequency) in the curve window to force it to boost to those voltages up to 1.2


----------



## juniordnz

Guys, what's T4 vBIOS stock voltage? I thought it would be the stock 1.093V but withouth even touching the voltage offset slider I've it go up to 1.112V.


----------



## apw63

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> Between the "base" STRIX BIOS and the highest "OC" version BIOS there is only a 21 watts difference (in maximum power draw) .
> 
> That's the only "real" difference between the cards. (As you can always O/C the frequency up to the OC version yourself using overclocking apps such as 'Asus GPU Tweak' or MSI Afterburner (much better app).
> 
> 21 watts isn't a lot. If you're getting a lot of power throttling now; upgrading to that BIOS won't help an awful lol. (it will a little, but only by 21 watts)
> 
> You can measure power draw in watts using HWINFO64 app.
> 
> If you want to stop power throttling you could flash the T4/STRIX BIOS (your card is perfect for it).
> 
> It has no power limit (completely removed) and no temp limit and even has increased voltage limits too.
> 
> Just be careful though especially if you're on air; because using that BIOS you'll get a lot more performance but also *a lot more heat*.
> 
> Nick
> 
> Even when you're using "L" to lock the voltage in AB you still need to set the voltage slider to 100% *and* apply a decent offset value (+frequency) in the curve window to force it to boost to those voltages up to 1.2






Thank you for the info. My card is under water now. I am ignorant to the T4/STRIX BIOS you are referring to. Where can i find this BIOS and information about it?


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> 
> Thank you for the info. My card is under water now. I am ignorant to the T4/STRIX BIOS you are referring to. Where can i find this BIOS and information about it?


 strix1080xoc_t4version2withhigherfirestrikescore.zip 148k .zip file


-Removed Power Limit
-Voltage up to 1.2v (over stock 1.093v)
*-No Temperature limit (^warning^)*
-May cause a 'reduced fan speed' on FE cards. FE cards should only use this when under water.
-Won't work with monitors plugged into the 3rd display port

*Be careful* (monitor your temps when using this BIOS)

-Remember ALWAYS *disable display driver in "device manager" first BEFORE flashing*.
-*Always backup your original BIOS* first using GPU-Z (and keep a backup of your original in safe place)


----------



## apw63

Thank you nrpeyton. I will check it out and see if I want to flash. I can always flashback. i already backed up the original BIOS.


----------



## swingarm

Just got my 1080 the other day but it will be a little while before I use it, the computer it goes in currently has the CPU in the process of RMA. I do have a mini-itx computer but the 1080 is about 0.5" to long to fit. BTW I'm not a gamer/overclocker, just like good quality products.


----------



## nexxusty

Quote:


> Originally Posted by *nrpeyton*
> 
> who u referring to?
> 
> ive seen u post that exact same post before a while ago?
> 
> I skimmed over that link a bit (need to get some sleep) but the bits I read made it make perfect sense now
> 
> there's also a part where they said that GDDR5X doesn't generate a lot of heat; so that really makes me wonder now if I'm actually getting proper contact on those 3 VRAM chips I keep going on about at all....
> 
> going to have to do some really good in-depth trial and error testing to try and work out whats causing it. 65c on a VRAM chip can't be right (especially when only measured from back of PCB).
> 
> anyway very useful mate cheers, rep +1 for that link lol
> 
> good night for now


Nobody in particular.
Quote:


> Originally Posted by *nrpeyton*
> 
> Okay great, thanks for replying to that mate, I've noticed my "overclockability" (if that's even a word lol) for memory drops *dramatically* as soon as memory temps begin exceeding 40c-44c.
> 
> Looking at the example PCB picture above (in my original post):
> -When heavily stressed (Furmark 300w draw) the two memory chips, far left (nearest monitor display outputs) only 38 degrees C.
> -Three chips in the middle (directly above) the core about 42-44c.
> - But the 3 chips between VRM and core (right hand side) are getting to 65 degrees C.
> 
> *Reaching out to you all *
> 
> I have almost finished my official guide: "How to fit the EK 780 TI Classified Block to a 1080 Classified". _Core never exceeds 44c (VRM about the same) *Furmark*._
> 
> I'm only 5% away from EK agree officially to support 1080 Classified with their cooling configurator.
> EKWB's EK_Grega's emails can be found a few pages back on this thread.
> 
> I need to make sure there isn't going to be a potential over-heat issue on those 3 chips before I finalise everything. _Do I need to revisit thermal pad thickness e.t.c_
> 
> Giving something back to the community; I am
> 
> Would be great (much appreciated) not just by me but benefit us *all* if *you* _(yes you reading this)_ ;-) could provide me some temps for those memory 3 chips.
> 
> Risked my card. 3 months to find confidence it took! (With unofficial install).
> 
> But success!
> 
> 
> 
> 
> 
> 
> 
> 85% followed *stock instructions in manual* I did
> 
> Guys the 1080 Classified has "overclocking in its DNA"; it is officially sponsored by Kingpin. And Kingpin works very closely with EVGA and has some influence in the industry.
> 
> Having card models like the 1080 Classified helps us *all*; it reminds the industry how important overclocking is to us
> .
> It's owners of these models that perform LN2, write BIOS editors and voltage & power mods and provide us with a world of information on our cards.
> 
> There is even an "LN2" labelled bios switch on my 1080 Classified lol. (don't believe me? I have picture right here)
> 
> So I'm reaching out to you guys; if any of you have a spare hour (and don't mind removing your back-plate; give me your GDDR5X memory temps, could you? *three* chips in the picture above.  _5 posts back_
> 
> Remember don't use "bare metal probes". (Not unless you cover the area with a little thin electrical tape first -- just take it slowly and enjoy and nothing dangerous will occur
> 
> Thanks you.
> 
> Nick Peyton


Wandering bits.


----------



## SmackHisFace

So I clicked save bios on GPU-Z and now GPU Z reads all my clokcs as 0mhz and I cant alter any settings in Afterburner. Any idea whats going on or what I should do. I have the stock bios and have not flashed it. Zotac AMP! 1080. Reinstalled the Driver its fixed now but it didnt save my bios.


----------



## OccamRazor

Quote:


> Originally Posted by *SmackHisFace*
> 
> So I clicked save bios on GPU-Z and now GPU Z reads all my clokcs as 0mhz and I cant alter any settings in Afterburner. Any idea whats going on or what I should do. I have the stock bios and have not flashed it. Zotac AMP! 1080. Reinstalled the Driver its fixed now but it didnt save my bios.


It should have saved it someplace in desktop, if it didnt you can still save it with nvflash: Open elevated cmd window (hit start button, type "cmd", right click cmd icon, run as administrator)
change directory to C:\ (type " cd C:\ " )
(type " nvflash --protectoff)*
*BACKUP ORIGINAL BIOS: Type " nvflash -b backup.rom " and a copy of your bios will be saved to the nvflash directory*

*Do this if you havent yet, you need to do it only once in your cards lifetime







, to remove write protection on the bios eeprom, no need to do it when flashing again.

Full flash instructions below:


Spoiler: Warning: Spoiler!



extract the nvflash files to C:\
save the BIOS file to C:\
open elevated cmd window (hit start button, type "cmd", right click cmd icon, run as administrator)
change directory to C:\ (type " cd C:\ " )
type " nvflash --protectoff " then hit enter (disables write protection on bios EEPROM and its one time only, no need to do it again everytime bios is flashed)
BACKUP ORIGINAL BIOS: Type " nvflash -b backup.rom " and a copy of your bios will be saved to the nvflash directory
type " nvflash -6 (bios name).rom "
wait for it to finish
reboot



Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *nexxusty*
> 
> Nobody in particular.
> Wandering bits.


?


----------



## nrpeyton

*replaced 0.5mm thermal pads with 0.5 mm copper shims (insulated around them first obviously)*



*Lost 15c heat off the memory chips with this mod. 

Find out tomorrow if it actually gets me any extra O/C headroom lol*

*somehow VRM temps went up though (by about 20c) which i don't understand because temps on core and everything else stayed the same and they 'are' the same thickness as the pads.

*scratching my head**

*Edit: I did re-use the old pads on the VRM (can that affect temps)?*


----------



## nexxusty

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> *replaced 0.5mm thermal pads with 0.5 mm copper shims (insulated around them first obviously)*
> 
> 
> 
> *Lost 15c heat off the memory chips with this mod.
> 
> Find out tomorrow if it actually gets me any extra O/C headroom lol*
> 
> *somehow VRM temps went up though (by about 20c) which i don't understand because temps on core and everything else stayed the same and they 'are' the same thickness as the pads.
> 
> *scratching my head**
> 
> *Edit: I did re-use the old pads on the VRM (can that affect temps)?*


Copper is thick enough that it raises the backplate off the VRM's.

That would be my guess.


----------



## max883

I used thermal grizzly kryonout on my Evga GTX 1080 ACX 3.0 SC With thermal pad mod, And i removed the bakplate. And my fann speed never goes above 50%. Fann at customl in msi afterburner and 120% power and 100+ gpu.

before I had a MSI 980 Ti gaming. Temps were 75.c and fann speed 66% With pc standig !

Now with EVGA 1080 ACX SC Temps are 59.c and fann speed 50%! With pc standing! Testet WIth Call of duty infinite warfare. 4K. all settings maxed out!


Temps are 71.c With the pc laying down Testet WIth Call of duty infinite warfare. 4K. all settings maxed out!


----------



## nexxusty

Quote:


> Originally Posted by *max883*
> 
> I used thermal grizzly kryonout on my Evga GTX 1080 ACX 3.0 SC With thermal pad mod, And i removed the bakplate. And my fann speed never goes above 50%. Fann at customl in msi afterburner and 120% power and 100+ gpu.
> 
> before I had a MSI 980 Ti gaming. Temps were 75.c and fann speed 66% With pc standig !
> 
> Now with EVGA 1080 ACX SC Temps are 59.c and fann speed 50%! With pc standing! Testet WIth Call of duty infinite warfare. 4K. all settings maxed out!
> 
> 
> Temps are 71.c With the pc laying down Testet WIth Call of duty infinite warfare. 4K. all settings maxed out!


Yoy!!!


----------



## Dragonsyph

FS ultra



My card has max core of 2164 @ 1.07V If it hits 2177 it crashes, so im guessing even more voltage wont help lol since 1.09 2177 crashes but 2164 1.07 dont.

What's everyone's else's graphics scores?

Im at

6188 FS Ultra

26,501 FS Regular

8679 Timespy

+140 core +1000 Memory @ 2164/12,000

Temps 32c-40c


----------



## nrpeyton

Quote:


> Originally Posted by *nexxusty*
> 
> Copper is thick enough that it raises the backplate off the VRM's.
> 
> That would be my guess.


hmm yeah I thought that could be the issue; but then I only added the copper shims to the 3 memory chips in the middle of the board (the hottest ones with, core on one side, and VRM on the other side).

For example the GDDR5X chips on the far left of the card (nearest display port connectors) are the coolest.

This mod has just brought the temperature of those 3 memory chips *more in line* with the others. (temps of the *others* haven't changed.) and core hasn't changed.

They were running about 15c-20c hotter than the rest of the card. (53 for normal gaming & 62 in Furmark).

Maybe your right though, the copper shims are aren't thicker "per say" but they are "harder" and won't "compress down" as much maybe this caused a 0.1mm difference (enough to affect VRM).

I wish it was possible to buy pads in incriments of 0.1mm lol (for example 0.6 (instead of 0.5) and 1.1 (instead of 1.0) lol. Would of made my job much easier.

During testing last night I remember trying to add more cooling to some extra components (and adding 1.0mm pads would stop the memory making contact), but then adding 0.5mm the memory would make contact but then the extra component wouldn't. I couldn't win lol.

Anyway I will take apart again today and see if I can find a solution. I really want to get *all* components on card all running 45c or cooler. 

Anyway good afternoon


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> FS ultra
> 
> 
> 
> 
> My card has max core of 2164 @ 1.07V If it hits 2177 it crashes, so im guessing even more voltage wont help lol since 1.09 2177 crashes but 2164 1.07 dont.
> 
> What's everyone's else's graphics scores?
> 
> Im at
> 
> 6188 FS Ultra
> 26,501 FS Regular
> 8679 Timespy
> 
> +140 core +1000 Memory @ 2164/12,000
> Temps 32c-40c


Have you looked at the rankings in the respective Benchmarking threads?
All scores can be submitted in each and ranked.


----------



## new boy

I should have a 1080 FE coming my way soon, and I'll get it under water a bit after that

I went EK last time, but this time I fancy a change.

I'm thinking of going Aquacomputer kryographics block with the passive backplate.

Good choice? Or is there a better option?


----------



## Vellinious

Quote:


> Originally Posted by *new boy*
> 
> I should have a 1080 FE coming my way soon, and I'll get it under water a bit after that
> 
> I went EK last time, but this time I fancy a change.
> 
> I'm thinking of going Aquacomputer kryographics block with the passive backplate.
> 
> Good choice? Or is there a better option?


They kryographics blocks are outstanding. If they're available for the GPU I choose, they're absolutely my first choice.


----------



## AveragePC

Never took a GPU apart before, wasn't too bad. Everything appears to be running smooth, did some temp testing running VR games and it looks like the thermal paste is doing it's job. Was a little worried about the application.


----------



## nrpeyton

Quote:


> Originally Posted by *AveragePC*
> 
> Never took a GPU apart before, wasn't too bad. Everything appears to be running smooth, did some temp testing running VR games and it looks like the thermal paste is doing it's job. Was a little worried about the application.


On my Classified I noticed the 'base plate' on the front of the card (underneath the ACX 3.0 cooler is made of hard metal (it kind of chimes when you tap it)..

But the backplate is plastic and flimsy (it bends).

Are the FTW's the same? Because I struggle to understand how adding a big thermal pad on the back is going to help cool the VRM (when backplate is plastic).

I'm only asking because I want to understand *EVGA's* 'thinking' on this. Not because I disagree with anything about your post. I see you followed the thermal pad mod instructions perfectly 

On the set of pads I got with my Classified I never got the big pad for the backplate. So just wondered if the FTW one was metal?

Ta,
Nick


----------



## AveragePC

Quote:


> Originally Posted by *nrpeyton*
> 
> On my Classified I noticed the 'base plate' on the front of the card (underneath the ACX 3.0 cooler is made of hard metal (it kind of chimes when you tap it)..
> 
> But the backplate is plastic and flimsy (it bends).
> 
> Are the FTW's the same? Because I struggle to understand how adding a big thermal pad on the back is going to help cool the VRM (when backplate is plastic).
> 
> I'm only asking because I want to understand *EVGA's* 'thinking' on this. Not because I disagree with anything about your post. I see you followed the thermal pad mod instructions perfectly
> 
> On the set of pads I got with my Classified I never got the big pad for the backplate. So just wondered if the FTW one was metal?
> 
> Ta,
> Nick


My card is the 1080 Super Clocked. The back plate appeared to be metal, but I could be mistaken. The card is back in the htpc lol.


----------



## apw63

I have tried to flash the bios of my strix 1080 base version. I'm trying to flash it to strix1080xoc_t4.rom. I downloaded the latest version of nvflash . I disabled the card with device manager. I keep getting this error

BCTR Error: Certificate 2.0 verification failed. I did start the CMD prompt as administrator. What am I doing wrong?

I'm using this line to flash. nvflash -6 strix1080xoc_t4.rom

I tried flashing back to the original backup bios. I get the same error.

Any help would be greatly appreciated.


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> I have tried to flash the bios of my strix 1080 base version. I'm trying to flash it to strix1080xoc_t4.rom. I downloaded the latest version of nvflash . I disabled the card with device manager. I keep getting this error
> 
> BCTR Error: Certificate 2.0 verification failed. I did start the CMD prompt as administrator. What am I doing wrong?
> 
> I'm using this line to flash. nvflash -6 strix1080xoc_t4.rom
> 
> I tried flashing back to the original backup bios. I get the same error.
> 
> Any help would be greatly appreciated.


what card do you have?


----------



## apw63

I have the asus strix 1080gtx-8g-gaming. Just the basic version.


----------



## nrpeyton

If you have an ASUS card then you want to type this command:

*nvflash strix1080xoc_t4.rom* <-- simple as that

if you have a non ASUS card you want to type this:

*nvflash --overridesub strix1080xoc_t4.rom*
(say yes when it asks if your sure)

make sure the .rom file and the nvflash.exe is in the same folder, go to file (from within that window) and select 'open command prompt' - the command prompt will open with the correct directory to save you having to type it all in.so you can just go right ahead and enter the command(s) above.

also do *not* use the certificates bypassed version. that won't work.

The certificates bypassed version is for BIOS's edited using a BIOS editor that don't have the official encrypted "stamp" of a major card manufacturer.

The T4 BIOS was given especially to an LN2 guy (sponsored by ASUS) and the LN2 guy was kind enough to release it to the public. Now that the file is out there's nothing ASUS can do about it. But its still an *official* signed BIOS so you don't need to use the 'certificates bypassed version' -- which it sounds like you are; from the error message you are getting.

The certificates bypassed version won't work with *official* signed BIOS's. That version is no use until a BIOS editor for pascal is released.

You might be wondering why there is a certificates bypassed version if there is no BIOS editor.

Simple answer is the guy who does the modified (hacked) NV flash version
isn't the same guy who releases the BIOS editors. He obviously wrote the updated nvlfash ready for a BIOS editor coming out (but one never came) and probably won't.

Nvidia has never been one of those companies that likes anyone doing anything differently to their original specifications. (If you O/C your card they worry you won't need a new one every 2 years.

Best regards,

Nick Peyton

*Edit:* You might need to use the command:
*--nvflash --protectoff*
to disable "write protection" before you flash.


----------



## apw63

Nick,

I'm use the bios you linked to me in post 8450.

Thank you for the help. I'll let you know how it goes.


----------



## nexxusty

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm yeah I thought that could be the issue; but then I only added the copper shims to the 3 memory chips in the middle of the board (the hottest ones with, core on one side, and VRM on the other side).
> 
> For example the GDDR5X chips on the far left of the card (nearest display port connectors) are the coolest.
> 
> This mod has just brought the temperature of those 3 memory chips *more in line* with the others. (temps of the *others* haven't changed.) and core hasn't changed.
> 
> They were running about 15c-20c hotter than the rest of the card. (53 for normal gaming & 62 in Furmark).
> 
> Maybe your right though, the copper shims are aren't thicker "per say" but they are "harder" and won't "compress down" as much maybe this caused a 0.1mm difference (enough to affect VRM).
> 
> I wish it was possible to buy pads in incriments of 0.1mm lol (for example 0.6 (instead of 0.5) and 1.1 (instead of 1.0) lol. Would of made my job much easier.
> 
> During testing last night I remember trying to add more cooling to some extra components (and adding 1.0mm pads would stop the memory making contact), but then adding 0.5mm the memory would make contact but then the extra component wouldn't. I couldn't win lol.
> 
> Anyway I will take apart again today and see if I can find a solution. I really want to get *all* components on card all running 45c or cooler.
> 
> Anyway good afternoon


That would be a much better way of saying it.

The copper definitely would retain its shape while a thermal pad will compress to less than 0.1mm if tightened enough.

There should be 0.1mm copper shims.... I don't know why there isn't. They'd be useful. Clearly.


----------



## Azazil1190

Silly questione..
The t4 bios is the same wih the xoc bios?
Or they're different bioses


----------



## apw63

Nick still no joy now i get this error


----------



## nexxusty

Quote:


> Originally Posted by *apw63*
> 
> Nick still no joy now i get this error


nvflash -6 strix1080xoc_t4.rom

"-6" overrides PCI Subsystem ID.

Type "nvflash" by itself and press enter for it to show you a list of possible commands/flags you can set.


----------



## nrpeyton

Quote:


> Originally Posted by *nexxusty*
> 
> nvflash -6 strix1080xoc_t4.rom
> 
> "-6" overrides PCI Subsystem ID.
> 
> Type "nvflash" by itself and press enter for it to show you a list of possible commands/flags you can set.


Quote:


> Originally Posted by *apw63*
> 
> Nick still no joy now i get this error


aye, if your typing the "number code" for the command its just one " - " <-- the small line (or dash?)

if your typing the command as a "word" its two so " --".

so yeah either the command above that nexxusty posted or:

*nvflash --overridesub [filename.com]* *<-- without the [ ] 's*

You might still be getting the subvendor mismatch error because its a different card (i thought you only got that error if it was a complete different manufacturer)

Edit: I just noticed your still using the nvflash version modified by joedirt, can't remember if that is the certificates bypassed version or not.. but if its not make sure you are still using the latest version of the "normal" one


----------



## nexxusty

Quote:


> Originally Posted by *nrpeyton*
> 
> aye, if your typing the "number code" for the command its just one " - " <-- the small line (or dash?)
> 
> if your typing the command as a "word" its two so " --".
> 
> so yeah either the command above that nexxusty posted or:
> 
> *nvflash --overridesub [filename.com]* *<-- without the [ ] 's*
> 
> You might still be getting the subvendor mismatch error because its a different card (i thought you only got that error if it was a complete different manufacturer)
> 
> Edit: I just noticed your still using the nvflash version modified by joedirt, can't remember if that is the certificates bypassed version or not.. but if its not make sure you are still using the latest version of the "normal" one


Noticed those new commands. -6 has been around since nvflash was made. Hehe.

The command to flash any ROM to any card used to be "nvflash -4 -5 -6 *.rom".

They should have left it IMO. The last command erases the NVRAM too or something like that. Can't do that anymore... doesn't seem right.


----------



## apw63

This is what it took to work on my system

nvflash64 -6 strix1080xoc_t4.rom

Testing now, watching temps


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> This is what it took to work on my system
> 
> nvflash64 -6 strix1080xoc_t4.rom
> 
> Testing now, watching temps


good, glad you got there in the end.

once you've had a few runs at it you'll know what to watch out for and u can relax and just enjoy a few extra FPS in ur game lol 

never released they'd updated to a 64bit version "nvflash64" 

Can't wait to get my water-chiller. Once I've got that I can stop mucking around with the damn thing and maybe actually play a damn game lol

don't know why I'm so intent on tinkering with it and opening it up and trying to force get better temps all the time lol.

its ridiculous check this picture out lol (I must be mad):

8 fans that you can see. Plus a 9th on the back (on outside of case) and 10th on the other side of the case aiming at the back of the CPU socket (back of the CPU socket actually gets me -10c off my CPU temp lol)

Plus the GPU and CPU are both under water (gpu is full cover but cools everything except the 3-phase memory VRM right at the back of the card) (they're also all spinning at 3000RPM).

-right click & open new tab for full size-


----------



## apw63

Nick'
check the link in my sig "my system". You can see my system (build log). I'm not much crazier than you.


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> Nick'
> check the link in my sig "my system". You can see my system (build log). I'm not much crazier than you.


omg its like a 'time machine' lol.

monster

that much cost u a fortune

what kind of temps do u get with all those radiators?

i had to save up half a months pay just to afford my EK kit, then sell two old graphics cards and half a months wages to get the 1080.. and the case was only recent too coz i was bored and wanted to do a new build while i was on holiday.

I thought my case was big. urz is unbelievable lol

mine isn't usually so untidy, but I was doing some stress testing tonight. going to tidy it up tomorrow and fix a fan onto the bottom underneath the reservoir (reservoir connects to fan mounts) so I've got something pointing at the memory VRM at the back of the Classified. Just noticed I was missing an opportunity with that lol.

oh i noticed we've got the same block lol. only mines on a 1080 classy.

anyway time for bed, night


----------



## DStealth

Quote:


> Originally Posted by *Dragonsyph*
> 
> FS ultra
> 
> 
> 
> 
> My card has max core of 2164 @ 1.07V If it hits 2177 it crashes, so im guessing even more voltage wont help lol since 1.09 2177 crashes but 2164 1.07 dont.
> 
> What's everyone's else's graphics scores?
> 
> Im at
> 
> 6188 FS Ultra
> 26,501 FS Regular
> 8679 Timespy
> 
> +140 core +1000 Memory @ 2164/12,000
> Temps 32c-40c


----------



## charro0412

Quote:


> Originally Posted by *Dragonsyph*
> 
> Graphics score seems low for 1936mhz.


http://www.3dmark.com/fs/10975113

more I do not get out



Cheers

Charro Luft


----------



## Spiriva

Quote:


> Originally Posted by *apw63*
> 
> Nick still no joy now i get this error


nvflash --index=0 --save 1080org.rom
nvflash --index=0 --protectoff
nvflash --index=0 -6 strix1080xoc_t4.rom

--index=0 is because i have two cards, next card would be --index=1 and do the same thing again.

Also remember to up the volt you need to use msi after burner and use the graph overclocker (ctrl f to bring it up)


----------



## nrpeyton

Quote:


> Originally Posted by *Spiriva*
> 
> nvflash --index=0 --save 1080org.rom
> nvflash --index=0 --protectoff
> nvflash --index=0 -6 strix1080xoc_t4.rom
> 
> --index=0 is because i have two cards, next card would be --index=1 and do the same thing again.
> 
> Also remember to up the volt you need to use msi after burner and use the graph overclocker (ctrl f to bring it up)


*try nvflash --index=0 --overridesub strix1080xoc_t4.rom*

also i think ur using the wrong version of nvflash

nvflash_5.319.0-win.zip 2819k .zip file


----------



## Dragonsyph

Quote:


> Originally Posted by *DStealth*


Thanks for posting your ultra scores, wasn't sure if mine was decent or not. I can't my core Oc any higher, i even opend the office window when it was 19F out side and had gpu temps of 24c and core wouldent go higher lol, Voltage did drop from 1.07 to 1.04 though.

Quote:


> Originally Posted by *charro0412*
> 
> http://www.3dmark.com/fs/10975113
> 
> more I do not get out
> 
> 
> 
> Cheers
> 
> Charro Luft


Nice, you got that Oc up.


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> FS ultra
> 
> 
> 
> 
> My card has max core of 2164 @ 1.07V If it hits 2177 it crashes, so im guessing even more voltage wont help lol since 1.09 2177 crashes but 2164 1.07 dont.
> 
> What's everyone's else's graphics scores?
> 
> Im at
> 
> 6188 FS Ultra
> 26,501 FS Regular
> 8679 Timespy
> 
> +140 core +1000 Memory @ 2164/12,000
> Temps 32c-40c


My best:
*5912*

http://www.3dmark.com/3dm/16339819
with AMD FX CPU not bad?? lol


----------



## apw63

Been playing around with the new bios. So far my best clocks are 256(1990) / 1000(11010) at 1.131 max volts. I could not push the clock past 256(1990). Tweak II does not as far as I know let me push the volts any higher. I might try Afterburner later today.


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> Been playing around with the new bios. So far my best clocks are 256(1990) / 1000(11010) at 1.131 max volts. I could not push the clock past 256(1990). Tweak II does not as far as I know let me push the volts any higher. I might try Afterburner later today.


*Afterburner is much better, you can also use afterburner to switch off GPU Dynamic Voltage control and "lock" any voltage in that you want. And then plot a 'frequency' for each voltage point.

So you could decide that at:
1.081v you want 1986mhz
1.093v you want 2000mhz
1.1v you want 2013mhz etc etc

The cooler you get your card though, the more stable it will be at higher overclocks.

I just upgraded to water and now I can get 2202mhz at 1.093mv (rock solid stable). Before I went water I was lucky to get 2151-2164mhz stable and sometimes it would still crash.

The Classified voltage tool isn't doing much at the moment (my card doesn't seem to like being above 1.093v above 40 degrees C.

When I get my water chiller I'll be able to start pushing volts a bit higher.

Every card is different though.

I'm currently working on modifying how efficiently my memory is cooled to try and push it harder too.
I can get up to +925 without artifacts (and occasionally a quick lucky run will give me a higher score) , but when I do a GPU Stability Test using the 'OCCT' app it records errors at anything past +725 memory. Before going water I could only get to about 625-650 without errors.

My waterblock isn't a perfect fit for my card so I'm still working away lol.

* *<--- replacing 0.5mm thermal pads with 0.5mm copper shims*


----------



## Yomny

Cant you specify a specific offset for each voltage setting also in Precision X OC? Using those features in the second screens, basic, linear and the manual. Sorry my terminology is a bit dumb but i know you could specify to have an offset only at higher voltages.

I'm just having issues sustaining the voltage, in my case 1.094v. Are you guys doing anything to lock voltages or its just up to the cards power delivery?

Another thing, i've read that these pascal cards are good till up to +500mhz offset for memory, is this normal, are there any side effects? I've been able to get 600 without issues, i've stressed with firestrike, skydiver and its just happy. Cant get more than +150 clock offset though without locking up, this sets me at 2152mhz.


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> Cant you specify a specific offset for each voltage setting also in Precision X OC? Using those features in the second screens, basic, linear and the manual. Sorry my terminology is a bit dumb but i know you could specify to have an offset only at higher voltages.
> 
> I'm just having issues sustaining the voltage, in my case 1.094v. Are you guys doing anything to lock voltages or its just up to the cards power delivery?
> 
> Another thing, i've read that these pascal cards are good till up to +500mhz offset for memory, is this normal, are there any side effects? I've been able to get 600 without issues, i've stressed with firestrike, skydiver and its just happy. Cant get more than +150 clock offset though without locking up, this sets me at 2152mhz.


You can, but its not as accurate. Precision X only allows you to go up in multiples of 50mhz and you've got to plot it by highlighting "blocks" of squares on a grid. It also doesn't allow you to "lock" onto one specific voltage and *stay* at that voltage.

MSI AB does 

In the curve window in MSI AB select ur voltage and press the L key, then click "apply" in the main window.
That will stop voltage jumping around. It will stay at what you specifiy. If you hit temps or power limit the voltage still won't change, it'll just reduce ur frequency instead. But if you're not hitting power limit anyway thats not usually an issue.

P.S. you still need to apply a 100% on the voltage slider in main window before using the curve. Also the voltage won't "take" unless you also plot a frequency overclock for that voltage. e.g. +100 mhz


----------



## SmackHisFace

Looks like my Zotac AMP! 1080 is kind of a dud. Crashes in OW at 2025 with max voltage and power limit. Had to bump it down to 2000-2012mhz to keep it stable. Kind of bummed but I got the card new for $480 and from all the benchmarks Ive seen overclocking on these cards doesn't add much FPS. I also cant get the curve to apply extra voltage in afterburner. When I hit control F It brings up the curve but then the voltage goes back to default despite saying +100. Looking at the graph the boost block is set for 2012-2025 but the in game its at 2000, why is that. O well Probably end up selling this card for the 1180 since Ive been upgrading every cycle anyways.


----------



## nrpeyton

Quote:


> Originally Posted by *SmackHisFace*
> 
> Looks like my Zotac AMP! 1080 is kind of a dud. Crashes in OW at 2025 with max voltage and power limit. Had to bump it down to 2000-2012mhz to keep it stable. Kind of bummed but I got the card new for $480 and from all the benchmarks Ive seen overclocking on these cards doesn't add much FPS. Probably end up selling this card for the 1180 since Ive been upgrading every cycle anyways.


you'll get more off the memory than the core.

+ 160 core might get you 2-3fps but a +575 on the memory (which most cards will hit) could give you 5 fps.

Combine the 2 and you're almost getting 10fps (which in some games could be 9% or 10%.

Bear in mind you got ur card for $480 (in my money that's £380. I paid £710 for my Classified. For nearly twice the money I've maybe got an extra 200mhz or 5FPS so I wouldn't get too disheartened about it.. It should be me complaining, not you. lol


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> you'll get more off the memory than the core.
> 
> + 160 core might get you 2-3fps but a +575 on the memory (which most cards will hit) could give you 5 fps.
> 
> Combine the 2 and you're almost getting 10fps (which in some games could be 9% or 10%.
> 
> Bear in mind you got ur card for $480 (in my money that's £380. I paid £710 for my Classified. For nearly twice the money I've maybe got an extra 200mhz or 5FPS so I wouldn't get too disheartened about it.. It should be me complaining, not you. lol


No....a solid core overclock will affect frame rates much more than the memory. Especially in real world situations, and not synthetics that like high memory clock, and, well...to put a finer point on it, bug out easily when it's too high.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> No....a solid core overclock will affect frame rates much more than the memory. Especially in real world situations, and not synthetics that like high memory clock, and, well...to put a finer point on it, bug out easily when it's too high.


on Maxwell O/C'ing my core got me about 7% and memory about 3%.

On Pascal it seems the other way around.

Witcher 3 (which is well optimised for my CPU meaning it's one game that *doesn't* actually bottleneck). I'm lucky to get 2FPS with a +160 on the core.

As soon as I do a +725 mem I'm getting +5fps.

I'll do a video capture if I have to lol....

I agree you lose FPS when memory bugs out, but going too high on he core seems to give a lower score too

its been said many times that memory scales very well on Pascal, which has been the saviour to a lot of people who own cards that barely go past 2000.

what I've also noticed; don't know how relevant this is; but on Maxwell I'd get artifacts when going too high on the core.
On Pascal I only get artifacts with high mem overclocks.
core either crashes or runs


----------



## SmackHisFace

Quote:


> Originally Posted by *nrpeyton*
> 
> you'll get more off the memory than the core.
> 
> + 160 core might get you 2-3fps but a +575 on the memory (which most cards will hit) could give you 5 fps.
> 
> Combine the 2 and you're almost getting 10fps (which in some games could be 9% or 10%.
> 
> Bear in mind you got ur card for $480 (in my money that's £380. I paid £710 for my Classified. For nearly twice the money I've maybe got an extra 200mhz or 5FPS so I wouldn't get too disheartened about it.. It should be me complaining, not you. lol


Yea Im not too upset about it. Right now I got it at 2000mhz in game boost and +500 in the memory. Card preforms great, its a lot faster than my 1400mhz 980ti. Gears of War benchmark with async compute on gives me 104fps vs the 82 of the 980ti @ 1440p. Perfect for my 96hz monitor.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> What I'm talking about is the idea that if you add 750 or 760 to the memory, it yields a significantly lower score, than if you add 755 to the memory. I recall several people saying this. That's clearly a bug....whether you see it in the benchmark or not, it's bugging. Why? No idea...... There should be absolutely no recognizable difference. The fact that there is a difference is very telling.
> 
> You have an FX series processor, and it's bottlenecking the GPU horribly....your higher increases with memory clock are very likely due to that, and not some magical memory.




GDD5R(X) spec sheet /\

*Edit (Everything below completely re-edited, ****ed up trying to explain it properly lol):
*
a +1000 on the memory would be the same as having the 110 spec above /\

nvidia was the only company that bought the GDDR5X memory from micron.

but they bought the lower grade 100 to save $$$

its the same as RAM. you a 2133 and a 2400MHZ rated chip could come off the same wafer but get binned differently.

So what did they do with all the higher rated memory?

probably had to sell it as lower spec 100.

That prolly explains why people like Dragonsyph is getting a +1000 on the memory rock solid stable. (12gb's instead of 10gb's a sec)
*that's 20% !!*

+1000 takes your memory performance from 10gb/s to 12gb/s (thats why it scales so well) which is the same as the 110 rated memory above. Which is probably what Dragonsyph got lol

its been said time and time again that memory scales really really well on pascal, I think this explains it lol

-I do agree though that a +495 or +505 gets better results. Remember that memory benching I done with the graph a while ago lol?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> on Maxwell O/C'ing my core got me about 7% and memory about 3%.
> 
> On Pascal it seems the other way around.
> 
> Witcher 3 (which is well optimised for my CPU meaning it's one game that *doesn't* actually bottleneck). I'm lucky to get 2FPS with a +160 on the core.
> 
> As soon as I do a +725 mem I'm getting +5fps.
> 
> I'll do a video capture if I have to lol....
> 
> I agree you lose FPS when memory bugs out, but going too high on he core seems to give a lower score too
> 
> its been said many times that memory scales very well on Pascal, which has been the saviour to a lot of people who own cards that barely go past 2000.
> 
> what I've also noticed; don't know how relevant this is; but on Maxwell I'd get artifacts when going too high on the core.
> On Pascal I only get artifacts with high mem overclocks.
> core either crashes or runs


What I'm talking about is the idea that if you add 750 or 760 to the memory, it yields a significantly lower score, than if you add 755 to the memory. I recall several people saying this. That's clearly a bug....whether you see it in the benchmark or not, it's bugging. Why? No idea...... There should be absolutely no recognizable difference. The fact that there is a difference is very telling.

You have an FX series processor, and it's bottlenecking the GPU horribly....your higher increases with memory clock are very likely due to that, and not some magical memory.


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> You can, but its not as accurate. Precision X only allows you to go up in multiples of 50mhz and you've got to plot it by highlighting "blocks" of squares on a grid. It also doesn't allow you to "lock" onto one specific voltage and *stay* at that voltage.
> 
> MSI AB does
> 
> In the curve window in MSI AB select ur voltage and press the L key, then click "apply" in the main window.
> That will stop voltage jumping around. It will stay at what you specifiy. If you hit temps or power limit the voltage still won't change, it'll just reduce ur frequency instead. But if you're not hitting power limit anyway thats not usually an issue.
> 
> P.S. you still need to apply a 100% on the voltage slider in main window before using the curve. Also the voltage won't "take" unless you also plot a frequency overclock for that voltage. e.g. +100 mhz


Looks like ill be checking out MSI AB. I' currently stable AT +150 **** and +600 memory but since the voltage is jumping around, my clock speed doesn't stay at the +150.
Thanks for your help. I'll check it out and report back. Thanks.


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> Looks like ill be checking out MSI AB. I' currently stable AT +150 **** and +600 memory but since the voltage is jumping around, my clock speed doesn't stay at the +150.
> Thanks for your help. I'll check it out and report back. Thanks.


Once you can "lock" a specific voltage in you can find out what ur max frequency is and just leave it at that while you game.

just watch temps though

most of us have been able to O/C a bit higher using the curve method on MSI AB than we have by simply setting a traditional +offset

to begin with; the reason to that was a bit of a mystery; but i've only just figured out why.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> What I'm talking about is the idea that if you add 750 or 760 to the memory, it yields a significantly lower score, than if you add 755 to the memory. I recall several people saying this. That's clearly a bug....whether you see it in the benchmark or not, it's bugging. Why? No idea...... There should be absolutely no recognizable difference. The fact that there is a difference is very telling.
> 
> You have an FX series processor, and it's bottlenecking the GPU horribly....your higher increases with memory clock are very likely due to that, and not some magical memory.


Had to completely change this post coz I messed up explaining it, its fixed now: (it came out in the wrong order, I quoted your post that was posted 'after' mine lol.

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/8490#post_25690664


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> Once you can "lock" a specific voltage in you can find out what ur max frequency is and just leave it at that while you game.
> 
> just watch temps though
> 
> most of us have been able to O/C a bit higher using the curve method on MSI AB than we have by simply setting a traditional +offset
> 
> to begin with; the reason to that was a bit of a mystery; but i've only just figured out why.


Im a little lost here, can't find the voltage control section. Should i enforce constant voltage in settings?


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> Im a little lost here, can't find the voltage control section. Should i enforce constant voltage in settings?


press cntrl + F or there's a *tiny* little button beside the core clock slider (looks like a mini bar graph)

voltages are along the bottom, select one and press L.. then plot a point above that voltage to set your frequency

then 'apply' in main window


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> press cntrl + F or there's a *tiny* little button beside the core clock slider (looks like a mini bar graph)
> 
> voltages are along the bottom, select one and press L.. then plot a point above that voltage to set your frequency
> 
> then 'apply' in main window


So i added 150 in the clock slider, which is what im almost certain i could do without crashing. Now by clicking the voltage point in that graph and pressing L, i could force or lock that voltage?


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> So i added 150 in the clock slider, which is what im almost certain i could do without crashing. Now by clicking the voltage point in that graph and pressing L, i could force or lock that voltage?


no don't use the core clock slider.

open the graph.

voltages are along the bottom... so click somewhere on the graph that corresponds to the voltages below... then on the left hand side are the frequencies.. you set the frequency and the voltage at the same time by "plotting" it on the graph.

in other words; the same "plot point" controls both simultaneously. open the curve window and play about with it and you'll understand what I mean

the higher voltage the more you can overclock.

all the microscopic "gates" inside the chip have got to switch faster when you increase the frequency speed; that requires more voltage so they can keep up. when they don't keep up thats when you crash.

temps also affect how much voltage u can apply though due to this:
look at it as the chip is made up of billions of conductive pathways and non-conductive walls.
the colder it is the more conductive the conductive pathways so the less electrons escape, when electrons escape through non-conductive walls thats when you get a crash or instability.

if you'e not getting as big an O/C as you want you need to find a way to reduce temps.
scale on Pascal is about 100mhz per 50 degrees c.


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> no don't use the core clock slider.
> 
> open the graph.
> 
> voltages are along the bottom... so click somewhere on the graph that corresponds to the voltages below... then on the left hand side are the frequencies.. you set the frequency and the voltage at the same time by "plotting" it on the graph.
> 
> in other words; the same "plot point" controls both simultaneously. open the curve window and play about with it and you'll understand what I mean
> 
> the higher voltage the more you can overclock.
> 
> all the microscopic "gates" inside the chip have got to switch faster when you increase the frequency speed; that requires more voltage so they can keep up. when they don't keep up thats when you crash.
> 
> temps also affect how much voltage u can apply though due to this:
> look at it as the chip is made up of billions of conductive pathways and non-conductive walls.
> the colder it is the more conductive the conductive pathways so the less electrons escape, when electrons escape through non-conductive walls thats when you get a crash or instability.
> 
> if you'e not getting as big an O/C as you want you need to find a way to reduce temps.
> scale on Pascal is about 100mhz per 50 degrees c.


I added the +150 clock speed in the slider then went to the manual curve and added a bit more frequency from 1.060v all the way to 1.093v which is where I think these cards max out. I was able to get +160 clock using this.

I also locked the voltage as you mentioned at 1.093v which basically forces the card all the time to run this voltage and frequency at that level, it holds most of the time but still drops.

All this is great but at the end I think my card and the power phases just aren't up to par. The voltage doesn't hold steady and the clock drops, not all the time but a few times during a valley benchmark. My card uses only 1x8 pin power and it's got the lowest power phase count of all the evga's lol.

All in all I'm happy as I'm getting 2152 core clock and +600 on memory, stable. All this of course when voltage holds steady lol, not even locking it in MSI AB gets it to stay.

BTW my card is water cooled so the most I see is 48C.


----------



## Randomocity

Quote:


> Originally Posted by *Yomny*
> 
> I added the +150 clock speed in the slider then went to the manual curve and added a bit more frequency from 1.060v all the way to 1.093v which is where I think these cards max out. I was able to get +160 clock using this.
> 
> I also locked the voltage as you mentioned at 1.093v which basically forces the card all the time to run this voltage and frequency at that level, it holds most of the time but still drops.
> 
> All this is great but at the end I think my card and the power phases just aren't up to par. The voltage doesn't hold steady and the clock drops, not all the time but a few times during a valley benchmark. My card uses only 1x8 pin power and it's got the lowest power phase count of all the evga's lol.
> 
> All in all I'm happy as I'm getting 2152 core clock and +600 on memory, stable. All this of course when voltage holds steady lol, not even locking it in MSI AB gets it to stay.
> 
> BTW my card is water cooled so the most I see is 48C.


EVGA 1080FTW here:

So far so good for me tonight as well. My current curve is holding stable at 2202 Mhz Core Clock and 5605 Mhz Memory clock at 1.1V. I flashed the T4 bios, which seemed to do the trick in bringing my clock speeds up. Previously, even on the FTW Slave BIOS, I was only ever able to get around 2151Mhz with +600 on the clock speed. I tried to push my core clock up to 2214, but it seems 2202 is the limit for my card. I have to play around with the memory speed a bit, I've had a single hiccup in 10 minutes with furmark running at 5805, so I backed the memory clock down to a stable 5605. My only issue is that my GPU's temps are up to 49C now, which is a little high for my tastes with it being under water.

Further playing around will ensue I suppose. Now to getting my GPU validated to be added to the owners club.


----------



## ucode

Quote:


> Originally Posted by *Vellinious*
> 
> What I'm talking about is the idea that if you add 750 or 760 to the memory, it yields a significantly lower score, than if you add 755 to the memory. I recall several people saying this. That's clearly a bug....whether you see it in the benchmark or not, it's bugging. Why? No idea...... There should be absolutely no recognizable difference. The fact that there is a difference is very telling.


Yep, I've also mentioned this and that if when experiencing that drop in performance then putting the PC in a sleep state and waking it magically returns it back to the performance that was expected (at least on my card anyway). Not to be confused with the drop in performance that can seem to happen at around 2800MHz clock (1400MHz in GPUZ, 5600MHz in AB :/). Here's a memory clock vs bandwidth check, one can see "the bug" as well as the big drop.



GDDR5 on the Pascal GTX 1050Ti doesn't suffer "the bug" although it does have the big drop off. For me that happens at about 3800MHz memory clock (1900MHz in GPUZ)


----------



## VersusPC

This may be a bit off topic, or have been answered in here before, however I didn't see anyone asking this in the couple pages I read back;

Is anyone getting extremely loud coil whine on the ASUS GTX 1080's? Just got done with a water cooled system and the VRM's are louder than the case fans?!?! Curious if this is just my particular card, or if all of the Asus 1080's are stricken with this issue.

Ive noticed a small amount on a few other Asus GTX 1080's, however this was the first one put under water, so I'm not sure if the stock cooler/fans had just muffled the noise of it on the other cards, or again if this one is just a lemon.


----------



## Yomny

Quote:


> Originally Posted by *Randomocity*
> 
> EVGA 1080FTW here:
> 
> So far so good for me tonight as well. My current curve is holding stable at 2202 Mhz Core Clock and 5605 Mhz Memory clock at 1.1V. I flashed the T4 bios, which seemed to do the trick in bringing my clock speeds up. Previously, even on the FTW Slave BIOS, I was only ever able to get around 2151Mhz with +600 on the clock speed. I tried to push my core clock up to 2214, but it seems 2202 is the limit for my card. I have to play around with the memory speed a bit, I've had a single hiccup in 10 minutes with furmark running at 5805, so I backed the memory clock down to a stable 5605. My only issue is that my GPU's temps are up to 49C now, which is a little high for my tastes with it being under water.
> 
> Further playing around will ensue I suppose. Now to getting my GPU validated to be added to the owners club.


That's awesome man, seems that memory is about as much as they handle for long periods. Going to see if I check out that bios update you mentioned. My issue is holding the voltage steady. The FTW is the way to go, from what I've read on evga the power phases are a lot better.

If you have some place I could gather more info on the bios flash I'll sure appreciate it. Thanks.


----------



## nrpeyton

Quote:


> Originally Posted by *Yomny*
> 
> That's awesome man, seems that memory is about as much as they handle for long periods. Going to see if I check out that bios update you mentioned. My issue is holding the voltage steady. The FTW is the way to go, from what I've read on evga the power phases are a lot better.
> 
> If you have some place I could gather more info on the bios flash I'll sure appreciate it. Thanks.


Just click 'search this thread' at the top and type in "t4 bios", you'll get a list of all relevant posts where its been mentioned.

you'll be able to get a good feel for it that way

other than that check out the hwbot forums and even a google search and a look around techpowerup.com wouldn't do any harm either


----------



## Darkboomhoney

Gtx 1080 Classified Waterblock is now available!!!!!

here the german link : http://www.aquatuning.de/wasserkuehlung/sets-und-systeme/interne-sets/alphacool/eiswolf/21660/alphacool-eiswolf-gpx-pro-nvidia-geforce-gtx-1080-m09-mit-backplate?c=21344


----------



## Yomny

Quote:


> Originally Posted by *nrpeyton*
> 
> Just click 'search this thread' at the top and type in "t4 bios", you'll get a list of all relevant posts where its been mentioned.
> 
> you'll be able to get a good feel for it that way
> 
> other than that check out the hwbot forums and even a google search and a look around techpowerup.com wouldn't do any harm either


Appreciate it, I did search and you came up in the techpowerup forum lol, back when trying to decide between the Gigabyte and Asus.









Appreciate all the help you've been. Thanks a bunch.


----------



## apw63

Quote:


> Originally Posted by *VersusPC*
> 
> This may be a bit off topic, or have been answered in here before, however I didn't see anyone asking this in the couple pages I read back;
> 
> Is anyone getting extremely loud coil whine on the ASUS GTX 1080's? Just got done with a water cooled system and the VRM's are louder than the case fans?!?! Curious if this is just my particular card, or if all of the Asus 1080's are stricken with this issue.
> 
> Ive noticed a small amount on a few other Asus GTX 1080's, however this was the first one put under water, so I'm not sure if the stock cooler/fans had just muffled the noise of it on the other cards, or again if this one is just a lemon.


I have not heard any whine out of mine. I have my card under water. My CPU, GPU, RAM & MB are all under water. In my system the loudest item is my 2 MCP35x pumps.


----------



## nrpeyton

Quote:


> Originally Posted by *Darkboomhoney*
> 
> Gtx 1080 Classified Waterblock is now available!!!!!
> 
> here the german link : http://www.aquatuning.de/wasserkuehlung/sets-und-systeme/interne-sets/alphacool/eiswolf/21660/alphacool-eiswolf-gpx-pro-nvidia-geforce-gtx-1080-m09-mit-backplate?c=21344


would be very interesting to see how this one does up against the EK 780TI Classy model

I managed to get it to fit my Classified

gave up waiting on the alpha-cool one because it was taking so long lol

also it would be helpful if they'd actually use the name "classified" in the product description. anyone actually looking to buy one of these would have to email support to find out it even existed lol (when they are naming it "Alphacool Eiswolf GPX Pro - Nvidia Geforce GTX 1080 M09"

why not name it: "Alphacool Eiswolf GPX Pro Evga 1080 Classified"


----------



## Darkboomhoney

Quote:


> Originally Posted by *nrpeyton*
> 
> would be very interesting to see how this one does up against the EK 780TI Classy model
> 
> I managed to get it to fit my Classified
> 
> gave up waiting on the alpha-cool one because it was taking so long lol
> 
> also it would be helpful if they'd actually use the name "classified" in the product description. anyone actually looking to buy one of these would have to email support to find out it even existed lol (when they are naming it "Alphacool Eiswolf GPX Pro - Nvidia Geforce GTX 1080 M09"
> 
> why not name it: "Alphacool Eiswolf GPX Pro Evga 1080 Classified"


I do not know, this name is just stupid for the block








But i called the support and ask him witch block is for the Classified and he said the M09.
Now i ordered two of them , for me and my friend in 2 - 3 Days i can say more ....
i post pictures and the results in few days ...


----------



## nrpeyton

Quote:


> Originally Posted by *Darkboomhoney*
> 
> I do not know, this name is just stupid for the block
> 
> 
> 
> 
> 
> 
> 
> 
> But i called the support and ask him witch block is for the Classified and he said the M09.
> Now i ordered two of them , for me and my friend in 2 - 3 Days i can say more ....
> i post pictures and the results in few days ...


*would be good to know your VRM and memory temperatures: (that is going to be the most interesting bit) as the alpha-cool blocks only "passively" cool VRM and memory.

Here's how I do it (with back plate left off): (4 temp probes attached to a £22 (about 26 $) fan controller:*







Notice temp 3 (the 3 GDDR5X memory chips nearest to the VRM) are the hottest. I'd really love to know what they get to with the alphacool block


----------



## Darkboomhoney

Quote:


> Originally Posted by *nrpeyton*
> 
> *would be good to know your VRM and memory temperatures: (that is going to be the most interesting bit) as the alpha-cool blocks only "passively" cool VRM and memory.
> 
> Here's how I do it (with back plate left off): (4 temp probes attached to a £22 (about 26 $) fan controller:*
> 
> 
> 
> 
> 
> 
> 
> Notice temp 3 (the 3 GDDR5X memory chips nearest to the VRM) are the hottest. I'd really love to know what they get to with the alphacool block


ok i post my results , I test my temperature with aquaero 6 xt and external temperature sensors, I will install them as described.
Is the temperature not worse when you stick the sensors directly on the vram ... ?


----------



## nrpeyton

Quote:


> Originally Posted by *Darkboomhoney*
> 
> ok i post my results , I test my temperature with aquaero 6 xt and external temperature sensors, I will install them as described.


excellent mate, would be good to know, on a Furmark run with a continuous draw of 320W (maximum on our Classifieds on the LN2 bios switch (or "Slave" as earlier models are called) my temperature on memory closest to VRM was getting to 62C.

Anyway I can't wait to see your results mate.

I will PM you the rest of my photos (if you can do a comparable test by matching my power draw on the card we should get pretty accurate comparison) I'll PM my other results 

Don't know if you know this already; but you can always know what your card is drawing with HWINFO64 app. It actually gives your cards power draw in watts. And also tells you what that is as a % of your cards maximum.

So for example the default max TDP is 245W power draw. (with slider in MSI AB/Precision X at 100%).

If you were drawing 320W you'd be at 130%:

GDDR5X chips are rated at 95 degrees C so I'm still deep within safe limits, VRM is rated at over 100C.

There is a sort of 'step' on the block that "lines up" with your GDDR5X VRAM chips (but because I am using 780TI Classy block it doesn't line up properly so I had to do a little mod with extra thermal pads at the side of the step so the memory made full contact with block.

Pictures and everything are all about 10 pages back or if your interested I've also posted an easy-to-read *straight to the point* version of it at kingpincooling.com here: http://forum.kingpincooling.com/showthread.php?t=3938

It was actually me who persuaded EK to start officially supporting the 1080 on their configurator with the older 780TI Classy block after I got it working and sent them all the detailed information


----------



## charro0412

http://www.3dmark.com/fs/10989955 msi X beast on air max temp 65 C


----------



## Randomocity

Quote:


> Originally Posted by *VersusPC*
> 
> This may be a bit off topic, or have been answered in here before, however I didn't see anyone asking this in the couple pages I read back;
> 
> Is anyone getting extremely loud coil whine on the ASUS GTX 1080's? Just got done with a water cooled system and the VRM's are louder than the case fans?!?! Curious if this is just my particular card, or if all of the Asus 1080's are stricken with this issue.
> 
> Ive noticed a small amount on a few other Asus GTX 1080's, however this was the first one put under water, so I'm not sure if the stock cooler/fans had just muffled the noise of it on the other cards, or again if this one is just a lemon.


I get horrific coil whine on my FTW. My rig is generally silent until my VRMs spin up or down.


----------



## juniordnz

Quote:


> Originally Posted by *Randomocity*
> 
> I get horrific coil whine on my FTW. My rig is generally silent until my VRMs spin up or down.


In normal gameplay with a normal FPS range?


----------



## Randomocity

Quote:


> Originally Posted by *juniordnz*
> 
> In normal gameplay with a normal FPS range?


Yep. It's readily apparent when I'm playing something like Doom and I'm putting the card under full load. It tends to also be worse for a couple of seconds once I close a game down.

Edit: interestingly enough it was actually worse after I put on the water block. Temps are still fantastic if I don't jack the voltage up to 1.1V...


----------



## SirCanealot

nrpeyton, thanks for your previous suggestion of setting the voltage to +100 in Afterburner, then adjusting the voltage past 1.1v via the curve manager. This has not worked at all! My 1080 is stubbornly sticking to around 1.088v or so, sometimes adjusting to 1.1 or around 1.075 or so. So I'm not sure why my 1080 is being stubborn. I ended up locking to 1.050v and messing around with that, but the 1080 still underclocks based on temperature (I think), so I'm still missing 2-3 boost bins, sometimes. This is during playing Dragon Age Inquisition at 4k DSR though, so the performance is still pretty amazing coming from a 980









I flashed the T4 v2 bios on this as well and the behaviour seems to be exactly the same. Need to flash back to the original bios and benchmark more to see if the T4 bios actually improves performance much (it barely affects the Firestrike score tbh)


----------



## nrpeyton

Quote:


> Originally Posted by *Randomocity*
> 
> Yep. It's readily apparent when I'm playing something like Doom and I'm putting the card under full load. It tends to also be worse for a couple of seconds once I close a game down.
> 
> Edit: interestingly enough it was actually worse after I put on the water block. Temps are still fantastic if I don't jack the voltage up to 1.1V...


isn't 1.1 only the next step up? like 1100mv instead of 1093mv (so 7 millivolts extra)?

what are your temps at the normal 1093mv maximum? (everyone can get to 1093 without T4 or voltage tool)


----------



## nrpeyton

Quote:


> Originally Posted by *SirCanealot*
> 
> nrpeyton, thanks for your previous suggestion of setting the voltage to +100 in Afterburner, then adjusting the voltage past 1.1v via the curve manager. This has not worked at all! My 1080 is stubbornly sticking to around 1.088v or so, sometimes adjusting to 1.1 or around 1.075 or so. So I'm not sure why my 1080 is being stubborn. I ended up locking to 1.050v and messing around with that, but the 1080 still underclocks based on temperature (I think), so I'm still missing 2-3 boost bins, sometimes. This is during playing Dragon Age Inquisition at 4k DSR though, so the performance is still pretty amazing coming from a 980
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I flashed the T4 v2 bios on this as well and the behaviour seems to be exactly the same. Need to flash back to the original bios and benchmark more to see if the T4 bios actually improves performance much (it barely affects the Firestrike score tbh)


There's 2 things that need to happen for the voltage to "take" after you use the L key in afterburner:

1- 100% in main window (voltage)
2- you have to apply an O/C (if GPU Boost 3.0 thinks it can hold the frequency you've set at a lower voltage it won't change to that voltage.

I've never heard of the voltage jumping around though usually it sticks... and the frequency goes down.

I'll do some experimenting with it tonight and see what happens and post back.

what temps were you getting? (pascal is very temp sensitive)?

and what card did you have again? (you might have to have a look at the voltage settings in MSI Afterburner settings... like you can set it to "MSI", "MSI Extended" or "3rd party" and there are other opttions in there too... you sure you haven't ticked something? try resseting it all back to default (if you can't remember what default was and u think u *have* changed something re-install it.

I had "force constant voltage" ticked on it once by mistake and that messed things up


----------



## nexxusty

Quote:


> Originally Posted by *nrpeyton*
> 
> There's 2 things that need to happen for the voltage to "take" after you use the L key in afterburner:
> 
> 1- 100% in main window (voltage)
> 2- you have to apply an O/C (if GPU Boost 3.0 thinks it can hold the frequency you've set at a lower voltage it won't change to that voltage.
> 
> I've never heard of the voltage jumping around though usually it sticks... and the frequency goes down.
> 
> I'll do some experimenting with it tonight and see what happens and post back.
> 
> what temps were you getting? (pascal is very temp sensitive)?
> 
> and what card did you have again? (you might have to have a look at the voltage settings in MSI Afterburner settings... like you can set it to "MSI", "MSI Extended" or "3rd party" and there are other opttions in there too... you sure you haven't ticked something? try resseting it all back to default (if you can't remember what default was and u think u *have* changed something re-install it.
> 
> I had "force constant voltage" ticked on it once by mistake and that messed things up


Jesus.... I did not know about the standard voltage slider needing to be maxed.... now have my 1.093v. 2150mhz crashes but 2125 lmhz seems stable.

FPS are better in every test.


----------



## juniordnz

Anyone on T4's stock voltage (without touching the voltage slider)?


----------



## Fidelity21

Sorry if this has already been discussed, but there are over 853 pages so far and I can't find the answer.









I have an Asus reference card with EK water block and backplate. It's working great and before I learned about voltage control, I was able to hit 2050mhz using the MSI AB program. Now that I know voltage can be increased, I installed the beta version, unlocked voltage control and even tried forcing voltage, but it won't go much about .950v. I see people here are pushing 1.100v but the card just doesn't want to go near 1.000 much less over.

Do I have to flash the BIOS in order to get these higher voltages? Seems like trying to force the voltage works south of .900v but as soon as I go north of .950v it just ignores my request.


----------



## Dragonsyph

Quote:


> Originally Posted by *Fidelity21*
> 
> Sorry if this has already been discussed, but there are over 853 pages so far and I can't find the answer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an Asus reference card with EK water block and backplate. It's working great and before I learned about voltage control, I was able to hit 2050mhz using the MSI AB program. Now that I know voltage can be increased, I installed the beta version, unlocked voltage control and even tried forcing voltage, but it won't go much about .950v. I see people here are pushing 1.100v but the card just doesn't want to go near 1.000 much less over.
> 
> Do I have to flash the BIOS in order to get these higher voltages? Seems like trying to force the voltage works south of .900v but as soon as I go north of .950v it just ignores my request.


So your card only hits .9 and not 1.09?


----------



## Fidelity21

Correct, it hits .900v. I've seen it go to .921v before but never higher than that. People are talking about voltages north of 1.000v which makes me wonder if I need a BIOS flash or if there is another fix.


----------



## OccamRazor

Quote:


> Originally Posted by *SirCanealot*
> 
> nrpeyton, thanks for your previous suggestion of setting the voltage to +100 in Afterburner, then adjusting the voltage past 1.1v via the curve manager. This has not worked at all! My 1080 is stubbornly sticking to around 1.088v or so, sometimes adjusting to 1.1 or around 1.075 or so. So I'm not sure why my 1080 is being stubborn. I ended up locking to 1.050v and messing around with that, but the 1080 still underclocks based on temperature (I think), so I'm still missing 2-3 boost bins, sometimes. This is during playing Dragon Age Inquisition at 4k DSR though, so the performance is still pretty amazing coming from a 980
> 
> 
> 
> 
> 
> 
> 
> 
> I flashed the T4 v2 bios on this as well and the behaviour seems to be exactly the same. Need to flash back to the original bios and benchmark more to see if the T4 bios actually improves performance much (it barely affects the Firestrike score tbh)


After flash and reboot, if the behavior is the same you need to shut down the computer as the reboot was not enough for the drivers to recognize the newly flashed bios.
Quote:


> Originally Posted by *nrpeyton*
> 
> isn't 1.1 only the next step up? like 1100mv instead of 1093mv (so 7 millivolts extra)?
> what are your temps at the normal 1093mv maximum? (*everyone can get to 1093 without T4 or voltage tool*)


Exactly! One of the differences between stock bios and T4 is that stock bios is locked to 1.093V and T4 isnt!

Quote:


> Originally Posted by *juniordnz*
> 
> Anyone on T4's stock voltage (without touching the voltage slider)?


The voltage slider only works if you set values in the core and mem fields, using the curve method overrides the setting.

Quote:


> Originally Posted by *Fidelity21*
> 
> Sorry if this has already been discussed, but there are over 853 pages so far and I can't find the answer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an Asus reference card with EK water block and backplate. It's working great and before I learned about voltage control, I was able to hit 2050mhz using the MSI AB program. Now that I know voltage can be increased, I installed the beta version, unlocked voltage control and even tried forcing voltage, but it won't go much about .950v. I see people here are pushing 1.100v but the card just doesn't want to go near 1.000 much less over.
> Do I have to flash the BIOS in order to get these higher voltages? Seems like trying to force the voltage works south of .900v but as soon as I go north of .950v it just ignores my request.


Yes you have to flash T4 if you want to get higher voltage than 1.093V



One thing to remember: the voltage node (in red circle) that we want to set locked with CRTL + L MUST be higher than the previous lower voltages (or you have to re-arrange the other ones lower with the voltage and clocks of choice), after that press apply in afterburner to see the red line turn yellow on the node you selected for the locked voltage.

Cheers

Occamrazor


----------



## Dragonsyph

Quote:


> Originally Posted by *Fidelity21*
> 
> Correct, it hits .900v. I've seen it go to .921v before but never higher than that. People are talking about voltages north of 1.000v which makes me wonder if I need a BIOS flash or if there is another fix.


It should be able to hit 1.09v with out needing new bios. You put the voltage adjustment bar at 100%? Power target 130%?


----------



## OccamRazor

Quote:


> Originally Posted by *Fidelity21*
> 
> Correct, it hits .900v. I've seen it go to .921v before but never higher than that. People are talking about voltages north of 1.000v which makes me wonder if I need a BIOS flash or if there is another fix.


Can you show us a pic of afterburner monitor in voltage window with the card under load?


----------



## Fidelity21




----------



## OccamRazor

Quote:


> Originally Posted by *Fidelity21*


Furmark wont do, the drivers when detecting the program is executed, will lower voltage and clocks automatically!
What card do you have exactly?


----------



## Fidelity21

ASUS GeForce® GTX 1080 Founders Edition With Professionally Installed Water Block by Performance-PCS.com. I went with the Full Cover Razor GTX as well as the backplate. Temperatures never exceed 50C thanks to the water cooling rig with a 360mm EK XE360 and another SE240 radiator with 8 case fans.

What programs will allow me to run the voltage higher? Furmark and Kombuster are both not pushing the card as much as I would like. How about Asus Realbench OpenCL test? Or 3Dmark on a loop?


----------



## Dragonsyph

Quote:


> Originally Posted by *Fidelity21*
> 
> ASUS GeForce® GTX 1080 Founders Edition With Professionally Installed Water Block by Performance-PCS.com. I went with the Full Cover Razor GTX as well as the backplate. Temperatures never exceed 50C thanks to the water cooling rig with a 360mm EK XE360 and another SE240 radiator with 8 case fans.
> 
> What programs will allow me to run the voltage higher? Furmark and Kombuster are both not pushing the card as much as I would like. How about Asus Realbench OpenCL test? Or 3Dmark on a loop?


Run something like MSI afterburner with voltage showing and run a FS or play some games with no vsyc on and see what voltage it goes too.

My card only needs around 1.04 to 1.07 for 2164mhz on the core. Maybe your core is low enough that ur card only needs .9.


----------



## IronAge

Quote:


> Originally Posted by *ROKUGAN*
> 
> Same here. I´ve read several members talking about flashing the T4 Bios in the AMP Extreme but I asked them on results and none reported back.
> Maybe they bricked their cards and aren´t able to read my feedback requests
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, I´m pretty sure it will work, but would like to know about the difference as the AMP Extreme bios is already pretty fine.


I have tested my AMP! non-Extreme with T4 Bios with stock cooler and without increasing VDDC.

http://www.3dmark.com/3dm/14560942

One of the highest Graphics Score i got without adding too much Core Clock offset.









But you have to make sure your GPU stays under 62 Degree Celcius ... i got lockups when temps went over that ... thats why i just checked the graphics score.

With a full FSE run temps would have been to high under the stock cooler @ 100% even with a additional 140x140xx38mm fan @ 2500rpm.

There is no thermal or PL related throttling ... with t4 Bios there is no PL/TDP Usage displayed in AB HUD.









As of now i own a Gainward Phoenix which has the strongest stock cooler so far.

With a curve i get 2152 with 1.0 VDDC ... got to try the T4 Bios on that card as well.


----------



## TK421

Can anyone comment on the heatsink quality of the laptop down below for a GTX1080? Disregard factory thermal paste since I will be using liquid metal.


----------



## ucode

Quote:


> Originally Posted by *TK421*
> 
> Can anyone comment on the heatsink quality of the laptop down below for a GTX1080? Disregard factory thermal paste since I will be using liquid metal.


Might be worthwhile to ask on NBR.

Is this the GT73VR 6RF? If so there isn't any firmware posted for that so can not tell you the power limit. Your not going to hit HE desktop power limits but given the relatively small performance gain with increased voltage and power your probably not missing out on much.

Overall looks okay, VRM should not dissipate too much power so 2 pipes should be more than enough and possibly a little wasteful if not also helping the GPU / Mem. 2x3 for the GPU and memory looks good for SFF. Cant tell on the quality of the machined interface to components though or how the EC will control cooling.


----------



## OccamRazor

Quote:


> Originally Posted by *IronAge*
> 
> I have tested my AMP! non-Extreme with T4 Bios with stock cooler and without increasing VDDC.
> 
> http://www.3dmark.com/3dm/14560942
> 
> One of the highest Graphics Score i got without adding too much Core Clock offset.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But you have to make sure your GPU stays under 62 Degree Celcius ... i got lockups when temps went over that ... thats why i just checked the graphics score.
> 
> With a full FSE run temps would have been to high under the stock cooler @ 100% even with a additional 140x140xx38mm fan @ 2500rpm.
> 
> There is no thermal or PL related throttling ... *with t4 Bios there is no PL/TDP Usage displayed in AB HUD*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As of now i own a Gainward Phoenix which has the strongest stock cooler so far.
> 
> With a curve i get 2152 with 1.0 VDDC ... got to try the T4 Bios on that card as well.


But you can get power readings on T4 in AB OSD with HWinfo, just go to sensor settings, OSD (RTSS) tab, select GPU power and tick "Show value in OSD"









Cheers

Occamrazor


----------



## Randomocity

Quote:


> Originally Posted by *nrpeyton*
> 
> isn't 1.1 only the next step up? like 1100mv instead of 1093mv (so 7 millivolts extra)?
> 
> what are your temps at the normal 1093mv maximum? (everyone can get to 1093 without T4 or voltage tool)


So my FTW topped out at 1.063V using the slave BIOS and 130% power limit in AB. Overclock maxed out at 2151Mhz and temps never really broke 45c under full load with furmark. At 1.1V on t4 v2, I'm able to run the card around 2202Mhz core with my max temp being around 49c. I've been a little nervous to push it past 1.1v due to temps, but I might give it a shot this weekend.


----------



## TK421

Quote:


> Originally Posted by *ucode*
> 
> Might be worthwhile to ask on NBR.
> 
> Is this the GT73VR 6RF? If so there isn't any firmware posted for that so can not tell you the power limit. Your not going to hit HE desktop power limits but given the relatively small performance gain with increased voltage and power your probably not missing out on much.
> 
> Overall looks okay, VRM should not dissipate too much power so 2 pipes should be more than enough and possibly a little wasteful if not also helping the GPU / Mem. 2x3 for the GPU and memory looks good for SFF. Cant tell on the quality of the machined interface to components though or how the EC will control cooling.


thinking that the vrm cooling might be overkill for the card though?

wonder if there's a drawback of using aluminum as heatsink fins?


----------



## Radmanhs

Is there any real way to bump the voltage past the "locked" max? I can only get 1.063 volts and 2100mhz on my core


----------



## OccamRazor

Quote:


> Originally Posted by *Radmanhs*
> 
> Is there any real way to bump the voltage past the "locked" max? I can only get 1.063 volts and 2100mhz on my core


Read my earlier post today about the curve nodes and CRTL + L to lock the voltage; on stock bios you are limited by the bios at 1.093V but the voltages and clocks are always bouncing around depending on load, temp and power limit, with the T4 bios its all the way to 1,200V with no or minor fluctuations!

Cheers

Occamrazor


----------



## pfinch

Hey guys,

flashed T4 (v2) on my AMP Extreme. Working fine, but on nearly every Application it throttles down by 1 Core-Mhz Step (2013 -> 2001; 2001 -> 2088 etc.)
Tested every Voltage (1093 to 1.2) ... all the same

could someone explain this behavior?


----------



## ROKUGAN

Quote:


> Originally Posted by *IronAge*
> 
> I have tested my AMP! non-Extreme with T4 Bios with stock cooler and without increasing VDDC.
> 
> http://www.3dmark.com/3dm/14560942
> 
> One of the highest Graphics Score i got without adding too much Core Clock offset.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But you have to make sure your GPU stays under 62 Degree Celcius ... i got lockups when temps went over that ... thats why i just checked the graphics score.
> 
> With a full FSE run temps would have been to high under the stock cooler @ 100% even with a additional 140x140xx38mm fan @ 2500rpm.
> 
> There is no thermal or PL related throttling ... with t4 Bios there is no PL/TDP Usage displayed in AB HUD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As of now i own a Gainward Phoenix which has the strongest stock cooler so far.
> 
> With a curve i get 2152 with 1.0 VDDC ... got to try the T4 Bios on that card as well.


Thanks for the feedback








I´ve been busy messing with my Mobo´s beta Bios and OC the CPU and now it´s time to mess with something else








I will def try the T4 Bios on the AMP! Extreme then, it usually settles around 2100 with temps in the 60's C but I feel it´s capable of much more without that damn throttling.


----------



## ROKUGAN

Quote:


> Originally Posted by *Randomocity*
> 
> I get horrific coil whine on my FTW. My rig is generally silent until my VRMs spin up or down.


I´ve seen a couple complaints on the loud coil whine from people on the madvr forum with EVGA cards. Considering the thermal pads fiasco it´s not the best iteration from them.


__
https://www.reddit.com/r/4oqy44/gtx_1080_coil_whine/%5B/URL

http://forums.evga.com/Is-coil-whine-a-serious-issue-with-the-ftw-and-sc-cards-m2527936.aspx

But some people blaming Corsair RM series power supplies. I have a Corsair AX and literally zero noise from the gfx.


----------



## nexxusty

Quote:


> Originally Posted by *Radmanhs*
> 
> Is there any real way to bump the voltage past the "locked" max? I can only get 1.063 volts and 2100mhz on my core


You are where I was yesterday Hehe.

Voltage slider has to be at 100÷, then select 1093mv in the curve.

Works.


----------



## Randomocity

Quote:


> Originally Posted by *ROKUGAN*
> 
> I´ve seen a couple complaints on the loud coil whine from people on the madvr forum with EVGA cards. Considering the thermal pads fiasco it´s not the best iteration from them.
> 
> 
> __
> https://www.reddit.com/r/4oqy44/gtx_1080_coil_whine/%5B/URL
> 
> http://forums.evga.com/Is-coil-whine-a-serious-issue-with-the-ftw-and-sc-cards-m2527936.aspx
> 
> But some people blaming Corsair RM series power supplies. I have a Corsair AX and literally zero noise from the gfx.


Sadly, I don't have a Corsair power supply, so it's not that. I've got the evga 750 G2, which should be more than enough, and stable enough power to where coil whine shouldn't be an issue.


----------



## SirCanealot

OccamRazor/nrpeyton: Thanks for the info as usual! I'll give this all a go when I have a chance again. Pretty random flowchart of things to check, my favourite! Thanks again, you guys are awesome!


----------



## Krzych04650

Question about PCB width. I am planning to SLI in January and since bridges are stiff now, I need card with the same or very similar width. Is there any other card as wide as MSI ones (140mm)? Do you think that something will very similar width like 137 mm would have any issues fitting with stiff SLI bridge? Anyone tried to do something similar?


----------



## Sowah

*Hello dear Friends!*

After time is gone now... you are complete happy with GTX 1080? Works all fine with drivers and so on?

I ask because I maybe like to change my 2x Maxwell Titan X's SLi to 2x GTX 1080 in SLi for Christmas...

These 2x
https://www.alternate.de/MSI/GeForce-GTX-1080-SEA-HAWK-X-Grafikkarte/html/product/1279454?event=search

for this:
http://i.imgur.com/qt9Jjlo.jpg

So, are 2x 1080's fine for Nvidia Surround? Is it enough VRAM for 3x Monitor Gaming?

Thank you for help to find out!
*
Greetings from Germany & Spain sends
Christian & my two Sons*


----------



## DADDYDC650

Quote:


> Originally Posted by *Sowah*
> 
> *Hello dear Friends!*
> 
> After time is gone now... you are complete happy with GTX 1080? Works all fine with drivers and so on?
> 
> I ask because I maybe like to change my 2x Maxwell Titan X's SLi to 2x GTX 1080 in SLi for Christmas...
> 
> These 2x
> https://www.alternate.de/MSI/GeForce-GTX-1080-SEA-HAWK-X-Grafikkarte/html/product/1279454?event=search
> 
> for this:
> http://i.imgur.com/qt9Jjlo.jpg
> 
> So, are 2x 1080's fine for Nvidia Surround? Is it enough VRAM for 3x Monitor Gaming?
> 
> Thank you for help to find out!
> *
> Greetings from Germany & Spain sends
> Christian & my two Sons*


Depends on the resolution and what games you play at what settings.


----------



## IronAge

If you got Titan X SLI under watercooling and overclock them @ 1400 - 1450 i would not buy GTX1080 Sli.

A GTX 980 Ti 1550 is only like 15% slower than a GTX1080 @ 2100.

For a more noticeable performance upgrade you would better buy Titan X Pascal.









So it actually depends - you run Titan X Maxwell SLI with stock clock rates or they are overclocked ?


----------



## ucode

Quote:


> Originally Posted by *TK421*
> 
> thinking that the vrm cooling might be overkill for the card though?


IMO better to have too much than too little.

Would like to try one of those myself and see if software disabling the power limit works any better than the DT card. Lots of bugginess still with Pascal


----------



## Sowah

Quote:


> Originally Posted by *DADDYDC650*
> 
> Depends on the resolution and what games you play at what settings.


Thank you!

Resolution we use 3x 1920x1080 = 5760x1080 in Bezel Correction = *6020x1080*

We play for example at the moment _Battlefielt 1, Titanfall 2, Everspace, Darksiders Warmastered Edition,
Osiris New Dawn and The Witcher 3 Wild Hunt_ with highest settings in Games, all fine without problems.

Quote:


> Originally Posted by *IronAge*
> 
> If you got Titan X SLI under watercooling and overclock them @ 1400 - 1450 i would not buy GTX1080 Sli.
> 
> A GTX 980 Ti 1550 is only like 15% slower than a GTX1080 @ 2100.
> 
> For a more noticeable performance upgrade you would better buy Titan X Pascal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So it actually depends - you run Titan X Maxwell SLI with stock clock rates or they are overclocked ?


Thank you!

Yeah, maybe 2x "overpriced" Titan X Pascal, hehe...









We play in SLi with stock clock rates, with custom fan profile :


We made time ago 3 stable overclock settings with the EVGA Precision X, but we never use / needed yet...

Last week I wrote with Brother HaYDen from Flawlesswidescreen http://www.flawlesswidescreen.org/ and Widescreen Gaming Forum http://www.wsgf.org ... and we think maybe a 1080 Ti version would bei nice.









Chris


http://imgur.com/all


----------



## nrpeyton

Quote:


> Originally Posted by *Radmanhs*
> 
> Is there any real way to bump the voltage past the "locked" max? I can only get 1.063 volts and 2100mhz on my core


which card do you have?
Quote:


> Originally Posted by *Sowah*
> 
> Thank you!
> 
> Resolution we use 3x 1920 x 1080 = 5760 x 1080 in Bezel Correction = *6020 x 1080*
> 
> We play for example at the moment _Battlefielt 1, Titanfall 2, Everspace, Darksiders Warmastered Edition,
> Osiris New Dawn and The Witcher 3 Wild Hunt_ with highest settings in Games, all fine without problems.
> 
> Thank you!
> 
> Yeah, maybe 2x "overpriced" Titan X Pascal, hehe...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We play in SLi with stock clock rates, with custom fan profile :
> 
> 
> We made time ago 3 stable overclock settings with the EVGA Precision X, but we never use / needed yet...
> 
> Last week I wrote with Brother HaYDen from Flawlesswidescreen http://www.flawlesswidescreen.org/ and Widescreen Gaming Forum http://www.wsgf.org ... and we think maybe a 1080 Ti version would bei nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Chris


The 1080 TI is to be introduced on the 17th of January at the CES conference then will be retailing shortly after (so I hear). At a much lower price than Titan X and for the same performance.


----------



## nrpeyton

Quote:


> Originally Posted by *Randomocity*
> 
> So my FTW topped out at 1.063V using the slave BIOS and 130% power limit in AB. Overclock maxed out at 2151Mhz and temps never really broke 45c under full load with furmark. At 1.1V on t4 v2, I'm able to run the card around 2202Mhz core with my max temp being around 49c. I've been a little nervous to push it past 1.1v due to temps, but I might give it a shot this weekend.


You don't need to be "nervous" about temps above 50c.

The GPU's are rated up to 94c.

The GDDR5X memory at 96c

The VRM over 100C.

The only thing temps may do is make your overclock "less stable" but you won't know that until you try 

Quote:


> Originally Posted by *pfinch*
> 
> Hey guys,
> 
> flashed T4 (v2) on my AMP Extreme. Working fine, but on nearly every Application it throttles down by 1 Core-Mhz Step (2013 -> 2001; 2001 -> 2088 etc.)
> Tested every Voltage (1093 to 1.2) ... all the same
> 
> could someone explain this behavior?


This is normal behaviour.

For every voltage step there are a number of "linked" or "possible" frequencies for that voltage.

Depending on temperature and power draw (of max TDP) GPU Boost 3.0 will "down-step" the frequency in an attempt to bring your temps and power draw "back into line".

This is the same for all voltages. And completely normal behaviour. There's no way to switch it off.

Once it reaches the lowest frequency step for that voltage, if temp is *still* too high or power draw is still nearing max TDP it will then drop voltage.

When it drops to that lower voltage it will then choose from an array of lower frequency steps for that voltage again. And the cycle continues until things "even out".

What you *can* do is "lock" the voltage. Using MSI Afterburner. That's about the best you can do.

You don't need to worry about the frequency jumping its completely normal. But if for some bizarre reason this did annoy someone and you were desperate you could lock the voltage and this would help a little.

The best way to improve this behaviour is put your card under water. This way instead of stepping down by maybe 4/5 frequency steps; at most it will maybe only drop 1 step. (Provided your not hitting power limit).

*Edit: (just remembered)::*
Also if you're an EVGA customer you could use Precision X and enable "K-Boost". This forces the card to run at its maximum. But the K-Boost only works on Precision X for EVGA cards (i think). And I can't remember exactly how it works again you'd have to check it out


----------



## Randomocity

Quote:


> Originally Posted by *nrpeyton*
> 
> You don't need to be "nervous" about temps above 50c.
> 
> The GPU's are rated up to 94c.
> 
> The GDDR5X memory at 96c
> 
> The VRM over 100C.
> 
> The only thing temps may do is make your overclock "less stable" but you won't know that until you try


Stability has definitely been an issue past 2202mhz, I'll bump it up to 1.2v and see how much farther I can overclock,although it's hard to complain about breaking 2200mhz at any voltage.


----------



## nrpeyton

Quote:


> Originally Posted by *Randomocity*
> 
> Stability has definitely been an issue past 2202mhz, I'll bump it up to 1.2v and see how much farther I can overclock,although it's hard to complain about breaking 2200mhz at any voltage.


true lol

my card is exactly the same, hates going above 2202 due to temps.

im strongly considering a water chiller, but thats a huge (and probably pointless) indulgance lol


----------



## emperium85

If you have an FE card get yourself an EVGA Hybrid kit, also the kit from the 980Ti works.
Max temps 40c closed case @2126Mhz (push/pull)

Where can I find the T4V2 bios ? I cant find it anyware, and whats the difference between v1 and v2 ?


----------



## nrpeyton

Quote:


> Originally Posted by *emperium85*
> 
> If you have an FE card get yourself an EVGA Hybrid kit, also the kit from the 980Ti works.
> Max temps 40c closed case @2126Mhz (push/pull)
> 
> Where can I find the T4V2 bios ? I cant find it anyware, and whats the difference between v1 and v2 ?


I posted the T4, V2 a few pages back


----------



## Darkboomhoney

@nrpeyton, aquatuning send me the wrong blocks ... M05 but i ordered M09...... now i ordered directly from occool the m09 i think next week i have results....


----------



## emperium85

Quote:


> Originally Posted by *nrpeyton*
> 
> I posted the T4, V2 a few pages back


Maybe something for the opening post, I can't find T4v2 anyware


----------



## nrpeyton

Quote:


> Originally Posted by *Darkboomhoney*
> 
> @nrpeyton, aquatuning send me the wrong blocks ... M05 but i ordered M09...... now i ordered directly from occool the m09 i think next week i have results....


**** sake mate, lol. I felt ur pain and disappointment :-(

argh oh well I look forward to it

by the way, kind of ironic that; especially after my post about the lack of proper "naming" of these blocks lol

anyway I hope u have more luck having it shipped from Germany.

*what did aquatuning have to say about it?*


----------



## OccamRazor

Quote:


> Originally Posted by *emperium85*
> 
> Maybe something for the opening post, I can't find T4v2 anyware


Here you go, nvflash plus T4 v2:

NvflashplusT4.zip 1304k .zip file


Cheers

Occamrazor


----------



## Fidelity21

So just to confirm, this bios flash will work with all FE cards? I'm maxing out my Asus 1080 FE card at 2050mhz core and I'd like to push higher voltage since I'm running a very nice EK water setup. Changing voltage from stock to 100% using MSI AB did nothing to change 2050mhz as my limit. Temps never exceed 50C but I have no idea what memory temps are or the voltage controller


----------



## nrpeyton

*Memory vs. Core overclocking on Pascal (which gives more gains)?[/*


----------



## nrpeyton

Quote:


> Originally Posted by *Fidelity21*
> 
> So just to confirm, this bios flash will work with all FE cards? I'm maxing out my Asus 1080 FE card at 2050mhz core and I'd like to push higher voltage since I'm running a very nice EK water setup. Changing voltage from stock to 100% using MSI AB did nothing to change 2050mhz as my limit. Temps never exceed 50C but I have no idea what memory temps are or the voltage controller


T4 bios won't brick a FE 1080.

Unfortunately voltage isn't doing an awful lot on Pascal. Not unless you can get under 25c (water chiller)

Some less "leaky" cards might take a few extra bumps under 40c

I'm talking about bumps over 1093mv here by the way


----------



## Sowah

Quote:


> Originally Posted by *nrpeyton*
> 
> The 1080 TI is to be introduced on the 17th of January at the CES conference then will be retailing shortly after (so I hear). At a much lower price than Titan X and for the same performance.


Thank you Nicholas, that sounds good with the Information about 1080 Ti!

By the way, I like your Videos in YouTube!

Greetings from Spain!
Christian


----------



## Vellinious

Quote:


> Originally Posted by *OccamRazor*
> 
> Here you go, nvflash plus T4 v2:
> 
> NvflashplusT4.zip 1304k .zip file
> 
> 
> Cheers
> 
> Occamrazor


Has anyone tested this with the FTW?

UPDATE: Haha, yeah....that didn't go over well. It wouldn't read either card as a 1080. Swapped to the #1 bios, rebooted, hot switched to #2 bios and reflashed to the stock bios. Up and going again, easy enough, but.....yeah, my FTWs didn't like that STRIX bios any at all. Ish


----------



## DStealth

Finally exceeded 8500 GPU with my Pallit JS stock cooled...cold air cooled actually








[email protected] +540 memory best for TimeSpy on my video...


http://www.3dmark.com/spy/848861


----------



## OccamRazor

Quote:


> Originally Posted by *Vellinious*
> 
> Has anyone tested this with the FTW?
> 
> UPDATE: Haha, yeah....that didn't go over well. It wouldn't read either card as a 1080. Swapped to the #1 bios, rebooted, hot switched to #2 bios and reflashed to the stock bios. Up and going again, easy enough, but.....yeah, my FTWs didn't like that STRIX bios any at all. Ish


Thats weird, it should had at least let the card to be recognized, its the first time i hear a T4 was not working in a FTW!
I found a couple of times that a reboot is not enough for the flash to be effective, have to shutdown and restart to see the newly flashed bios working, it happen when i flashed once the T4, rebooted and when i was testing realized in AB there was still the power and temperature slider (grayed out in T4)!
A good indicator of successful flash is low resolution just before windows kicks in, when the drivers recognize a new card is installed as a new bios is like a new card for the drivers.

Cheers

Occamrazor


----------



## Azazil1190

Is any different between t4 and t4 v2?
Are the same?


----------



## OccamRazor

Quote:


> Originally Posted by *Azazil1190*
> 
> Is any different between t4 and t4 v2?
> Are the same?


Yes, basically, its only lowered OC clocks and some speed improvements for 1080 strix cards but nothing we reference pcb users might see any major bump!

Cheers

Occamrazor


----------



## Azazil1190

Quote:


> Originally Posted by *OccamRazor*
> 
> Yes, basically, its only lowered OC clocks and some speed improvements for 1080 strix cards but nothing we reference pcb users might see any major bump!
> 
> Cheers
> 
> Occamrazor


Thanks mate for the response.
I own a strix oc flashed to t4 xoc bios I think is thr 1st t4.Thats why i asked if exist difference between t4 and t4 v2.
What do you think? Better to flash the t4 v2?
Thanks in advance


----------



## emperium85

I flashed T4v2 and I dont want to run @ 1.2v but when I do CTRL+F on afterburner the table goes to 1.2v how do I set this to max 1.15v ?


----------



## MonarchX

Why can't GPU-Z display ASIC for my Gigabyte GTX 1080 Gaming G1? My highest OC at maxed out allowable voltage, power, etc. is only 2100Mhz. Is that normal?


----------



## Azazil1190

Quote:


> Originally Posted by *MonarchX*
> 
> Why can't GPU-Z display ASIC for my Gigabyte GTX 1080 Gaming G1? My highest OC at maxed out allowable voltage, power, etc. is only 2100Mhz. Is that normal?


Gpu z cant read the asic of pascal.
Its normal dont worry.
And your oc is fair enough.


----------



## Randomocity

Quote:


> Originally Posted by *Vellinious*
> 
> Has anyone tested this with the FTW?
> 
> UPDATE: Haha, yeah....that didn't go over well. It wouldn't read either card as a 1080. Swapped to the #1 bios, rebooted, hot switched to #2 bios and reflashed to the stock bios. Up and going again, easy enough, but.....yeah, my FTWs didn't like that STRIX bios any at all. Ish


Huh. I haven't have any issues with the t4 bios that was posted earlier on my FTW. Final count for my overclock is [email protected]/+725. Can't really complain.


----------



## Vellinious

Quote:


> Originally Posted by *Randomocity*
> 
> Huh. I haven't have any issues with the t4 bios that was posted earlier on my FTW. Final count for my overclock is [email protected]/+725. Can't really complain.


I run mine for TimeSpy at 2202 and +575. I can run 2214, but there's no improvement in frame rates.
Quote:


> Originally Posted by *OccamRazor*
> 
> Thats weird, it should had at least let the card to be recognized, its the first time i hear a T4 was not working in a FTW!
> I found a couple of times that a reboot is not enough for the flash to be effective, have to shutdown and restart to see the newly flashed bios working, it happen when i flashed once the T4, rebooted and when i was testing realized in AB there was still the power and temperature slider (grayed out in T4)!
> A good indicator of successful flash is low resolution just before windows kicks in, when the drivers recognize a new card is installed as a new bios is like a new card for the drivers.
> 
> Cheers
> 
> Occamrazor


I did a reboot twice...I didn't leave it off though. On the Maxwells, sometimes it'd take a few boots to take effect, so...I kinda tried that, but each time, I'd get into Windows and I could hear the cards enabling and disabling themselves, it wouldn't bring up the drivers and device manager wasn't recognizing them as 1080s.

I might try again tonight, see if I flash and then turn the machine off for a few minutes, see if that doesn't work a little better. I'd like to see if I can get a run done at 2227+ in SLI.


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> *Memory vs. Core overclocking on Pascal (which gives more gains)?[/*


Ya memory OC alone in FS gave me like extra 1800 points.


----------



## ucode

Quote:


> Originally Posted by *emperium85*
> 
> I flashed T4v2 and I dont want to run @ 1.2v but when I do CTRL+F on afterburner the table goes to 1.2v how do I set this to max 1.15v ?


Set all points above 1.15V to the same frequency as at 1.15V. IOW flatline frequencies from 1.15V and above.

Never heard of T4v2. There's the strix xoc VBIOS with increased default clocks followed by the strix T4, both allowed 1.2V and disabled soft power limits.


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> I run mine for TimeSpy at 2202 and +575. I can run 2214, but there's no improvement in frame rates.
> I did a reboot twice...I didn't leave it off though. On the Maxwells, sometimes it'd take a few boots to take effect, so...I kinda tried that, but each time, I'd get into Windows and I could hear the cards enabling and disabling themselves, it wouldn't bring up the drivers and device manager wasn't recognizing them as 1080s.
> 
> I might try again tonight, see if I flash and then turn the machine off for a few minutes, see if that doesn't work a little better. I'd like to see if I can get a run done at 2227+ in SLI.


Did you try unplugging the PSU?
Sometimes when switching from Master to Slave with the FTW's you need to do that for it to take effect.


----------



## Outcasst

I have been testing out this T4 BIOS, and I'm getting an average of 122FPS Heaven Run by overclocking to 2100MHz just using the offset and no voltage increase. GPU-Z indicates it's running at 1.06v.

However whenever I try to force a voltage, say 1.1v using the curve at the exact same clock speed, the average goes down to 115FPS. Both methods aren't throttling as far as I can tell, running at a constant 2100 throughout the entire run. Am I missing something here?

GPU isn't exceeding 52c, Founder's Edition


----------



## Calibos

I'm about to delid and use CLU on my 6700K. Just wondering is there anything to be gained from replacing the TIM on my Palit Super Jetstream GTX1080 with CLU. Currently about 70deg while boosting to about 2000mhz. Havent tried OC'ing yet.


----------



## Dragonsyph

Dont put liquid metal on a gpu is my opinion.


----------



## OccamRazor

Quote:


> Originally Posted by *Azazil1190*
> 
> Thanks mate for the response.
> I own a strix oc flashed to t4 xoc bios I think is thr 1st t4.Thats why i asked if exist difference between t4 and t4 v2.
> What do you think? Better to flash the t4 v2?
> Thanks in advance


AFAIK, the V2 is the latest strix bios availiable for OC, give the V2 a try, the XOC didnt go well with my Seahawk X, drivers didnt recognize the card!
Quote:


> Originally Posted by *emperium85*
> 
> I flashed T4v2 and I dont want to run @ 1.2v but when I do CTRL+F on afterburner the table goes to 1.2v how do I set this to max 1.15v ?


Choose the node in the graph you want and then press CRTL+L,the red line will turn into yellow, then you know its set! then hit apply in Afterburner! (the node has to be in the higher clock/voltage position)

Quote:


> Originally Posted by *MonarchX*
> 
> Why can't GPU-Z display ASIC for my Gigabyte GTX 1080 Gaming G1? My highest OC at maxed out allowable voltage, power, etc. is only 2100Mhz. Is that normal?


ASIC quality reading is not supported on GTX 1080, W1zzard himself said so!









Quote:


> Originally Posted by *Randomocity*
> 
> Huh. I haven't have any issues with the t4 bios that was posted earlier on my FTW. Final count for my overclock is [email protected]/+725. Can't really complain.


Lucky you...









Quote:


> Originally Posted by *Vellinious*
> 
> I run mine for TimeSpy at 2202 and +575. I can run 2214, but there's no improvement in frame rates.
> I did a reboot twice...I didn't leave it off though. On the Maxwells, sometimes it'd take a few boots to take effect, so...I kinda tried that, but each time, I'd get into Windows and I could hear the cards enabling and disabling themselves, it wouldn't bring up the drivers and device manager wasn't recognizing them as 1080s.
> I might try again tonight, see if I flash and then turn the machine off for a few minutes, see if that doesn't work a little better. I'd like to see if I can get a run done at 2227+ in SLI.


Good luck man!









Quote:


> Originally Posted by *ucode*
> 
> Set all points above 1.15V to the same frequency as at 1.15V. IOW flatline frequencies from 1.15V and above.
> Never heard of T4v2. There's the strix xoc VBIOS with increased default clocks followed by the strix T4, both allowed 1.2V and disabled soft power limits.


Its the latest bios from ASUS for the Strix, i got it from Dancop or Elmor (cant recall anymore); but its floating around since August i think!

Quote:


> Originally Posted by *Outcasst*
> 
> I have been testing out this T4 BIOS, and I'm getting an average of 122FPS Heaven Run by overclocking to 2100MHz just using the offset and no voltage increase. GPU-Z indicates it's running at 1.06v.
> However whenever I try to force a voltage, say 1.1v using the curve at the exact same clock speed, the average goes down to 115FPS. Both methods aren't throttling as far as I can tell, running at a constant 2100 throughout the entire run. Am I missing something here?
> GPU isn't exceeding 52c, Founder's Edition


The only throthling i can think of it temperature wise in your case, any temp for me above 45C is a lower OC

Quote:


> Originally Posted by *Dragonsyph*
> 
> Dont put liquid metal on a gpu is my opinion.


I second that! Liquid metal paste has galium and galium loves to "eat" aluminium and other metals, but the prefered prey is aluminium really!









Cheers all

Occamrazor


----------



## zlpw0ker

actually ASIC quality reading is supported. I have a 1080 seahawk x and when I rightclick on gpu-z I can click and see it says 100%.


----------



## Azazil1190

Quote:


> Originally Posted by *zlpw0ker*
> 
> actually ASIC quality reading is supported. I have a 1080 seahawk x and when I rightclick on gpu-z I can click and see it says 100%.


Which version of gpuz do you have?
Probably old version.Update to the latest and youll see.
Gpuz dosenr read asic on pascal


----------



## Azazil1190

double post
Delete!


----------



## zlpw0ker

Quote:


> Originally Posted by *Azazil1190*
> 
> Which version of gpuz do you have?
> Probably old version.Update to the latest and youll see.
> Gpuz dosenr read asic on pascal


I have gpu-z 0.8.7 version and I have Pascal gpu which is 1080 seahawk x.
But you are probably right,if I update it I cant get the ASIC score.


----------



## OccamRazor

Quote:


> Originally Posted by *zlpw0ker*
> 
> actually ASIC quality reading is supported. I have a 1080 seahawk x and when I rightclick on gpu-z I can click and see it says 100%.


The GPU-Z creator said it wasn´t supported, although it reads, it just displays broken values! On older versions it displays values but not real!

_"Ok so it could be that the ASIC quality data is read the same way as on previous cards. Older versions of GPU-Z will assume it to be a Maxwell GPU (the latest at the time).
I do postprocess the raw values to map them onto a percentage range, which, given these above 100% numbers, could mean that register location has not changed but the data format has."_

Cheers

Occamrazor


----------



## Azazil1190

is this the t4v2?


----------



## nrpeyton

*Just purchased a water-chiller:

Hailea water chiller Ultra Titan 1500 (HC500 = 790Watt cooling capacity)*

http://www.watercoolinguk.co.uk/p/Hailea-water-chiller-Ultra-Titan-1500-HC500-=-790Watt-cooling-capacity-UK-Plug_53532.html

*Temperatures down to 4c are possible* (I know I need to take condensation measures don't worry I plan to buy a dew point meter)

Normally retails for £500 ($628).

Grabbed it 2nd hand on Ebay for £200. _still can't believe it myself_ -- (but seller assures me its working and he has a 250+ rating at 100% and I used paypal)

Anyone got one of these can offer any advice?

My *water pump* is a high-end EK pump with following stats:
*Maximum Flow =* 1500L/hr
*Max Head Press. =* 3.9m

Will *only* be cooling an *EK full cover block* and an *EK Supremacy Evo CPU block*. (Haven't worked out yet if I'll need to disconnect the radiator) due to restriction.

Blocks are apparently both low-restriction.

*Chiller recommends a flow rate of at least 1200L/hr. EK don't sell anything that does more than 1500L/hr (which is what I have). A D5 I think*.

Anyway there is no information online about these; the only page I could find that even had a 'review' wasn't even in English.

Kind of scared about what I've just gotten myself into lol.

Chiller has been sitting unused for 3 years. But I still don't know yet if it was used to cool a cold water aquarium for fish by last user (or overclocking). Seller gave info on both. But was an overclocker himself. (But never used it, it sat in an unopened box in his house for 3 years).

If it was fish (3+ years ago), I'm seriously going to have to work out how to flush it clean and think about anti-growth measures lol.

Anyway any advice on this would be absolutely fantastic. Or even if anyone can put me into contact with someone they know they've heard has had any experience with these at all. Would be amazing 

Won't be a week or two until its here. *Normally this would be the "research online time and wait" but there's nothing available. :-(* kinda makes me sad because I normally enjoy looking at everything and watching all the reviews etc etc lol :-(

Anyway can't wait to see what gains I get with this lol.

Nick Peyton


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *Just purchased a water-chiller:
> 
> Hailea water chiller Ultra Titan 1500 (HC500 = 790Watt cooling capacity)*
> 
> http://www.watercoolinguk.co.uk/p/Hailea-water-chiller-Ultra-Titan-1500-HC500-=-790Watt-cooling-capacity-UK-Plug_53532.html
> 
> *Temperatures down to 4c are possible* (I know I need to take condensation measures don't worry I plan to buy a dew point meter)
> 
> Normally retails for £500 ($628).
> 
> Grabbed it 2nd hand on Ebay for £200. _still can't believe it myself_ -- (but seller assures me its working and he has a 250+ rating at 100% and I used paypal)
> 
> Anyone got one of these can offer any advice?
> 
> My *water pump* is a high-end EK pump with following stats:
> *Maximum Flow =* 1500L/hr
> *Max Head Press. =* 3.9m
> 
> Will *only* be cooling an *EK full cover block* and an *EK Supremacy Evo CPU block*. (Haven't worked out yet if I'll need to disconnect the radiator) due to restriction.
> 
> Blocks are apparently both low-restriction.
> 
> *Chiller recommends a flow rate of at least 1200L/hr. EK don't sell anything that does more than 1500L/hr (which is what I have). A D5 I think*.
> 
> Anyway there is no information online about these; the only page I could find that even had a 'review' wasn't even in English.
> 
> Kind of scared about what I've just gotten myself into lol.
> 
> Chiller has been sitting unused for 3 years. But I still don't know yet if it was used to cool a cold water aquarium for fish by last user (or overclocking). Seller gave info on both. But was an overclocker himself. (But never used it, it sat in an unopened box in his house for 3 years).
> 
> If it was fish (3+ years ago), I'm seriously going to have to work out how to flush it clean and think about anti-growth measures lol.
> 
> Anyway any advice on this would be absolutely fantastic. Or even if anyone can put me into contact with someone they know they've heard has had any experience with these at all. Would be amazing
> 
> Won't be a week or two until its here. *Normally this would be the "research online time and wait" but there's nothing available. :-(* kinda makes me sad because I normally enjoy looking at everything and watching all the reviews etc etc lol :-(
> 
> Anyway can't wait to see what gains I get with this lol.
> 
> Nick Peyton


Probably yield you 40 or 50mhz without dropping frames.


----------



## Menthol

I have the Hailea chiller 500A, must be same thing
I only use it for benching, no need for daily use, you may find one D5 is not enough, I have always used 2 D5 in series, you can set it up how you want, some use no radiators as more surface area to loose cool temps., I use quick disconnects, leave rads in loop, run rad fans on fan controller and turn fans off when using chiller, if I see dew starts to form, I turn on fans and or change temp on unit.
With only one pump I would place the chiller on desk or above PC and keep tubing to minimum, with dual D5 my chiller is on a furniture dolly and I can role it around as needed, for benching I use a second power supply and turn the chiller and pumps on before benching for minimum temps, or run PC at default idle load until temps drop, a couple generations ago GPU's generated more heat and CPU's needed more volts so also generated more heat so cooling water first was more important, not as much now, I was recently benching [email protected] and 4 [email protected] without issue, but once you get it and have questions just let me know, there are several others around here that use chillers also
Some people go to great lengths to insulate the CPU socket, Gpu and tubing to keep temps low and prevent moisture, I don't, Chi;;. bench, done is all I do


----------



## Vellinious

It's winter....just roll the rig outside. lol


----------



## ucode

Quote:


> Originally Posted by *OccamRazor*
> 
> Its the latest bios from ASUS for the Strix, i got it from Dancop or Elmor (cant recall anymore); but its floating around since August i think!


AFAIK Dancop posted the first 1.2V strix VBIOS with higher default GPU and Mem clocks which was strix1080xoc.rom, it was not called T4. Then later Elmor posted strix1080xoc_t4.rom which was much the same except it had standard default GPU and Mem clocks which meant the video clock ran higher (closer to GPU clock) and in FS scored higher for same GPU and Mem clocks. Elmor's VBIOS is the only "T4" VBIOS I am aware of.


----------



## DStealth

Correct.
There's only one T4 BIOS


----------



## OccamRazor

Quote:


> Originally Posted by *ucode*
> 
> AFAIK Dancop posted the first 1.2V strix VBIOS with higher default GPU and Mem clocks which was strix1080xoc.rom, it was not called T4. Then later Elmor posted strix1080xoc_t4.rom which was much the same except it had standard default GPU and Mem clocks which meant the video clock ran higher (closer to GPU clock) and in FS scored higher for same GPU and Mem clocks. Elmor's VBIOS is the only "T4" VBIOS I am aware of.


Quote:


> Originally Posted by *DStealth*
> 
> Correct.
> There's only one T4 BIOS


Yes, there is only the stock, the LN2 and the T4 bios! Maybe people assumed the T4 as a v2 counting as the second strix bios atfer the LN2 bios!

Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *Menthol*
> 
> I have the Hailea chiller 500A, must be same thing
> I only use it for benching, no need for daily use, you may find one D5 is not enough, I have always used 2 D5 in series, you can set it up how you want, some use no radiators as more surface area to loose cool temps., I use quick disconnects, leave rads in loop, run rad fans on fan controller and turn fans off when using chiller, if I see dew starts to form, I turn on fans and or change temp on unit.
> With only one pump I would place the chiller on desk or above PC and keep tubing to minimum, with dual D5 my chiller is on a furniture dolly and I can role it around as needed, for benching I use a second power supply and turn the chiller and pumps on before benching for minimum temps, or run PC at default idle load until temps drop, a couple generations ago GPU's generated more heat and CPU's needed more volts so also generated more heat so cooling water first was more important, not as much now, I was recently benching [email protected] and 4 [email protected] without issue, but once you get it and have questions just let me know, there are several others around here that use chillers also
> Some people go to great lengths to insulate the CPU socket, Gpu and tubing to keep temps low and prevent moisture, I don't, Chi;;. bench, done is all I do


Thanks for info mate; much appreciated.

Only installed my 1st loop 3 months ago. so

Yesterday I started to read up about 1st drain + coolant change then it occurred to me "I wonder what the Chiller was used for".

Seller can't remember (he bought from Ebay, 3 years ago after "light use" but can't remember what the 'light use' was, but says he thinks it was already flushed "ready for use". But that's not good? :-(

Got some expensive EK cooling equipment here. It could of had all sorts of dirty water through it if its been used for Aquarium.

Any idea how I could safely clean/flush it? Would I even need to as long as I use proper EK coolant (which has anti-biocide & anti-corrosion) anyway....

Now that I am thinking this over properly it is beginning to all sound like I've gotten myself into something I won't be able to maintain.

*Edit:*
Just finally found a 'review' of the HC500a for water-cooling loops here:
http://www.bit-tech.net/hardware/cooling/2010/07/20/hailea-hc-500a-water-chiller-review/4

They have roughly the same setup as me (1 CPU & 1 GPU block).

They said you needed a powerful pump (but that the "Swiftech MCP655" is powerful enough).

Googled the '*Swiftech MCP655'* and its weaker than my own pump:
Max Flow: 1200 L/hr
Max Head Press: 3.2M

*My pump: (D5)*
Max Flow: 1500 L/hr
Max Head Press. 3.9m

So actually looking good so far 

Just need to think about how I'm going to clean this thing out....

Does it have an internal tank of its own? I'm not going to need to fork out on ridiculous amounts of EK coolant and Distilled water am I lol?

Nick


----------



## Menthol

Yes it has an internal tank
Use white denatured vinegar and distilled water to flush it out then flush it with only distilled water, I only use distilled water and now a water wetter, I do not use dies or other additives, coolants.
One pump will work as long as tubing is not long, keep chiller as close to PC as you can , fill chiller tank with distilled water, or your coolant of choice before connecting tubing as the tank is large compared to radiators, watch for moisture on fittings, blocks, etc. even if not on blocks it can form on rads and tubing and drip onto components
I have to run an extension cord to another room because if it is on the same circuit as the rest of my system it draws to much current and breaker pops, may be a US thing but most circuits in house are 15 amp which is fine unless you overload it.

P.S.
When I purchased mine new the tank had a leak, I opened it up and there was a crack, I assume from shipping, I used epoxy to seal the leak and it's been fine for several years now


----------



## nrpeyton

Quote:


> Originally Posted by *Menthol*
> 
> Yes it has an internal tank
> Use white denatured vinegar and distilled water to flush it out then flush it with only distilled water, I only use distilled water and now a water wetter, I do not use dies or other additives, coolants.
> One pump will work as long as tubing is not long, keep chiller as close to PC as you can , fill chiller tank with distilled water, or your coolant of choice before connecting tubing as the tank is large compared to radiators, watch for moisture on fittings, blocks, etc. even if not on blocks it can form on rads and tubing and drip onto components
> I have to run an extension cord to another room because if it is on the same circuit as the rest of my system it draws to much current and breaker pops, may be a US thing but most circuits in house are 15 amp which is fine unless you overload it.
> 
> P.S.
> When I purchased mine new the tank had a leak, I opened it up and there was a crack, I assume from shipping, I used epoxy to seal the leak and it's been fine for several years now


Okay thanks, I'll make sure I ask the seller to tell the Courier to be very careful and mark it as 'fragile'.

You ever performed a flush on yours yet?

I was thinking of doing something like this: (but could be wayyy off here):

*By blocking the "return" or "in" valve at bottom of pump (below), then unscrewing the top and placing the return tube into the top of it with a "water-filter" wrapped around it --

I could use the vinegar/distilled water as you suggested & leave running for 30 minutes, then swap to distilled only...?*



Edit:
Just googled the 'water wetter'; came up with treatment for car engine coolant. (But mentions it contains anti-corrosives etc) so still makes sense...


----------



## Menthol

Redline water wetter https://www.amazon.com/Red-Line-80204-Water-Wetter/dp/B000CPI5ZK/ref=sr_1_1?ie=UTF8&qid=1481433078&sr=8-1&keywords=water+wetter+redline

Doesn't take much


----------



## nrpeyton

Quote:


> Originally Posted by *Menthol*
> 
> Redline water wetter https://www.amazon.com/Red-Line-80204-Water-Wetter/dp/B000CPI5ZK/ref=sr_1_1?ie=UTF8&qid=1481433078&sr=8-1&keywords=water+wetter+redline
> 
> Doesn't take much


never had an issue with growth?

i'm going to have to be very careful (at least in the immediate short term) as unaware of last use (even after flushing micro-organisms are still going to exist)

I may end up doing a full drain + change every 1 week for the first few weeks (after initial double flush of chiller) just to be safe.

Anyway what I wanted to ask you; is why do you use the redline water wetter instead of modern PC loop coolant (which contains anti-corrosion & anti-biocide)...? I just had a look on some forums and I can see the water-wetter was very popular in 2001 when there wasn't as much around for PC (before PC coolants contained anti-corrosion).
So just wanted to know if you are using it from habit or because you've tried both and seen better results with the wetter? (And if you didn't have the chiller would you still be using the wetter)?

Thank you 

Anyway on funnier note; I just read a reply on techpowerup.com about a guy who manipulates his chiller's thermostat to get sub-zero. (anti-freeze replaces coolant)
I am thinking calm down - one step at a time lol


----------



## Menthol

I have only started using the wetter recently as suggested by Jpmboy in these forums
Some guy in the midlifegamer forums posted a detailed article a while back on modifying chillers to sub zero, then you would need anti-freeze in the coolant

http://www.midlifegamers.co.uk/

you will need to search for his post, but a very detailed guide and worth the read


----------



## Vellinious

I'd personally never use water wetter in a cooling loop these days. Everything's acrylic. Alcohol and acrylic don't play nice.....ever.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I'd personally never use water wetter in a cooling loop these days. Everything's acrylic. Alcohol and acrylic don't play nice.....ever.


Top of my GPU block is acrylic. (So Nickel-Plated & Acrylic).

EK actually say in their FAQ you can use:
-Vinegar on their bare copper blocks (then flush with distilled)
-Vinegar on their Acetal blocks - but warm soapy only recommended (then flush with distilled)
-Warm water only with Nickel-Plated (then flush with distilled)
-Non-abrasive warm soapy water only on Acrylic (then flush with distilled)

They obviously wrote that FAQ knowing people may have more than one different type of block in their loop.

I think as long as you flush with distilled afterwards then you are okay.

Or "deionised" water is what we have in the U.K. which is apparently meant to be "even purer" than distilled.

Think I may stick to just using the EK coolant though; but still flush the Chiller 1st with 5% Vinegar then 2nd with deionised/distiled as Menthol added.

If I ever decided to try and go subzero on it I'd have to do a lot of research first (and that's only if I even had access to the thermostat -- which until its here I have no idea).....

Most antifreeze is made from ethylene glycol, but it can contain other chemicals or ingredients, such as propylene glycol, ethanol, methanol and isopropyl alcohol.


----------



## Dissolution187

I just got an EVGA GTX 1080 SC. Can someone help me flash the bios so I can achieve a more aggressive OC?

Thanks so much!


----------



## nrpeyton

Quote:


> Originally Posted by *Dissolution187*
> 
> I just got an EVGA GTX 1080 SC. Can someone help me flash the bios so I can achieve a more aggressive OC?
> 
> Thanks so much!


What BIOS?

I mean which BIOS do you want to replace it with / what are you trying to flash?


----------



## Dissolution187

Quote:


> Originally Posted by *nrpeyton*
> 
> What BIOS?
> 
> I mean which BIOS do you want to replace it with / what are you trying to flash?


The bios version is 86.04.17.00.80 if that helps?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Top of my GPU block is acrylic. (So Nickel-Plated & Acrylic).
> 
> EK actually say in their FAQ you can use:
> -Vinegar on their bare copper blocks (then flush with distilled)
> -Vinegar on their Acetal blocks - but warm soapy only recommended (then flush with distilled)
> -Warm water only with Nickel-Plated (then flush with distilled)
> -Non-abrasive warm soapy water only on Acrylic (then flush with distilled)
> 
> They obviously wrote that FAQ knowing people may have more than one different type of block in their loop.
> 
> I think as long as you flush with distilled afterwards then you are okay.
> 
> Or "deionised" water is what we have in the U.K. which is apparently meant to be "even purer" than distilled.
> 
> Think I may stick to just using the EK coolant though; but still flush the Chiller 1st with 5% Vinegar then 2nd with deionised/distiled as Menthol added.
> 
> If I ever decided to try and go subzero on it I'd have to do a lot of research first (and that's only if I even had access to the thermostat -- which until its here I have no idea).....
> 
> Most antifreeze is made from ethylene glycol, but it can contain other chemicals or ingredients, such as propylene glycol, ethanol, methanol and isopropyl alcohol.


From the EK FAQ you were looking at. When I said they don't play nice? It's because I've had it happen. Cracks start to form on the acetal where the screws go through it. Enjoy. (water wetter contains alcohol)


----------



## Menthol

Vellinious, you are correct, I don't care for acrylic blocks or colored dyes myself, black acetyl or nickle plated copper and no die for me so I should be careful with recommendations


----------



## Vellinious

Quote:


> Originally Posted by *Menthol*
> 
> Vellinious, you are correct, I don't care for acrylic blocks or colored dyes myself, black acetyl or nickle plated copper and no die for me so I should be careful with recommendations


Getting hard to find GPU blocks that don't have acetal exposed to the coolant any more.


----------



## nrpeyton

Just heard back from EK support too and deionised isn't okay. Apparently it still has metal deposits that haven't been removed.

This complicates things; because in the U.K destiled isn't easy to get unless you order online (then you pay more for delivery due to weight).


----------



## new boy

Meh, I use deionised + a biocide and it works fine for me


----------



## nrpeyton

Quote:


> Originally Posted by *new boy*
> 
> Meh, I use deionised + a biocide and it works fine for me


my equipment is all only 3 months old don't know if I want to take the risk.


----------



## new boy

That's fair enough (though I think using deionised is pretty standard stuff) Maybe just get a premix of Mayhems or something then if your going to pay silly money to get fluid delivered either way.

It might be worth asking at a chemist if they can get you distilled water (though they will probably have to order it)


----------



## 6u4rdi4n

Just distil it yourself. Not that hard ^^


----------



## Vellinious

You can get distilled and deionized water at the grocery store. I use distilled to flush / rinse / clean my loops, before I put in the Mayhem's coolant I'm using. I used to just use distilled + biocide, but.....decided to go with Mayhem's pastels. The difference in temps between the two is nominal and the Mayhem's looks better.


----------



## new boy

Its surprisingly hard to get hold of distilled in the UK. All we really have is deionised, at least in "real" shops


----------



## Vellinious

Quote:


> Originally Posted by *new boy*
> 
> Its surprisingly hard to get hold of distilled in the UK. All we really have is deionised, at least in "real" shops


That's odd. I would have never guessed that something as simple as that would be hard to get ahold of. Interesting.


----------



## Derek1

Is this expensive for distilled water or do you need a special permit to buy it from these guys?

http://www.thedistilledwatercompany.com/buy-distilled-water

What about these guys

http://www.buydistilledwater.co.uk/


----------



## new boy

well, its like $20 delivered for a gallon from the 2nd link, more from the 1st.

I'd rather just use deionised, that's what most of us on overclockers.co.uk run and we don't seam to have any issues *shrug.

But I'm not saying anyone else who wants to pay the extra for the distilled is doing it wrong, just the deionised works for me at like 5% of the price.


----------



## nrpeyton

The problem I've got, is the chiler is so big and contains its own tank... even if I run it on a loop with a pump with a few litres of distilled for a couple of hours I'm still just circulating the same fluid.

I'd have to do this several times to remove everything to get everything out (so what are we talking £100 on distilled by the time I've finished lol.

All because the bloody seller cant remember what it was last used for 3 years ago.

Menthol you any idea how big the tank is?


----------



## Menthol

Distilled water is inexpensive in grocery stores in US, the tank is around gallon, I purchased my unit new, but last flush I drained chiller by letting it siphon and blowing in one hose and repeating
You have an unknown purchasing used from a buyer that doesn't know what it was used for, the tank is not clear so cannot see water or water level inside
You need 2 adapters for connecting fittings http://www.performance-pcs.com/new-hailea-connection-adapter-for-hailea-hc-250a-hc-300a-hc-500a-g1-4.html

Input Voltage: 110V
Power: 1/2 PS
Power consumption: 375W
Cooling capacity: 790W (suitable for extreme OC and cooling)
Recommended coolant flow: 1200-3000 l/h
Weight: 22kg
Cooling agent: R134A
Dimensions: 475x360x490mm (LxWxH)
Heat exchanger material: Titanium
max. compressive load: 0,8bar
Temperature adjustments: Digital read-out (precision 0,1°C), temperature setting in 1°C steps
current demand: 2.4 A (50 Hz)
current drain: 552 W


----------



## nrpeyton

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Just distil it yourself. Not that hard ^^


that's actually a good point.

Look you can buy a 1L/hr system from as little as £40 (probably even cheaper if I looked harder)

Quote:


> Originally Posted by *Menthol*
> 
> Distilled water is inexpensive in grocery stores in US, the tank is around gallon, I purchased my unit new, but last flush I drained chiller by letting it siphon and blowing in one hose and repeating


Okay thanks for the info Menthol much appreciated.

I've opened a support ticket with "Mayhems" and informed them about the larger 1 gallon tank size today after reading your post.

They have emailed back asking if the tank is accessible and recommended a cleaning solution but the problem with the solution is if it is impossible to completely remove *all* liquid from the tank the cleaning solution may not be suitable. As the "destilled only" flush afterwards may not rid *everything* from the tank? I suppose it would depend where the "inlets" and "outlets" are on the tank (bottom-middle or bottom-bottom).

I'll know more of course when it gets here.

But I would assume then because you had to siphon it that the tank is not accessible?

To me that seems like a fundamental design flaw which would make "deep cleaning" impossible. I.E. it be impossible to remove *all* old water from the tank?

Lol this seems to be turning into a "project"..

I'm probably going to have to go out and buy a home distillery kit, some sort of "submersible pump" I can use with the distilled and then the most expensive bit (some sort of filter)... but finding a filter system that fits the industry standard G1/4 fittings will probably be impossible (i doubt it exists) so a DIY filter solution is probably going to be on the agenda....

Once I'm up and running I could even offer deep cleans for clients lol....

Argh anyway still a headache just now. I'm probably looking too far into it (before its even here) like I do everything...

Anyone watch the AMD New Horizon Event?

I was at work, so just about to watch the you-tube playback now myself....

Been quiet on here for a few days,


----------



## mypickaxe

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Just distil it yourself. Not that hard ^^


I wouldn't bother in the US, but in the UK: http://www.wikihow.com/Make-Distilled-Water


----------



## binormalkilla

So which GTX 1080 is the best way to go if I'm putting an EK full coverage block on it immediately? I was planning on going with an MSI Founder's Edition...any ideas? Getting rid of these 290Xs because I'm sick of fiddling with Crossfire. It works well on many games, but it seems like every driver update is a roll of the dice.


----------



## turtletrax

Quote:


> Originally Posted by *binormalkilla*
> 
> So which GTX 1080 is the best way to go if I'm putting an EK full coverage block on it immediately? I was planning on going with an MSI Founder's Edition...any ideas? Getting rid of these 290Xs because I'm sick of fiddling with Crossfire. It works well on many games, but it seems like every driver update is a roll of the dice.


I have two Asus founders on heatkiller blocks w/t4 bios and very happy. I can fold at 2100mhz for days on end with zero throttling never going above 40c, and I can game with everything maxed and never any bouncing clocks or throttling around 36c. Wasn't the case before the blocks, or the t4 bios. Those two in combination is pretty potent with a good loop









There may be people who disagree, but the power wielded in a single 8 pin and small footprint under water is pretty amazing. Also, reference is always a snap to block up properly.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *mypickaxe*
> 
> I wouldn't bother in the US, but in the UK: http://www.wikihow.com/Make-Distilled-Water


I agree. Seems like you can get it everywhere around there.

Many other places though, like the UK or even Norway where I am, it would seem like distilled water isn't something you can get everywhere. Deionized water is easy to get, but distilled you often have to order online, which jacks up the pricing with expensive shipping. Been running deionized myself because it's cheap and easy to get, and even without any biocides or anything, the loop is nice and clean.


----------



## smicha

1080 - sorry- 14x 1080 - have me in.


----------



## emperium85

Hi guys, still nothing on customizing the GTX1070/1080 bios ? Does anyone know if somebody is developping it ?

I want to use my card with its full potential like I did with my GTX980Ti G1, 1550/1600 core en NO GPU boost, just steady FPS.


----------



## nrpeyton

Quote:


> Originally Posted by *6u4rdi4n*
> 
> I agree. Seems like you can get it everywhere around there.
> 
> Many other places though, like the UK or even Norway where I am, it would seem like distilled water isn't something you can get everywhere. Deionized water is easy to get, but distilled you often have to order online, which jacks up the pricing with expensive shipping. Been running deionized myself because it's cheap and easy to get, and even without any biocides or anything, the loop is nice and clean.


Deionised is fine if you want to keep your loop clean but "clean" isn't the issue.

Deionised water still has heavy metal deposits which can corrode your expensive cooling hardware.

I checked with EK support and they confirmed 'deionised' voids warranty.

I've done a little research over the last few weeks on this and it seems that there are two different "filtering techniques" used which are so fundamentally different.

Deionised water is treated by forcing it through a series of expensive filtering systems. _(Imagine a cloth/paper/foam water filter)._

Distilled water is actually boiled to steam, evaporated (then collected). So with this method you are literally pulling the water out of the deposits. (Metal deposits obviously can't turn to steam).

I've just found a great supplier online for U.K distilled water. 100x cheaper than having to buy "Pure H2O" or other makes from computing websites. My order was for 25L for only £17.95 including delivery.

_And yes I did say 25L lol (I have a water chiller to flush)._


----------



## nrpeyton

Quote:


> Originally Posted by *emperium85*
> 
> Hi guys, still nothing on customizing the GTX1070/1080 bios ? Does anyone know if somebody is developping it ?
> 
> I want to use my card with its full potential like I did with my GTX980Ti G1, 1550/1600 core en NO GPU boost, just steady FPS.


No I'm afraid not mate, but have you tried the "curve" overclocking method? You can disable "GPU Boost 3.0 Dynamic Voltage Control" and lock specific voltages. Some people (including me) are getting higher overclocks. _And if you like tinkering around with it as you did your old card; its got that appeal. - Also memory is scaling quite nicely on Pascal too _

Also depending what card you have; there is a solution that may help you. You could jump onto the T4 Bios bandwagon and flash it to your card. It has no power limit and unlocks voltage up to 1.2v
Not something you would want to do without doing all your research first though.

Quote:


> Originally Posted by *smicha*
> 
> 1080 - sorry- 14x 1080 - have me in.


wow is that your build?

Just received my new replacement heatsink from EVGA _after I broke the LED cable_

*\/ \/ old*

 *<< new*
*<< new*

*Thinking of doing a comparison article on Memory & VRM temps for air vs. my EK block.
Would that interest anyone except me? lol

I would be measuring using temperature probes in real-time since Nvidia doesn't provide for this.

*


----------



## binormalkilla

Quote:


> Originally Posted by *turtletrax*
> 
> I have two Asus founders on heatkiller blocks w/t4 bios and very happy. I can fold at 2100mhz for days on end with zero throttling never going above 40c, and I can game with everything maxed and never any bouncing clocks or throttling around 36c. Wasn't the case before the blocks, or the t4 bios. Those two in combination is pretty potent with a good loop
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There may be people who disagree, but the power wielded in a single 8 pin and small footprint under water is pretty amazing. Also, reference is always a snap to block up properly.


Thanks for the tip. I went ahead and ordered the MSI FE with a heatkiller block and backplate. I typically prefer reference boards.


----------



## ucode

Quote:


> Originally Posted by *smicha*
> 
> 1080 - sorry- 14x 1080 - have me in.


Nice project.

Seemed a lot of the guys on OTOY are having trouble optimizing for the Octane bench, typically should see over 170pts for a singe 1080 card when done properly. Of course could be the bench itself is creating the problem in the first place.


----------



## zlpw0ker

Quote:


> Originally Posted by *smicha*
> 
> 1080 - sorry- 14x 1080 - have me in.


Im in awe over you new rigs. that is just an insane WS board. Please document everything on youtube.
I subbed to you.
What kind of mobo's you picked out that supports 7 gpu's with dual xeon?


----------



## owikhan

Looking for best Curve Fans settings for my ZOTAC GTX 1080 AMP EDiTION..


----------



## DamiNQN

Just finish my rig whith 1080 Sli:


----------



## turtletrax

Quote:


> Originally Posted by *binormalkilla*
> 
> Thanks for the tip. I went ahead and ordered the MSI FE with a heatkiller block and backplate. I typically prefer reference boards.


I bought two 1080 strix and was seriously underwhelmed, and the size difference was not worth the hassle in my Murderbox. Returned them and went founders because I had decided to watercool again (not sure why I figured I could get away without TBH).

Ended up getting terrible samples from Watercool and had to spend about 20 hours sanding and polishing the plexi, and ups beat the tar out of the back plates in shipping, but it was all worth it. They look amazing, perform like crazy and are cool as a cucumber. Hope you are as satisfied as I am. Just remember to flash to t4 bios unless you can't live with the loss of one Displayport.


----------



## binormalkilla

Quote:


> Originally Posted by *turtletrax*
> 
> I bought two 1080 strix and was seriously underwhelmed, and the size difference was not worth the hassle in my Murderbox. Returned them and went founders because I had decided to watercool again (not sure why I figured I could get away without TBH).
> 
> Ended up getting terrible samples from Watercool and had to spend about 20 hours sanding and polishing the plexi, and ups beat the tar out of the back plates in shipping, but it was all worth it. They look amazing, perform like crazy and are cool as a cucumber. Hope you are as satisfied as I am. Just remember to flash to t4 bios unless you can't live with the loss of one Displayport.


I installed everything last night, and so far I'm pleased. I'm getting a 7F temperature delta while loaded. I removed 3 290Xs with the same loop when I installed the 1080, so I have an absurdly over provisioned cooling setup







. I went with the acetal block and I'm pleased with the results, despite normally going with EK for all of my water cooling.



So now it's time to flash the t4 bios and begin overclocking!


----------



## turtletrax

Quote:


> Originally Posted by *binormalkilla*
> 
> I installed everything last night, and so far I'm pleased. I'm getting a 7F temperature delta while loaded. I removed 3 290Xs with the same loop when I installed the 1080, so I have an absurdly over provisioned cooling setup
> 
> 
> 
> 
> 
> 
> 
> . I went with the acetal block and I'm pleased with the results, despite normally going with EK for all of my water cooling.
> 
> 
> 
> So now it's time to flash the t4 bios and begin overclocking!


Wow, ya, I think you should be pretty set with that loop









Run Afterburner and expand the graph and run Timespy on stock bios and see what your power limiter and voltage limiter are doing. One card would always voltage limit no matter what was going on and I would start losing clock steps like crazy, even with 36c load temps. Clocks would be bouncing allover the place. Flashed t4, and completely gone. everything on the graph would be straight lines after temps normalized. I was so happy. Interested to see if you have anything similar.

Now my Acer X34 sings 100FPS in just about everything I throw at it. Its bliss


----------



## Vellinious

I need more volts....thinking about giving the T4 bios a run again. Got the temps figured out. Winter is good. lol


----------



## Martin778

I'm more amazed by the lack of perfcap


----------



## Widde

I've got a Evga 1080 ftw on the way ^_^ Going to feel great to throw out my leaf blower I have right now (R9 290 Reference)

My first team green card since my gts250 ^^ Hoping for a better and more silent experience


----------



## nrpeyton

Quote:


> Originally Posted by *DamiNQN*
> 
> Just finish my rig whith 1080 Sli:


looks nice, I like it. jelous mine doesn't look so aesthetically pleasing. still performs ok though. lol

Quote:


> Originally Posted by *turtletrax*
> 
> I bought two 1080 strix and was seriously underwhelmed, and the size difference was not worth the hassle in my Murderbox. Returned them and went founders because I had decided to watercool again (not sure why I figured I could get away without TBH).
> 
> Ended up getting terrible samples from Watercool and had to spend about 20 hours sanding and polishing the plexi, and ups beat the tar out of the back plates in shipping, but it was all worth it. They look amazing, perform like crazy and are cool as a cucumber. Hope you are as satisfied as I am. Just remember to flash to t4 bios unless you can't live with the loss of one Displayport.


why would you sand and polish the plexi? ~confused~

Quote:


> Originally Posted by *Vellinious*
> 
> I need more volts....thinking about giving the T4 bios a run again. Got the temps figured out. Winter is good. lol


Quote:


> Originally Posted by *Martin778*
> 
> I'm more amazed by the lack of perfcap


aye, would of been better to of had GPU-Z set to "maximum" to give a better idea of what you were peaking at in the run (especially when ur talking about voltage) lol

Quote:


> Originally Posted by *Widde*
> 
> I've got a Evga 1080 ftw on the way ^_^ Going to feel great to throw out my leaf blower I have right now (R9 290 Reference)
> 
> My first team green card since my gts250 ^^ Hoping for a better and more silent experience


welcome to the club


----------



## nrpeyton

Quote:


> Originally Posted by *owikhan*
> 
> Looking for best Curve Fans settings for my ZOTAC GTX 1080 AMP EDiTION..


Best fan curve for me (before I went water) was simply:
something ridiculously low at idle (15%) maybe...

then 100% as soon as temps hit 45c.

so
0 C to 39C - 15%
40c -50%
41c - 60%
42c - 70%
43c - 80%
44c - 90%
45c - 100%
46c - 94c - 100%

That curve is strictly a *performance only* curve (with no care towards noise at all).

You can adjust it to start 5c higher or 5c lower as some fans take up to 6/7 seconds to reach full speed (and in that time you could lose 10c which you won't recover until you restart your 3d app).

I maxed out at 55c-58c on prolonged gaming on a cold day.

Now with water I get to 45c (but only because I don't have enough radiator space). If I had enough radiator coverage I'd be maxing out at 35-38c.

Water Chiller will be here soon though; can't wait ;-)


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> looks nice, I like it. jelous mine doesn't look so aesthetically pleasing. still performs ok though. lol
> why would you sand and polish the plexi? ~confused~
> 
> aye, would of been better to of had GPU-Z set to "maximum" to give a better idea of what you were peaking at in the run (especially when ur talking about voltage) lol
> welcome to the club


Stock voltage....won't go higher than 1.093v.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Stock voltage....won't go higher than 1.093v.


aye lol I thought so, the Courier picks up my Chiller from London tomorrow to deiver to me (In Scotland).

Just hope aquatuning.co.uk don't take forever to deliver all my new fittings and tubing and flushing equipment (its a U.K website with .co.uk suffix) but everything is shipped from Germany.

The U.K is **** for computer geeks, most my friends here are only into football and XBOX (or playstation 4) you have to order everything online in the U.K.

Our biggest PC retailer is a company called "PC-WORLD" and they are so pathetic you can't even go in and buy thermal paste. lol

Its all just laptops and OEM systems, maybe a wireless router or a few HDMI cable... but definitely no speciality / enthusiast stuff whatsoever.


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## owikhan

Quote:


> Originally Posted by *nrpeyton*
> 
> Best fan curve for me (before I went water) was simply:
> something ridiculously low at idle (15%) maybe...
> 
> then 100% as soon as temps hit 45c.
> 
> so
> 0 C to 39C - 15%
> 40c -50%
> 41c - 60%
> 42c - 70%
> 43c - 80%
> 44c - 90%
> 45c - 100%
> 46c - 94c - 100%
> 
> That curve is strictly a *performance only* curve (with no care towards noise at all).
> 
> You can adjust it to start 5c higher or 5c lower as some fans take up to 6/7 seconds to reach full speed (and in that time you could lose 10c which you won't recover until you restart your 3d app).
> 
> I maxed out at 55c-58c on prolonged gaming on a cold day.
> 
> Now with water I get to 45c (but only because I don't have enough radiator space). If I had enough radiator coverage I'd be maxing out at 35-38c.
> 
> Water Chiller will be here soon though; can't wait ;-)


Can u share screen short of curve fan settings please..


----------



## tfam26

Succesfully flashed the T4 bios to my MSI GTX 1080 Gaming X and my voltage is still locked @ stock voltages. Tried all the options in Afterburner to unlock voltage, moved the voltage slider to +100%, power target/temp to max unlinked, etc etc... something I'm missing here? This is using manual overclock based on offseting the core clock and I havce not touched memory... do I need to use the curve in order to up the voltage perhaps?

I also notice most people posting a hardcap on the voltage @1.09 bit mine has never gone above 2126mhz @1.071 even in the most extreme gaming/bench situations with the T4 or stock bios. If I try to bump it up over 2126mhz, the voltage does not increase and I get artifacts then eventual crashes. Any help?


----------



## owikhan

i try my self this one ZOTAC 1080 AMP EDITION
is it alright?


----------



## SirCanealot

Quote:


> Originally Posted by *owikhan*
> 
> i try my self this one ZOTAC 1080 AMP EDITION
> is it alright?


It's really up to you what noise level/temperatures you want to put up with. Technically the best setting for the fan is either 100% or 0% depending on your perspective. In my opinion, fan speeds depending on the card start to get abrasive around 55-75%, so I would advise a fan curve that hits around there at the temps you want to hit. (EG, Acelero IV fans are only 2000rpm, so they make basically no noise at 100%







). If you're interested, I'm generally happier with a 'fixed' amount of noise, so I generally just set the fan % to whatever I can put up with at the time and depending on how noisy the game is, etc...


----------



## Tdbeisn554

So after lots of issues, problems and RMA's I finally have my GTX 1080 back, good LED's and minimal coil whine







I have exams soon but after that I will probably be a bit more active with overclocking my rig and my 1080







Will post results


----------



## Widde

What seems to be the average overclock that people achieve on these cards? I'm having low hopes since I've lost the lottery every single time I've bought silicone


----------



## nrpeyton

Quote:


> Originally Posted by *owikhan*
> 
> Can u share screen short of curve fan settings please..


[*full size* - right click & open new tab]


----------



## Randomocity

Quote:


> Originally Posted by *Widde*
> 
> What seems to be the average overclock that people achieve on these cards? I'm having low hopes since I've lost the lottery every single time I've bought silicone


The average overclock is somewhere between 2000-2100mhz. Some cards can hit somewhere between 2100-2214 mhz once they're under water. What card do you have and what's your overclock look like?


----------



## Widde

Quote:


> Originally Posted by *Randomocity*
> 
> The average overclock is somewhere between 2000-2100mhz. Some cards can hit somewhere between 2100-2214 mhz once they're under water. What card do you have and what's your overclock look like?


I dont have it in my hands yet ^^ Got a Evga ftw on the way ^_^ Getting it tomorrow


----------



## Randomocity

Quote:


> Originally Posted by *Widde*
> 
> I dont have it in my hands yet ^^ Got a Evga ftw on the way ^_^ Getting it tomorrow


Enjoy, it's a great card and I've been super happy with it. I passed the 22000mhz barrier with the t4 bios that's floating in the thread, but my card seems to be happiest at 2189mhz.The biggest fps increase was honestly the memory overclock, I got nearly 8 fps boosting +725 Mhz.


----------



## Widde




----------



## nrpeyton

Quote:


> Originally Posted by *Randomocity*
> 
> Enjoy, it's a great card and I've been super happy with it. I passed the 22000mhz barrier with the t4 bios that's floating in the thread, but my card seems to be happiest at 2189mhz.The biggest fps increase was honestly the memory overclock, I got nearly 8 fps boosting +725 Mhz.


Quote:


> Originally Posted by *Vellinious*


;-)


----------



## nrpeyton

Water Chiller finally came!

Just *waiting* on my new fittings/tubing and flushing equipment arriving from Germany.

Hailea HC-500A (Ultra Titan 1500)
-Cooling capacity 790W
-Temperatures down to 4 degrees C





*25 litres of distilled lol:*


----------



## ciccoman

I'm looking for the bios of Galax HOF 1080 *Limited Edition* ... you kindly check your sources?

I would try to flash it on my HOF to see if the boost stages are different.

Thanks


----------



## nrpeyton

Quote:


> Originally Posted by *ciccoman*
> 
> I'm looking for the bios of Galax HOF 1080 *Limited Edition* ... you kindly check your sources?
> 
> I would try to flash it on my HOF to see if the boost stages are different.
> 
> Thanks


I was looking for that BIOS to flash to my Classified.

Couldn't find it on the techpowerup GPU database and I also asked at the GALAXY HOF Owners Club (but none of them had it, most of those guys are still running 980 TI's).

You could try reaching out to GALAX for it (just say your a GALAX customer lol).

If you get ahold of it, could you please PM me the file?

A million thanks if you find it, and you can PM me 

I'll have another go at sourcing it too 

Thanks,

Nick Peyton

~P.S~.

My Classified isn't compatible with many other cards BIOS's due to massive 14-phase overkill VRM lol but *is* compatible with the HOF (as it is also an overkill card).


----------



## Widde

Got the card now







My god what a difference









Games are actually fluid now ^^


----------



## ciccoman

Quote:


> Originally Posted by *nrpeyton*
> 
> I was looking for that BIOS to flash to my Classified.
> 
> Couldn't find it on the techpowerup GPU database and I also asked at the GALAXY HOF Owners Club (but none of them had it, most of those guys are still running 980 TI's).
> 
> You could try reaching out to GALAX for it (just say your a GALAX customer lol).
> 
> If you get ahold of it, could you please PM me the file?
> 
> A million thanks if you find it, and you can PM me
> 
> I'll have another go at sourcing it too
> 
> Thanks,
> 
> Nick Peyton
> 
> ~P.S~.
> 
> My Classified isn't compatible with many other cards BIOS's due to massive 14-phase overkill VRM lol but *is* compatible with the HOF (as it is also an overkill card).


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> Water Chiller finally came!


Dude... I really hope you are not considering to use this with your Classified...


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Dude... I really hope you are not considering to use this with your Classified...


why not?

CPU and GPU are in same loop, so they'll both be cooled by the chiller instead of radiator 

...still waiting on all my new fittings and tubing e.t.c coming from Germany.

Should be here tomorrow for me finishing work. Can't wait ;-)


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> why not?
> 
> CPU and GPU are in same loop, so they'll both be cooled by the chiller instead of radiator
> 
> ...still waiting on all my new fittings and tubing e.t.c coming from Germany.
> 
> Should be here tomorrow for me finishing work. Can't wait ;-)


But you will unclassify the card...


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> But you will unclassify the card...


How will I lol?

You talking about condensation?


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> How will I lol?
> 
> You talking about condensation?


You really don't get it?







Take a good look at the bottle of distilled water haha!


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> You really don't get it?
> 
> 
> 
> 
> 
> 
> 
> Take a good look at the bottle of distilled water haha!


hahah oh aye,

I get it now lol, **** sake...

thought u were talking about the chiller lol

P.S.

I'm chomping at the bloody bit here wish this bloody Aquatuning order with fitments would hurry up.

Can't believe I'm so excited about what is effectively a ******* fridge lol.

I need to get out more lol

Anyway how is your 1080 going now lol u got everything sorted out?


----------



## DarknightOCR

After mounting the EK block, my 1080 FTW, walks with 34ºC maximum, in benchmarks and games type Bf1 during hours stays in the 31 / 32ºc

I was able to put a curve, which I have 2152Mhz with voltage stock (1.062v)
Bios original, I did not get to test the T4 after assembling the block.
That's good for me.


----------



## nrpeyton

Quote:


> Originally Posted by *DarknightOCR*
> 
> After mounting the EK block, my 1080 FTW, walks with 34ºC maximum, in benchmarks and games type Bf1 during hours stays in the 31 / 32ºc
> 
> I was able to put a curve, which I have 2152Mhz with voltage stock (1.062v)
> Bios original, I did not get to test the T4 after assembling the block.
> That's good for me.


That's good.

U able to check memory and VRM temperatures?

And don't forget memory overclock 

some people are getting nearly 5-8% extra FPS on memory alone


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> hahah oh aye,
> 
> I get it now lol, **** sake...
> 
> thought u were talking about the chiller lol
> 
> P.S.
> 
> I'm chomping at the bloody bit here wish this bloody Aquatuning order with fitments would hurry up.
> 
> Can't believe I'm so excited about what is effectively a ******* fridge lol.
> 
> I need to get out more lol
> 
> Anyway how is your 1080 going now lol u got everything sorted out?


Yep, after a combined time of almost 2 months without card, lots of emails and 4! RMA's...
My new card is pretty good now, minimal coil whine, good leds, and better OC capabilities then my first GTX lemon lol. Gonna do some testing and benchmarks to see what clocks I can get, but time I ran heaven it boosted to 2025 on stock without doing anything so that is good. Pretty sure I can get at least 2100








But gonna do some more research and reading into gpu and specifically Pascal overclocking, since its the first card I will overclock seriously. Never really overclocked my GTX 770. But it would be a shame to run a classy on stock speeds


----------



## DarknightOCR

I have to connect the sensor there, or the laser to see the temperature of the VRMs

The memory, from what I tested, I gained something from them between the 505Mhz and the 575Mhz.
I can up them without artifacts up to 875mhz + -, but it does not give me an increase in performance at more than 575mhz


----------



## Mayor Winters

Quote:


> Originally Posted by *DarknightOCR*
> 
> I have to connect the sensor there, or the laser to see the temperature of the VRMs
> 
> The memory, from what I tested, I gained something from them between the 505Mhz and the 575Mhz.
> I can up them without artifacts up to 875mhz + -, but it does not give me an increase in performance at more than 575mhz


Yea please do so, Im very interested in knowing that because I might get a 1080 FTW due to RMA'ing a 980ti Hydro Copper (discontinued) plus EK waterblock, and I want to know if the VRM issue is fixed with the block.


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Yep, after a combined time of almost 2 months without card, lots of emails and 4! RMA's...
> My new card is pretty good now, minimal coil whine, good leds, and better OC capabilities then my first GTX lemon lol. Gonna do some testing and benchmarks to see what clocks I can get, but time I ran heaven it boosted to 2025 on stock without doing anything so that is good. Pretty sure I can get at least 2100
> 
> 
> 
> 
> 
> 
> 
> 
> But gonna do some more research and reading into gpu and specifically Pascal overclocking, since its the first card I will overclock seriously. Never really overclocked my GTX 770. But it would be a shame to run a classy on stock speeds


Glad to hear EVGA squared you up properly in the end .









and aye; it would be a shame especially on a classy lol.

Once you've broken her in (in got a good feel for her); I'd consider putting her under Water lol. They always perform better "wet". lol

Nah seriously they do haha


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> Glad to hear EVGA squared you up properly in the end .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and aye; it would be a shame especially on a classy lol.
> 
> Once you've broken her in (in got a good feel for her); I'd consider putting her under Water lol. They always perform better "wet". lol
> 
> Nah seriously they do haha


Hahahaha








I was thinking on going with a custom loop with my new build, but I did not do it because if I want to do it I want to do it good, with good high end parts and my budget went into a GTX 1080 so... Maybe next update will be custom loop


----------



## nrpeyton

Quote:


> Originally Posted by *Archang3l*
> 
> Hahahaha
> 
> 
> 
> 
> 
> 
> 
> 
> I was thinking on going with a custom loop with my new build, but I did not do it because if I want to do it I want to do it good, with good high end parts and my budget went into a GTX 1080 so... Maybe next update will be custom loop


The EK kits are quite good, I got EK to start supporting the 1080 Classified with their older 780TI block (don't know if you followed the posts).

Here's its: http://forum.kingpincooling.com/showthread.php?t=3938

Also if you're going EK dont use the configurator coz it will cost you twice as much; go for one of the kits. For example the EK Kit P360. Then buy the GPU block seperately (you'll save a lot of money and all the parts are pretty much exactly the same as their highest end ones).

Guys,

Anyone know if this *Thermal Pad is electrically insulating or electrically conductive?:* (can't seem to get a straight answer from the seller).
*http://www.ebay.co.uk/itm/262746215254*

I want to replace my GPU pads.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> The EK kits are quite good, I got EK to start supporting the 1080 Classified with their older 780TI block (don't know if you followed the posts).
> 
> Here's its: http://forum.kingpincooling.com/showthread.php?t=3938
> 
> Also if you're going EK dont use the configurator coz it will cost you twice as much; go for one of the kits. For example the EK Kit P360. Then buy the GPU block seperately (you'll save a lot of money and all the parts are pretty much exactly the same as their highest end ones).
> 
> I want to replace my GPU pads.


Yeah I followed it, and read your posts. I was looking at EK stuff too, but I kinda like to pick my own parts and stuff







, but still thanks for the advice.

You good luck too with your GTX 1080 Unclassified Xp


----------



## Randomocity

Quote:


> Originally Posted by *nrpeyton*
> 
> The EK kits are quite good, I got EK to start supporting the 1080 Classified with their older 780TI block (don't know if you followed the posts).
> 
> Here's its: http://forum.kingpincooling.com/showthread.php?t=3938
> 
> Also if you're going EK dont use the configurator coz it will cost you twice as much; go for one of the kits. For example the EK Kit P360. Then buy the GPU block seperately (you'll save a lot of money and all the parts are pretty much exactly the same as their highest end ones).
> 
> Guys,
> 
> Anyone know if this *Thermal Pad is electrically insulating or electrically conductive?:* (can't seem to get a straight answer from the seller).
> *http://www.ebay.co.uk/itm/262746215254*
> 
> I want to replace my GPU pads.


Thermal pads are never electrically conductive, otherwise you'd short whatever it touches.

That being said, I like the idea of the kit, but I really love having all the parts exactly as I want em. Besides, I'll take my GTs any day over vardars.


----------



## ucode

Quote:


> Originally Posted by *Randomocity*
> 
> Thermal pads are never electrically conductive, otherwise you'd short whatever it touches.


That's not true

Example http://tennmaxusa.com/wp-content/uploads/2015/06/MaxTherm-GP-CP5000-TDS.pdf

While it's not the norm it is certainly a good question for an unknown. IMHO probably best to check on the manufacturers website if in doubt or if manufacturer unknown then a Ohm meter may be the next best thing.


----------



## Randomocity

Quote:


> Originally Posted by *ucode*
> 
> That's not true
> 
> Example http://tennmaxusa.com/wp-content/uploads/2015/06/MaxTherm-GP-CP5000-TDS.pdf
> 
> While it's not the norm it is certainly a good question for an unknown. IMHO probably best to check on the manufacturers website if in doubt or if manufacturer unknown then a Ohm meter may be the next best thing.


Some clarification, thermal pads that you stick into a computer SHOULD NOT be electrically conductive. That's how you bridge connections on your gpu and short it. There are applications in the electrical engineering world that can be both thermally and electrically conductive, however the pads on your gpu aren't one of them.


----------



## nrpeyton

Quote:


> Originally Posted by *Randomocity*
> 
> Some clarification, thermal pads that you stick into a computer SHOULD NOT be electrically conductive. That's how you bridge connections on your gpu and short it. There are applications in the electrical engineering world that can be both thermally and electrically conductive, however the pads on your gpu aren't one of them.


would a multimeter be able to be used to tell ?

The ones I bought weren't listed for use on GPU's.

They were on Ebay but the 'use scenario' wasn't listed.

And they werne't listed as electrically insulating, and the seller wouldn't give me a straight answer.

even a little conductivity might not cause an immediate short but could cause small longterm damage if they are conductive at all....?


----------



## pantsoftime

Quote:


> Originally Posted by *nrpeyton*
> 
> would a multimeter be able to be used to tell ?
> 
> The ones I bought weren't listed for use on GPU's.
> 
> They were on Ebay but the 'use scenario' wasn't listed.
> 
> And they werne't listed as electrically insulating, and the seller wouldn't give me a straight answer.
> 
> even a little conductivity might not cause an immediate short but could cause small longterm damage if they are conductive at all....?


The materials typically used for thermal pads aren't conductive unless they're designed specifically for that purpose such as EMI shielding. In these cases they add metal shavings to the material. A DMM can measure conductivity if you put it in resistance mode and measure across the pad. I'm betting you're going to be fine since the conductive ones usually cost a lot as well.
Edit: spelling


----------



## nrpeyton

Quote:


> Originally Posted by *pantsoftime*
> 
> The materials typically used for thermal pads aren't conductive unless they're designed specifically for that purpose such as EMI shielding. In these cases they add metal shavings to the material. A DMM can measure conductivity if you put it in resistance mode and measure across the pad. I'm betting you're going to be fine since the conductive ones usually cost a lot as well.
> Edit: spelling


Thanks I'm having a look at that just now.(With my multimeter)

So glad I bought it -- never thought I'd need it again after I bought it to probe about on my GPU but here I am lol 

Edit: (result)

Pads are fine, so I'm lucky then. Only paid about £2.50 (3 dollars)? for 10cm by 10cm 3 w/mk pads 

GPU manufacturers use 1 w/mk and even EK only use 3.2 w/mk.

Really don't see the point in more expensive pads (14 w/mk etc) there no measurable difference.


----------



## GraphicsWhore

Was gifted an EVGA Hybrid for Christmas by fiancee. Going to try and convince her to let me use it tomorrow night.

Coming from a MSI Gaming 6G 980Ti. Great card but gets noisy so I'm looking forward to something quieter. Still using my Antec Skeleton open-air case and I think the fan will fit perfectly in the would-be media drives section of the case.


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Did you try unplugging the PSU?
> Sometimes when switching from Master to Slave with the FTW's you need to do that for it to take effect.


I was about to post that...Here I need to do that whenever I switch BIOS, otherwise it would still read
Quote:


> Originally Posted by *GraphicsWhore*
> 
> 
> 
> Was gifted an EVGA Hybrid for Christmas by fiancee. Going to try and convince her to let me use it tomorrow night.
> 
> Coming from a MSI Gaming 6G 980Ti. Great card but gets noisy so I'm looking forward to something quieter. Still using my Antec Skeleton open-air case and I think the fan will fit perfectly in the would-be media drives section of the case.


Why not try changing the rad fans for some push/pull with quiet fans? Some Noctua's NF9 or even the cheap Arctic's F12 on a push pull setup should be pretty quiet under load and inaudible on idle.


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> Pads are fine, so I'm lucky then.


More a case of you would have been unlucky if they were conductive or low quality I think. IMO no need to leave it to chance though if you can check









FYI The link you gave does say they are electrically insulated from the breakdown voltage of 1.5kV and resistivity of >1017. It's ebay though so personally I'd steer clear myself. The 1017 is probably supposed to be 10 to the power of 17 as 1017 would be pretty poor but 10^17 is about 1,000 to 1,000,000 times better than most others :s


----------



## Dragonsyph

Quote:


> Originally Posted by *juniordnz*
> 
> I was about to post that...Here I need to do that whenever I switch BIOS, otherwise it would still read
> Why not try changing the rad fans for some push/pull with quiet fans? Some Noctua's NF9 or even the cheap Arctic's F12 on a push pull setup should be pretty quiet under load and inaudible on idle.


I dont know if its the same stock fan as my card but i dont think iv ever heard the sound of the fan and its just to the left of me on a table no case.


----------



## arrow0309

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pantsoftime*
> 
> The materials typically used for thermal pads aren't conductive unless they're designed specifically for that purpose such as EMI shielding. In these cases they add metal shavings to the material. A DMM can measure conductivity if you put it in resistance mode and measure across the pad. I'm betting you're going to be fine since the conductive ones usually cost a lot as well.
> Edit: spelling
> 
> 
> 
> Thanks I'm having a look at that just now.(With my multimeter)
> 
> So glad I bought it -- never thought I'd need it again after I bought it to probe about on my GPU but here I am lol
> 
> Edit: (result)
> 
> Pads are fine, so I'm lucky then. Only paid about £2.50 (3 dollars)? for 10cm by 10cm 3 w/mk pads
> 
> GPU manufacturers use 1 w/mk and even EK only use 3.2 w/mk.
> 
> *Really don't see the point in more expensive pads (14 w/mk etc) there no measurable difference.*
Click to expand...

The point (really) is about how much you consider a proper / better cooling of your vrm (mosfets) since they usually gets hotter than the gpu itself.
Of course, a better cooling involves not only better pads (like those of "at least" 12W/mK or even Fujipoly's famous 17W/mK) but also a better (liquid) coolng (since the extra amount of heat to dissipate will add some degrees onto your liquid temp).
However it's recommended, the vrm (and not only, the whole vga itself) will thank you and if you have a nice wc setup there won't be a problem at all
Think only at how care do we take when choosing the "word's best" thermal grease for our gpu's (I used to even put CLP and CLU but ended up using the Thermal Grizzly's Kryonaut).
So why do we have to use only rubbish for cooling our vrm's? Only because they never really show their temp (not anymore since they're not using expensive, I2C capable pwms no more) or because the mosfets tend / are certified to work at even higher temps than the gpus?
That won't work for me thanks.

So, I wouldn't recommend anything lower than those Phobya (7W/mK) pads, these Gelid of 12W/mK are IMO even better:

http://www.ebay.co.uk/sch/i.html?_from=R40&_trksid=p2050601.m570.l1311.R1.TR1.TRC0.A0.H0.Xgelid+thermal+.TRS0&_nkw=gelid+thermal+pad&_sacat=0


----------



## xTesla1856

What options do I have for BIOS reflashes on my EVGA Founder's with the stock air cooler? The highest clock I've seen so far is 2152MHZ while the card is cool, until it starts thermal throttling down to about 2088. During gameplay, it hovers between 2050 and 2101. Memory is at +500 at all times. I'm very happy with the card, as it seems I was very lucky in the silicon lottery. The Voltage and power limit are pegged during gameplay though. I wonder if a water block would help


----------



## Jedi Mind Trick

Thoughts on passive cooling on the VRAM/vrm? I got a decent deal on the zotac amp extreme, but the cooler it slightly to big for my case so I threw an universal block on the card. Core temps seem good so far (50*C peak), but I can't see the vrm temps.

Just realized I said "aio" block, meant universal block.


----------



## xartic1

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> Thoughts on passive cooling on the VRAM/vrm? I got a decent deal on the zotac amp extreme, but the cooler it slightly to big for my case so I threw an AIO block on the card. Core temps seem good so far (50*C peak), but I can't see the vrm temps.


What block did you put on exactly?

If anything, I would point a fan on the card to have the comfort of knowing there is air passing over them.


----------



## Jedi Mind Trick

Quote:


> Originally Posted by *xartic1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jedi Mind Trick*
> 
> Thoughts on passive cooling on the VRAM/vrm? I got a decent deal on the zotac amp extreme, but the cooler it slightly to big for my case so I threw an AIO block on the card. Core temps seem good so far (50*C peak), but I can't see the vrm temps.
> 
> 
> 
> What block did you put on exactly?
> 
> If anything, I would point a fan on the card to have the comfort of knowing there is air passing over them.
Click to expand...

Koolance GPU 210. And yea, I have a spare 92mm fan I can aim at it. Figured I should be fine, but wanted to ask.

Thanks!


----------



## xartic1

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> Koolance GPU 210. And yea, I have a spare 92mm fan I can aim at it. Figured I should be fine, but wanted to ask.
> 
> Thanks!


NP, FYI I have a 1080 HOF with a Antec spotcool blowing on the backplate. Not nearly as warm to the touch on benchmarks now.

Would you mind posting a picture of your card setup? I'm curious on the visuals.


----------



## Jedi Mind Trick

Quote:


> Originally Posted by *xartic1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jedi Mind Trick*
> 
> Koolance GPU 210. And yea, I have a spare 92mm fan I can aim at it. Figured I should be fine, but wanted to ask.
> 
> Thanks!
> 
> 
> 
> NP, FYI I have a 1080 HOF with a Antec spotcool blowing on the backplate. Not nearly as warm to the touch on benchmarks now.
> 
> Would you mind posting a picture of your card setup? I'm curious on the visuals.
Click to expand...

Can do as soon as I get home from work.


----------



## nrpeyton

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> Can do as soon as I get home from work.


I interested to see this too.

Then will be easier to comment on your question too 

Quote:


> Originally Posted by *arrow0309*
> 
> The point (really) is about how much you consider a proper / better cooling of your vrm (mosfets) since they usually gets hotter than the gpu itself.
> Of course, a better cooling involves not only better pads (like those of "at least" 12W/mK or even Fujipoly's famous 17W/mK) but also a better (liquid) coolng (since the extra amount of heat to dissipate will add some degrees onto your liquid temp).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> However it's recommended, the vrm (and not only, the whole vga itself) will thank you and if you have a nice wc setup there won't be a problem at all
> Think only at how care do we take when choosing the "word's best" thermal grease for our gpu's (I used to even put CLP and CLU but ended up using the Thermal Grizzly's Kryonaut).
> So why do we have to use only rubbish for cooling our vrm's? Only because they never really show their temp (not anymore since they're not using expensive, I2C capable pwms no more) or because the mosfets tend / are certified to work at even higher temps than the gpus?
> That won't work for me thanks.
> 
> So, I wouldn't recommend anything lower than those Phobya (7W/mK) pads, these Gelid of 12W/mK are IMO even better:
> 
> http://www.ebay.co.uk/sch/i.html?_from=R40&_trksid=p2050601.m570.l1311.R1.TR1.TRC0.A0.H0.Xgelid+thermal+.TRS0&_nkw=gelid+thermal+pad&_sacat=0


I hear what you are saying; but have you ever taken a probe to the VRM or memory chips and seen if there was any real, measurable difference with pads of a higher w/mk?
Quote:


> Originally Posted by *xTesla1856*
> 
> What options do I have for BIOS reflashes on my EVGA Founder's with the stock air cooler? The highest clock I've seen so far is 2152MHZ while the card is cool, until it starts thermal throttling down to about 2088. During gameplay, it hovers between 2050 and 2101. Memory is at +500 at all times. I'm very happy with the card, as it seems I was very lucky in the silicon lottery. The Voltage and power limit are pegged during gameplay though. I wonder if a water block would help


You could flash the T4/STRIX BIOS (it doesn't brick FE cards) and has no power limit.

You need to be careful though; it also has no temperature limit (so you need to watch carefully).

The other "watch out" is your fans will spin lower (since FE is a 'blower style' these fans spin up faster than a STRIX).

To be quite honest; I would not recommend flashing this BIOS until after your card is under a water-block. Your temps will likely be out of control. (and with no temp throttling you could do damage)

The BIOS also has a higher voltage limit (1.2v).

Just be careful & do your research first.

There plenty of people on this thread with FE who have flashed the T4 successfully. Click [search] at the top and enter "T4 BIOS Founder's" or something and you'll get a list of all the related posts,

The file was posted a about 5-10 pages back.


----------



## Fidelity21

Quote:


> Originally Posted by *DarknightOCR*
> 
> After mounting the EK block, my 1080 FTW, walks with 34ºC maximum, in benchmarks and games type Bf1 during hours stays in the 31 / 32ºc
> 
> I was able to put a curve, which I have 2152Mhz with voltage stock (1.062v)
> Bios original, I did not get to test the T4 after assembling the block.
> That's good for me.


What is your water temperature? My ASUS FE 1080 sits at 35C at idle and pushes into the 50C range when fully loaded for a bit.

Also, I still haven't tried flashing a new BIOS, but the ASUS FE 1080 is able to push into 2100mhz territory now that I'm able to push the core voltage 100% using MSI Afterburner 4.3 beta. Running The Division in 4k gets me about 52fps with Nvidia recommended settings before the OC and it stays at 60fps when my overclock is enabled.

Overclocking core to +225 and memory clock +600


----------



## Jedi Mind Trick

Spoiler: Warning: Picture of setup















This is how i have it setup, card is perfectly sized without that massive cooler. It sits on my pump [which is plastic], which hopefully helps with the eventual sag since I can't use the backplate.

On a side note, the flash makes my lazy setup look even worse


----------



## DarknightOCR

Quote:


> Originally Posted by *Fidelity21*
> 
> What is your water temperature? My ASUS FE 1080 sits at 35C at idle and pushes into the 50C range when fully loaded for a bit.
> 
> Also, I still haven't tried flashing a new BIOS, but the ASUS FE 1080 is able to push into 2100mhz territory now that I'm able to push the core voltage 100% using MSI Afterburner 4.3 beta. Running The Division in 4k gets me about 52fps with Nvidia recommended settings before the OC and it stays at 60fps when my overclock is enabled.
> 
> Overclocking core to +225 and memory clock +600


I do not have a water thermometer, but I think it's low.
In idle I have the graph with 19 / 20ºC, and the CPU [email protected] with 20 / 22ºC + -

Circuit with pump MCP-355 + Rad 360 + rad 120, CPU and graphic with EK blocks
As I said, running benchmarks I have the graph at 34ºC, I have not seen any value higher than the time I run benchmark or that I am playing some game
CPU in the same benchmark and game situations rises there for 50ºC + -

This was after some time playing


----------



## Dragonsyph

Quote:


> Originally Posted by *DarknightOCR*
> 
> I do not have a water thermometer, but I think it's low.
> In idle I have the graph with 19 / 20ºC, and the CPU [email protected] with 20 / 22ºC + -
> 
> Circuit with pump MCP-355 + Rad 360 + rad 120, CPU and graphic with EK blocks
> As I said, running benchmarks I have the graph at 34ºC, I have not seen any value higher than the time I run benchmark or that I am playing some game
> CPU in the same benchmark and game situations rises there for 50ºC + -
> 
> This was after some time playing


Whats the room temps?


----------



## nrpeyton

Loving my new setup lol 



And no condensation 

Got a cheap dew point/ relative humidity meter and I can go as low as 9 degrees C water temp without issue. (how far down I can go will vary day-to-day) but as long as I don't open all the Windows and keep the thermostat in my house at 21c I can maintain a 'relative humidity' indoors of about 40-45% which = 9c without a single iota of condensation 

As long as I keep an eye on the dew point meter I'm fine 

Also even with the EK D5 pump running at its LOWEST setting the water still gets around the system flawlessly.

And I don't even have a dehumidifier yet 

Under *Furmark full load* with power target at 130% on my Classified (that's a 320W draw compared to an Nvidea maximum of 180 watts) and my temp still never exceeds 19-20c.

The *SECOND* I turn Furmark off the temp drops back to 10c before I can blink 

Not sure whats up with my CPU though (temps are still reaching 48c with PRIME95) and when I stop the sress test it takes 1 min 20s for socket temp to drop back to 20c.

'Package Temp" on CPU on the other hand _never exceeds_ 22c and drops down to 5c in *2 flashes* when stopping test (but 'Package Temp' on AMD is never reliable).


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> Loving my new setup lol
> 
> 
> 
> And no condensation
> 
> Got a cheap dew point/ relative humidity meter and I can go as low as 9 degrees C water temp without issue. (how far down I can go will vary day-to-day) but as long as I don't open all the Windows and keep the thermostat in my house at 21c I can maintain a 'relative humidity' indoors of about 40-45% which = 9c without a single iota of condensation
> 
> As long as I keep an eye on the dew point meter I'm fine
> 
> Also even with the EK D5 pump running at its LOWEST setting the water still gets around the system flawlessly.
> 
> And I don't even have a dehumidifier yet
> 
> Under *Furmark full load* with power target at 130% on my Classified (that's a 320W draw compared to an Nvidea maximum of 180 watts) and my temp still never exceeds 19-20c.
> 
> The *SECOND* I turn Furmark off the temp drops back to 10c before I can blink


Are you able to hit higher Oc now or does your boost clock go higher with same Oc as before? Those temps are real nice, 8))). Mine are about double at around 42 max.


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> Are you able to hit higher Oc now or does your boost clock go higher with same Oc as before? Those temps are real nice, 8))). Mine are about double at around 42 max.


Rock-solid stable at 2215MHZ and + 775 memory (16 degrees C core GPU temp).

Not tried to push her faster yet lol.

-Right click & open new tab for *full size*-


-Right click & open new tab for *full size*-


----------



## PCBeast

Hello guys. I have Asus Strix OC and I flashed my card with T4 bios but unfortunately my results are disappointing. I reached 2.2 Ghz without artifacts with 1.13v but score in Heaven is lower than my original bios with default setup 1.093v with 2126 Mhz. Higher voltage also doesn't help nor higher frequency or similar combination both of them. I always have lower score than original bios. I tried one more bios from this guy :
http://forum.hwbot.org/showthread.php?t=159025
but results are almost the same. Is there anyone with experience like mine?
Any advice?


----------



## PCBeast

Hello guys. I have Asus Strix OC and I flashed my card with T4 bios but unfortunately my results are disappointing. I reached 2.2 Ghz without artifacts with 1.13v but score in Heaven is lower than my original bios with default setup 1.093v with 2126 Mhz. Higher voltage also doesn't help nor higher frequency or similar combination both of them. I always have lower score than original bios. I tried one more bios from this guy :
http://forum.hwbot.org/showthread.php?t=159025
but results are almost the same. Is there anyone with experience like mine?
Any advice?


----------



## nrpeyton

Quote:


> Originally Posted by *PCBeast*
> 
> Hello guys. I have Asus Strix OC and I flashed my card with T4 bios but unfortunately my results are disappointing. I reached 2.2 Ghz without artifacts with 1.13v but score in Heaven is lower than my original bios with default setup 1.093v with 2126 Mhz. Higher voltage also doesn't help nor higher frequency or similar combination both of them. I always have lower score than original bios. I tried one more bios from this guy :
> http://forum.hwbot.org/showthread.php?t=159025
> but results are almost the same. Is there anyone with experience like mine?
> Any advice?


If you overclock too hard your card becomes unstable (not *so* unstable that it crashes but _enough instability_ to force your card to have to "error correct". I.E. it tries to prevent it's self from crashing by dropping frames instead of rendering frames that would either artifact or cause a driver crash.

These "dropped frames" won't count in your score; thus, you lose points.

The only way to combat this; is to get temperatures down; the lower your temperature the better the eletrons are conducted along their 'pathways' and the less electrons escape through non-conductive walls (its this leakage of electrons that causes instability).

Pascal is *extremely* temperature sensitive.


----------



## PCBeast

@nrpeyton

Yes I am familiar in Pascal but I already have low temperatures. On air, full load 40-45c. I tried unlocked bioses because I want to buy water block and get better clocks. In my case, you think that temperature is the main reason for throttling? So if this true, which temperature I need for normal function? Beneath 40c? I am not so sure, probably chip is limited.


----------



## nrpeyton

Quote:


> Originally Posted by *PCBeast*
> 
> @nrpeyton
> 
> Yes I am familiar in Pascal but I already have low temperatures. On air, full load 40-45c. I tried unlocked bioses because I want to buy water block and get better clocks. In my case, you think that temperature is the main reason for throttling? So if this true, which temperature I need for normal function? Beneath 40c? I am not so sure, probably chip is limited.


The scale on Pascal is about 100mhz for every 50 degrees C

Have a look at your memory too; some people are getting over 675+ memory overclocks which is yielding about the same as a + 140 on the core.

Never realised you were throttling too; you shouldn't be throttling if you're using the STRIX BIOS as it has no temp or power limits (and higher voltage limit).

You won't be throttling as such; but your card is likely dropping frames due to leaked electrons as you're flooding the card with more electrons than it can cope with.

Only way to make it handle those electrons a bit better is to make it more conductive.

All materials are more conductive the colder they are


----------



## PCBeast

Quote:


> Originally Posted by *nrpeyton*
> 
> The scale on Pascal is about 100mhz for every 50 degrees C
> 
> Have a look at your memory too; some people are getting over 675+ memory overclocks which is yielding about the same as a + 140 on the core.
> 
> Never realised you were throttling too; you shouldn't be throttling if you're using the STRIX BIOS as it has no temp or power limits (and higher voltage limit).
> 
> You won't be throttling as such; but your card is likely dropping frames due to leaked electrons as you're flooding the card with more electrons than it can cope with.
> 
> Only way to make it handle those electrons a bit better is to make it more conductive.
> 
> All materials are more conductive the colder they are


Ok. I know, It's clear to me. I tried also without oc memory and again I have the same issue.

If you are interesting in my results look at my review :

http://forum.benchmark.rs/showthread.php?388997-Asus-GTX-1080-Strix-OC-Single-SLI-User-Test-Review


----------



## nrpeyton

Quote:


> Originally Posted by *PCBeast*
> 
> Ok. I know, It's clear to me. I tried also without oc memory and again I have the same issue.
> 
> If you are interesting in my results look at my review :
> 
> http://forum.benchmark.rs/showthread.php?388997-Asus-GTX-1080-Strix-OC-Single-SLI-User-Test-Review


Its hard to compare benchmarks because everyone uses different software and have different CPU's & setups.

Not been on 3dmark in a while actually; think I'm going to have a wee blast at it tomorrow again lol.

But 2 SLI 1080 is a powerhouse; I'm sure what you have is *more* than adequate. I doubt overclocking is going to do very much more for your gaming experience.

If you're like me; and just enjoy pushing harder for the fun of it (to see how far you can go) then the next step would be moving onto the next step for cooling.

Doesn't neceassarily have to be a jump straight into the deep end, you could go semi by starting with a water-chiller like me.

There's plenty playtime in it and bragging rights and there's a guy over at techpowerup.com who has the same model of chiller as me; whose got a validation at 5.5GHZ.

Anyway if you're not power limiting, you're not temperature *throttling (not limiting)* and more volts is just giving you less frames then you've done well to get that far (as I can see from the benchmarks I looked at on your link).. but on the back of that; you _are_ on the bleeding edge of what your card can do. I'd pull back a little to find your highest scoring combination of core/memory & voltage then leave it at that. At least until you are ready to jump into the realms of more expensive cooling.


----------



## PasK1234Xw

Quote:


> Originally Posted by *GraphicsWhore*
> 
> 
> 
> Was gifted an EVGA Hybrid for Christmas by fiancee. Going to try and convince her to let me use it tomorrow night.
> 
> Coming from a MSI Gaming 6G 980Ti. Great card but gets noisy so I'm looking forward to something quieter. Still using my Antec Skeleton open-air case and I think the fan will fit perfectly in the would-be media drives section of the case.


Can you share BIOS for that TIA


----------



## nrpeyton

deleted, misread previous post


----------



## immortalkings

my MSI Gaming X is now getting the OC boost on their advertise site.. and im not getting the 2000mhz boost anymore with 82c temperature with 80% fan speed on 80c temp... a few months ago its only getting a 80-81c to 1950mhz to 2.1mhz boost.. what is happening? is there a fix on this?


----------



## juniordnz

Quote:


> Originally Posted by *immortalkings*
> 
> my MSI Gaming X is now getting the OC boost on their advertise site.. and im not getting the 2000mhz boost anymore with 82c temperature with 80% fan speed on 80c temp... a few months ago its only getting a 80-81c to 1950mhz to 2.1mhz boost.. what is happening? is there a fix on this?


Geez, what voltages are you running to get those temps?

I've always read that the Gaming X has a good cooler, it shouldn't be reaching +80ºC, right?


----------



## BigBeard86

Quote:


> Originally Posted by *nrpeyton*
> 
> Rock-solid stable at 2215MHZ and + 775 memory (16 degrees C core GPU temp).
> 
> Not tried to push her faster yet lol.
> 
> -Right click & open new tab for *full size*-
> 
> 
> -Right click & open new tab for *full size*-


Where did you buy that chiller? How much did it cost you? What pumps and radiators?


----------



## immortalkings

Quote:


> Originally Posted by *juniordnz*
> 
> Geez, what voltages are you running to get those temps?
> 
> I've always read that the Gaming X has a good cooler, it shouldn't be reaching +80ºC, right?


just default.. do i need to adjust the voltage? lower it?


----------



## juniordnz

Quote:


> Originally Posted by *immortalkings*
> 
> just default.. do i need to adjust the voltage? lower it?


No, it just shouldn't be reaching those temps I think.

When does it go past 80ºC? Normal gaming? FPS unlocked with no VSYNC?


----------



## immortalkings

Quote:


> Originally Posted by *juniordnz*
> 
> No, it just shouldn't be reaching those temps I think.
> 
> When does it go past 80ºC? Normal gaming? FPS unlocked with no VSYNC?


yes normal gaming like Battlefield 1.. no vsync fps unlocked but cant max my x34 on 100hz.. haha


----------



## juniordnz

Quote:


> Originally Posted by *immortalkings*
> 
> yes normal gaming like Battlefield 1.. no vsync fps unlocked but cant max my x34 on 100hz.. haha


That's not ok, bro. BF1 is super light on the GPU. I believe you're not even getting 99% usage of GPU, right?

I can only think of poor pasting job on the GPU. I'd buy some quality TIM like Kryuonaut and reapply on the core using the rice grain method. I don't believe 80ºC on that card on BF1 is ok.


----------



## immortalkings

Quote:


> Originally Posted by *juniordnz*
> 
> That's not ok, bro. BF1 is super light on the GPU. I believe you're not even getting 99% usage of GPU, right?
> 
> I can only think of poor pasting job on the GPU. I'd buy some quality TIM like Kryuonaut and reapply on the core using the rice grain method. I don't believe 80ºC on that card on BF1 is ok.


im getting 99% GPU Usage on Battlefield 1.. the only problem is the temperature.. il try to reapply the thermal paste.. dont know if any thing good available in my country


----------



## nrpeyton

Got my 1080 at *2252* mhz now in Heaven lol  _not tried to go faster yet nor tried classy voltage tool._

Core temp is only 11-13 degrees C at load (Heaven benchmark).

Memory temps are only 24-27c. (not even tried to O/C memory faster than 775 yet lol -- _can't wait to try this_) at idle memory is sitting at 16c lol _(GDDR5X memory is rated at 94c max & EVGA Hybrid memory reaches 60c. ACX 3.0 air cooled can hit 85c)_

VRM is only about 38c and is 23c at idle. _(stock EVGA load temps for all ACX 3.0 cards is 85+ degrees C for VRM)
_
Water temp of 6c-8c. (*condensation* started setting in at 5-6c so had to do an _*emergency shutdown lol*_).

Got a fright though; was continuously checking tubing for *condensation* (and nothing) then had a look at a metal fitting and it was wet. Then straight away I noticed the "power" LED for core was flashing (indicating irregular voltage supply to core) so I done an emergency shut down. Pulled block off card and there was condensation over the metal parts (acrylic was fine). I must have *just saved it in time.
*



As soon as that LED begun flashing my card also seemed to go into 'safe mode' and down-clocked its self to 253mhz which probably stopped a powerful short-circuit which at full load (300 watts) would probably have fried something.

I'm lucky lol -- and learnt my lesson. I was pushing past the dew point (my "relative humidity meter" was showing 9c and I was pushing for 5c.

Used the chance to apply extra thermal pads (covering more than what EK state in manual):

right click & 'open new tab' for *full size*


Quote:


> Originally Posted by *BigBeard86*
> 
> Where did you buy that chiller? How much did it cost you? What pumps and radiators?


Chiller was £200 pounds sterling (U.K) or in U.S dollars $245. It's a 1/2 horse power unit.
-790w cooling capacity.
-Temps down to 3c are possible (further if you manipulate thermostat and run anti-freeze)
-Bought on Ebay
-Just 1 standard EK D5 (came with EK Kit P360) _even with the pump on its lowest setting the water is still pumped around the whole system flawlessly including chiller_
-EK Coolstream PE 360 (disconnected since I got chiller)

Chiller has a 4L tank (so you have a total of 5L - 6L in your loop with this setup. If you set the chiller to ambient and you're machine is idling the chiller will only switch on for a few seconds every 5-10 minutes. If you set chiller to 8 degrees C and you're under full load it will run about 35% of time.

P.S.
This has completely re-opened the whole overclocking ball-game for me (and moved it)!

I feel like a N00b again getting to do everything O/C'ing wise from the beginning on the same PC! 

Anyone know if using *Liquid Metal on GPU* core would cause corrosion / discolouration of the top of the DIE (the shiny IHS)?

I desperately want to do it; but I'm afraid it will make my card non-sellable in future. (I'm not worried about causing a short circuit because I plan to insulate) but *I am worried about corrosion / discolouration*

Also; my water block is nickel-plated copper.


----------



## Dugom

I've read some posts but not the 292 pages, so sorry if it as already been asked.

Here my question, have someone tried the 1080 Gaming Z BIOS on the 1080 Gaming X card. If so, is there any improvement?

Even if I don't get 70° on the GPU, I get voltage drop, with some Mhz lost. I suspect the VRM sensor at this point.


http://www.tomshardware.com/reviews/best-nvidia-geforce-gtx-1080-graphics-cards,4725.html

Thanks guys.

.


----------



## Vellinious

Quote:


> Originally Posted by *Dugom*
> 
> I've read some posts but not the 292 pages, so sorry if it as already been asked.
> 
> Here my question, have someone tried the 1080 Gaming Z BIOS on the 1080 Gaming X card. If so, is there any improvement?
> 
> Even if I don't get 70° on the GPU, I get voltage drop, with some Mhz lost. I suspect the VRM sensor at this point.
> 
> 
> http://www.tomshardware.com/reviews/best-nvidia-geforce-gtx-1080-graphics-cards,4725.html
> 
> Thanks guys.
> 
> .


Are you sure it's not just boost 3.0 doing what it's supposed to? As temps increase, it will decrease clock and voltage to keep the temps cool. It will do this every 10c or so. 70, is too warm, btw.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Got my 1080 at *2252* mhz now in Heaven lol  _not tried to go faster yet nor tried classy voltage tool._
> 
> Core temp is only 11-13 degrees C at load (Heaven benchmark).
> 
> Memory temps are only 24-27c. (not even tried to O/C memory faster than 775 yet lol -- _can't wait to try this_) at idle memory is sitting at 16c lol _(GDDR5X memory is rated at 94c max & EVGA Hybrid memory reaches 60c. ACX 3.0 air cooled can hit 85c)_
> 
> VRM is only about 38c and is 23c at idle. _(stock EVGA load temps for all ACX 3.0 cards is 85+ degrees C for VRM)
> _
> Water temp of 6c-8c. (*condensation* started setting in at 5-6c so had to do an _*emergency shutdown lol*_).
> 
> Got a fright though; was continuously checking tubing for *condensation* (and nothing) then had a look at a metal fitting and it was wet. Then straight away I noticed the "power" LED for core was flashing (indicating irregular voltage supply to core) so I done an emergency shut down. Pulled block off card and there was condensation over the metal parts (acrylic was fine). I must have *just saved it in time.
> *
> 
> 
> 
> As soon as that LED begun flashing my card also seemed to go into 'safe mode' and down-clocked its self to 253mhz which probably stopped a powerful short-circuit which at full load (300 watts) would probably have fried something.
> 
> I'm lucky lol -- and learnt my lesson. I was pushing past the dew point (my "relative humidity meter" was showing 9c and I was pushing for 5c.
> 
> Used the chance to apply extra thermal pads (covering more than what EK state in manual):
> 
> right click & 'open new tab' for *full size*
> 
> Chiller was £200 pounds sterling (U.K) or in U.S dollars $245. It's a 1/2 horse power unit.
> -790w cooling capacity.
> -Temps down to 3c are possible (further if you manipulate thermostat and run anti-freeze)
> -Bought on Ebay
> -Just 1 standard EK D5 (came with EK Kit P360) _even with the pump on its lowest setting the water is still pumped around the whole system flawlessly including chiller_
> -EK Coolstream PE 360 (disconnected since I got chiller)
> 
> Chiller has a 4L tank (so you have a total of 5L - 6L in your loop with this setup. If you set the chiller to ambient and you're machine is idling the chiller will only switch on for a few seconds every 5-10 minutes. If you set chiller to 8 degrees C and you're under full load it will run about 35% of time.
> 
> P.S.
> This has completely re-opened the whole overclocking ball-game for me (and moved it)!
> 
> I feel like a N00b again getting to do everything O/C'ing wise from the beginning on the same PC!
> 
> Anyone know if using *Liquid Metal on GPU* core would cause corrosion / discolouration of the top of the DIE (the shiny IHS)?
> 
> I desperately want to do it; but I'm afraid it will make my card non-sellable in future. (I'm not worried about causing a short circuit because I plan to insulate) but *I am worried about corrosion / discolouration*
> 
> Also; my water block is nickel-plated copper.


I'd never use it on a GPU. Completely pointless.


----------



## Dugom

Quote:


> Originally Posted by *Vellinious*
> 
> Are you sure it's not just boost 3.0 doing what it's supposed to? As temps increase, it will decrease clock and voltage to keep the temps cool. It will do this every 10c or so. 70, is too warm, btw.


Nvidia says 94°C maxi, BIOS is set to target 92°C max. Afterburner don't show the temparature limit sensor activation, the GPU temp is not the problem here.

83° is the GPU temperature trottleling start set by Nvidia. 92°C activate the 100% fan speed. 94° shut down the card.

.


----------



## Dugom

bug


----------



## juniordnz

Quote:


> Originally Posted by *Dugom*
> 
> Nvidia says 94°C maxi, BIOS is set to target 92°C max. Afterburner don't show the temparature limit sensor activation, the GPU temp is not the problem here.
> 
> 83° is the GPU temperature trottleling start set by Nvidia. 92°C activate the 100% fan speed. 94° shut down the card.
> 
> .


I'm afraid you're wrong. With Pascal, thermal throttling starts as soon as the card reaches 39ºC depending on the card. At 83ºC you've already lost at least some 5 points on the core clock curve.


----------



## Dugom

Quote:


> Originally Posted by *juniordnz*
> 
> I'm afraid you're wrong. With Pascal, thermal throttling starts as soon as the card reaches 39ºC depending on the card. At 83ºC you've already lost at least some 5 points on the core clock curve.


Why, AfterBurner doesn't say so? Why other limit are activated from 0 to 1, but not the Temp one?

Maybe it is not the GPU, but VRM temp, who activate throttleling.

.


----------



## Dugom

Bug (How can I delete post?)


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I'd never use it on a GPU. Completely pointless.


I paid the price of 2 radiators for this setup.

*Its quieter.*

Uses *less power* than 2 radiators and 10 fans in push-pull drawing 50 watts every hour, every day.

If I'm just browsing / video / office, I can set the chiller thermostat at 32c (highest setting) it won't even switch on at all. *The 6 L of water in my loop seems to dissipate enough heat on its own even with chiller switched off!
*
If I set the thermostat *at* ambient it will run for 15-30 seconds every 10 minutes. It's blissfully quiet compared to 10 fans in push pull on 2 radiators.

Why would I disconnect my GPU from the loop (that would just be an inconvenience)? And I am indeed getting clocks that I could never have even dreamed about before.

Even my GPU memory is currently idling at 14 degrees C.

My GPU VRM is even running below ambient lol.

Even setting the thermostat to 8 degrees C with a prime95 on the CPU *and* Furmark on GPU (*simultaneously*) the chiller will still only run 40% of the time.

I can go as low as about 7c-8c water temp *without a single iota of condensation*. (I monitor with my dew point meter). My GPU idles at 0.5 - 1.0c over water temp.

All things conduct heat better the colder they are. (especially water). Because the water has no heat energy in its its crying out for it, so it absorbs it like there's no tomorrow, lol.


----------



## BigBeard86

How noisy is the chiller? Did you have to custom modit to use as a pc chiller?
What is the brand, or what other brand do you reccomend.

Is it running your cpu and gpu?


----------



## juniordnz

Quote:


> Originally Posted by *Dugom*
> 
> Why, AfterBurner doesn't say so? Why other limit are activated from 0 to 1, but not the Temp one?
> 
> Maybe it is not the GPU, but VRM temp, who activate throttleling.
> 
> .


I think afterburner and GPU-Z doesn't understand pascal yet. What you said about VRM temps makes sense, but I believe it's not the case because the thermal throttle is a norm in all cards, with good and bad VRM cooling. Even the modified EVGA's ACX coolers (which keep the mosfets under 80ºC) shows the same thermal throttle. It's something about pascal architecture, it's by design. That's why, IMO, over volting these cards should be made only by those who can handle the heat (watercooling).

I won't go over 1.063V on mine until I get it under water. Not worth it bragging about a clock you can't keep for more than a minute or so.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I paid the price of 2 radiators for this setup.
> 
> *Its quieter.*
> 
> Uses *less power* than 2 radiators and 10 fans in push-pull drawing 50 watts every hour, every day.
> 
> If I'm just browsing / video / office, I can set the chiller thermostat at 32c (highest setting) it won't even switch on at all. *The 6 L of water in my loop seems to dissipate enough heat on its own even with chiller switched off!
> *
> If I set the thermostat *at* ambient it will run for 15-30 seconds every 10 minutes. It's blissfully quiet compared to 10 fans in push pull on 2 radiators.
> 
> Why would I disconnect my GPU from the loop (that would just be an inconvenience)? And I am indeed getting clocks that I could never have even dreamed about before.
> 
> Even my GPU memory is currently idling at 14 degrees C.
> 
> My GPU VRM is even running below ambient lol.
> 
> Even setting the thermostat to 8 degrees C with a prime95 on the CPU *and* Furmark on GPU (*simultaneously*) the chiller will still only run 40% of the time.
> 
> I can go as low as about 7c-8c water temp *without a single iota of condensation*. (I monitor with my dew point meter). My GPU idles at 0.5 - 1.0c over water temp.
> 
> All things conduct heat better the colder they are. (especially water). Because the water has no heat energy in its its crying out for it, so it absorbs it like there's no tomorrow, lol.


I wasn't talking about the chiller. I was talking about the CLU.


----------



## Vellinious

Quote:


> Originally Posted by *Dugom*
> 
> Nvidia says 94°C maxi, BIOS is set to target 92°C max. Afterburner don't show the temparature limit sensor activation, the GPU temp is not the problem here.
> 
> 83° is the GPU temperature trottleling start set by Nvidia. 92°C activate the 100% fan speed. 94° shut down the card.
> 
> .


You asked, I told you. If you don't want to believe it.....don't. I care not. Enjoy your hot, throttling GPU. /wink


----------



## nrpeyton

**MUST HAVE* CHANGES TO EK MANUAL R.E PAD SIZES*

WHEN INSTALLING:
*EK 780TI CLASSY BLOCK*
ONTO
*1080 CLASSY*

Trial & error with different pad sizes:

The most important thing I noticed is the memory & VRM pad install instructions in the EK manual are outdated for installing this block onto a 1080 Classy:
-Heights of components have changed.

-EK are shipping this block with old manual and old instructions and listing the block as 1080 compatible (this is still *GOOD* because:

*A)* They were not supporting it *officially* before I forwarded them my initial research last month.

*B)* it *is* indeed compatible and even WITHOUT doing the modifications to the instructions below, you *WILL* INDEED still be deep within safe limits;

...however if you want the best temps; then accept that component heights have changed:

If you want the three GDDR5X memory chips closest to the VRM to run 20c cooler then you must at least change the sizes of the pads for memory & VRM to 1.0mm and 1.5mm. (These areas are marked blue in the picture)

_Note: The EK manual lists these pads as 0.5mm memory and 1.0mm VRM._

-The areas marked red are for *extra* pads only (definitely *not* essential but if you want to scrape every last degree C you can add these extra pads). I've tested with trial and error to work out the best heights for these extra pads.

-Testing method for checking contact was 2 tier:
*(1)* brushing top of all pads with a little thermal grease; attaching then removing block, then looking at block to see if there was residue (if there was then contact was fine).

*(2)* to test just how good this contact is I installed temperature probes on memory and VRM areas (on back of PCB) then set my water chiller to below ambient.

If a testing area dropped a good level below ambient (maybe 14c for memory) then I know contact on that area is good. _(obviously the more you're pad is compressed between component and block the better; but you don't want it to be too high as this could affect the contact on components around it)
_
*(3)* Finally 2 Furmark Stress Tests; (a) GPU Core Burner then (b) GPU memory burner (temps were monitored across core, VRM & memory).

right click & 'open new tab' for *full size*


----------



## nrpeyton

Quote:


> Originally Posted by *BigBeard86*
> 
> How noisy is the chiller? Did you have to custom modit to use as a pc chiller?
> What is the brand, or what other brand do you reccomend.
> 
> Is it running your cpu and gpu?


Hailea HC-500a (Ultra Titan 1500):
http://www.aquatuning.co.uk/water-cooling/radiators/chiller/2693/durchlaufkuehler-hailea-ultra-titan-1500-hc500790watt-kaelteleistung
or
http://www.aquatuning.co.uk/used-stuff/17331/b-ware-durchlaufkuehler-hailea-ultra-titan-1500-hc500790watt-kaelteleistung

Yes it is running CPU & GPU.

It is only noisy when you want to cool at 8 d degrees C. (Then it will run maybe 35-40% of the time) at load (gaming etc).

At idle for 8 degrees C it will only switch on for 20 seconds every 5-10 minutes.

At idle (or with light browsing/video/office work) it doesn't run at all if you set thermostat to 28-32c (the 4L tank is so big the water dissipates most of the heat with no help from the chiller)

*No custom mod necessary.* You can buy 2 cheap adapters to make it compatible with industry standard PC liquid cooling fittings (G1/4 fittings):
http://www.aquatuning.co.uk/water-cooling/radiators/chiller-accessories/2876/anschlussadapter-fuer-ultra-durchlaufkuehler-300-500-1500-auf-g1/4 <--- adapter


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I wasn't talking about the chiller. I was talking about the CLU.


Why do Americans all call Liquid Metal "CLU"?

The one I have is called 'Thermal Grizzly - Conductonaut' (73 W/mk) but I never really noticed any difference on my CPU.

GPU is much easier to cool (and stays closer to water temp) which is why I desperately wanted to try it; just curiosity more than anything! Will be much easier to test if it actually *does* anything since the margins of difference are much smaller.

By the margins I mean; on CPU your water temp could be 10c but your CPU temp could still be 50c.

My GPU on the other hand is never usuallymore than 10c above water temp. *(even with extreme load)*
So would be interesting to see if the Liquid Metal could get that down to maybe only a 5c difference?

When you're approaching "near-subzero" conditions every 1 degree C could make the world of difference


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Why do Americans all call Liquid Metal "CLU"?
> 
> The one I have is called 'Thermal Grizzly - Conductonaut' (73 W/mk) but I never really noticed any difference on my CPU.
> 
> GPU is much easier to cool (and stays closer to water temp) which is why I desperately wanted to try it; just curiosity more than anything! Will be much easier to test if it actually *does* anything since the margins of difference are much smaller.
> 
> By the margins I mean; on CPU your water temp could be 10c but your CPU temp could still be 50c.
> 
> My GPU on the other hand is never usuallymore than 10c above water temp. *(even with extreme load)*
> So would be interesting to see if the Liquid Metal could get that down to maybe only a 5c difference?
> 
> When you're approaching "near-subzero" conditions every 1 degree C could make the world of difference


CLU - CoolLabs Liquid Ultra.

I'd still never use it on a GPU. It's caused me issues in the past with nickel plated blocks. The only time I'd even think about using it, is during a delid where I was replacing the IHS on the die, and not going direct die cooling.

10c to 5c? Won't happen. You may gain 1 or 2c. But.....some people have to find out for themselves, so...enjoy.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> CLU - CoolLabs Liquid Ultra.
> 
> I'd still never use it on a GPU. It's caused me issues in the past with nickel plated blocks. The only time I'd even think about using it, is during a delid where I was replacing the IHS on the die, and not going direct die cooling.
> 
> 10c to 5c? Won't happen. You may gain 1 or 2c. But.....some people have to find out for themselves, so...enjoy.


What did it do to your nickel plating?

Was the IHS affected?

And how long are we talking for damage to take place? I mean could I get away with a 2hr session?


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> CLU - CoolLabs Liquid Ultra.
> 
> I'd still never use it on a GPU. It's caused me issues in the past with nickel plated blocks. The only time I'd even think about using it, is during a delid where I was replacing the IHS on the die, and not going direct die cooling.
> 
> 10c to 5c? Won't happen. You may gain 1 or 2c. But.....some people have to find out for themselves, so...enjoy.


No damage to the die itself? I ask because I'm planning on using conductonaut between gpu die and a copper shim I'll have to use so my waterblock makes contact with the die. I don't care about the copper, but damagin the GPU die would be a pain...


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> No damage to the die itself? I ask because I'm planning on using conductonaut between gpu die and a copper shim I'll have to use so my waterblock makes contact with the die. I don't care about the copper, but damagin the GPU die would be a pain...


You using extra thick pads or something?

I was using copper shims for the memory but never really seen much of a temp drop (I replaced 0.5mm pads with 0.5mm shims).

I've since went back to pads as there was no measurable temp difference (I pasted up on both sides too).


----------



## ucode

Quote:


> Originally Posted by *Dugom*
> 
> Nvidia says 94°C maxi, BIOS is set to target 92°C max. Afterburner don't show the temparature limit sensor activation, the GPU temp is not the problem here.
> 
> 83° is the GPU temperature trottleling start set by Nvidia. 92°C activate the 100% fan speed. 94° shut down the card.
> 
> .


Where are you seeing this?

FWIW my 1080 FE has a slowdown temp of 96C and a shutdown temp of 99C. This can be checked with the inbuilt nvidia SMI utility (nvidia-smi.exe -q -d temperature). In the early days with air cooling and T4 VBIOS I had logged 96C on the GPU but suspect the fan had stopped at some time.

As already mentioned boost 3 can mess with boost levels wrt temperatures, starting with very low temps. It's possible to reverse it and actually have clocks increase with temperature but this would usually be a bad idea.


----------



## Dragonsyph

Why don't you guys just use Kboost which forces the card to operate at max boost speed.

And i won't ever use CLU again, i used it on an h100i with a delided 3770k and it softened and turned the copper block on the h100i to a different color.


----------



## juniordnz

Quote:


> Originally Posted by *ucode*
> 
> It's possible to reverse it and actually have clocks increase with temperature but this would usually be a bad idea.


How? Never heard about that.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> How? Never heard about that.


lol, yeah....me either. Not sure how they're pulling that off, but I'd have to see it to believe it.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> You using extra thick pads or something?
> 
> I was using copper shims for the memory but never really seen much of a temp drop (I replaced 0.5mm pads with 0.5mm shims).
> 
> I've since went back to pads as there was no measurable temp difference (I pasted up on both sides too).


No, I just need the copper shim because the heatplate has 4 "fingers" that protrudes near the core and those don't let the corsair block reach the gpu die. So I need a copper shim about 2mm thick to make proper contact.

That's why I asked about the clu damaging the die. Because I plan on using it to minimize the losses of using a copper shim and not direct block-to-die mount.


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> How? Never heard about that.


How to demonstrate? Did post a graph here.

Not really sure it's useful at all other than showing that boost isn't perfect and depends when clocks are set as to how much it changes, i.e. core temperature at the time they are set.

BTW this was on an air cooled GP107 with core voltage increased to a little over 1.3V. How much difference does a ~0.2V increase make? In this chips case about 50MHz







Tried some VBIOS mods without proper signing too, result was video card boots with driver problems reported







Looks pretty much tied down so far. Anyone requested hulk certs yet?


----------



## hertz9753

The Corsair HG10 was a pain in my crack and it is back in a box. The four fingers that @juniordnz is talking about are small studs that go above the mid plate that EVGA uses to connect to the ACX cooler. You need a shim to get proper cooling to the GPU core.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> No, I just need the copper shim because the heatplate has 4 "fingers" that protrudes near the core and those don't let the corsair block reach the gpu die. So I need a copper shim about 2mm thick to make proper contact.
> 
> That's why I asked about the clu damaging the die. Because I plan on using it to minimize the losses of using a copper shim and not direct block-to-die mount.


I am very interested to see if the DIE will be damaged too (or mini shiny IHS)?

I've started a POLL about this on techopowerup.com forums (cooling sub-section) to try and see how its affected others: (with IHS, copper & nickel)
I've also asked your question on the thread too.. _It looks like its turned into quite a popular thread (had 250 views in 12 hours already)._

https://www.techpowerup.com/forums/threads/p-o-l-l-liquid-metal-damage-integrated-heat-spreader-copper-nickel.229039/

I'd be interested to see what temps you're getting with the copper shim (like how many degrees you lose; or if its the same)?

I tried 0.5mm copper shims to replace 0.5mm pads for memory and never seen much of a difference.

_*Edit*:
Actually I'm beggning to faintly remember discussing this with you a few months ago.. lol have you still not done it yet?_

Or you just grudging paying for a full cover block when you're got the HG10 sittig there? You always could sell it? Then buy a used block off Ebay? Wouldn't cost too much and performance would probably be much better?

Also how are you cooling the memory and VRM using G10? (will the *fan* section on the G10 and its heatsink match up correctly with the VRM on your card)? Then you also still have to consider memory cooling too? _(i've not looked at the G10 in detail but I did have a quick squint)..._


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I am very interested to see if the DIE will be damaged too (or mini shiny IHS)?
> 
> I've started a POLL about this on techopowerup.com forums (cooling sub-section) to try and see how its affected others: (with IHS, copper & nickel)
> I've also asked your question on the thread too.. _It looks like its turned into quite a popular thread (had 250 views in 12 hours already)._
> 
> https://www.techpowerup.com/forums/threads/p-o-l-l-liquid-metal-damage-integrated-heat-spreader-copper-nickel.229039/
> 
> I'd be interested to see what temps you're getting with the copper shim (like how many degrees you lose; or if its the same)?
> 
> I tried 0.5mm copper shims to replace 0.5mm pads for memory and never seen much of a difference.
> 
> _*Edit*:
> Actually I'm beggning to faintly remember discussing this with you a few months ago.. lol have you still not done it yet?_
> 
> Or you just grudging paying for a full cover block when you're got the HG10 sittig there? You always could sell it? Then buy a used block off Ebay? Wouldn't cost too much and performance would probably be much better?
> 
> Also how are you cooling the memory and VRM using G10? (will the *fan* section on the G10 and its heatsink match up correctly with the VRM on your card)? Then you also still have to consider memory cooling too? _(i've not looked at the G10 in detail but I did have a quick squint)..._


I have two classy blocks I'd let go real cheap. They're just sitting there. One brand new in the box. Never even been opened.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> I am very interested to see if the DIE will be damaged too (or mini shiny IHS)?
> 
> I've started a POLL about this on techopowerup.com forums (cooling sub-section) to try and see how its affected others: (with IHS, copper & nickel)
> I've also asked your question on the thread too.. _It looks like its turned into quite a popular thread (had 250 views in 12 hours already)._
> 
> https://www.techpowerup.com/forums/threads/p-o-l-l-liquid-metal-damage-integrated-heat-spreader-copper-nickel.229039/
> 
> I'd be interested to see what temps you're getting with the copper shim (like how many degrees you lose; or if its the same)?
> 
> I tried 0.5mm copper shims to replace 0.5mm pads for memory and never seen much of a difference.
> 
> _*Edit*:
> Actually I'm beggning to faintly remember discussing this with you a few months ago.. lol have you still not done it yet?_
> 
> Or you just grudging paying for a full cover block when you're got the HG10 sittig there? You always could sell it? Then buy a used block off Ebay? Wouldn't cost too much and performance would probably be much better?
> 
> Also how are you cooling the memory and VRM using G10? (will the *fan* section on the G10 and its heatsink match up correctly with the VRM on your card)? Then you also still have to consider memory cooling too? _(i've not looked at the G10 in detail but I did have a quick squint)..._


I dind't do it yet, can you believe it? I'm depending on getting the copper shim from china (coudn't find 20mm X 20 mm X 2,5mm anywhere here in Brazil). And in this time of the year, international orders take a long long time. I'm hoping to get everything in hand until the end of the week.

I bought a Noctua NF9 2000rpm to use on the Kraken, that should blow enough air on the heatplate right above the VRM area. I'm also replacing all pads from the heatplate for 11w/mK fujipoly. All that should be able to keep both VRM and VRAM cool. And I'm also doing the mod on the backplate with 6w/mK thermal pads on the mosfets and VRAM modules area.

I can't wait to get everything done. I just hope it works out with the copper shim (I guess it will, the IHS on the CPU is like a copper shim and it works). Summer has been crazy here, yesterday we got 42ºC and I have no AC in my room


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> I dind't do it yet, can you believe it? I'm depending on getting the copper shim from china (coudn't find 20mm X 20 mm X 2,5mm anywhere here in Brazil). And in this time of the year, international orders take a long long time. I'm hoping to get everything in hand until the end of the week.
> 
> I bought a Noctua NF9 2000rpm to use on the Kraken, that should blow enough air on the heatplate right above the VRM area. I'm also replacing all pads from the heatplate for 11w/mK fujipoly. All that should be able to keep both VRM and VRAM cool. And I'm also doing the mod on the backplate with 6w/mK thermal pads on the mosfets and VRAM modules area.
> 
> I can't wait to get everything done. I just hope it works out with the copper shim (I guess it will, the IHS on the CPU is like a copper shim and it works). Summer has been crazy here, yesterday we got 42ºC and I have no AC in my room


Those ambient temps can't help a lot lol.

I also got my shims from China (couldn't get the right size here). And they never offer fast delivery. (*not* even an *option* for *express* if u *want it)*
It's always "*free* or very cheap *delivery*" *from* *China* but takes *2 millennia to arrive* lol.

When I was fiddling with pads on my GPU trying to get better mem and VRM temps; one thing to note is that a more "compressed" pad (like pushed harder with more pressure between component and block) delivers 3 fold better temperatures than going for a higher w/MK.

So I grabbed cheaper 10cm x 10cm slabs so I had enough to tinker until I found the best performance.

Never paid more than 3.95 for pads (thats about 4.50 in U.S dollars). For a 10cm x 10cm 1.0mm or 1.5mm

What I did notice with the higher w/MK though; is the pads did seem to be a bit "softer" Meaning they'd compress down more (delivering you good temps on *that* component without compromising on others around it.

And by "others" I mean.. sometimes if you go too high on a pad it affects the contact pressure on others around it (then temps on those components go higher).

I've got my GPU all probed out with thermostats attached to a fan controller and I measure everything from back of PCB. (If I position the probe correctly behind the relevant component) is great for an accurate reading). I was getting differences of 20c - 30c at the back of the same component with different pad sizes even when they made "light" contact vs "pressurised" contact with pads of the same w/MK rating.

(My block is at different heights all over it's self and components are all different heights too -- so finding the best performance pad height across the entire PCB was painstaking lol.

Anyway I got there in the end, and now I've got my memory overclocked at +915 with no errors; so it definitely made a difference 

Anyway I'm beginning to waffle lol. and its bed time; so hope ur stuff arrives soon & it all goes well for you 

Did u decide if ur going with the liquid metal yet? I'm still desperate to try it on mine (I'm thinking I might do a few short runs to compare with Kryonaut) then remove it before it dries out and becomes too difficult to remove. I'll let you know how I get on lol.. (not been back and checked that thread at tech-powerup yet either -- that's my next port of call tomorrow night) 

P.S. I have 72 w/Mk liquid metal sitting here lol (just not opened the pack yet ha)


----------



## binormalkilla

So I finally go around to flashing the t4 BIOS and doing some more overclocking. So far I'm at 2126/5569 Firestrike Ultra stable. I noticed that even with the voltage at +100 I'm maxing out at 1.081V. I tried manually tuning the VF curve to gradually ramp up the voltage with each +1 MHz increment, but I still topped out at 1.081V, as you can see with the crosshairs in the second screenshot.

Is there something I'm missing to allow a higher voltage with the founder's edition flashed with the t4 bios? I tried changing the voltage control setting to 'MSI extended' in afterburner, but that didn't seem to matter.

EDIT:

This is the BIOS I flashed:
https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803




EDIT2:

Ok I got the voltage to scale higher by tweaking the curve a bit while running Kombustor. Now to test for stability...


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> Those ambient temps can't help a lot lol.
> 
> I also got my shims from China (couldn't get the right size here). And they never offer fast delivery. (*not* even an *option* for *express* if u *want it)*
> It's always "*free* or very cheap *delivery*" *from* *China* but takes *2 millennia to arrive* lol.
> 
> When I was fiddling with pads on my GPU trying to get better mem and VRM temps; one thing to note is that a more "compressed" pad (like pushed harder with more pressure between component and block) delivers 3 fold better temperatures than going for a higher w/MK.
> 
> So I grabbed cheaper 10cm x 10cm slabs so I had enough to tinker until I found the best performance.
> 
> Never paid more than 3.95 for pads (thats about 4.50 in U.S dollars). For a 10cm x 10cm 1.0mm or 1.5mm
> 
> What I did notice with the higher w/MK though; is the pads did seem to be a bit "softer" Meaning they'd compress down more (delivering you good temps on *that* component without compromising on others around it.
> 
> And by "others" I mean.. sometimes if you go too high on a pad it affects the contact pressure on others around it (then temps on those components go higher).
> 
> I've got my GPU all probed out with thermostats attached to a fan controller and I measure everything from back of PCB. (If I position the probe correctly behind the relevant component) is great for an accurate reading). I was getting differences of 20c - 30c at the back of the same component with different pad sizes even when they made "light" contact vs "pressurised" contact with pads of the same w/MK rating.
> 
> (My block is at different heights all over it's self and components are all different heights too -- so finding the best performance pad height across the entire PCB was painstaking lol.
> 
> Anyway I got there in the end, and now I've got my memory overclocked at +915 with no errors; so it definitely made a difference
> 
> Anyway I'm beginning to waffle lol. and its bed time; so hope ur stuff arrives soon & it all goes well for you
> 
> Did u decide if ur going with the liquid metal yet? I'm still desperate to try it on mine (I'm thinking I might do a few short runs to compare with Kryonaut) then remove it before it dries out and becomes too difficult to remove. I'll let you know how I get on lol.. (not been back and checked that thread at tech-powerup yet either -- that's my next port of call tomorrow night)
> 
> P.S. I have 72 w/Mk liquid metal sitting here lol (just not opened the pack yet ha)


That's nice to hear, mate. O couldn't take that 1w/mK thing Evga gave us for free. And as they look exactly the same as the original pads on vram and mosfets, i decided to replace those aswell. I'm glad I bought pads thicker than necessary, so now I can squeeze them a little amd get some better performance out of them. The pads for the backplate are 1,8mm thick and I have bought some 2mm 6w/mK to do the infamous evga thermal mod. I'll be able to cover the whole back of the mosfet plus the back of the Vram and let all make contact woth the backplate for some extra dissipation. On the heatplate I spent more and got me some 11w/mk 1,5mm thick pads. Those should be enough to cover the whole vrm area and all vram modules. Also with a nice squeeze on it since I believe a little over 1mm thick pads would sufice to make contact.

I'm sitting on some conductonaut too. And if I don't find anyone saying it damages the die itself I'll use it to minimize the loss of using a shim between block and die. I also have kryonaut to use on the block to avoid the galium from the liquid metal to mix with the copper and making it stain.

Did you find anything about conductonaut drying out and needing replacement very often like happens with CLU? It would be a pain havin to replace it every now and then...

Now, the heat...geez, I cant stand my card running at 3600rpm fan speed, sounding like a jet turbine and still breaking 70ºC. I miss winter so much lol
Quote:


> Originally Posted by *binormalkilla*
> 
> So I finally go around to flashing the t4 BIOS and doing some more overclocking. So far I'm at 2126/5569 Firestrike Ultra stable. I noticed that even with the voltage at +100 I'm maxing out at 1.081V. I tried manually tuning the VF curve to gradually ramp up the voltage with each +1 MHz increment, but I still topped out at 1.081V, as you can see with the crosshairs in the second screenshot.
> 
> Is there something I'm missing to allow a higher voltage with the founder's edition flashed with the t4 bios?


That's probably because you're flatlining the curve before reaching your max voltage point. Try setting a higher clock for your max voltage point and you'll see the voltage rise.


----------



## Buzzard1

Where can I get one of these cool t2 or t4 bios for my EVGA FTW 1080? I get 2075 core @ 61c after 30 minutes of benching? Thanks


----------



## juniordnz

Quote:


> Originally Posted by *Buzzard1*
> 
> Where can I get one of these cool t2 or t4 bios for my EVGA FTW 1080? I get 2075 core @ 61c after 30 minutes of benching? Thanks


Just search for "T4" on this thread and you'll find some nice guys posting it a few pages back.

It's a nice BIOS, but intended for those who are running on water and can handle the heat. Otherwise you'll loose your OC to the thermal throttle and put unnecessary stress on the GPU.


----------



## arrow0309

Quote:


> Originally Posted by *binormalkilla*
> 
> So I finally go around to flashing the t4 BIOS and doing some more overclocking. So far I'm at 2126/5569 Firestrike Ultra stable. I noticed that even with the voltage at +100 I'm maxing out at 1.081V. I tried manually tuning the VF curve to gradually ramp up the voltage with each +1 MHz increment, but I still topped out at 1.081V, as you can see with the crosshairs in the second screenshot.
> 
> Is there something I'm missing to allow a higher voltage with the founder's edition flashed with the t4 bios? I tried changing the voltage control setting to 'MSI extended' in afterburner, but that didn't seem to matter.
> 
> EDIT:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> This is the BIOS I flashed:
> https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803
> 
> 
> 
> 
> 
> 
> EDIT2:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ok I got the voltage to scale higher by tweaking the curve a bit while running Kombustor. Now to test for stability...


Hi, nice block you got there, I mean in your sig (Heatkiller IV Acetal)








I was looking to choose a wb for a new vga (Titan X Ps) and couldn't decide yet between your Watercool and the EK nickel acetal.
Can you tell me anything about it?
Did you have any way to compare it or know some review (even with the 1080)?
Is it restrictive?
Nice wb however, cheers!


----------



## owikhan

Where from i buy Zotac 1080 Amp edition Water block please need guidance


----------



## binormalkilla

Quote:


> Originally Posted by *arrow0309*
> 
> Hi, nice block you got there, I mean in your sig (Heatkiller IV Acetal)
> 
> 
> 
> 
> 
> 
> 
> 
> I was looking to choose a wb for a new vga (Titan X Ps) and couldn't decide yet between your Watercool and the EK nickel acetal.
> Can you tell me anything about it?
> Did you have any way to compare it or know some review (even with the 1080)?
> Is it restrictive?
> Nice wb however, cheers!


I was pleasantly surprised by the quality of the block to be honest. I've typically gone with EK for all of my GPU blocks in the past, so this was a departure from the norm. My only complaint is their choice to use Torx screws on the block. Other than that I really like the look, and the performance is great.


----------



## arrow0309

Quote:


> Originally Posted by *binormalkilla*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> Hi, nice block you got there, I mean in your sig (Heatkiller IV Acetal)
> 
> 
> 
> 
> 
> 
> 
> 
> I was looking to choose a wb for a new vga (Titan X Ps) and couldn't decide yet between your Watercool and the EK nickel acetal.
> Can you tell me anything about it?
> Did you have any way to compare it or know some review (even with the 1080)?
> Is it restrictive?
> Nice wb however, cheers!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was pleasantly surprised by the quality of the block to be honest. I've typically gone with EK for all of my GPU blocks in the past, so this was a departure from the norm. My only complaint is their choice to use Torx screws on the block. Other than that I really like the look, and the performance is great.
Click to expand...

Thanks!
I'll probably go for a Watercool wb & bp as well








I've just discovered that I also like the silver / plexy nickel block


----------



## IronAge

Quote:


> Originally Posted by *owikhan*
> 
> Where from i buy Zotac 1080 Amp edition Water block please need guidance


Only waterblock for this card comes from bitspower

https://www.caseking.de/bitspower-nvidia-gtx-1080-reference-acrylic-clear-wach-462.html


----------



## HeLeX63

Just watercooled my Gainward GTX 1080 Pheonix with a full cover EK waterblock, and I am only able to reach 2,126 with occasional drops to 2,114MHz. This is up from my 2025 - 2063MHz on the air cooler which ran from 67-72C @100% fan speed. Now GPU runs at a max of 52C with an ambient of about 33C.

I was able to push memory from +400 to at least +500 but haven't tried further.

How does this match with others? Do I have a bad chip or is this by no means a total overclocking failure?

Cheers


----------



## juniordnz

Quote:


> Originally Posted by *HeLeX63*
> 
> Just watercooled my Gainward GTX 1080 Pheonix with a full cover EK waterblock, and I am only able to reach 2,126 with occasional drops to 2,114MHz. This is up from my 2025 - 2063MHz on the air cooler which ran from 67-72C @100% fan speed. Now GPU runs at a max of 52C with an ambient of about 33C.
> 
> I was able to push memory from +400 to at least +500 but haven't tried further.
> 
> How does this match with others? Do I have a bad chip or is this by no means a total overclocking failure?
> 
> Cheers


I can get 2126mhz and +575 on memory with stock 1.063V on air.

The drop on core clock will depend on temperature. Mine usually drops to 2100mhz (carded below 60ºC) and 2088mhz on very hot days.


----------



## nrpeyton

Quote:


> Originally Posted by *HeLeX63*
> 
> Just watercooled my Gainward GTX 1080 Pheonix with a full cover EK waterblock, and I am only able to reach 2,126 with occasional drops to 2,114MHz. This is up from my 2025 - 2063MHz on the air cooler which ran from 67-72C @100% fan speed. Now GPU runs at a max of 52C with an ambient of about 33C.
> 
> I was able to push memory from +400 to at least +500 but haven't tried further.
> 
> How does this match with others? Do I have a bad chip or is this by no means a total overclocking failure?
> 
> Cheers


at an ambient of 22c my card maxed out about 35-40 so that is probably about right. If my ambient was 10c hotter I'd probably of been getting 45-50 which is roughly what your getting.

You've still lost 20c going water.

Also FE cards are running at 85c+ so you're not doing too bad.

Only way you'll get that down further is to lower ambient or grab a water chiller or extreme cooling.

2126 is also probably just above average, but definitely not 'great' either.


----------



## HeLeX63

Quote:


> Originally Posted by *juniordnz*
> 
> I can get 2126mhz and +575 on memory with stock 1.063V on air.
> 
> The drop on core clock will depend on temperature. Mine usually drops to 2100mhz (carded below 60ºC) and 2088mhz on very hot days.


Is that under water though ? I can only get 2,126 max and as low as 2,114 @1.093 to 1.075V


----------



## Derek1

Quote:


> Originally Posted by *HeLeX63*
> 
> Is that under water though ? I can only get 2,126 max and as low as 2,114 @1.093 to 1.075V


Mine does 2152 Core and 11600 Mem never going above 50C @ 1.081v with ambient @ 25C on a FTW Hybrid AIO conversion.


----------



## juniordnz

Quote:


> Originally Posted by *HeLeX63*
> 
> Is that under water though ? I can only get 2,126 max and as low as 2,114 @1.093 to 1.075V


No, that's on air with the stock FTW cooler.

Just above average as nick said, but nothing special. 2.126mhz seems to be the max most of the cards get. Then there are those who can getc higher clocks and the unlucky ones who can't even break 2050.

With 1.093V I can get 2177mhz. But it's not worth until I can get the card on water.

BTW, is there a way to undervolt the card and keep the adaptive voltage? (Don't wanna lock the clock/voltage curve).
Quote:


> Originally Posted by *Derek1*
> 
> Mine does 2152 Core and 11600 Mem never going above 50C @ 1.081v with ambient @ 25C on a FTW Hybrid AIO conversion.


Did you test if the performance keeps increasing after +575? On your card it's like a constant curve upwards until the memory fails?


----------



## HeLeX63

Quote:


> Originally Posted by *Derek1*
> 
> Mine does 2152 Core and 11600 Mem never going above 50C @ 1.081v with ambient @ 25C on a FTW Hybrid AIO conversion.


That's pretty darn good.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Did you find anything about conductonaut drying out and needing replacement very often like happens with CLU? It would be a pain havin to replace it every now and then...
> .


Not had a chance as was adding heatsinks to my memory VRM tonight but my new pack of paste comes tomorrow so I'll rip the block off my CPU and see if there's any damage or drying out.

I didn't mind taking the risk on my old CPU as I'll be going ZEN soon, but I'll let you know tomorrow what the result is.

I've had the Conductonaut on it for about 2 months. Just never touched GPU (the CPU was the test subject lol).


----------



## IronAge

Quote:


> Originally Posted by *HeLeX63*
> 
> That's pretty darn good.


My Phoenix does 2152 with 1.031V - 1.043V ... with stock cooler with 80%-100% rpm.

I even got it through FSE with GPU @ 2164 with less VDDC / undervolting. So probably your GPU does not like OV that much too.

With the Phoenix GLH/Gamrock Premium Review Bios i get 5640 VRAM .

With any other Bios (EVGA FTW Master for instance) i get much lower VRAM clock rates but higher scores when i use the same clock rates as with the Gainward VBios.

So i came to the conclusion that the Palit/Gainward VBIOS must have very tame memory timings.

Probably you want to try the Phoenix GLH /Gamerock Premium Review Bios too.

Highest Scores/performance i got with the Asus T4 Bios without overvoltage, almost 12200 Graphics Score with 3D Mark Firestrike Extreme.

OS is Windows 10 ... i tried the same Settings/Benchmarks with Windows 7 and got about 200-250 Points lower graphics Score.


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> No, that's on air with the stock FTW cooler.
> 
> Just above average as nick said, but nothing special. 2.126mhz seems to be the max most of the cards get. Then there are those who can getc higher clocks and the unlucky ones who can't even break 2050.
> 
> With 1.093V I can get 2177mhz. But it's not worth until I can get the card on water.
> 
> BTW, is there a way to undervolt the card and keep the adaptive voltage? (Don't wanna lock the clock/voltage curve).
> 
> Did you test if the performance keeps increasing after +575? On your card it's like a constant curve upwards until the memory fails?


Yes, still increases, anything over +800 though and I start to artifact. FPS only marginally, 1-2 fps per jump @50-100mhz, but graphics score more significantly. It was finally how I managed to go over 25k in FS.
ETA I can get to 2177 but FS shuts down. I typically start off at 2164 and then throttle down to 2152 at 40-43C. I have been meaning to do some serious testing with the T4 to see what the extra volts will do but haven't got around to it.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Yes, still increases, anything over +800 though and I start to artifact. FPS only marginally, 1-2 fps per jump @50-100mhz, but graphics score more significantly. It was finally how I managed to go over 25k in FS.
> ETA I can get to 2177 but FS shuts down. I typically start off at 2164 and then throttle down to 2152 at 40-43C. I have been meaning to do some serious testing with the T4 to see what the extra volts will do but haven't got around to it.


Before I got my Chiller the Classified voltage tool wasn't doing anything for memory; now I'm getting to +875 without issue (but only when I up the memory voltage a little tad).

I've not tried it on the core yet.

Thats with a water temp of about 10-13c. (memory idles between 17-19c (about 5-6c below ambient). Then maxes out about 35c on the hottest chips closest to the VRM.

The core seems to match the water temperature better than any other component on the card. (naturally) I suppose; as there's no pad between core and block; but everywhere else is affected by thermal pads.

Quote:


> Originally Posted by *IronAge*
> 
> My Phoenix does 2152 with 1.031V - 1.043V ... with stock cooler with 80%-100% rpm.
> 
> I even got it through FSE with GPU @ 2164 with less VDDC / undervolting. So probably your GPU does not like OV that much too.
> 
> With the Phoenix GLH/Gamrock Premium Review Bios i get 5640 VRAM .
> 
> With any other Bios (EVGA FTW Master for instance) i get much lower VRAM clock rates but higher scores when i use the same clock rates as with the Gainward VBios.
> 
> So i came to the conclusion that the Palit/Gainward VBIOS must have very tame memory timings.
> 
> Probably you want to try the Phoenix GLH /Gamerock Premium Review Bios too.
> 
> Highest Scores/performance i got with the Asus T4 Bios without overvoltage, almost 12200 Graphics Score with 3D Mark Firestrike Extreme.
> 
> OS is Windows 10 ... i tried the same Settings/Benchmarks with Windows 7 and got about 200-250 Points lower graphics Score.


Hmm interesting; so part of the reason the T4 BIOS is getting everyone higher scores then is down to memory timings? That explains it then. I always wondered about that.


----------



## binormalkilla

Quote:


> Originally Posted by *HeLeX63*
> 
> Is that under water though ? I can only get 2,126 max and as low as 2,114 @1.093 to 1.075V


Mine is stable (3dmark stress test) at 2126 at 1.062V on FE card flashed with the T4 BIOS. I set the VF curve to 1.2V, tweaking the frequency to find the max stable frequency at this voltage. Interestingly, this card doesn't benefit from 1.2V. 3dmark Firestrike Extreme would abort at even +230 MHz, which is an increase from +225 Mhz @ 1.062V. I still need to test by slightly increasing voltage and frequency. I get around a 13C delta T from idle to load in 3dmark Firestrike Ultra. Since flashing T4, I don't ever hit any limits other than load limit.


----------



## HeLeX63

Quote:


> Originally Posted by *binormalkilla*
> 
> Mine is stable (3dmark stress test) at 2126 at 1.062V on FE card flashed with the T4 BIOS. I set the VF curve to 1.2V, tweaking the frequency to find the max stable frequency at this voltage. Interestingly, this card doesn't benefit from 1.2V. 3dmark Firestrike Extreme would abort at even +230 MHz, which is an increase from +225 Mhz @ 1.062V. I still need to test by slightly increasing voltage and frequency. I get around a 13C delta T from idle to load in 3dmark Firestrike Ultra. Since flashing T4, I don't ever hit any limits other than load limit.


How do I flash my non FE card to a better bios ? Will that even do anything for me when overclocking ?


----------



## binormalkilla

Quote:


> Originally Posted by *HeLeX63*
> 
> How do I flash my non FE card to a better bios ? Will that even do anything for me when overclocking ?


You need to download NVFLash (I got it from Guru3d's site), then flash it after disabling the adapter in device manager. There is a decent guide on this site somewhere, just search for nvflash and you should see it. Flashing your BIOS allows you to increase the voltage further.


----------



## HeLeX63

Quote:


> Originally Posted by *binormalkilla*
> 
> You need to download NVFLash (I got it from Guru3d's site), then flash it after disabling the adapter in device manager. There is a decent guide on this site somewhere, just search for nvflash and you should see it. Flashing your BIOS allows you to increase the voltage further.


Got it already. But how do I go about getting the right BIOS ? Surely I cant just use any BIOS, I need a Gainward one for my card.


----------



## binormalkilla

Quote:


> Originally Posted by *HeLeX63*
> 
> Got it already. But how do I go about getting the right BIOS ? Surely I cant just use any BIOS, I need a Gainward one for my card.


The only caveat to flashing a different BIOS for GTX 1080's that I know of is the fan control. If you're flashing a T4 BIOS to a FE card, the fan control portion of the BIOS doesn't allow the high RPM blower style fan to reach a higher RPM. For you, it shouldn't matter. Presumably the BIOS doesn't have any logic that controls any of the power components on the PCB, so any different board design from various vendor's won't matter when you flash the BIOS. Essentially you should be able to flash any BIOS without any issues. This isn't the case will all graphics cards, FYI.

So to clarify, you should flash the BIOS I linked. If you have integrated graphics on your CPU you're totally safe in case you get a bad flash. I think it's a bit silly that NVFlash runs in Windows instead of a bootable DOS environment, but I guess I'm old school.


----------



## nrpeyton

Hi,

Anyone know if Liquid Electrical Tape is easy to remove? Does it just peel off like a bit of tape or a bit of thermal pad? (or will I be spending 2 days scratching and poking and wanting to kill myself with frustration)? lol

Thinking of using this to insulate my GPU against condensation.

But don't want any residue as I'm quite "anal" about things being clean and tidy. Also the idea of never being able to return the card to its factory condition would kind of sadden me.

Thanks,

Nick


----------



## binormalkilla

Quote:


> Originally Posted by *nrpeyton*
> 
> Hi,
> 
> Anyone know if Liquid Electrical Tape is easy to remove? Does it just peel off like a bit of tape or a bit of thermal pad? (or will I be spending 2 days scratching and poking and wanting to kill myself with frustration)? lol
> 
> Thinking of using this to insulate my GPU against condensation.
> 
> But don't want any residue as I'm quite "anal" about things being clean and tidy. Also the idea of never being able to return the card to its factory condition would kind of sadden me.
> 
> Thanks,
> 
> Nick


If I were you I would look into using conformal coating to protect your PCB. If you properly protect your PCB components you should be fine. The GPU is a sealed component, so you won't stand to gain anything from protecting it from condensation.


----------



## nrpeyton

Quote:


> Originally Posted by *binormalkilla*
> 
> If I were you I would look into using conformal coating to protect your PCB. If you properly protect your PCB components you should be fine. The GPU is a sealed component, so you won't stand to gain anything from protecting it from condensation.


Sorry, I meant to say PCB 

It is a full cover block so yes, 

Does "conformal coating" peel off? I'll have to google that one.

Anyhow thanks for reply


----------



## HeLeX63

Quote:


> Originally Posted by *binormalkilla*
> 
> The only caveat to flashing a different BIOS for GTX 1080's that I know of is the fan control. If you're flashing a T4 BIOS to a FE card, the fan control portion of the BIOS doesn't allow the high RPM blower style fan to reach a higher RPM. For you, it shouldn't matter. Presumably the BIOS doesn't have any logic that controls any of the power components on the PCB, so any different board design from various vendor's won't matter when you flash the BIOS. Essentially you should be able to flash any BIOS without any issues. This isn't the case will all graphics cards, FYI.
> 
> So to clarify, you should flash the BIOS I linked. If you have integrated graphics on your CPU you're totally safe in case you get a bad flash. I think it's a bit silly that NVFlash runs in Windows instead of a bootable DOS environment, but I guess I'm old school.


Ok cool, ill give it a try. My GPU has a dual BIOS too.

Where is the link I cannot see it ?


----------



## GreedyMuffin

Tested with [email protected] Same WU/Task. Some results with wattage draw from the whole system (6700K @1.200V fixed)

2138mhz - 1.050V - 282.37 Watt

2100mhz - 1.000V - 264.34 Watt

2050mhz - 0.950V - 243.32 Watt

2000mhz - 0.900V - 228.39 Watt

1950mhz - 0.850V - 213.28 Watt

PSU: AX1500I. Wattage was drawn from the wall. Efficiency at 91% or so.


----------



## spboss

Does anyone have a Gigabyte GTX 1080 Waterforce?

Mine's running stock clock speeds as 2088mhz on boost and 5000mhz on memory.

How much further could I push it?


----------



## HeLeX63

Quote:


> Originally Posted by *spboss*
> 
> Does anyone have a Gigabyte GTX 1080 Waterforce?
> 
> Mine's running stock clock speeds as 2088mhz on boost and 5000mhz on memory.
> 
> How much further could I push it?


No need to ask. Just go ahead and increase core clock by 20MHz increments, and you'll soon find out its limit when it causes artifacts on screen or when it crashes. Once this happens you can dial back 5MHz at a time to get max stable OC.


----------



## HeLeX63

After doing some more unfortunate testing, I have realized I probably have one of the worst GTX 1080's.

Looks like I'm settling at a max boost of 2,114MHz, which soon quickly drops to 2,101MHz which is what it stays on MOSTLY. Occasionally it gets as low as 2,088MHz









This is with 100% max voltage locked at 1.093V. Temps stabilize and max at about 54C with a single 480mm x 60mm thick radiator, with an OC'd i7 4790k @4.7GHz and Gainward GTX 1080 Pheonix in the loop. It is summer here in Australia so the max load temps can be expected.

Is there anything else available to me ? I was really hoping for a solid 2,139 to 2,164MHz range...


----------



## nrpeyton

Quote:


> Originally Posted by *HeLeX63*
> 
> After doing some more unfortunate testing, I have realized I probably have one of the worst GTX 1080's.
> 
> Looks like I'm settling at a max boost of 2,114MHz, which soon quickly drops to 2,101MHz which is what it stays on MOSTLY. Occasionally it gets as low as 2,088MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is with 100% max voltage locked at 1.093V. Temps stabilize and max at about 54C with a single 480mm x 60mm thick radiator, with an OC'd i7 4790k @4.7GHz and Gainward GTX 1080 Pheonix in the loop. It is summer here in Australia so the max load temps can be expected.
> 
> Is there anything else available to me ? I was really hoping for a solid 2,139 to 2,164MHz range...


Your GPU under water? You could grab a water chiller. I got from 2151 to 2252 with a 10c water temp. Not even had it down to its lowest setting yet or tried upping the voltage past 1.093.

Also got extra on the memory to +875 (as block is full-cover).. at 10c water temp my memory idles at 14-15c (taken using probes) and thats only with 2.8 w/Mk 1mm pads

More details a few pages back.

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Tested with [email protected] Same WU/Task. Some results with wattage draw from the whole system (6700K @1.200V fixed)
> 
> 2138mhz - 1.050V - 282.37 Watt
> 
> 2100mhz - 1.000V - 264.34 Watt
> 
> 2050mhz - 0.950V - 243.32 Watt
> 
> 2000mhz - 0.900V - 228.39 Watt
> 
> 1950mhz - 0.850V - 213.28 Watt
> 
> PSU: AX1500I. Wattage was drawn from the wall. Efficiency at 91% or so.


Seems quite low, my GPU under full stress at max power (130% on slider) draws up to 320w on its own.

Full system is about 525w - 550+

Your CPU overclocked?

Anyone know if applying 'conformal coating' to my PCB will invalidate warranty? I'm actually surprised manufacturers don't just apply this at the factory.


----------



## HeLeX63

Quote:


> Originally Posted by *nrpeyton*
> 
> Your GPU under water? You could grab a water chiller. I got from 2151 to 2252 with a 10c water temp. Not even had it down to its lowest setting yet or tried upping the voltage past 1.093.
> 
> Yeah its under water. Max temp about 50ish...
> 
> Whats a water chiller ?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I have two classy blocks I'd let go real cheap. They're just sitting there. One brand new in the box. Never even been opened.


Hmm I can't believe I only just seen this post (only reason I seen it is because someone asked me something that I had to go a few pages back to find the link)

Normally I get notified by email any quotes to my messages. Must have missed this one for some reason.

Hmm anyway that is very interesting I'll have a think about this 

Quote:


> Originally Posted by *HeLeX63*
> 
> Yeah its under water. Max temp about 50ish...
> 
> Whats a water chiller ?


*I can go as low as 3c water temp.
*





*full size* - right click & 'open new tab'


----------



## HeLeX63

Quote:


> Originally Posted by *nrpeyton*
> 
> Hmm I can't believe I only just seen this post (only reason I seen it is because someone asked me something that I had to go a few pages back to find the link)
> 
> That's insane haha.
> 
> But yeah I'm not willing to go that far at all. 50C in summer to me is still pretty good and better than my 73C. When winter hits, I can expect max load to be in the 40's.
> 
> As for the T4 bios, is it even worth it ?


----------



## nrpeyton

Quote:


> Originally Posted by *HeLeX63*
> 
> Hmm I can't believe I only just seen this post (only reason I seen it is because someone asked me something that I had to go a few pages back to find the link)
> 
> That's insane haha.
> 
> But yeah I'm not willing to go that far at all. 50C in summer to me is still pretty good and better than my 73C. When winter hits, I can expect max load to be in the 40's.
> 
> As for the T4 bios, is it even worth it ?


*I *think* but don't quote me on this. I've heard roomers:*

It has tighter memory timings; which results in slightly better scores.

*I'm sure about this though:*

It also has an increased power limit which is great for cards that are heavily power limited.

If you have the cooling hardware; it even has increased max voltage up to 1.2v.

It also has no temp limit, so you can fry your card if you're not careful. (If your fans or pump failed) and you never noticed the card would melt down before it throttled.


----------



## ucode

Quote:


> Originally Posted by *HeLeX63*
> 
> After doing some more unfortunate testing, I have realized I probably have one of the worst GTX 1080's.


IMHO that's nonsense, nothing wrong with those clocks, probably a little better than mine with Galax FE. Sure, can raise voltage with T4 VBIOS and get 2.2GHz and FS at 26k but for the extra power not sure it's worth it. You have a good card, enjoy it and stop worrying over a few MHz.









Edit: Forgot to mention that while the T4 VBIOS doesn't have a software limit for temperature it still shows a 96C HW throttling limit and a 99C shutdown temp and FWIW seems my fan may have failed at some time with T4 and registered 96C in HWiNFO so likely appears to work.


----------



## HeLeX63

Quote:


> Originally Posted by *ucode*
> 
> IMHO that's nonsense, nothing wrong with those clocks, probably a little better than mine with Galax FE. Sure, can raise voltage with T4 VBIOS and get 2.2GHz and FS at 26k but for the extra power not sure it's worth it. You have a good card, enjoy it and stop worrying over a few MHz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Forgot to mention that while the T4 VBIOS doesn't have a software limit for temperature it still shows a 96C HW throttling limit and a 99C shutdown temp and FWIW seems my fan may have failed at some time with T4 and registered 96C in HWiNFO so likely appears to work.


Yeah you're right. Ill stick with my 2,114 and 2,101 clocks and be content


----------



## HeLeX63

Quote:


> Originally Posted by *ucode*
> 
> IMHO that's nonsense, nothing wrong with those clocks, probably a little better than mine with Galax FE. Sure, can raise voltage with T4 VBIOS and get 2.2GHz and FS at 26k but for the extra power not sure it's worth it. You have a good card, enjoy it and stop worrying over a few MHz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Forgot to mention that while the T4 VBIOS doesn't have a software limit for temperature it still shows a 96C HW throttling limit and a 99C shutdown temp and FWIW seems my fan may have failed at some time with T4 and registered 96C in HWiNFO so likely app:thumb:ears to work.


Is your card on air or water ?


----------



## ucode

Started with air but too noisy for me so now on water. The 26k was with water, the 96C with air. On stock FE VBIOS with water my clocks were probably near or a little worse than yours.


----------



## ROKUGAN

Quote:


> Originally Posted by *juniordnz*
> 
> No, that's on air with the stock FTW cooler.
> 
> Just above average as nick said, but nothing special. 2.126mhz seems to be the max most of the cards get. Then there are those who can getc higher clocks and the unlucky ones who can't even break 2050.


It´s actually funny because I´ve been reading several posts from last week and lots of people reporting max OC @ 2.126 Mhz.
During the holidays I flashed and tested the Asus T4 bios in my Zotac 1080 Amp! Extreme and the max OC was...guess what? 2.126 Mhz








Tbh I was hoping to achieve higher frequencies with the T4 bios, as temps are still in the mid 60´s despite higher voltage thanks to the Zotac Extreme behemoth cooler.
Unfortunately its the same max frequency as with the original Bios, which was already quite good for this model.
I´ve been able to achieve only marginal gains (+100 points in Firestrike Ultra).

Looking at the pics and temps of the extreme WC setup done by nrpeyton (







) to ahieve +100Mhz higher, it´s pretty clear that this card is maxed out and we are hardly going to squeeze much more out of it. Also run out of hope about Pascal´s bios tweaking.

Actually already looking forward to the aftermarket models of the 1080ti to continue the OC fun


----------



## binormalkilla

Quote:


> Originally Posted by *nrpeyton*
> 
> Sorry, I meant to say PCB
> 
> It is a full cover block so yes,
> 
> Does "conformal coating" peel off? I'll have to google that one.
> 
> Anyhow thanks for reply


I've never used it myself, but it's standard for protecting PCBs in a wet environment. It's used in industrial applications mostly, but you could try it out. Conformal coating is a spray on dielectric compound, and will be permanent. There should be some older threads in the phase change cooling section of this forum. Here's an example of conformal coating in action:
http://www.overclock.net/t/1393232/hwbot-a-style-waterproof-by-conformal-coating-live-demo/0_50

Also to clarify, this isn't something you place between a heatsink and a component...it probably has a fairly low thermal conductivity.


----------



## binormalkilla

Quote:


> Originally Posted by *ROKUGAN*
> 
> It´s actually funny because I´ve been reading several posts from last week and lots of people reporting max OC @ 2.126 Mhz.
> During the holidays I flashed and tested the Asus T4 bios in my Zotac 1080 Amp! Extreme and the max OC was...guess what? 2.126 Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> Tbh I was hoping to achieve higher frequencies with the T4 bios, as temps are still in the mid 60´s despite higher voltage thanks to the Zotac Extreme behemoth cooler.
> Unfortunately its the same max frequency as with the original Bios, which was already quite good for this model.
> I´ve been able to achieve only marginal gains (+100 points in Firestrike Ultra).
> 
> Looking at the pics and temps of the extreme WC setup done by nrpeyton (
> 
> 
> 
> 
> 
> 
> 
> ) to ahieve +100Mhz higher, it´s pretty clear that this card is maxed out and we are hardly going to squeeze much more out of it. Also run out of hope about Pascal´s bios tweaking.
> 
> Actually already looking forward to the aftermarket models of the 1080ti to continue the OC fun


This is really strange...I maxed at this frequency as well. I haven't tried bumping the voltage a little at a time, but I suspect it won't matter. I tweaked the curve to have 1.2V at a mere +230 MHz (instead of +225 MHz) and it didn't affect stability one bit. I'm having too much fun gaming so I haven't bothered OCing much more, though. I wonder if there's something going on with the driver keeping us at 2126 MHz. With Firestrike Ultra, I went from 5268 to 6024 after flashing T4 and OCing. I'll need to re-run with the default frequency and my stable OC to confirm these values, however.

What are everyone's experience with memory OC? I've read that eventually error correction kicks in and loosens the timings, so higher frequency can be stable, yet yield equal or less performance.


----------



## Derek1

Quote:


> Originally Posted by *ROKUGAN*
> 
> It´s actually funny because I´ve been reading several posts from last week and lots of people reporting max OC @ 2.126 Mhz.
> During the holidays I flashed and tested the Asus T4 bios in my Zotac 1080 Amp! Extreme and the max OC was...guess what? 2.126 Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> Tbh I was hoping to achieve higher frequencies with the T4 bios, as temps are still in the mid 60´s despite higher voltage thanks to the Zotac Extreme behemoth cooler.
> Unfortunately its the same max frequency as with the original Bios, which was already quite good for this model.
> I´ve been able to achieve only marginal gains (+100 points in Firestrike Ultra).
> 
> Looking at the pics and temps of the extreme WC setup done by nrpeyton (
> 
> 
> 
> 
> 
> 
> 
> ) to ahieve +100Mhz higher, *it´s pretty clear that this card is maxed out and we are hardly going to squeeze much more out of it.* Also run out of hope about Pascal´s bios tweaking.
> 
> Actually already looking forward to the aftermarket models of the 1080ti to continue the OC fun


Never!

https://rog.asus.com/articles/news/fastest-gtx-1080-gpu-boost-clock--2645mhz/

http://videocardz.com/60923/galax-overclocks-gtx-1080-to-2-2-ghz-on-air-2-5-ghz-with-ln2

And also reports of an EVGA FTW going to 2800.

And 2 guys in this thread did 2300.

There must be a way to punch through.


----------



## ucode

Quote:


> Originally Posted by *binormalkilla*
> 
> What are everyone's experience with memory OC? I've read that eventually error correction kicks in and loosens the timings, so higher frequency can be stable, yet yield equal or less performance.


There isn't any memory error correction on these boards, the EDC line can however be used for retraining purposes. Memory OC on 1080 is buggy too.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Never!
> 
> https://rog.asus.com/articles/news/fastest-gtx-1080-gpu-boost-clock--2645mhz/
> 
> http://videocardz.com/60923/galax-overclocks-gtx-1080-to-2-2-ghz-on-air-2-5-ghz-with-ln2
> 
> And also reports of an EVGA FTW going to 2800.
> 
> And 2 guys in this thread did 2300.
> 
> There must be a way to punch through.




My best so far.

Only way to punch through is more cooling 

Even at these low temps, I'm still not seeing voltage doing much on the core. That 2265mhz was achieved at 1.093. 1.15v or even 1.2v wasn't making the run anymore stable or getting me any higher.

I am however getting decent memory gains with extra VRAM voltage. (GDDR5X memory seems to be responding nicely to lower temps + a little voltage boost)

I'll post something else tomorrow showing memory gains.


----------



## binormalkilla

Quote:


> Originally Posted by *ucode*
> 
> There isn't any memory error correction on these boards, the EDC line can however be used for retraining purposes. Memory OC on 1080 is buggy too.


What do you mean by 'EDC line'?


----------



## ucode

@binormalkilla These should help

https://www.micron.com/~/media/documents/products/technical-note/dram/tned01_gddr5_sgram_introduction.pdf

https://www.micron.com/~/media/documents/products/technical-note/dram/tned02_gddr5x.pdf


----------



## ROKUGAN

Quote:


> Originally Posted by *Derek1*
> 
> Never!
> 
> https://rog.asus.com/articles/news/fastest-gtx-1080-gpu-boost-clock--2645mhz/
> 
> http://videocardz.com/60923/galax-overclocks-gtx-1080-to-2-2-ghz-on-air-2-5-ghz-with-ln2
> 
> And also reports of an EVGA FTW going to 2800.
> 
> And 2 guys in this thread did 2300.
> 
> There must be a way to punch through.


LOL, well I´m talking about "normal" OC for 24/7 usage. If you want to pour Liquid Nitrogen while you play Witcher3 be my guest








Also, silicon lottery. The fact someone made 2300Mhz doesn´t mean your card can do it, it´s already pretty rare to achieve +2200 Mhz unless under really serious cooling or being a rare card lottery winner.
Having said that, I do regret that Nvidia blocked the BIOS as tweaking is a big part of the fun for enthusiasts and the Pascal range is really boring in that respect. I had much more fun squeezing my 980ti.


----------



## ROKUGAN

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> My best so far.
> 
> Only way to punch through is more cooling
> 
> Even at these low temps, I'm still not seeing voltage doing much on the core. That 2265mhz was achieved at 1.093. 1.15v or even 1.2v wasn't making the run anymore stable or getting me any higher.
> 
> I am however getting decent memory gains with extra VRAM voltage. (GDDR5X memory seems to be responding nicely to lower temps + a little voltage boost)
> 
> I'll post something else tomorrow showing memory gains.


GPU Temp @ 9C is just HILARIOUS


----------



## nrpeyton

Quote:


> Originally Posted by *ROKUGAN*
> 
> GPU Temp @ 9C is just HILARIOUS


6c at idle, 9-11c at load. Trying to think of a way to get the 'at load' to 5c.

Not figured out how yet.


----------



## Tdbeisn554

Quote:


> Originally Posted by *nrpeyton*
> 
> 6c at idle, 9-11c at load. Trying to think of a way to get the 'at load' to 5c.
> 
> Not figured out how yet.


Wow dude! Seems like your chiller works great.

11c on load is just crazy Xp


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> 6c at idle, 9-11c at load. Trying to think of a way to get the 'at load' to 5c.
> 
> Not figured out how yet.


Liquid metal on da core. I have it. Works perfectly!


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> Liquid metal on da core. I have it. Works perfectly!


With Kryonaut (normal paste) I only get 6-8c difference between water temp and load temp.

Furmark maybe 9-10c difference between water and GPU load temp.

What are you getting with the Liquid Metal? Between water and load?

And is your block nickel plated? how does that do with the LM?

More importantly; what about the top of the DIE? (can it all be removed without discolouration or permanent scratching)?

I *couldn't* get all of it off my copper CPU block (despite scrubbing with non-abrasive cloth with 1st an alchohol based tim cleaner, then 2nd tried a citrus based cleaner and then even boiling water!) :

This is as good as I could get it; and thats only with 1-2 months installation.... _(not tried any harsher chemicals yet)_


If you can give me any info at all on any of this it would be so much appreciated  and defintiely worth rep point 

So far I've been too afraid to try it on £800 GPU lol...but I desperately want to..  not afraid due to shorting (as know I'll be careful) but I *am* worried about scratching the top of the DIE trying to remove it in future.

I know someone else here also considering using it too (jioniorz)


----------



## jiccman1965

Can anyone help me fine a moded bios for the EVGA GTX 1080 FTW.


----------



## GreedyMuffin

Quote:


> Originally Posted by *nrpeyton*
> 
> Your GPU under water? You could grab a water chiller. I got from 2151 to 2252 with a 10c water temp. Not even had it down to its lowest setting yet or tried upping the voltage past 1.093.
> 
> Also got extra on the memory to +875 (as block is full-cover).. at 10c water temp my memory idles at 14-15c (taken using probes) and thats only with 2.8 w/Mk 1mm pads
> 
> More details a few pages back.
> Seems quite low, my GPU under full stress at max power (130% on slider) draws up to 320w on its own.
> 
> Full system is about 525w - 550+
> 
> Your CPU overclocked?
> 
> Anyone know if applying 'conformal coating' to my PCB will invalidate warranty? I'm actually surprised manufacturers don't just apply this at the factory.


Yeah. But my CPU is only running 4400 @1,200V fixed.

Right now I'm considering to use 1950 at 0.850V as 24/7 usage profile. That is only a 7 percent decrease from 2100mhz.


----------



## jiccman1965

Yes overclocked CPU, was able to get +115 GPU and +550 Memory on Air. Just looking to get a bit more voltage.

7742.png 1511k .png file


----------



## nrpeyton

Quote:


> Originally Posted by *jiccman1965*
> 
> Yes overclocked CPU, was able to get +115 GPU and +550 Memory on Air. Just looking to get a bit more voltage.
> 
> 7742.png 1511k .png file


Voltage isn't doing much on Pascal.

But I understand you still want to find that out for yourself and experience the "effort in trying"...

I did.. I was disappointed.. but it was still good to try. So I hear what your saying.

Anyway; more importantly.

There is no BIOS editor out for PASCAL... at all.. period.

The only option you have is to jump on the T4 BIOS bandwagon and flash it to your card as many others have.

: If you do decide to jump onto the "T4 BIOS bandwagon" and flash it to your card; *be careful:*

- it has no temp or power limit.
-if your fans or pump failed the card would melt down before it throttled
-FE cards on air using this BIOS will have slower fan speed
-the third display port on your card won't work anymore (nothing to worry about -- but don't let that mistake you into thinking you've bricked your card)
-don't use the 'certificates bypassed' version of nvflash to flash it (that seems back to front, I know - but don't worry about that there's a good reason)

strix1080xoc_t4version2.zip 148k .zip file


-Allows voltage increase up to 1.2v _(stock locked limit is 1.093v so a 107 millivolt, max increase)_
-Power limit removed
-Temp limit removed *(warning)!*
-Could be dangerous on cards with only one power connector with lower quality cables. Max safe power draw for 1 power cable cards is 225 watts.

P.S.
_-Many are getting higher benchmark scores using this BIOS *without* even using the increased voltage it offers. Two reasons for this. no.1) removed power limit no.2) it is roomered to have 'tighter' memory timings._

_-Not compatible with EVGA 1080 Classified; it won't brick the card; you'll just get a lower score + numbers won't report correctly._

_-People on EVGA 1080 FTW have been getting good results, however _

Let us know how you get on


----------



## BigBeard86

so using this bios will allow me to OC my watercooled evga 1080 FE even more? I am always hitting power limits. What enefit does CLU have over this? Can you link me to how to flash?

How do i backup my original bios in case i need to replace it?


----------



## nrpeyton

Quote:


> Originally Posted by *BigBeard86*
> 
> so using this bios will allow me to OC my watercooled evga 1080 FE even more? I am always hitting power limits. What enefit does CLU have over this? Can you link me to how to flash?
> 
> How do i backup my original bios in case i need to replace it?


Yes it will prevent the power throttling. (up to a point -- at a certain point the hardware it's self I think has an upper limit calculated at the hard level by the resistors and such on your PCB printed circuit board).

Use GPU-Z to backup your original BIOS (make sure you disable the graphics driver in device manager first) do this when backing up AND when flashing new BIOS. _A simple google search can show you this _

Do a bit research first and see if you can figure it out; if you have any questions come back and i'll try my best to help.

You're better doing a bit research first and figuring most of it out yourself; it will make a lot more sense to you this way and it will be easier for you to diagnose if anything goes wrong.

You need an app called nvflash (*not* the certificates bypassed version)

I had less information than that when I figured it out through google searches/research


----------



## Kerdamor

anyone know how i can test memory ( full pool 8gb) ?


----------



## amvnz

Don't flash that T4 bios. I flashed it a while ago and I didn't overclock at all, didn't run benchmarks regularly or do anything extra stressful. Just played games. I started blackscreening 5 minutes into playing any game recently and I finally found the cause, the PCI-E cable melted.


----------



## nrpeyton

Quote:


> Originally Posted by *amvnz*
> 
> Don't flash that T4 bios. I flashed it a while ago and I didn't overclock at all, didn't run benchmarks regularly or do anything extra stressful. Just played games. I started blackscreening 5 minutes into playing any game recently and I finally found the cause, the PCI-E cable melted.


How many power cables does your card take normally?

I wouldn't recommend flashing it on a card that only takes 1 cable if you are overclocking very hard for long periods of time. Each cable is capable of only 150w. And the PCI-E lane 75w. So thats 225w. If you go 275w + regularly you will be putting a lot of stress on that cable and slot.

If your card has two power cables; you should be fine.

You can see how much your GPU is drawing using the HWINFO64 app, it gives real time, per second wattage draw for GPU.


----------



## HeLeX63

Quote:


> Originally Posted by *amvnz*
> 
> Don't flash that T4 bios. I flashed it a while ago and I didn't overclock at all, didn't run benchmarks regularly or do anything extra stressful. Just played games. I started blackscreening 5 minutes into playing any game recently and I finally found the cause, the PCI-E cable melted.


Holy **** that's scary. Were you on air ? What were your GPU temps ??


----------



## nrpeyton

Quote:


> Originally Posted by *Kerdamor*
> 
> anyone know how i can test memory ( full pool 8gb) ?


You can test 1gb and it will still be tested evenly across all 8 GDDR5X vram chips (I don't believe the data fills up one chip at a time).

If you just want to test your memory overclock the OCCT app is best for this. It works by comparing one frame to the last. Quite simple, quite elegant. I think it fills about 5.6gb _(with max setting you have to change this)_ which is about the best you're going to get on anything synthetic.

I got to +875 before errors starting showing on chilled water.

+725 - 775 on normal water.

A bit lower on air.

I also found memory responds quite nicely to a little extra voltage and lower temps. (I'm not talking about core voltage).
Only gigabyte xtreme customers and evga classified 1080 customers can change memory voltage (so i've heard but there could be more).


----------



## amvnz

Yes the card only has 1 PCI-E slot, its an ASUS GTX 1080 Founders Edition. The GPU temps were always close to room temperature when idle and maxed out at maybe 55 C. It is watercooled.


----------



## nrpeyton

Quote:


> Originally Posted by *amvnz*
> 
> Yes the card only has 1 PCI-E slot, its an ASUS GTX 1080 Founders Edition. The GPU temps were always close to room temperature when idle and maxed out at maybe 55 C. It is watercooled.


That is why it melted then.

You should have been monitoring your power draw through HWINFO64. (If cross-flashing this BIOS doesn't give you an accurate HWINFO64 reading there are other ways to make a rough estimate with wall-socket watt meters etc).

The normal max power limit of a FE is 180w or (or 216w when overclocked)

And for a '1 power cable card'; the max draw shouldn't exceed 225w. (150w cable, 75w lane).

As I said, each power cable is capable of 150w. And the lane 75w. That's 225w. If you're going over that, then expect the wire to heat up a little...

...if you're _frequently_ going over that *for extended periods of time*
then you either need to find a way to cool that cable (and plug), or simply don't use the T4 with a FE..
or...
just don't overclock quite as hard... I mean.. draw 250/260w but anything higher I'd be worried....

That being said, I'm sure many of these cables are made a lot "more equal" than others.

If you're overclocking your FE to 2150 - 2200 at 1.093v you're going to be drawing 275w +

It's possible your cable already had an issue, obviously there is some safety margin built into the 225w max of a one cable card.

Nvidia only put one cable on it because they never intended more than 216w (9w within spec of only one cable).

On an FTW (which has 2 connectors) the max safe power draw would be 375w. (you'd be hard pushed to draw that on ANY overclock) maybe furmark/LN2 

Anyway I am sorry to hear what happened. I hope your card is okay?


----------



## nrpeyton

Quote:


> Originally Posted by *Kerdamor*
> 
> anyone know how i can test memory ( full pool 8gb) ?


The app claims 2gb is the highest but the *actual* is much higher..

I just checked.. 4.6gb actually... _(my idle before starting OCCT was only 300mb or so)_

See below:

*full size* - right click & open new tab


----------



## Kerdamor

damn, need find another way for test memory, OCCT can only 2gb







but thx, good prog for test video card anyway
yeah, same situation


----------



## nrpeyton

Quote:


> Originally Posted by *Kerdamor*
> 
> damn, need find another way for test memory, OCCT can only 2gb
> 
> 
> 
> 
> 
> 
> 
> but thx, good prog for test video card anyway


http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/8830#post_25750868

read my last post (i just covered that lol)


----------



## Buzzard1

Quote:


> Originally Posted by *amvnz*
> 
> Yes the card only has 1 PCI-E slot, its an ASUS GTX 1080 Founders Edition. The GPU temps were always close to room temperature when idle and maxed out at maybe 55 C. It is watercooled.


I think nrpreyton meant he wouldnt put these bios on a card with only 1 pci-e power cord. I have 2 8 pin connectors on my evga ftw.


----------



## Menthol

Quote:


> Originally Posted by *nrpeyton*
> 
> With Kryonaut (normal paste) I only get 6-8c difference between water temp and load temp.
> 
> Furmark maybe 9-10c difference between water and GPU load temp.
> 
> What are you getting with the Liquid Metal? Between water and load?
> 
> And is your block nickel plated? how does that do with the LM?
> 
> More importantly; what about the top of the DIE? (can it all be removed without discolouration or permanent scratching)?
> 
> I *couldn't* get all of it off my copper CPU block (despite scrubbing with non-abrasive cloth with 1st an alchohol based tim cleaner, then 2nd tried a citrus based cleaner and then even boiling water!) :
> 
> This is as good as I could get it; and thats only with 1-2 months installation.... _(not tried any harsher chemicals yet)
> 
> If you can give me any info at all on any of this it would be so much appreciated  and defintiely worth rep point
> 
> So far I've been too afraid to try it on £800 GPU lol...but I desperately want to..  not afraid due to shorting (as know I'll be careful) but I *am* worried about scratching the top of the DIE trying to remove it in future.
> 
> I know someone else here also considering using it too (jioniorz) _


The very first CPU I delidded, i Had to redo the CLU application after benchin using a SS unit, -50 degrees the lid feel off after removing from the CPU socket as I didn't use any glue
CLU comes with an alcohol pad and a little scrub pad similar to what would be used to clean pots and pans, it left very small scratches on the surface of the die and lid, it didn't harm the die but it did leave those very tiny scratches, my guess is it would do the same to the GPU die and it would be noticeable and possibly cancel any RMA attempt.
Personally I would not use it except when delidding a CPU where adhesion would be desired, I would never use it between CPU and heatsink or block, nor on a GPU die.
That said I always like to see others do something I wouldn't do as there's not many things I won't do


----------



## amvnz

Yeah don't flash to a Founders Edition. I flashed it ages ago when the GTX 1080s were released. If anyone has the T4 bios on their Founders Edition card, flash your card back to the original BIOS. I only tested overclocks on the day I flashed, it had been running stock without Afterburner installed for months. I don't benchmark, just play games. All my games are FPS capped too because of G-SYNC so the card wasn't exactly running flat out all the time. Even that melted the PCI-E connector.

It's working fine now, I just scraped the mess out of the card's connector with a needle and plugged in a different PCI-E cable. Runs stable like it always did and the new cable has no signs of damage.


----------



## BigBeard86

amvnz, thank you for the warning and helping me avoid the poor advice of the previous poster, who speaks like a know-it-all.


----------



## ucode

Quote:


> Originally Posted by *Kerdamor*
> 
> anyone know how i can test memory ( full pool 8gb) ?


What kind of test? You could try Nai's 64-bit benchmark which will use all available VRAM, best run headless.


----------



## Buzzard1

I am gonna flash my 1080 ftw tonight. It runs so cool already not to mention I have dual bios with a physical switch in case things get to bad. If any one eles has been running the t-4 bios on the ftw with luck, please post your good numbers so I can use it as a basic reference.

Thank you.


----------



## nrpeyton

Deleted.

Confused bigbeard86 with buzzard1 with FTW/FE when talking about the T4 BIOS.

My apologies.

I've also updated my warnings with the file (T4) a few pages back to include a warning for owners of cards with only one power cable when using the T4.


----------



## nrpeyton

Quote:


> Originally Posted by *Kerdamor*
> 
> damn, need find another way for test memory, OCCT can only 2gb
> 
> 
> 
> 
> 
> 
> 
> but thx, good prog for test video card anyway
> yeah, same situation


Another programme you could try is the 'memory burner', part of the 'Precision OC Scanner X' suite.

Why do u need to test the full 8gb? The memory isn't filled up 1 chip at a time.

If it's an OC you're wanting to test OCCT works despite the 2gb max. (In actual fact it doubles up to 4gb) once the test begins as u can see in my hwinfo64 screenshot

Other than that I can't think of any modern software for testing video memory
Anything else is older and has an even smaller load.
If u do find anything though plz let me know I've been looking myself for this, this year, and those were my two, best finds.


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> With Kryonaut (normal paste) I only get 6-8c difference between water temp and load temp.
> 
> Furmark maybe 9-10c difference between water and GPU load temp.
> 
> What are you getting with the Liquid Metal? Between water and load?
> 
> And is your block nickel plated? how does that do with the LM?
> 
> More importantly; what about the top of the DIE? (can it all be removed without discolouration or permanent scratching)?
> 
> I *couldn't* get all of it off my copper CPU block (despite scrubbing with non-abrasive cloth with 1st an alchohol based tim cleaner, then 2nd tried a citrus based cleaner and then even boiling water!) :
> 
> This is as good as I could get it; and thats only with 1-2 months installation.... _(not tried any harsher chemicals yet)_
> 
> 
> If you can give me any info at all on any of this it would be so much appreciated  and defintiely worth rep point
> 
> So far I've been too afraid to try it on £800 GPU lol...but I desperately want to..  not afraid due to shorting (as know I'll be careful) but I *am* worried about scratching the top of the DIE trying to remove it in future.
> 
> I know someone else here also considering using it too (jioniorz)


OOOkkkk.
So...

What I used is Thermal Grizzly Conductonaut, but in the past I used to have CL pro and CL ultra.
I used to have it on copper coolers and nickel plated ones as well.
At the moment, I ONLY use nickel plated heatsinks. Liquid metal is fine with that, no issues at all.
Regarding cleaning, I use Coollaboratory cleaning set- https://www.amazon.co.uk/Coollaboratory-Liquid-Cleaning-Set/dp/B0058688KA
You can clean nickel plated heatsinks much easier/better than copper ones. It feels like copper is getting "soaked" with that liquid metal thats why you see all the residues on copper.
Cleaning of the die is very easy. You can use the same set I linked above. No residue left on cores....
Regarding scratching. The scratches on the die are not from the paste itself, but from the cleaning of the paste and moving its metal particles on the surface of the die. Just dont push on the liquid metal/die when cleaning and you should avoid the scratches.
No worries tho - no dmg will be done to the core.
My delta T idle-load is about 3-5C max, depending on application/game.
I run w-cooling, high end one, and getting about 33-35C in load, while heavy gaming in 1440p, with 2190MHz @1.094v + hard mod TDP.


----------



## fat4l

For anyone interested in Hailea Waterchiller review, you can check one here, from 2008 lol!

https://translate.google.co.uk/translate?sl=cs&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fpctuning.tyden.cz%2Fhardware%2Fskrine-zdroje-chladice%2F11828-hailea_hc-500a-vodnik_s_aktivne_chlazenou_vodou%3Fstart%3D1&edit-text=&act=url


----------



## Vellinious

So close to 16k I can smell it. Just need a little more. Time to roll the PC outside again and see if I can push over. I've been messing with getting it tuned back in after a motherboard replacement and a clean wipe / reinstall of everything. I'm finally getting there.

http://www.3dmark.com/compare/spy/1001826/spy/705491/spy/705550#


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> So close to 16k I can smell it. Just need a little more. Time to roll the PC outside again and see if I can push over. I've been messing with getting it tuned back in after a motherboard replacement and a clean wipe / reinstall of everything. I'm finally getting there.
> 
> http://www.3dmark.com/compare/spy/1001826/spy/705491/spy/705550#


Good score.

Lets see some screenshots with temps?

Very interested to see how rolling the PC outside does temp wise  especially pitted against a chiller would make a good comparison at this time of year  lol

We can compare actual *scores* maybe after ZEN launches but for now its temps only because you're at a destinct advantage with that mammothly overpriced CPU lol.

Quote:


> Originally Posted by *fat4l*
> 
> For anyone interested in Hailea Waterchiller review, you can check one here, from 2008 lol!
> 
> https://translate.google.co.uk/translate?sl=cs&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fpctuning.tyden.cz%2Fhardware%2Fskrine-zdroje-chladice%2F11828-hailea_hc-500a-vodnik_s_aktivne_chlazenou_vodou%3Fstart%3D1&edit-text=&act=url


Great thanks for the info mate; and for the water-chiller review. Will read it tonight after work when I've got time to look at it more carefully and understand the translation errors. But yeah definitely good, didn't think any other reviews even existed apart from the one done by bitspower.

Thanks again 

Quote:


> Originally Posted by *Menthol*
> 
> The very first CPU I delidded, i Had to redo the CLU application after benchin using a SS unit, -50 degrees the lid feel off after removing from the CPU socket as I didn't use any glue
> CLU comes with an alcohol pad and a little scrub pad similar to what would be used to clean pots and pans, it left very small scratches on the surface of the die and lid, it didn't harm the die but it did leave those very tiny scratches, my guess is it would do the same to the GPU die and it would be noticeable and possibly cancel any RMA attempt.
> Personally I would not use it except when delidding a CPU where adhesion would be desired, I would never use it between CPU and heatsink or block, nor on a GPU die.
> That said I always like to see others do something I wouldn't do as there's not many things I won't do


Thanks for info.

*I just read that* apparently the main ingredient in *Liquid Metal* (gallium) *turns to a solid at 8 degrees C*. Now *if it expands* as it *turns to a solid* that could potentially *crack the DIE*...? Might be wrong; but I am awfully afraid to try it after reading that lol.. as the whole point of doing it was to try and get below 8c at load using the chiller :-(

Argh back to the drawing table lol :-(


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Good score.
> 
> Lets see some screenshots with temps?
> 
> Very interested to see how rolling the PC outside does temp wise  especially pitted against a chiller would make a good comparison at this time of year  lol
> 
> We can compare actual *scores* maybe after ZEN launches but for now its temps only because you're at a destinct advantage with that mammothly overpriced CPU lol.
> Great thanks for the info mate; and for the water-chiller review. Will read it tonight after work when I've got time to look at it more carefully and understand the translation errors. But yeah definitely good, didn't think any other reviews even existed apart from the one done by bitspower.
> 
> Thanks again
> Thanks for info.
> 
> *I just read that* apparently the main ingredient in *Liquid Metal* (gallium) *turns to a solid at 8 degrees C*. Now *if it expands* as it *turns to a solid* that could potentially *crack the DIE*...? Might be wrong; but I am awfully afraid to try it after reading that lol.. as the whole point of doing it was to try and get below 8c at load using the chiller :-(
> 
> Argh back to the drawing table lol :-(


I did it once a while back. Ambient was at 2c, coolant temps about 6c. GPUs topped out at 20c. It's -13c today....hopefully, those temps will hold until the weekend.


----------



## fat4l

I also forgot to say, you need to use liquid tape or normal paste around the die to cover the small resistors/capacitors if using liquid metal.


----------



## Vellinious

It was 14c in the man cave when I got home from work today. A 7c drop in temps brought the score up a tad. It's supposed to be cold all the way through the weekend. Hopefully, I can get 2214 to run and hit that 16k mark I'm shooting for. Wish I could get a little more out of the CPU.....that's about next, I guess.

http://www.3dmark.com/spy/1004478


----------



## mypickaxe

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> I have two classy blocks I'd let go real cheap. They're just sitting there. One brand new in the box. Never even been opened.
> 
> 
> 
> Hmm I can't believe I only just seen this post (only reason I seen it is because someone asked me something that I had to go a few pages back to find the link)
> 
> Normally I get notified by email any quotes to my messages. Must have missed this one for some reason.
> 
> Hmm anyway that is very interesting I'll have a think about this
> 
> Quote:
> 
> 
> 
> Originally Posted by *HeLeX63*
> 
> Yeah its under water. Max temp about 50ish...
> 
> Whats a water chiller ?
> 
> Click to expand...
> 
> *I can go as low as 3c water temp.
> *
Click to expand...

Perhaps this has been asked, but why go through the PCIe brackets when it appears that case has cutouts for tubing?


----------



## nrpeyton

Quote:


> Originally Posted by *mypickaxe*
> 
> Perhaps this has been asked, but why go through the PCIe brackets when it appears that case has cutouts for tubing?


They're not rubber, I'd have to push through the metal. The safest way to do that would be to remove the mobo & everything first and re-install the full system which would have slowed me down on the day I received (and was excited) to set up the chiller. 

Just double checked it now; it does look that way; with the hole in the middle of the X in a circle. But the metal in that circle doesn't seem any thinner than the rest of the case. :-( Or you're right .. it would have been ideal


----------



## feznz

Quote:


> Originally Posted by *Vellinious*
> 
> It was 14c in the man cave when I got home from work today. A 7c drop in temps brought the score up a tad. It's supposed to be cold all the way through the weekend. Hopefully, I can get 2214 to run and hit that 16k mark I'm shooting for. Wish I could get a little more out of the CPU.....that's about next, I guess.
> 
> http://www.3dmark.com/spy/1004478
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ]


have you tried tightening the timings on the ram to something like 13-13-13-32 with a slight over voltage

I done a bit of benching experiments I found that tighter ram timing yielded slightly more that higher ram frequencies and I am sure you will find help here with tightening the secondary timings here.
http://www.overclock.net/t/1268061/ocn-ram-addict-club-gallery/0_20


----------



## Pu239

Hey!

I just bought a Giga GTX 1080 (GV-N1080WF3OC-8GD) and I have serious performance issues. I have an i7-4770K @ 4,3GHz and 16GB RAM.
I bought and set up the card yesterday. I ran Unigine Heaven and Fire Strike benchmarks. The performance varies between bad and worse. Sometimes I got a performance slightly better than my previous GTX 780Ti sometimes its far worse.
Results for the Heaven (the bad):

And the worse:


I also got a very poor result in Fire strike:


Solutions I've already tried:

Installing and re-installing the latest drivers (376.33)
Trying different DP outputs of the VGA
Relocating the VGA to other PCIe slots
Plugging other 8-pin connectors of the PSU into the VGA

Solutions which I am not yet tried:
Reseting the mobo BIOS
Uninstall driver with DDU and then reinstalling.
Maybe a complete OS reinstall, but I'd like to avoid that

Can anyone help me what could've gone wrong? How can I fix it? Is it software originated or is my hardware faulty?


----------



## Pu239

My first image is quite small, so here it is again:
Heaven GTX 1080 bad (WQHD)


----------



## TWiST2k

So no 1080 Ti announcement, what a surprise hahaha. I could go back and call you all out that "KNEW" it was coming, but you know who you are and you now know, how wrong you were lol.


----------



## hertz9753

Quote:


> Originally Posted by *Pu239*
> 
> My first image is quite small, so here it is again:
> Heaven GTX 1080 bad (WQHD)


You have to right click on the image and left click on open link.


----------



## ROKUGAN

Quote:


> Originally Posted by *TWiST2k*
> 
> So no 1080 Ti announcement, what a surprise hahaha. I could go back and call you all out that "KNEW" it was coming, but you know who you are and you now know, how wrong you were lol.


I wouldn´t be surprised if Nvidia decided to wait for AMD to release VEGA being slightly faster than the 1080 only to launch the 1080ti afterwards being 20% faster than Vega, similarly what they did with the Fury and the 980ti.


----------



## Vellinious

Quote:


> Originally Posted by *feznz*
> 
> have you tried tightening the timings on the ram to something like 13-13-13-32 with a slight over voltage
> 
> I done a bit of benching experiments I found that tighter ram timing yielded slightly more that higher ram frequencies and I am sure you will find help here with tightening the secondary timings here.
> http://www.overclock.net/t/1268061/ocn-ram-addict-club-gallery/0_20


All I had done up to this point was bump it from 3200 to 3400 with a little extra voltage. I'm completely ignorant when it comes to memory overclocking....I need to sit down and read up on it, but, time is at a premium.....


----------



## nrpeyton

Quote:


> Originally Posted by *Pu239*
> 
> Hey!
> 
> I just bought a Giga GTX 1080 (GV-N1080WF3OC-8GD) and I have serious performance issues. I have an i7-4770K @ 4,3GHz and 16GB RAM.
> I bought and set up the card yesterday. I ran Unigine Heaven and Fire Strike benchmarks. The performance varies between bad and worse. Sometimes I got a performance slightly better than my previous GTX 780Ti sometimes its far worse.
> Results for the Heaven (the bad):
> 
> And the worse:
> 
> 
> I also got a very poor result in Fire strike:
> 
> 
> Solutions I've already tried:
> 
> Installing and re-installing the latest drivers (376.33)
> Trying different DP outputs of the VGA
> Relocating the VGA to other PCIe slots
> Plugging other 8-pin connectors of the PSU into the VGA
> 
> Solutions which I am not yet tried:
> Reseting the mobo BIOS
> Uninstall driver with DDU and then reinstalling.
> Maybe a complete OS reinstall, but I'd like to avoid that
> 
> Can anyone help me what could've gone wrong? How can I fix it? Is it software originated or is my hardware faulty?


Temps are you getting?

You getting slow downs on anything else?

Your FPS in the second link is much better. Than the first.

Quote:


> Originally Posted by *TWiST2k*
> 
> So no 1080 Ti announcement, what a surprise hahaha. I could go back and call you all out that "KNEW" it was coming, but you know who you are and you now know, how wrong you were lol.


I thought it was the 17th of January it was getting announced?

The longer the better; can't really be bothered with the hassle of having to go out and sell my old card and try and buy a new one etc etc.

Feel as though I only done that recently this year with the 1080.


----------



## juniordnz

*I just got a 20ºC reduction on GPU temp just by replacing stock paste with Kryonaut and stock pads with Fujipoly's 11w/mK (+ a drop of TIM on the chips).*

How I tested: I ran a full Firestrike Ultra Stress Test and took the highest temperature the GPU registered during the test. The test takes something like 12min if I'm not mistaken and that's plenty of time for temperatures to max out and stabilize.

GTX 1080 FTW - 1.062V / 2.100mhz Core / +575mhz Mem / 130% TDP / Fans 3250RPM

*Test 1 - Stock Paste / Stock Pads*
• Room: 33ºC ***
• GPU IDLE: 34ºC
• GPU LOAD: 87ºC

*Test 2 - Kryonaut / Fujipoly's 11w/mK on VRAM and MOSFETS + a drop of tim on the surface of the chips*
• Room: 38ºC ***
• GPU IDLE: 33ºC
• GPU LOAD: 69ºC

*** I can't measure the temperature in my room, so I took the local temperature registered at the moment of the test. Although it's not accurate to measure the room temp, it gives us an estimate. My room was a lot hotter on the second test than in the first. Let's take this 5ºC in local temp and consider a 2ºC variation in ambient temp for the sake of comparing.

*That's an 18ºC difference considering only the GPU temp reading and a 20ºC difference if we consider an increase of 2ºC in the room temp (as mentioned above, it was clearly hotter on the second test)*


----------



## Derek1

Anyone have any idea what this is? FTW 2?

http://www.evga.com/icx/


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> *I just got a 20ºC reduction on GPU temp just by replacing stock paste with Kryonaut and stock pads with Fujipoly's 11w/mK (+ a drop of TIM on the chips).*
> 
> How I tested: I ran a full Firestrike Ultra Stress Test and took the highest temperature the GPU registered during the test. The test takes something like 12min if I'm not mistaken and that's plenty of time for temperatures to max out and stabilize.
> 
> GTX 1080 FTW - 1.062V / 2.100mhz Core / +575mhz Mem / 130% TDP / Fans 3250RPM
> 
> *Test 1 - Stock Paste / Stock Pads*
> • Room: 33ºC ***
> • GPU IDLE: 34ºC
> • GPU LOAD: 87ºC
> 
> *Test 2 - Kryonaut / Fujipoly's 11w/mK on VRAM and MOSFETS + a drop of tim on the surface of the chips*
> • Room: 38ºC ***
> • GPU IDLE: 33ºC
> • GPU LOAD: 69ºC
> 
> *** I can't measure the temperature in my room, so I took the local temperature registered at the moment of the test. Although it's not accurate to measure the room temp, it gives us an estimate. My room was a lot hotter on the second test than in the first. Let's take this 5ºC in local temp and consider a 2ºC variation in ambient temp for the sake of comparing.
> 
> *That's an 18ºC difference considering only the GPU temp reading and a 20ºC difference if we consider an increase of 2ºC in the room temp (as mentioned above, it was clearly hotter on the second test)*


That's amazing just for upgrading what you did (paste and pads). More than most will ever get. Fantastic mate. How high are the pads on the VRM?


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Anyone have any idea what this is? FTW 2?
> 
> http://www.evga.com/icx/


No clue. ICX would suggest a new style cooler, though. Kinda like they did with the ACX 2.0 / ACX 2.0+ 970s, with 2 different FTW models.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> That's amazing just for upgrading what you did (paste and pads). More than most will ever get. Fantastic mate. How high are the pads on the VRM?


Pads are 1,5mm thick for both VRAM and Mosfets. On the VRAM the thickniss allows a good squeeze of the pads. On the Mosfets it's just about right, it does make contact, but it doesn't squeeze like the ones on the VRAM.

I remember reading it here about adding a drop of tim on the chips to help conducing the heat away from it, am I right?


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> Pads are 1,5mm thick for both VRAM and Mosfets. On the VRAM the thickniss allows a good squeeze of the pads. On the Mosfets it's just about right, it does make contact, but it doesn't squeeze like the ones on the VRAM.
> 
> I remember reading it here about adding a drop of tim on the chips to help conducing the heat away from it, am I right?


I do the same on my memory chips as well (just a tiny tiny little bit) and same on the mosfets (although that can get a bit messy lol if you're not patient in removing it afterwards)

did you get the drivers too? (next to the mosfets on the VRM)?


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> Anyone have any idea what this is? FTW 2?
> 
> http://www.evga.com/icx/


Quote:


> Originally Posted by *Vellinious*
> 
> No clue. ICX would suggest a new style cooler, though. Kinda like they did with the ACX 2.0 / ACX 2.0+ 970s, with 2 different FTW models.


It's dfinitely a new cooling system. And the words "safety" and "peace of mind" may suggest a direct contact with the mosfets that were overheating and maybe temp monitoring of the power delivery system?


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> It's dfinitely a new cooling system. And the words "safety" and "peace of mind" may suggest a direct contact with the mosfets that were overheating and maybe temp monitoring of the power delivery system?


Also; i'm using 1.5mm on the VRM too.... but 1mm on the memory. I'd love to only have to use half that on both.. but i wasn't getting equal contact so i gave up (still not sure it would of even been possible _even_ if i had gotten everything perfect. Think it still wouldn't of worked due to card not being flat out the factory anyway or imperfections on the block...


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> I do the same on my memory chips as well (just a tiny tiny little bit) and same on the mosfets (although that can get a bit messy lol if you're not patient in removing it afterwards)
> 
> did you get the drivers too? (next to the mosfets on the VRM)?


As long as it works and keep mosfets cooler...I don't have any intention of cleaning it up







it's more of a "set and forget" kind of mod


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> As long as it works and keep mosfets cooler...I don't have any intention of cleaning it up
> 
> 
> 
> 
> 
> 
> 
> it's more of a "set and forget" kind of mod


Aye that's amazing though with the Kryonaut. And I wouldn't of been surprised if your VRM wasn't making proper contact before too (as a lot of EVGA's weren't hence the whole drama and thermal mod issue)... a cooler VRM may also be contributing to your cooler core temps as less heat getting conducted (especially if it was running over 100c).

1.5mm on the VRM got me the best results too.

Anyway its always good to see on these forums when someone gets a good result by making a few adjustments in a cost effective way so wtg mate. 









Would be interested to see if you manage any extra mhz on your memory overclock now....
(mine responded quite nicely)


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> It's dfinitely a new cooling system. And the words "safety" and "peace of mind" may suggest a direct contact with the mosfets that were overheating and maybe temp monitoring of the power delivery system?


Doubtful on the temp monitoring. I'd love it, though. That's one of the things I loved about the 290X I had, was the ability to monitor the VRM temps right in GPUz.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Doubtful on the temp monitoring. I'd love it, though. That's one of the things I loved about the 290X I had, was the ability to monitor the VRM temps right in GPUz.


I wish nvidia would implement it


----------



## ironhide138

Quote:


> Originally Posted by *ROKUGAN*
> 
> I wouldn´t be surprised if Nvidia decided to wait for AMD to release VEGA being slightly faster than the 1080 only to launch the 1080ti afterwards being 20% faster than Vega, similarly what they did with the Fury and the 980ti.


Honestly, if Vega isn't out till the summer, they may not even feel bothered to do a 1080ti. If Vega comes out in june/July,and goes toe to toe with the 1080, nvidia could put out volta oct-dec (18-19 months after 1080/1070 isn't unreasonable) and just take the lead again. 4-5 months of competition from Vega isn't going to do much, unless AMD plans to put Vega against volta.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Doubtful on the temp monitoring. I'd love it, though. That's one of the things I loved about the 290X I had, was the ability to monitor the VRM temps right in GPUz.


How on earth they don't add VRAM and VRM temp monitoring features on top tier cards? Ok, don't add it on FTWs, but Classys? They don't really get the word "enthusiast"...

But after all the burn from the whole overheating VRM situation I believe this new system may have a direct contact between mosfets and heatspreaders


----------



## Derek1

Quote:


> Originally Posted by *juniordnz*
> 
> How on earth they don't add VRAM and VRM temp monitoring features on top tier cards? Ok, don't add it on FTWs, but Classys? They don't really get the word "enthusiast"...
> 
> But after all the burn from the whole overheating VRM situation I believe this new system may have a direct contact between mosfets and heatspreaders


Ya, I am interested in the monitoring issue as well.
The "i" gives the impression of that as the Corsair stuff (coolers and PSUs associated with Link) all have that designation.


----------



## Derek1

More info.


__ https://twitter.com/i/web/status/817064619076251648%2Fphoto%2F1


----------



## juniordnz

Quote:


> Originally Posted by *Derek1*
> 
> More info.
> 
> 
> __ https://twitter.com/i/web/status/817064619076251648%2Fphoto%2F1


See the solid portion of the backplate right where the thermal mod tells us to apply the thermal pad?

I'm guessing this iCX is a "what we should've done from the start" version of the FTW.


----------



## nrpeyton

Quote:


> Originally Posted by *juniordnz*
> 
> See the solid portion of the backplate right where the thermal mod tells us to apply the thermal pad?
> 
> I'm guessing this iCX is a "what we should've done from the start" version of the FTW.


Why change the name then? Just an advertising gimmic so people considering buying a aren't put off by a ACX 3.0.

So just add a few thermal pads and change the entire name of the cooler for all cards sold afterwards??

What about the after-sale value of all the regular ACX 3.0??

Good news for EVGA profits, terrible news for those looking to eventually sell on Ebay and upgrade.


----------



## Vellinious

Meh....more marketing gimmicks. The GPU market is full of them. LOOK AT THIS SHINEY 7000 PHASE VRM YOU WON'T USE UNLESS YOU'RE GOING TO PUT A POT ON IT AND USE LN2, AND THIS AWESOME RGB LED STUFF WE SLAPPED TOGETHER!! WOOHOOO!!!! /smack


----------



## TWiST2k

Quote:


> Originally Posted by *nrpeyton*
> 
> Why change the name then? Just an advertising gimmic so people considering buying a aren't put off by a ACX 3.0.
> 
> So just add a few thermal pads and change the entire name of the cooler for all cards sold afterwards??
> 
> What about the after-sale value of all the regular ACX 3.0??
> 
> Good news for EVGA profits, terrible news for those looking to eventually sell on Ebay and upgrade.


We will have to wait and see what its gonna be about. My stuff usually gets hand me downed to other systems here, my last main 4790k + 980Ti Classy build is now my GFs PC haha.


----------



## Nicklas0912

My GTX 1080 Sucks bads







can only do 2000Mhz on Core, all over that it crash.

Power target limt is 120%.

is a Evga GTX 1080 ACX 3.0

Temps is nerver getting over 61C, but hitting Power target so fast.


----------



## Derek1

Use Precision X to change to prefer Temp limit.


----------



## Vellinious

It'll still hit the power limit.....and at 61c he's already past 3 of the thermal points for boost 3.0. Keep it cooler....bout the only thing that'll help at this point. The cooler they run, the more efficient they run.


----------



## Nicklas0912

Quote:


> Originally Posted by *Vellinious*
> 
> It'll still hit the power limit.....and at 61c he's already past 3 of the thermal points for boost 3.0. Keep it cooler....bout the only thing that'll help at this point. The cooler they run, the more efficient they run.


Not really, because I have to turn fan speed down to 60% that means 61C max, if I turn it up to 100%, I need to lowere my OC, else I hit Power taget faster.

So just need more Power taget.... mabye a bios mod?


----------



## golfergolfer

Wondering if anyone had an opinion/could help me out some here? I am looking at using EVGA GTX 1080 Hybrid in my next build for the primary purpose of it is more quiet and I am really restricted in air due to the fact of a small form factor case. My question is that would I be better of using a Hybrid to achieve noise and temperature goals (but mainly noise) or a custom aircooled? I have searched the web to no result for this


----------



## Derek1

Quote:


> Originally Posted by *golfergolfer*
> 
> Wondering if anyone had an opinion/could help me out some here? I am looking at using EVGA GTX 1080 Hybrid in my next build for the primary purpose of it is more quiet and I am really restricted in air due to the fact of a small form factor case. My question is that would I be better of using a Hybrid to achieve noise and temperature goals (but mainly noise) or a custom aircooled? I have searched the web to no result for this


I am using the Hybrid FTW. I can recommend it for temps as mine never goes over 50C.
I don't hear the fans either. Replaced the rad fan with a Corsair push/pull too and they are set to never go over 900rpm.


----------



## golfergolfer

Quote:


> Originally Posted by *Derek1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *golfergolfer*
> 
> Wondering if anyone had an opinion/could help me out some here? I am looking at using EVGA GTX 1080 Hybrid in my next build for the primary purpose of it is more quiet and I am really restricted in air due to the fact of a small form factor case. My question is that would I be better of using a Hybrid to achieve noise and temperature goals (but mainly noise) or a custom aircooled? I have searched the web to no result for this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am using the Hybrid FTW. I can recommend it for temps as mine never goes over 50C.
> I don't hear the fans either. Replaced the rad fan with a Corsair push/pull too and they are set to never go over 900rpm.
Click to expand...

That sounds pretty good, I was planning on using a single noctua fan at around 1200 rpm, how about the pump though? thats what I was more curious about


----------



## Derek1

Quote:


> Originally Posted by *golfergolfer*
> 
> That sounds pretty good, I was planning on using a single noctua fan at around 1200 rpm, how about the pump though? thats what I was more curious about


I am sure a single noctua will be fine. I used 2 Corsairs mainly for the pretty lights.
Pump you will never hear. At least I don't lol But I have 10 fans in my case.


----------



## golfergolfer

10 fans haha yeah with the set up I was aiming for I would only be running the single noctua fan and then another 92mm noctua on a cpu cooler, airflow isnt going to be very high :/


----------



## Derek1

Quote:


> Originally Posted by *golfergolfer*
> 
> 10 fans haha yeah with the set up I was aiming for I would only be running the single noctua fan and then another 92mm noctua on a cpu cooler, airflow isnt going to be very high :/


EVGA is set to release a bunch of Quick Connect stuff I believe for CPU and GPU.
Might be worth looking into.
Or look into a Corsair H75 or H55 for your CPU (cpu and case dependant of course).


----------



## golfergolfer

Quote:


> Originally Posted by *Derek1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *golfergolfer*
> 
> 10 fans haha yeah with the set up I was aiming for I would only be running the single noctua fan and then another 92mm noctua on a cpu cooler, airflow isnt going to be very high :/
> 
> 
> 
> EVGA is set to release a bunch of Quick Connect stuff I believe for CPU and GPU.
> Might be worth looking into.
> Or look into a Corsair H75 or H55 for your CPU (cpu and case dependant of course).
Click to expand...

Quick Connect stuff is pretty interesting, I really am trying to cram alot into a small case have another thread here trying to figure it out if you wanna know more haha


----------



## ACM

Does this seem right for a 6700k & GTX1080 that are OC'd?

http://www.3dmark.com/3dm/17222992

Thanks!


----------



## hertz9753

Quote:


> Originally Posted by *ACM*
> 
> Does this seem right for a 6700k & GTX1080 that are OC'd?
> 
> http://www.3dmark.com/3dm/17222992
> 
> Thanks!


It looks okay to me but I'm not much of gamer but I did get the free benchmark from Steam.









http://www.3dmark.com/3dm/16841067?


----------



## Unnatural

Hello!
What performance can I expect from a OC watercooled GTX 1080 paired with a 1440p @144 G-Sync screen (ASUS ROG Swift PG278Q)? I'm currently on a 980ti SLI, was really hoping to switch back to single card with the rumored 1080ti, but nope... (I'd use the old 980ti to upgrade other system, anyway, so not much money loss)


----------



## juniordnz

Quote:


> Originally Posted by *ACM*
> 
> Does this seem right for a 6700k & GTX1080 that are OC'd?
> 
> http://www.3dmark.com/3dm/17222992
> 
> Thanks!


You should be getting a little bit more on graphics I guess. Try pushing it a little bit more. Also, is it heating up a lot?

This is my last run with stock 1.062V: http://www.3dmark.com/fs/11335631


----------



## Derek1

Quote:


> Originally Posted by *Unnatural*
> 
> Hello!
> What performance can I expect from a OC watercooled GTX 1080 paired with a 1440p @144 G-Sync screen (ASUS ROG Swift PG278Q)? I'm currently on a 980ti SLI, was really hoping to switch back to single card with the rumored 1080ti, but nope... (I'd use the old 980ti to upgrade other system, anyway, so not much money loss)


http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores

http://www.overclock.net/t/1443196/fire-strike-extreme-top-30

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30

I would say that generally speaking 980Ti in SLi will outperform a single 1080.

Look at the other benchmark threads ie Timespy, FS Extreme to see. They all have charts on the first page listing results.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Unnatural*
> 
> Hello!
> What performance can I expect from a OC watercooled GTX 1080 paired with a 1440p @144 G-Sync screen (ASUS ROG Swift PG278Q)? I'm currently on a 980ti SLI, was really hoping to switch back to single card with the rumored 1080ti, but nope... (I'd use the old 980ti to upgrade other system, anyway, so not much money loss)


I had a water cooled 980 Ti overclocked to 1556MHz core and 8GHz memory. Got a 1080 through EVGA's Step-Up program and I get somewhere between 10 and 20% higher fps out of the box. I'm got a water block for the 1080 in shipping, so I'm hoping that getting it under water will increase performance more.

If I was sitting on 980 Ti SLI, I wouldn't move to a single 1080. From a single 980 Ti to a single 1080, sure, but that would depend on several things.


----------



## ACM

Quote:


> Originally Posted by *juniordnz*
> 
> You should be getting a little bit more on graphics I guess. Try pushing it a little bit more. Also, is it heating up a lot?
> 
> This is my last run with stock 1.062V: http://www.3dmark.com/fs/11335631


Pushed it a little harder on the core & a more aggressive fan profile (Never went over 60c). I'll try adding more memory too.

http://www.3dmark.com/3dm/17237956

Edit:
Pushing the memory more
http://www.3dmark.com/3dm/17238531

Edit 2:
Hit the sweet spot I think.
85% core voltage - 120/92 power limit - +135 core - +550 Memory (Going over 550 didn't make a difference for me)
http://www.3dmark.com/3dm/17238907


----------



## SmackHisFace

Hi guys Ive had my 1080 about a month and Im noticing that it no longer boosts to 1.093v but rather caps out at 1.062v. I have max power limit, max voltage, and max temp limit. Doesn't matter what temp the card is at its topping out at 1.062. I know the cards throttle and voltages lower but they used to always be at least 1.075v during gaming. What gives, why the sudden change? Driver 376.33.


----------



## rev0luci0n

I'm having very similar issues to the below with my new MSI GTX 1080 Gaming X. Games will crash and I get purple and green space invader icons all over the screen.

Only started occuring since reinstalling my OS after running the 1080 for two weeks, particularly in the games Dishonored 2, Titanfall 2 and Watch Dogs 2.

Any ideas?


Click to view full size!
Quote:


> Originally Posted by *x7007*
> 
> I posted it on nvidia customer service.
> 
> Does anyone here play with 3D TV and Tridef ? can you please tell me if COD Special Ops III works for you ? Windows 10 x64 + 368.51 + GTX1080 + Tridef 7.0 3DTV worked fine. Also GTA V crashes in instat as soon as I start new game and I go to the hostages, ALWAYS the SAME PLACE and SPOT. I can't figure it out !
> 
> Game issue - Call of Duty Special Ops III newest updated or fresh installed without the newest updates.
> 
> Someone else having the same issue with 970 correctly in the Tridef forums, he could only manage to work in 3D using 3Dvision in 720P and using the game inside renderer to upscale it to higher resolution.
> Here is a link to the forum https://www.tridef.com/forum/viewtopic.php?f=2&t=6607
> 
> Using GTX1080 or 970 I had before, with Tridef 7.0 the newest version, and any profile that supposed to work before doesn't work with the latest driver.
> 
> The game works fine in 2D mode, but as soon as I launch it from Tridef menu the game goes black screen as soon after the logo when you need to press Enter to Continue.
> 
> What comes after is Display driver nvlddmkm stopped responding and has successfully recovered. and the clock stuck on 753 Mhz or so till doing restart or some driver reset which can be done using CRU 1.2.6 which restore the clock and driver condition like it's been system restart.
> 
> I get all this errors in the event viewer as soon as the Press Enter to Continue shown and the game is actually need to start rendering
> 
> \Device\Video3
> NVRM: Graphics TEX Exception on (GPC 0, TPC 3): TEX NACK / Page Fault
> 
> \Device\Video3
> Variable String too Large
> 
> \Device\Video3
> Graphics Exception: ESR 0x505a24=0x80000000 0x505a28=0x0 0x505a2c=0x0 0x505a34=0x0
> 
> \Device\Video3
> Graphics Exception: ESR 0x50da24=0x80000000 0x50da28=0x0 0x50da2c=0x0 0x50da34=0x0
> 
> \Device\Video3
> NVRM: Graphics TEX Exception on (GPC 1, TPC 3): TEX NACK / Page Fault
> 
> \Device\Video3
> Variable String too Large
> 
> \Device\Video3
> NVRM: Graphics TEX Exception on (GPC 2, TPC 3): TEX NACK / Page Fault
> 
> \Device\Video3
> Variable String too Large
> 
> \Device\Video3
> Graphics Exception: ESR 0x515a24=0x80000000 0x515a28=0x0 0x515a2c=0x0 0x515a34=0x0
> 
> \Device\Video3
> Graphics Exception: ESR 0x51da24=0x80000000 0x51da28=0x0 0x51da2c=0x0 0x51da34=0x0
> 
> \Device\Video3
> NVRM: Graphics TEX Exception on (GPC 3, TPC 3): TEX NACK / Page Fault
> 
> \Device\Video3
> Variable String too Large
> 
> Faulting application name: BlackOps3.exe, version: 0.0.0.0, time stamp: 0x5765991f
> Faulting module name: unknown, version: 0.0.0.0, time stamp: 0x00000000
> Exception code: 0xc0000005
> Fault offset: 0x0000000000000000
> Faulting process id: 0x2658
> Faulting application start time: 0x01d1d10fcc74a5d3
> Faulting application path: C:\Gamez\Call of Duty Black Ops III\BlackOps3.exe
> Faulting module path: unknown
> Report Id: 674d8a47-c3aa-4a3d-afb3-17bde1758926
> Faulting package full name:
> Faulting package-relative application ID:
> 
> Display driver nvlddmkm stopped responding and has successfully recovered.
> 
> It was working in older drivers, I'm not really sure which, because I waited for the GTX1080 to arrive and the 368.39 is the newest driver correctly which installed and I didn't run the game with the 970 to see before. I just don't want to check all kind of drivers, but I'm sure one of the older drivers work.


----------



## nrpeyton

1080 Memory overclocking

*Memory Loves low temps*
example:
http://www.overclock.net/t/1609397/msi-left-the-plastic-on-the-1080-seahawk-ek-x

P.S.
_(not to be confused with core temp)_


----------



## x7007

Quote:


> Originally Posted by *rev0luci0n*
> 
> I'm having very similar issues to the below with my new MSI GTX 1080 Gaming X. Games will crash and I get purple and green space invader icons all over the screen.
> 
> Only started occuring since reinstalling my OS after running the 1080 for two weeks, particularly in the games Dishonored 2, Titanfall 2 and Watch Dogs 2.
> 
> Any ideas?
> 
> 
> Click to view full size!


For me the issue in COD 3 was SMAA x2TX Filmic if I remember correct. using other SMAA worked fine.


----------



## ANN1H1L1ST

Does anyone here have a EVGA 1080 Classified? If so would it be worth it to sell my Two 980's in SLI and get one 1080 Classified? Thanks!


----------



## Vellinious

Did some near 0c testing. Didn't hit my target, but.....until I drop below 0c, I think this is as good as it gets.

6950X -- 4.5 / 4.6 @ 1.475
2 x 1080 FTW -- 2240 / 5556 @ 1.093v

Ambient: .6c
GPU loop in: 2.7c
GPU loop out: 2.1c
CPU loop in: 3.8c
CPU loop out: 3.2c
CPU peak temp: 68c
GPU peak temp: 24c


----------



## Derek1

Quote:


> Originally Posted by *ANN1H1L1ST*
> 
> Does anyone here have a EVGA 1080 Classified? If so would it be worth it to sell my Two 980's in SLI and get one 1080 Classified? Thanks!


Why?
Look back a page where the guy asked if he should trade his 2 980 Ti's for a 1080.
980's in SLi have better performance than a single 1080.


----------



## Menthol

There are other reasons to go 1080 or TXP as long as it satisfies your needs, you will consume less energy, translating into much less heat generated, with less heat generated you can run case fans at a lower speed which generates less noise making a better environment for general computing use and gaming, along with new features you may or may not use


----------



## Derek1

Quote:


> Originally Posted by *Menthol*
> 
> There are other reasons to go 1080 or TXP as long as it satisfies your needs, you will consume less energy, translating into much less heat generated, with less heat generated you can run case fans at a lower speed which generates less noise making a better environment for general computing use and gaming, along with new features you may or may not use


That's why I asked why first.
If he has other reasons than merely performance, poor Sli support for one, then fine but to do so thinking you will get equivalent performance then it ain't gonna happen.


----------



## nrpeyton

Quote:


> Originally Posted by *ANN1H1L1ST*
> 
> Does anyone here have a EVGA 1080 Classified? If so would it be worth it to sell my Two 980's in SLI and get one 1080 Classified? Thanks!


I'm getting roughly the same frames.

Or twice the frames on non-sli games.

I also had GTX 980 SLI. With PG278Q G-sync.

But when you consider how badly SLI is supported in so many games; and the extra memory (twice as much) i'd definitely upgrade.

I had the exact same dilemma as you 6 months ago, and I haven't looked back.. not for a second 

The Classified is a beast of a card. Also; you can get an EK block for it now too 

P.S.

Summary:
If you were talking 980TI it would be different, I'd say stick with the 980TI. But plain 980 SLI vs 1080. Definitely upgrade.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm getting roughly the same frames.
> 
> Or twice the frames on non-sli games.
> 
> I also had GTX 980 SLI. With PG278Q G-sync.
> 
> But when you consider how badly SLI is supported in so many games; and the extra memory (twice as much) i'd definitely upgrade.
> 
> I had the exact same dilemma as you 6 months ago, and I haven't looked back.. not for a second
> 
> The Classified is a beast of a card. Also; you can get an EK block for it now too
> 
> P.S.
> 
> Summary:
> If you were talking 980TI it would be different, I'd say stick with the 980TI. But plain 980 SLI vs 1080. Definitely upgrade.


Ya I just noticed that now. It was late last night for me and I was sure he had said Ti in there because I had just answered the same question 2 pages back.

In any case, don't let Nick here get you down that long road, you will be buying chillers and exotic TIMS and pads and busting holes through your walls to get more cooling, wiring the card up to monitor all aspects whie running. LOL


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Ya I just noticed that now. It was late last night for me and I was sure he had said Ti in there because I had just answered the same question 2 pages back.
> 
> In any case, don't let Nick here get you down that long road, you will be buying chillers and exotic TIMS and pads and busting holes through your walls to get more cooling, wiring the card up to monitor all aspects whie running. LOL


haha 

glad someone appreciates me lol 

*\/ \/ \/And speaking of more cooling \/ \/ \/ \/ \/*
Quote:


> Originally Posted by *Vellinious*
> 
> Did some near 0c testing. Didn't hit my target, but.....until I drop below 0c, I think this is as good as it gets.
> 
> 6950X -- 4.5 / 4.6 @ 1.475
> 2 x 1080 FTW -- 2240 / 5556 @ 1.093v
> 
> Ambient: .6c
> GPU loop in: 2.7c
> GPU loop out: 2.1c
> CPU loop in: 3.8c
> CPU loop out: 3.2c
> CPU peak temp: 68c
> GPU peak temp: 24c


Those are damn good water temps, core seems a *little* high for such a low water temp though? (2.7c water and 24c core)


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> haha
> 
> glad someone appreciates me lol












Have you got your Classy website up and running yet?
Every Classy owner needs to get there if you do, as well as engineers from EVGA and all the waterblock manufacturers. LOL


----------



## nrpeyton

lol

This new ICX is a bit of a kick in the teeth. Anyone else seen the EVGA news?

I should of been able to sit back comfortably and enjoy shopping for ZEN but instead; now I'm worrying about a huge drop in my cards resale price.

It wont affect me personally performance wise; but it *will* affect the resale value of my card. As it will with ALL EVGA ACX 3.0 cards.

_n00b = I don't want want broken outdated ACX 3.0 I only want the newer, better ICX!_

Kind of ****ty to be an EVGA customer right now. I haven't said that across there at the EVGA forums because I don't want to contribute to an already, tense situation. But its definitely how I feel.

I do hope EVGA has a pleasant surprise for us. It won't be free. But I'm hoping it will at least be cheap and straight-forward. (And without compromise). In terms of upgrade options to bring our cards into line with ICX.

The point is; this only came about due to the reduced confidence in ACX 3.0 and lost sales. However I don't blame EVGA for that, I blame neive n00bs. Who pay too much attention to the headlines and not enough attention to the detail. ACX 3.0 was never broken. People just *thought* it was broken But this ICX just makes it *look* like it *was* because that's what people thought.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> lol
> 
> This new ICX is a bit of a kick in the teeth. Anyone else seen the EVGA news?
> 
> I should of been able to sit back comfortably and enjoy shopping for ZEN but instead; now I'm worrying about a huge drop in my cards resale price.
> 
> It wont affect me personally performance wise; but it *will* affect the resale value of my card. As it will with ALL EVGA ACX 3.0 cards.
> 
> Kind of ****ty to be an EVGA customer right now. I haven't said that across there at the EVGA forums because I don't want to contribute to an already, tense situation. But its definitely how I feel.
> 
> I do hope EVGA has a pleasant surprise for us. It won't be free. But I'm hoping it will at least be cheap and straight-forward. (And without compromise).


Ya I went through the thread over there.
Its hard for me to be concerned really. I have never did a resale on one of my outdated cards as I didn't feel it was worth the effort. But mind you none of those cards had the potential to realize a half decent dollar value due to their age, usually 3+ years old when I upgrade and that they were usually midranged to begin with.
Now however, while mine, like yours is watercooled the issue seems moot. I will be re-selling a watercooled card not an ACX.
For others who aren't under water, There aren't any definitive reports yet on actual cooling differences other than the fact the card has a few more screws in it. Is there going to be a real difference in architecture of the sink? Or is it just a rebrand with the new pads and bios and backplate?
If the released a Ti there wouldnt be this outcry would there. To me the situation is equivalent in that regard. If something gets released that is an increase in performance whether it is cooling or clock rate I certainly don't feel like I deserve to be compensated by the company for doing so.
Just a lot of whiney babies with entitlement issues.
Screw em.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Ya I went through the thread over there.
> Its hard for me to be concerned really. I have never did a resale on one of my outdated cards as I didn't feel it was worth the effort. But mind you none of those cards had the potential to realize a half decent dollar value due to their age, usually 3+ years old when I upgrade and that they were usually midranged to begin with.
> Now however, while mine, like yours is watercooled the issue seems moot. I will be re-selling a watercooled card not an ACX.
> For others who aren't under water, There aren't any definitive reports yet on actual cooling differences other than the fact the card has a few more screws in it. Is there going to be a real difference in architecture of the sink? Or is it just a rebrand with the new pads and bios and backplate?
> If the released a Ti there wouldnt be this outcry would there. To me the situation is equivalent in that regard. If something gets released that is an increase in performance whether it is cooling or clock rate I certainly don't feel like I deserve to be compensated by the company for doing so.
> Just a lot of whiney babies with entitlement issues.
> Screw em.


lol anyway todays edition; especially just for you: 
http://www.overclock.net/t/1618752/designing-a-new-line-of-high-quality-water-cooling-components-looking-for-your-input/100#post_25761950


----------



## Joenc

I'm glad evga will be refreshing the ftw cards, they need to ... and if they add or

change a few things maybe I'll buy the ftw2 ! I wanted a ftw since they were released ,but

they had so many problems , I decided to wait and wait. Now, maybe the wait will be

over in 3-4 months, but by then I might get a vega card ! haha


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> haha
> 
> glad someone appreciates me lol
> 
> *\/ \/ \/And speaking of more cooling \/ \/ \/ \/ \/*
> Those are damn good water temps, core seems a *little* high for such a low water temp though? (2.7c water and 24c core)


Yeah...the other card always runs exactly 2c cooler. I think I'm going to take the hotter one apart and reseat the block, see if I can even it up a bit. That was peak temp....the hot one was usually sitting around 21 or 22. I count 3 spots in Timespy where the temps spike a little bit.

I did a run at 2252 last night. Score dropped a tiny bit. Wouldn't do 2270. lol


----------



## Widde

Does anyone know why I'm getting hit by a voltage limit while overclocked? Cranked the voltage to max in precision aswell power and temp target. Running +125 +350 atm on a 1080 ftw, temps never went above 57C

(This was in heaven)


----------



## Vellinious

Use GPUz sensors tab.


----------



## Widde

It said at perfcap reason: VRel


----------



## Vellinious

Quote:


> Originally Posted by *Widde*
> 
> It said at perfcap reason: VRel


The temps it's showing are getting up there.....every 10c or so, there's a "layer", where boost 3.0 will either try to downclock the GPU, or bump the voltage. I would imagine at 61c, it's hitting one of those points and giving you the vrel perfcap reason. VREL is reliability voltage. Meaning, it's probably not 100% stable at those clocks/voltages/temps.

Try running a more aggressive fan curve.


----------



## nrpeyton

Quote:


> Originally Posted by *Widde*
> 
> It said at perfcap reason: VRel


You're at the maximum voltage for PASCAL at 1.093v (that's the stock highest voltage with the voltage slider at 100%). If you reduced the voltage slider to 0% you'd only be at 1.042v

[/quote]
Quote:


> Originally Posted by *Vellinious*
> 
> Yeah...the other card always runs exactly 2c cooler. I think I'm going to take the hotter one apart and reseat the block, see if I can even it up a bit. That was peak temp....the hot one was usually sitting around 21 or 22. I count 3 spots in Timespy where the temps spike a little bit.
> 
> I did a run at 2252 last night. Score dropped a tiny bit. Wouldn't do 2270. lol


I'll do a wee run just now on the chiller and compare water temps with load, see what I'm getting.

You should still be getting same difference as me, between the two I'd think 

And at least you got to use 2277 in your post, (at least in name).. suppose u did get close enough to have that right lol 

I couldn't get over 2264 without it crashing and 2264 was also a bumpy ride lol.

Interesting how we're both stopping out at 2264 though. I wonder at what temp/clock it would start accepting a few more milli volts.


----------



## Vellinious

Possibly. The chiller is likely a little more efficient at removing the heat from the coolant.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Possibly. The chiller is likely a little more efficient at removing the heat from the coolant.


Aye but if you've got a guage reading your [in] water temp that should still give you a comparable result.. I could be wrong; but don't know what the science behind that would be lol if I was..

I edited last post:
_Interesting how we're both stopping out at 2264 though. I wonder at what temp/clock it would start accepting a few more milli volts._

You running T4 or FTW BIOS?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye but if you've got a guage reading your [in] water temp that should still give you a comparable result.. I could be wrong; but don't know what the science behind that would be lol if I was..
> 
> I edited last post:
> _Interesting how we're both stopping out at 2264 though. I wonder at what temp/clock it would start accepting a few more milli volts._
> 
> You running T4 or FTW BIOS?


That'd be hard to gauge, really. With like coolant temps, clocks and voltages, the GPU temps should be relatively the same. Should be.....power delivery from the PSU may play a part in that.

2252 was good, 2270 wouldn't run at all....just a few seconds, then it crashed. I'm thinking that a little more voltage, OR, cooler temps will get it done.

I'm running the stock FTW bios.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> That'd be hard to gauge, really. With like coolant temps, clocks and voltages, the GPU temps should be relatively the same. Should be.....power delivery from the PSU may play a part in that.
> 
> 2252 was good, 2270 wouldn't run at all....just a few seconds, then it crashed. I'm thinking that a little more voltage, OR, cooler temps will get it done.
> 
> I'm running the stock FTW bios.


Right I done the run. Just a simple 2202MHZ run at 130% power. 1.093v

*(TimeSpy)*

Water temp was between 8c and 11.6c

GPU core temp went pretty much like this through the run:
17
17
17
17
17
17
17
17
18
19
20
19
18
18
18
18
18
17
17
18
19
19
19
19
20
19
19
19
19
18

The chiller was only running for about 1/3rd of the run.

It cools down to the thermostat setting (8.0c) then waits until the temp gets 2.5c hotter then automatically switches on again. _(5 Litres in loop)._

Max power draw was 248w

So summary:
*Water Temp:* 8.0c to 11.6c
*Avg. Core Temp*: 18c
*Highest Core Temp (split second only):* 20c

Juniorz re-pasted his FTW (and changed pads) and got -15c core temp. Which was astonishing.

At 3c water temp you should be maxing out at 12c on a TimeSpy

I would of went lower but humidity is quite high today and I've still not insulated yet lol.


----------



## Vellinious

Doubtful....12c even with those coolant temps is likely out of reach. I'd have to see it to believe it. That'd be a 9c or 10c difference between coolant temp, and GPU core temp under load......that's not going to happen. lol


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Doubtful....12c even with those coolant temps is likely out of reach. I'd have to see it to believe it. That'd be a 9c or 10c difference between coolant temp, and GPU core temp under load......that's not going to happen. lol


That's exactly what I'm getting though.

Once I've insulated up, i'll do a run and let you see.... this is just more motivation for me to get my act together with the insulation lol.

+1000 memory/2277 Core watch out, coz here I come.

Also thinking of grabbing a few of these for *back* of PCB: (they're only £12 each)


----------



## Vellinious

Where are you reading the temps from? GPUz sensors tab? And what are you using to get your coolant temps?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Where are you reading the temps from? GPUz sensors tab? And what are you using to get your coolant temps?


My loop has 5L of coolant in it.... constantly circulating. So it takes a lot to rise that temperature. (especially with the chiller on).

Even with the chiller switched off I can can maintain a coolant temp of 28c (at room 22c) without the chiller even switching on. Up to about 150w system load. 24/7.
The insane amount of coolant in my loop (1 gallon) is able to dissipate about 150w without a radiator.

The coolant temperature is measured by the chiller as it passes through and is displayed on the front panel all the time. Its accurate to 0.1c

The core temps I'm measuring using GPU-Z / Afterburner / HWinfo64 they all match up.

I'm using Kryonaut paste (its as close to liquid metal performance as you can get) without actually using LM.

I could use a temp probe to verify the coolant temperature inside the PC if u want too... but fact is.. I was still maxing out at 19c average at a coolant temp of about 10c.

Even a furmark stress test (constantly drawing 285w) won't put the core temp above 24c at with a 11c water. And it takes a minute to get to 24c.


----------



## Vellinious

I'm using Kryonaut as well. No idea how you're staying within 10c of coolant temps. Doesn't seem possible, especially considering that you're gonna be running 1c to 2c above coolant temps even at idle. No block is 100% efficient at removing heat. Which means your GPU is only creating 8c more heat under load, than it does at idle? Something's not right there.


----------



## nrpeyton

I'll take a probe to the water inside EK reservoir and confirm temp. 2 mins.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I'm using Kryonaut as well. No idea how you're staying within 10c of coolant temps. Doesn't seem possible, especially considering that you're gonna be running 1c to 2c above coolant temps even at idle. No block is 100% efficient at removing heat. Which means your GPU is only creating 8c more heat under load, than it does at idle? Something's not right there.


Even with Furmark drawing 275w continuously for 6 minutes temp never went over 19c on core with 9.4c water temp. (see last pic at bottom and analyse until you're hearts content) 





*Full size -* right click & open new tab


*How are you measuring your coolant temp?*


----------



## Vellinious

XSPC temp sensor. One before the rad and one after. Ambient temp is measured via XSPC sensor as well. I wanted to make sure that all temps I took, except for core temp, were measured by the same type of sensors.

I have a digital thermometer.....I'll have to check to see just how accurate those XSPC sensors are. Though, I can't imagine they're very far off....if at all.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> XSPC temp sensor. One before the rad and one after. Ambient temp is measured via XSPC sensor as well. I wanted to make sure that all temps I took, except for core temp, were measured by the same type of sensors.
> 
> I have a digital thermometer.....I'll have to check to see just how accurate those XSPC sensors are. Though, I can't imagine they're very far off....if at all.


There must be some explanation.

But considering some of these GPU's can run at 55c full load on air and they have no IHS on top of the DIE 10c above temp must be possible. OR -- the Nvidia temp sensor, in my GPU is faulty lol.

Still seems quite surprising the more I think about it; but then again I also heard the colder something is; the better it conducts. So the water will absorb more heat more quickly, the colder it is. If I touch any of the metal bits in my loop it does feel like COLD metal.. if that's GPU DIE is getting constant water at 8c flowing over it at 1000L/hr it maybe becomes more and more possible.

I've also added extra pads to more VRM components (beyond what is asked of us in an EK manual) to improve heat dissipation.

Also what size of pads are you using? Don't know if the FTW is different.

But in EK manual it suggested 0.5mm memory and 1.0mm VRM. I got much better temps (20c) with 1.0mm memory and 1.5mm VRM as I was able to *really* squeeze those pads hard between block & component. They shrink to 50% their original height under good pressure.

Don't know if your VRM temps could be conducting along and affecting the core; but if that was happening you'd probably be seeing it hit your memory overclock long before you felt it on the core (opposite end of card).

Anyway just speculating lol; sometimes you have to speculate to accumulate 

I've also got no back-plate on. Some backplates have been known to trap heat, rather than remove it.


----------



## Vellinious

Both of these cards would stay under 55c or so, with 20c ambient on air, with the fans running 100% at 2189 @ 1093mv. That's a 30c variance. Again, the question should be, will a 1080 only go up 8c or 9c under full load, as compared to idle......as I said. I find that highly unlikely. Maybe there are other examples of it, and maybe I'm wrong, but.....from my experiences with overclocking and watercooling? The magic 8 ball says, "The outlook is not good". lol /shrug

All that really matters is if your cards are performing the way you wish them to.


----------



## nrpeyton

lol true, i'm happy anyway


----------



## Vellinious

Me too....these cards have offered a unique challenge. = )


----------



## ucode

Quote:


> Originally Posted by *Widde*
> 
> Does anyone know why I'm getting hit by a voltage limit while overclocked?


Not so much as a limit but more of a flag AFAIK. FWIW nVidia can be really flunky with it's reporting.



Can only really see the colors for two of them here, it's thermal limitting at 35C and power limiting at ~7W. I don't think so. :/


----------



## Vellinious

That's a pretty old version of GPUz......could be that's the reason you're seeing that stuff.....just a thought.


----------



## hertz9753

You guys are making my GTX 980 Ti hot. That is a picture of the Ti folding and it also has a GTX 960 FTW under it. The Kraken G10 is fine but the old Antec 620 cooler is going to be replaced.

I would love to have 1080 or the mystery 1080 Ti that hasn't been released but I paid the big money for 980 Ti and I only have a 1080P monitor.


----------



## Vellinious

That's pretty high for a Maxwell....


----------



## hertz9753

Quote:


> Originally Posted by *Vellinious*
> 
> That's pretty high for a Maxwell....


That is true but folding is a 24/7 stress benchmark. I have a push-pull H55 that I will install soon. It's not the best cooling solution but it will be better than the failing Antec with only push out.

It used to fold and play games at 1491 with the stock 06G-P4-4991-KR bios. It's not a SC or even a FTW.









I just like to watch this thread so don't get mad at me for posting.


----------



## Vellinious

lol, post anywhere ya want.


----------



## ucode

Quote:


> Originally Posted by *Vellinious*
> 
> That's a pretty old version of GPUz......


Nothing wrong with the softwares, they just relay what the nVidia driver tells it. Needs a driver fix, whether that will come or not is another question.



Quote:


> Originally Posted by *hertz9753*
> 
> It used to fold and play games at 1491 with the stock 06G-P4-4991-KR bios.


Strange with the 1080's. IIRC the first Pascal driver had P-States P0, P5 and P8 then the later drivers added P2. Well with P2 (usually used for compute / CUDA) the default memory clock got lowered to run at ~9000MT/s default instead of the P0 default of ~10000MT/s. Don't know of any good reason for doing that unless there's some memory overheating problems.


----------



## Vellinious

I've never seen my 1080s register those readings in the perf cap reason line......


----------



## c0ld

So is it viable to upgrade to 1080 with the 1080 Ti not being announced or should I wait hmmmm


----------



## hertz9753

Folding will also give you the P2 state on your memory on Maxwell or Pascal, it works the same is games. It's a built in thing and it doesn't matter which driver you are using and I'm to lazy to mod my bios.


----------



## ucode

Quote:


> Originally Posted by *hertz9753*
> 
> Folding will also give you the P2 state on your memory on Maxwell or Pascal, it works the same is games. It's a built in thing and it doesn't matter which driver you are using and I'm to lazy to mod my bios.


Here's with the Pascal driver that was available when the 1080 FE's were launched.



As you can see there isn't a P2 state which means compute and CUDA apps are run in P0.

Later driver's, same VBIOS



Now P2 is introduced and results in memory clocks speeds that are ~1000MT/s lower than P0. AFAIK AB adjusts both P2 and P0 the same amount instead of separately for what ever reason which might be problematic for some wanting to maximize their P2 memory clock.


----------



## juniordnz

What are those P states? Never heard about it.


----------



## ucode

Quote:


> Originally Posted by *juniordnz*
> 
> What are those P states? Never heard about it.


Maybe this helps
Quote:


> The GPU performance state APIs are used to get and set various performance levels on a per-GPU basis. P-States are GPU active/executing performance capability and power consumption states.
> 
> P-States range from P0 to P15, with P0 being the highest performance/power state, and P15 being the lowest performance/power state. Each P-State maps to a performance level. Not all P-States are available on a given system. The definition of each P-States are currently as follows:
> 
> P0/P1 - Maximum 3D performance
> P2/P3 - Balanced 3D performance-power
> P8 - Basic HD video playback
> P10 - DVD playback
> P12 - Minimum idle power consumption


http://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/group__gpupstate.html


----------



## 6u4rdi4n

So I finally got a water block for my GTX 1080. Oh what a difference in noise and temperatures! I did a 15 minute run of Heaven benchmark just to get a baseline before and after. Room temp stayed at 24,6C for both test. On the stock ACX 3.0 cooler, the card would max at 71C with the fans on auto, maxing at 65%. Same test on water, max 45C and no additional noise.

The loop consists of two EK XTX radiators, one 360 and one 120 with EK Vardar F3 fans in pull, D5 pump, EVGA GTX 1080 SC ACX added today (EK Acetal/Nickel with backplate) and an i7 2600K @ 4.5GHz with a Swiftech Apogee HD block.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> That is why it melted then.
> 
> You should have been monitoring your power draw through HWINFO64. (If cross-flashing this BIOS doesn't give you an accurate HWINFO64 reading there are other ways to make a rough estimate with wall-socket watt meters etc).
> 
> The normal max power limit of a FE is 180w or (or 216w when overclocked)
> 
> And for a '1 power cable card'; the max draw shouldn't exceed 225w. (150w cable, 75w lane).
> 
> As I said, each power cable is capable of 150w. And the lane 75w. That's 225w. If you're going over that, then expect the wire to heat up a little...
> 
> ...if you're _frequently_ going over that *for extended periods of time*
> then you either need to find a way to cool that cable (and plug), or simply don't use the T4 with a FE..
> or...
> just don't overclock quite as hard... I mean.. draw 250/260w but anything higher I'd be worried....
> 
> That being said, I'm sure many of these cables are made a lot "more equal" than others.
> 
> If you're overclocking your FE to 2150 - 2200 at 1.093v you're going to be drawing 275w +
> 
> It's possible your cable already had an issue, obviously there is some safety margin built into the 225w max of a one cable card.
> 
> Nvidia only put one cable on it because they never intended more than 216w (9w within spec of only one cable).
> 
> On an FTW (which has 2 connectors) the max safe power draw would be 375w. (you'd be hard pushed to draw that on ANY overclock) maybe furmark/LN2
> 
> Anyway I am sorry to hear what happened. I hope your card is okay?


Not entirely true, a single 8pin molex connector is capable of way more than just 150w. If you had a modular power supply you'll find a double 8-pin PCI-E cable is actually connect to the PSU by a single 6pin molex connector. According to molex, a 6pin connector is more than enough for 300w. 8pin for 150w and 6pin for 75w is just the PCIE standard and has nothing to do with what the connector is capable of.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Not entirely true, a single 8pin molex connector is capable of way more than just 150w. If you had a modular power supply you'll find a double 8-pin PCI-E cable is actually connect to the PSU by a single 6pin molex connector. According to molex, a 6pin connector is more than enough for 300w. 8pin for 150w and 6pin for 75w is just the PCIE standard and has nothing to do with what the connector is capable of.


| was taking the numbers from ATX standard specifications as listed on various sites.

What is your theory behind why it melted , dodgy cable? It was certainly odd how many with a FE have flashed this BIOS and also done physical power mods too, and never experienced this issue.

Faulty cable?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> | was taking the numbers from ATX standard specifications as listed on various sites.
> 
> What is your theory behind why it melted , dodgy cable? It was certainly odd how many with a FE have flashed this BIOS and also done physical power mods too, and never experienced this issue.
> 
> Faulty cable?


Could just be that the gauge of the wire was too small.


----------



## IronAge

Don't know why sb would want to buy a FE any more since there are enough FC WB available for custom designs, inexpensive ones too.

That is my Gainward Phoenix with Strix XOC T4 Bios @ 2228/5562 VDDC 1.125V

http://www.3dmark.com/3dm/17295759



It won't do much more though ... probably with a FC like 20-30 MHz ... does not seem to like too much overvoltage.

With the most recent Phoenix GLH Bios @ 2190/5562 VDDC 1.093V. something like 12056 FSE Graphics Score.

It gets too warm when i do full FSE runs with the stock cooler ... so u just ran the Graphics Tests.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> | was taking the numbers from ATX standard specifications as listed on various sites.
> 
> What is your theory behind why it melted , dodgy cable? It was certainly odd how many with a FE have flashed this BIOS and also done physical power mods too, and never experienced this issue.
> 
> Faulty cable?


It looks like a faulty connector to me. Or the connector wasn't seated properly.
Very similar to some sata connect meltdown cases..

Here's a picture of a standard corsair PCI-E cable,  .
you can see that two 8-pin connectors are actually from a single 8-pin connector. The single 8-pin connector at the PSU side carries 300w.

BTW, T4 BIOS does not add stress to the PCI-E slot for 1080 FE cards. GTX 1080 FE takes 100% its core power from 8pin connector. There's no physical connection between PCIE slot power to the GPU core VRM.


----------



## juniordnz

After changing the paste and all thermal pads, I decided to give T4 another chance (card would get too hot before).

Was able to get to 2.151mhz / 1.075V. Temps were pretty much in control like they were with stock BIOS. Also, this T4 BIOS draws less power than the stock BIOS or am I seeing things? I'm pretty sure I'm seeing less Watts being pulled for the same voltage than I used to see with the stock EVGA BIOS...

2.151mhz core
+575mhz mem
1.075V

25.656 points firestrike graphics: http://www.3dmark.com/fs/11370372


----------



## ROKUGAN

Quote:


> Originally Posted by *6u4rdi4n*
> 
> So I finally got a water block for my GTX 1080. Oh what a difference in noise and temperatures! I did a 15 minute run of Heaven benchmark just to get a baseline before and after. Room temp stayed at 24,6C for both test. On the stock ACX 3.0 cooler, the card would max at 71C with the fans on auto, maxing at 65%. Same test on water, max 45C and no additional noise.
> 
> The loop consists of two EK XTX radiators, one 360 and one 120 with EK Vardar F3 fans in pull, D5 pump, EVGA GTX 1080 SC ACX added today (EK Acetal/Nickel with backplate) and an i7 2600K @ 4.5GHz with a Swiftech Apogee HD block.


I´m curious...what kind of GPU frequency increase are you getting from that 25C reduction?


----------



## juniordnz

Is there a way to undervolt pascal without using k boost to lock the voltage/clock point on the curve? Ie: undervolt without loosing the idle states.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Is there a way to undervolt pascal without using k boost to lock the voltage/clock point on the curve? Ie: undervolt without loosing the idle states.


Create a curve for it, and it'll try to run it at your specified voltages.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *ROKUGAN*
> 
> I´m curious...what kind of GPU frequency increase are you getting from that 25C reduction?


GPU Boost 3.0 did very little on its own... I've set power limit to 120%, temp limit to 92C and I've set core voltage to 100% so it can use all of its stock voltage. Haven't really gotten around to testing it far and wide, but in Heaven benchmark, it boosted 13mhz higher than before. 1 step I guess? It also seems to be holding a step higher in frequency at the same voltage all over. Before, it didn't really want to go to 1.093V, but now it stays there most of the time. When it drops volt, it keeps the frequency, so I guess it drops voltage because of TDP or because it's smart enough to realize that it doesn't need that much voltage for that frequency all the time.


----------



## Vellinious

Quote:


> Originally Posted by *6u4rdi4n*
> 
> GPU Boost 3.0 did very little on its own... I've set power limit to 120%, temp limit to 92C and I've set core voltage to 100% so it can use all of its stock voltage. Haven't really gotten around to testing it far and wide, but in Heaven benchmark, it boosted 13mhz higher than before. 1 step I guess? It also seems to be holding a step higher in frequency at the same voltage all over. Before, it didn't really want to go to 1.093V, but now it stays there most of the time. When it drops volt, it keeps the frequency, so I guess it drops voltage because of TDP or because it's smart enough to realize that it doesn't need that much voltage for that frequency all the time.


It's probably dropping voltage when the temps drop a little. I'd bet it's hitting that "thermal layer" at 45c or 55c or wherever you're at, and dropping the voltage a step. As temps increase, and you cross one of those "layers" so to speak, it'll either raise voltage or drop the clock to keep the GPU stable. Depending on where you're at in the curve, that is. Everything goes off of the curve...even at stock clocks.


----------



## 6u4rdi4n

Exactly. Think I gotta tweak my fans or something to keep it below 45. Seems like both 40 and 45 is a step. Haven't gotten around to tweaking the curve or anything yet tho.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> Create a curve for it, and it'll try to run it at your specified voltages.


I can run my card stable at 2100mhz/1.025V. But when it reaches it's first throttle point (49ºC) it goes to 2100mhz/1.063V and then on the second throttle point (53ºC) it goes to 2.088mhz/1.025V.

This curve thing is crazy. It doesn't obey what you tell it to do nor does it follows the logic. WIsh I could run at 2100/1.025 than throttle to the lowest stable 2.088mhz clock, which is 0.993V).


----------



## Vellinious

Does anyone else have trouble getting Firestrike to run at the same clocks that Timespy does? I can run 2240 and 2252 on Timespy all night, but 2202 is the best I can do on Firestrike.

Anyone?


----------



## IronAge

Have you monitored the GPU usage during Timespy ? Its @ 99% all the time ?

Fire Strike Extreme Graphics Tests put the hightest load on the GPU, so most likely it fails when Graphics Test #2 is running ?


----------



## Vellinious

Quote:


> Originally Posted by *IronAge*
> 
> Have you monitored the GPU usage during Timespy ? Its @ 99% all the time ?
> 
> Fire Strike Extreme Graphics Tests put the hightest load on the GPU, so most likely it fails when Graphics Test #2 is running ?


At 2240 it fails in graphics test 1. At 2202 it runs great. 2214 runs all the way through, but scores drop. Same with 2227.


----------



## IronAge

Usually Scores only drop when VRAM clock rates is set too high

Have you tried decreasing your VMEM clock or set VMEM offset to zero and see if its running without crashing @ 2240 ?


----------



## Vellinious

Quote:


> Originally Posted by *IronAge*
> 
> Usually Scores only drop when VRAM clock rates is set too high
> 
> Have you tried decreasing your VMEM clock or set VMEM offset to zero and see if its running without crashing @ 2240 ?


Scores will drop on Pascal based GPUs when clocks get a little bit too high and the temps are a tad too warm.

I'm just asking if anyone has seen the same things....that clocks on Timespy tend to go higher than on Firestrike.

The scores are pretty high already. I just want more. lol


----------



## IronAge

Never had that ... it always crashed/halted in FSE when trying to find the highest stable core clock.... until that scores just went up.

How much VDDC you have to give for 2202 ? Probably i should use Timespy to have my GPU look better.









Your GPUs are under 45 Degree Celcius during FSE or have you tried FSU ?

When you want higher scores try the Strix T4 Bios ... this should help a little.


----------



## fat4l

Quote:


> Originally Posted by *Vellinious*
> 
> Does anyone else have trouble getting Firestrike to run at the same clocks that Timespy does? I can run 2240 and 2252 on Timespy all night, but 2202 is the best I can do on Firestrike.
> 
> Anyone?


yeah samething, maybe its about 20MHz difference


----------



## 6u4rdi4n

I currently have an i7 2600k @ 4.5GHz with my GTX 1080. I'm getting an i7 4770k with a z87 motherboard very cheap. Is it worth dismantling my water cooling loop and swapping, or should I just resell it and save up for Ryzen, i7 7700K, x99 platform or something?

I'm gaming at 2560x1440.


----------



## ROKUGAN

Quote:


> Originally Posted by *juniordnz*
> 
> I can run my card stable at 2100mhz/1.025V. But when it reaches it's first throttle point (49ºC) it goes to 2100mhz/1.063V and then on the second throttle point (53ºC) it goes to 2.088mhz/1.025V.
> 
> This curve thing is crazy. It doesn't obey what you tell it to do nor does it follows the logic. WIsh I could run at 2100/1.025 than throttle to the lowest stable 2.088mhz clock, which is 0.993V).


Exactly my own same experience here. Actually I gave up trying to sustain over 2100Mhz stable. The temp sensitivity in Pascal is almost ridiculous, the card could run >25C hotter without a problem.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> Does anyone else have trouble getting Firestrike to run at the same clocks that Timespy does? I can run 2240 and 2252 on Timespy all night, but 2202 is the best I can do on Firestrike.
> 
> Anyone?


I get low GPU utilization in Timespy(low as 60%) so maybe you can use higher clocks in it because its not stressing your GPu as hard as Firestrike does.


----------



## Dragonsyph

Quote:


> Originally Posted by *6u4rdi4n*
> 
> I currently have an i7 2600k @ 4.5GHz with my GTX 1080. I'm getting an i7 4770k with a z87 motherboard very cheap. Is it worth dismantling my water cooling loop and swapping, or should I just resell it and save up for Ryzen, i7 7700K, x99 platform or something?
> 
> I'm gaming at 2560x1440.


How cheap is very cheap? If your cpu is not bottlencking ur gpu in games and or your cpu is not hitting 100% on cores when you are doing work or gaming then i see no reason to need anything better then a 4770k. But a 7700k at say 5.2ghz has a fun factor to it. Along with the newer cpus having things like ddr4 etc.


----------



## Vellinious

Quote:


> Originally Posted by *fat4l*
> 
> yeah samething, maybe its about 20MHz difference


Hmm. I'm seeing a little bit more than that. Not a whole lot, though. 50mhz. I'm riding the GPUs on the ragged edge in COLD temps, though. Oddly...the cold didn't seem to help increase core clocks in Firestrike, but it did help in Timespy. Eh....doesn't matter I guess.

Quote:


> Originally Posted by *IronAge*
> 
> Never had that ... it always crashed/halted in FSE when trying to find the highest stable core clock.... until that scores just went up.
> 
> How much VDDC you have to give for 2202 ? Probably i should use Timespy to have my GPU look better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your GPUs are under 45 Degree Celcius during FSE or have you tried FSU ?
> 
> When you want higher scores try the Strix T4 Bios ... this should help a little.


Yes, I've run FS Ultra. It runs at 2202 / 5500.

http://www.3dmark.com/fs/11379686


----------



## max883

I used Thermal grizzly kryonaut and thermal mod with updated bios on my Evga GTX 1080 ACX 3.0 SC. Fann at custom in Msi Afterburner and 120% power and 50+ Gpu.

before I had a MSI 980 Ti gaming. Temps were 75.c and fann speed 66%

Now with Evga GTX 1080 ACX 3.0 SC temps are 60.c and fann speed 45%!


----------



## juniordnz

Also, the curve changes some points after a session of stress or restarting the computer. I had it all set up to run 2100mhz from 1.025V to 1.064V and when I started my computer today it was set to 2113mhz.

Oh, how I miss the BIOS editor...


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Also, the curve changes some points after a session of stress or restarting the computer. I had it all set up to run 2100mhz from 1.025V to 1.064V and when I started my computer today it was set to 2113mhz.
> 
> Oh, how I miss the BIOS editor...


I heard a rumor that there may be a bios editor after the 1080ti release. Just a rumor, and....I certainly don't put much hope in it, but.....eh


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Dragonsyph*
> 
> How cheap is very cheap? If your cpu is not bottlencking ur gpu in games and or your cpu is not hitting 100% on cores when you are doing work or gaming then i see no reason to need anything better then a 4770k. But a 7700k at say 5.2ghz has a fun factor to it. Along with the newer cpus having things like ddr4 etc.


€110 for the i7 4770k, z87 motherboard and 16GB ram. I'm looking to get the minimum FPS in games like GTA V up. Also better singlethread performance because of a few games.


----------



## GreedyMuffin

Quote:


> Originally Posted by *6u4rdi4n*
> 
> €110 for the i7 4770k, z87 motherboard and 16GB ram. I'm looking to get the minimum FPS in games like GTA V up. Also better singlethread performance because of a few games.


Du får den så billig ja ^^ (Hint - CT)

I would do it considering the low price.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> I heard a rumor that there may be a bios editor after the 1080ti release. Just a rumor, and....I certainly don't put much hope in it, but.....eh


well, it's ok to dream...


----------



## 6u4rdi4n

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Du får den så billig ja ^^ (Hint - CT)
> 
> I would do it considering the low price.


Så deg her for lenge siden (Hint - Er ikke så anonym som man tror^^)

I'm torn between upgrading or just resell it, maybe make a complete PC out of it and then sell it. If the performance increase is insignificant, I rather not drain my loop and pick apart most of my computer just a couple of days after rebuilding the loop


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I heard a rumor that there may be a bios editor after the 1080ti release. Just a rumor, and....I certainly don't put much hope in it, but.....eh


Very interesting mate; you got a link to where you read the roomer?

It is quite possible; that the guys who enjoy writing such an editor, had 980TI and probably didn't see the point in upgrading until a 1080 TI comes out, (as 980TI is only 10% slower than 1080).

That's my *theory* 

But; in all honest.. I do think it's a dream...

Nvidia have a MASSIVE Research & Development budget to support; they don't want people overclocking because it jeopardises sales slightly, on their next release. Not to mention potential RMA costs.

*A lot of hardware roomers have turned out to be untrue lately....*

Nvidia never even announced 1080TI at CES!!!:

I pulled this off a website:
_"In the months leading up to CES, there were several rumours floating around claiming that Nvidia would be showing off the GTX 1080Ti- something that ultimately turned out to be untrue. However, the rumour mill doesn't stop spinning, leading to new reports that the card will now be shown at PAX East on March 10th instead.
Given that past reports turned out to be unreliable, I would recommend taking this information with a dose of scepticism but according to a add-in Board partner source speaking with Tech Buyer's Guide, the GTX 1080Ti will make its appearance at PAX East in March with variants from MSI.

Apparently Nvidia has chosen to sit on the GTX 1080Ti for a few extra months in order to see what AMD has brewing with Vega 10. Judging from the report, board partners are already preparing variants of the GPU for launch but a founder's edition wasn't mentioned at all.
Obviously, the source on this information is anonymous but supposedly this person works for MSI, assuming the source is legitimate to begin with."_


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Very interesting mate; you got a link to where you read the roomer?
> 
> It is quite possible; that the guys who enjoy writing such an editor, had 980TI and probably didn't see the point in upgrading until a 1080 TI comes out, (as 980TI is only 10% slower than 1080).
> 
> That's my *theory*
> 
> But; in all honest.. I do think it's a dream...
> 
> Nvidia have a MASSIVE Research & Development budget to support; they don't want people overclocking because it jeopardises sales slightly, on their next release. Not to mention potential RMA costs.
> 
> *A lot of hardware roomers have turned out to be untrue lately....*
> 
> Nvidia never even announced 1080TI at CES!!!:
> 
> I pulled this off a website:
> _"In the months leading up to CES, there were several rumours floating around claiming that Nvidia would be showing off the GTX 1080Ti- something that ultimately turned out to be untrue. However, the rumour mill doesn't stop spinning, leading to new reports that the card will now be shown at PAX East on March 10th instead.
> Given that past reports turned out to be unreliable, I would recommend taking this information with a dose of scepticism but according to a add-in Board partner source speaking with Tech Buyer's Guide, the GTX 1080Ti will make its appearance at PAX East in March with variants from MSI.
> 
> Apparently Nvidia has chosen to sit on the GTX 1080Ti for a few extra months in order to see what AMD has brewing with Vega 10. Judging from the report, board partners are already preparing variants of the GPU for launch but a founder's edition wasn't mentioned at all.
> Obviously, the source on this information is anonymous but supposedly this person works for MSI, assuming the source is legitimate to begin with."_


Don't even remember why I heard it. Like I said....probably not a very reliable source.....the guy has a friend with a cousin, who's uncle knows a guy that works as a janitor at NVIDIA, and he says "X" type of thing, most likely.

It's my understanding that the guy that makes the editors, is the same guy that does CPUz. W1zzard? Something like that. He makes his home at TPU forums. He'd been working on one, but NVIDIA locked it down pretty tight I guess. Not sure if there's been any progress, but.....if one was to want to go straight to the source, that'd be where I'd start.

I'm not even really concerned about it anymore.....until someone starts beating my scores again, then I'll get interested. haha


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> until someone starts beating my scores again, then I'll get interested. haha


**** sake lol, hehe 

well I'm never going SLI again so I doubt that day will ever come but after Ry*zen* if you're up for a few single GPU runs against me, i'd be up for that lol

you can even pick your best one.. lol

~DreaM MachinE~


----------



## Vellinious

I haven't done single card runs in so long.....I'd need to check to see which GPU runs the best alone. lol


----------



## nrpeyton

;-)


----------



## juniordnz

I think I've finally tamed the curve. Got my card to run 2100mhz at 1.013V and score 25.433 points on Firestrike.

http://www.3dmark.com/3dm/17324436?



One thing I noticed is how great is the impact of the whole curve, and not only the point where you want your card to run. I was able to score 400+ points only by tweaking the curve to make it smoother, making sure all points below my max clock have 101-114 offsets. That eliminates peaks on the curve and yielded more performance AND stability. It seems like it's not all about the voltage/clock you're running at the moment, but it's surroundings as well.

Is 25.400 points for an undervolted (-50mv) card any good?


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> I think I've finally tamed the curve. Got my card to run 2100mhz at 1.013V and score 25.433 points on Firestrike.
> 
> http://www.3dmark.com/3dm/17324436?
> 
> 
> 
> One thing I noticed is how great is the impact of the whole curve, and not only the point where you want your card to run. I was able to score 400+ points only by tweaking the curve to make it smoother, making sure all points below my max clock have 101-114 offsets. That eliminates peaks on the curve and yielded more performance AND stability. It seems like it's not all about the voltage/clock you're running at the moment, but it's surroundings as well.
> 
> Is 25.400 points for an undervolted (-50mv) card any good?


What driver are you using? I don't see a 376.6 listed on NVIDIA's website


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> What driver are you using? I don't see a 376.6 listed on NVIDIA's website


This one:

Official Nvidia Geforce 376.60 Hotfix Driver

http://nvidia.custhelp.com/app/answers/detail/a_id/4293

Battlefield 1 crash on some Kepler based GPUs
Dark puddles in Battlefield 1
Random black screen in DOTA 2


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> This one:
> 
> Official Nvidia Geforce 376.60 Hotfix Driver
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4293
> 
> Battlefield 1 crash on some Kepler based GPUs
> Dark puddles in Battlefield 1
> Random black screen in DOTA 2


24586 at those clocks / voltage, but....I can't overclock my memory that high. Most mine will do without dropping FPS is +495, which puts me at about 5500.


----------



## juniordnz

Quote:


> Originally Posted by *Vellinious*
> 
> 24586 at those clocks / voltage, but....I can't overclock my memory that high. Most mine will do without dropping FPS is +495, which puts me at about 5500.


Mine will loose performance after +625, but BF1 won't take more than +575. It also didn't took 2100mhz/1.013V that well, I'm working on 2100/1.025 now.


----------



## Vellinious

Quote:


> Originally Posted by *juniordnz*
> 
> Mine will loose performance after +625, but BF1 won't take more than +575. It also didn't took 2100mhz/1.013V that well, I'm working on 2100/1.025 now.


I got mine to run as low as 1.000v at 2100 / +495 memory offset. Same score as I ran at 1.031v.

Probably the difference between water cooling and air cooling, I'd guess.


----------



## Dragonsyph

So has anyone broken FS graphics score of 26,405 yet?


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> So has anyone broken FS graphics score of 26,405 yet?


Haven't tried. Might give it a go this weekend. This rig doesn't like to run single GPU setups very well. Runs SLI like a beast, but single card? I dunno.....maybe I'm doin something wrong.


----------



## Tristanguy1224

So I'm new to a 1080 since Christmas.... Got the EVGA Classified 1080. Anyway I had two 970s in SLI and.... I'm actually seeing a performance LOSS in some places. Is there something wrong with my card?
Now I DID have Mr. Dark do up a custom BIOS for my 970 a few months ago to raise the TDP limits and clock speed but.... to beat a 1080? a CLASSIFIED 1080???? at ~2176-2150Mhz?
I feel like that shouldn't be. Now on games that didn't support SLI or did so poorly it's amazing but.... I feel like there's gotta be something I can do like I did with my 970s. Has anyone unlocked Pascal BIOS yet?


----------



## Dragonsyph

Quote:


> Originally Posted by *Tristanguy1224*
> 
> So I'm new to a 1080 since Christmas.... Got the EVGA Classified 1080. Anyway I had two 970s in SLI and.... I'm actually seeing a performance LOSS in some places. Is there something wrong with my card?
> Now I DID have Mr. Dark do up a custom BIOS for my 970 a few months ago to raise the TDP limits and clock speed but.... to beat a 1080? a CLASSIFIED 1080???? at ~2176-2150Mhz?
> I feel like that shouldn't be. Now on games that didn't support SLI or did so poorly it's amazing but.... I feel like there's gotta be something I can do like I did with my 970s. Has anyone unlocked Pascal BIOS yet?


Your 1080 is only getting 22-23k graphics scores when at those clocks you should be 25-26k, might be a cpu bottleneck with that FX cpu.


----------



## Tristanguy1224

Sure I know it is to some degree but still LESS than 2x 970s? with the same setup though?


----------



## nrpeyton

Quote:


> Originally Posted by *Dragonsyph*
> 
> Your 1080 is only getting 22-23k graphics scores when at those clocks you should be 25-26k, might be a cpu bottleneck with that FX cpu.


Quote:


> Originally Posted by *Tristanguy1224*
> 
> So I'm new to a 1080 since Christmas.... Got the EVGA Classified 1080.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anyway I had two 970s in SLI and.... I'm actually seeing a performance LOSS in some places. Is there something wrong with my card?
> Now I DID have Mr. Dark do up a custom BIOS for my 970 a few months ago to raise the TDP limits and clock speed but.... to beat a 1080? a CLASSIFIED 1080???? at ~2176-2150Mhz?
> I feel like that shouldn't be. Now on games that didn't support SLI or did so poorly it's amazing but.... I feel like there's gotta be something I can do like I did with my 970s. Has anyone unlocked Pascal BIOS yet?


It won't be less, it's probably about 10% more. You're maybe just expecting a big boost.... and you're not seeing that.. but unless you have a technical difficulty I doubt you're seeing less....

I agree with Dragonsyph.

I also have the EVGA 1080 Classified and an FX CPU (overclocked to 9590 clocks) and I am highly capped due to CPU.

Can't wait for ZEN to come out.

Anyway welcome to the 1080 Owners Club mate, I am so pleased to see another Classified owner here  welcome welcome 

There are no BIOS edits for PASCAL yet; but there is an updated Classified voltage tool that works with core + memory; so you are luckier than most 

You can get to 320 watts just by using the 'slave' BIOS switch; which will give you 130% on your power slider in MSI Afterburner / Precision X
For comparison; a Founders Edition; max power draw is 216w fully overclocked.

Your 1080 should be roughly 10% faster than 970 SLI (minimum). Thats still *a lot* when you think about it.. Your Classified is *more* powerful than 2 of your old cards; put together....!!

Your system is also much more modern now; you're "back in the game" so to speak.. in another years time you will still get a lot of money if you sold that Classified.. but in another years time a 970 isn't going to be worth very much at all?

I upgraded to my 1080 from SLI 980, on an FX-8350 and I still haven't looked back... I'm absolutely delighted with it 
Best GPU I have ever owned 
Performance wise, its pretty much equal to before; but consuming 1/3rd of the power. And generating 1/4th the heat.

In terms of raw overclocking; you might find you can get further (higher/faster) using the new "Curve" overclocking method.... instead of just a traditional +core bump in the main window of your O/C'ing app. _Try setting voltage to 50%_


----------



## Tristanguy1224

Thanks for the welcome.
I'm starting to think my 970s were just beastly when unlocked as they were..... I was running them 24/7 at 1544 core 4010 memory. With the modified BIOS I got from Mr. Dark it removed GPU Boost 2.0 to lock the clocks at what I set them to as well as raising the TDP. I'm already running on the LN2 BIOS on my Classy and moved the slider to 130% on EVGA Precision.

Also after seeing the huge difference between stock and modified BIOS on the old cards I'm sure when someone figures out the Pascal BIOS I'll get an appreciable bump in performance... I just expected more from the "top dog" card.


----------



## nrpeyton

Quote:


> Originally Posted by *Tristanguy1224*
> 
> Thanks for the welcome.
> I'm starting to think my 970s were just beastly when unlocked as they were..... I was running them 24/7 at 1544 core 4010 memory. With the modified BIOS I got from Mr. Dark it removed GPU Boost 2.0 to lock the clocks at what I set them to as well as raising the TDP. I'm already running on the LN2 BIOS on my Classy and moved the slider to 130% on EVGA Precision.
> 
> Also after seeing the huge difference between stock and modified BIOS on the old cards I'm sure when someone figures out the Pascal BIOS I'll get an appreciable bump in performance... I just expected more from the "top dog" card.


I agree its the top dog card 

But are you being power throttled in games? I would doubt that? I've yet to play a game that draws more than around 275 watts (or benchmark).

Only Furmark / OCCT seems capable of drawing the cards maximum power....

You can check that using GPU-Z and watching the "perfcap" reason... on the sensors tab...

If you bumped voltage up you might start to see more power usage... but voltage isn't doing an awful lot for Pascal.

It's got *SOME* people extra.. but only a select few....

I'm running on a water chiller and still haven't managed to get temps low enough yet; to add in anymore voltage. (except on memory where I got nice gains with lower temps + little voltage boost)

Pascal just doesn't seem to like voltage at all... so even if the BIOS editor existed.. it probably wouldn't help.
If you really wanted; you could test this yourself using the updated Classified voltage tool.. see if it gets you anything extra.... I'd be interested if you did  you'd be one of the rare, few


----------



## Tristanguy1224

I see what you're saying. But the difference it made on my 970s (WITHOUT extra voltage- didn't help my Maxwells either) it was literally night and day. There HAS to be SOMETHING like that I can do with Pascal


----------



## nrpeyton

Quote:


> Originally Posted by *Tristanguy1224*
> 
> I see what you're saying. But the difference it made on my 970s (WITHOUT extra voltage- didn't help my Maxwells either) it was literally night and day. There HAS to be SOMETHING like that I can do with Pascal


The only way to make your card run faster without crashing, is more cooling...

The scale is about 100mhz for every 50 degrees C on Pascal (*very* approximately)

At 50 degrees C I was maxing out at about 2151 -- > 2176 (occasionally I could get a few minutes at 2184mhz

At 15 degrees C I am maxing out at about 2252mhz - 2264mhz (2264 is very bumpy)
Not played around with voltage tool much though; since getting the Water Chiller..... there might be more I can do here.. just not had the time yet.. but if does get me anything at all.. its probably going to be a few mhz for a few milli volts... if i'm even lucky,....

/\ that's still a *massive* overclock, considering the stock Nvidia boost for a FE 1080 is 1733mhz. People sometimes forget that; when their card comes with a factory O/C 

Updated EVGA 1080 Classified Voltage Tool:
*BE CAREFUL:*
This allows altering voltage past the Nvidia BIOS limit of 1.093v.

The only protection will be at the hard level; while this tool is active.

Second row down, is memory.

Third (you won't need to touch this) is PCI-E voltage.

*Warning:*
Temp & Power & strain on your card will all rise exponentially using this tool..

Classified1080voltagetool.zip 934k .zip file


----------



## Tristanguy1224

SOOOO let's say I build another rig like the one I posted years ago "A Mini Fridge CAN cool a PC" and chill my GPU sub zero it'll scale? Not necessarily perfectly but the colder I get it the faster it'll go? I'm seriously not above building a completely ridiculous cooling setup....


----------



## nrpeyton

Quote:


> Originally Posted by *Tristanguy1224*
> 
> SOOOO let's say I build another rig like the one I posted years ago "A Mini Fridge CAN cool a PC" and chill my GPU sub zero it'll scale? Not necessarily perfectly but the colder I get it the faster it'll go? I'm seriously not above building a completely ridiculous cooling setup....


Exactly 











Here's a few of my other posts about classified 1080:

http://forum.kingpincooling.com/showthread.php?t=3938

http://forums.evga.com/Must-Have-changes-to-EK-manual-RE-pad-sizes-For-780TI-Classy-Block-1080-Classy-m2598540.aspx


----------



## Tristanguy1224

I don't like GPU Boost 3.0...


----------



## Tristanguy1224

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Exactly


What do you do to fight condensation?


----------



## nrpeyton

Quote:


> Originally Posted by *Tristanguy1224*
> 
> What do you do to fight condensation?


Here's a few of my other posts about classified 1080:

http://forum.kingpincooling.com/showthread.php?t=3938

http://forums.evga.com/Must-Have-changes-to-EK-manual-RE-pad-sizes-For-780TI-Classy-Block-1080-Classy-m2598540.aspx

Using this:


Its a Dew Point / Relative Humidity meter.

The reading that you see.. is the *lowest water temp* i can go before condensation would start kicking in.. it changes every day 

That picture was about 7pm tonight 

I've actually just finished modifying the Chiller for Sub-zero tonight.. in fact thats why I'm so excited and still active on here, at 04:39am, my time.. lol
https://www.techpowerup.com/forums/threads/water-chiller-thats-her-modified-for-subzero.229527/

Tonight I was fiddling with the electrics to bypass the thermostat.. tomorrow I begin researching how to insulate.. and then I can go *below* the reading on my dew point meter.. to wherever I want 

Chiller only cost me £200 (about 250$ in U.S dollars) on Ebay _(used)_


----------



## Tristanguy1224

Ok so your water chiller has granular control.. THAT'S the way to go. I appreciate the information. I actually work at a microcenter now so I'm going to be getting an Intel setup as soon as I can I just really expected more from a 1080 especially the one with the highest stock TDP.


----------



## nrpeyton

Quote:


> Originally Posted by *Tristanguy1224*
> 
> Ok so your water chiller has granular control.. THAT'S the way to go. I appreciate the information. I actually work at a microcenter now so I'm going to be getting an Intel setup as soon as I can I just really expected more from a 1080 especially the one with the highest stock TDP.


The litography on these GPU's is getting smaller and the whole game is beginning to shift... the old 100c is the new 50c..... the new 50c is the new 0c.. or -1c ;-)

Chips are so much more fragile, they use less power.. there for they can't take all the power you still want to throw at them... like before.. not without more cooing ;-)

This is the future; I actually believe this is why water-cooling came about.. if you think back 10 years.. you could do a lot more on air.. than what you are able to do now...

Not in terms of actual mhz per mhz.. but on a % scale...

People will still always utilize the difference in silicon.. to gain every last drop of performance... overclocking is still as strong as ever -- and always will be -- but if you want the same gains as before.. you have to push harder and spend a bit more $$$

There's a guy on here I speak to frequently... he's been wheeling his entire rig outside into -20c temps, just to get the *winters edge* ;-)


----------



## Vellinious

Quote:


> Originally Posted by *Tristanguy1224*
> 
> So I'm new to a 1080 since Christmas.... Got the EVGA Classified 1080. Anyway I had two 970s in SLI and.... I'm actually seeing a performance LOSS in some places. Is there something wrong with my card?
> Now I DID have Mr. Dark do up a custom BIOS for my 970 a few months ago to raise the TDP limits and clock speed but.... to beat a 1080? a CLASSIFIED 1080???? at ~2176-2150Mhz?
> I feel like that shouldn't be. Now on games that didn't support SLI or did so poorly it's amazing but.... I feel like there's gotta be something I can do like I did with my 970s. Has anyone unlocked Pascal BIOS yet?


2 x 970s should be beating a single 1080 pretty handily. The 970s I had made a single 1080 look pretty regular.

http://www.3dmark.com/fs/6368664


----------



## Tristanguy1224

So.... unrelated question. When playing A LOT of games ANY time I play JUST THESE GAMES in 4K I get insane keyboard lag and issues. I'll press W and there will be close to .25 - .5 sec. delay before I move OR before I stop moving. My mouse isn't delayed like this. My mouse clicks are not delayed like this. My Xbox one controller is not affected AT ALL. Any Ideas?


----------



## Tristanguy1224

Quote:


> Originally Posted by *Vellinious*
> 
> 2 x 970s should be beating a single 1080 pretty handily. The 970s I had made a single 1080 look pretty regular.
> 
> http://www.3dmark.com/fs/6368664


Holy Im-CPU-limited-Batman!


----------



## Vellinious

Quote:


> Originally Posted by *Tristanguy1224*
> 
> Holy Im-CPU-limited-Batman!


Yeah....the FX processor is killin ya.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah....the FX processor is killin ya.


:-( lol

Really need to start abusing that FX some more if you want to get those frames up lol (Tristanguy; I'm the same... had her at 5.3GHZ the other night --stock on mine is 4.0GHZ). _Was forced to use 6500 RPM worth of fan, to cool the mobo VRM, to stop mobo initiated CPU throttling_

Whats the most you've had out of her?
I've not really seen or heard much about 9590 overclocking....,


----------



## TK421

Hey guys, what would be a retailer that doesn't charge tax when shipping to NY? I'm looking to buy a 2nd 1080.


----------



## hertz9753

Maybe newegg or superbiiz?


----------



## TK421

Quote:


> Originally Posted by *hertz9753*
> 
> Maybe newegg or superbiiz?


Ah, never heard of superbiiz. I just checked out their FAQ and they do seem to not charge tax when shipped outside of CA.

Have anyone bought from superbiiz and have any experience of the seller? How they handle orders etc?


----------



## hertz9753

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hertz9753*
> 
> Maybe newegg or superbiiz?
> 
> 
> 
> Ah, never heard of superbiiz. I just checked out their FAQ and they do seem to not charge tax when shipped outside of CA.
> 
> Have anyone bought from superbiiz and have any experience of the seller? How they handle orders etc?
Click to expand...

I have ordered from them many times and they used to be named something like ewizz and unlike newegg they sell on ebay using two other seller names and you wouldn't know it's them until you see the shipping invoice.


----------



## fat4l

Does the tool work with FE card too ?


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> Does the tool work with FE card too ?


Which tool?

If your referring to the 1080 Classy voltage tool, I've only tested it on the classy 1080.
I doubt the FE uses the same voltage controller as the classy, so my guess would probably have to be no. Sorry.

But I could be wrong, only one way to find out.

I verified using a multimeter.

But others can verify by watching the power draw.
Ur reported voltage won't go up in windows. But if it works ur actual voltage will still be what u set.
U can tell, because the idle power draw reported in HWinfo 64 will increase a little, the moment u apply a higher voltage. Or vice versa.


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> Which tool?
> 
> If your referring to the 1080 Classy voltage tool, I've only tested it on the classy 1080.
> I doubt the FE uses the same voltage controller as the classy, so my guess would probably have to be no. Sorry.
> 
> But I could be wrong, only one way to find out.
> 
> I verified using a multimeter.
> 
> But others can verify by watching the power draw.
> Ur reported voltage won't go up in windows. But if it works ur actual voltage will still be what u set.
> U can tell, because the idle power draw reported in HWinfo 64 will increase a little, the moment u apply a higher voltage. Or vice versa.


aha. well can I try it the way that...."if the card clocks higher" ?
It shows the voltage of 1.0v in the tool.


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> aha. well can I try it the way that...."if the card clocks higher" ?
> It shows the voltage of 1.0v in the tool.


Not necessarily; because there aren't many 1080's that actually _like_ any extra voltage. Most don't play nice lol. I'm running chilled water and with the exception of memory (which responded nicely to a bit extra voltage and lower temps) I haven't really been able to fully utilize the tool yet.

Going subzero (mild subzero) soon though, so will let you all know how i get on with it then.

Don't get me wrong; the extra cooling certainly did unlock extra frequency (got up to 2252mhz) which was a nice jump. But that was at 1.093v (the normal max).

That being said, I've not spent an entire night playing with it or anything or any sort of trial and error to the extent I would have when I first bought it and joined the club here... someone else may of successfully traded a few extra volts for a few MHz


----------



## Buzzard1

Quote:


> Originally Posted by *Vellinious*
> 
> 2 x 970s should be beating a single 1080 pretty handily. The 970s I had made a single 1080 look pretty regular.
> 
> http://www.3dmark.com/fs/6368664


I moved from 2 x 970 sli and I got a 10-15% boost in performance. From everything I gather, the 970s were very good overclockers where as the 1080 is **** for overclocking








You will still notice in actual game play that fps will be more steady and reliable then with the 2 970s.

I just saw this post and felt compelled to share my results.


----------



## 6u4rdi4n

It's like Sandy Bridge vs newer CPUs. You won't get that much more performance, but FPS seems way more stable with newer ones, 6700K for example.

I would much rather have one card with the same performance as 2 cards, even slightly below. I feel the only reasons to go with 2 (or more cards) are either that you have one powerful card and get another one cheap or used down the line, or the most powerful single card you can buy isn't enough.


----------



## Buzzard1

So I flashed my 1080 FTW with the strix t4 bios. Its kinda interesting. with stock bios I was only able to stay at 2088, with the t4 bios I was stable and staying at 2113. The weird thing is, my temps are 4-6c colder then with the stock evga bios. I am curious to why this is. I have been running Heaven for 4 hours now and my temps hover between 52c and 58c. Usally my temps are 58c-64c.

Every one told me that my card would run hotter and that seemed to make sense but why would my temps be getting lower?

P.S. my benchmark scores did go up to.


----------



## Buzzard1

Quote:


> Originally Posted by *6u4rdi4n*
> 
> It's like Sandy Bridge vs newer CPUs. You won't get that much more performance, but FPS seems way more stable with newer ones, 6700K for example.
> 
> I would much rather have one card with the same performance as 2 cards, even slightly below. I feel the only reasons to go with 2 (or more cards) are either that you have one powerful card and get another one cheap or used down the line, or the most powerful single card you can buy isn't enough.


Why not have 2 x 1080s


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Buzzard1*
> 
> Why not have 2 x 1080s


I would rather have one GTX Titan X (Pascal) than 2 GTX 1080s. But of course, before the new Titan X was launched, I would have bought 2 GTX 1080s if one wasn't enough. See my point?


----------



## Buzzard1

Quote:


> Originally Posted by *6u4rdi4n*
> 
> I would rather have one GTX Titan X (Pascal) than 2 GTX 1080s. But of course, before the new Titan X was launched, I would have bought 2 GTX 1080s if one wasn't enough. See my point?


I guess it comes down to a matter of opinion. Personally I would rather have 2 x 1080's then teh latest titan X pascal. My buddy has the titan XP and his frames are not that much higher then mine. 2 x 1080s would blow the doors off any titan as long as the game is not a disaster for sli scaling.


----------



## fat4l

Guys.

Is there any AIO that is fully compatible with 1080 FE?
Or any that would be compatible after some mods? We have dremel ready.
Not more than 100£ / 100$.

We would like to use the stock fan and shroud and just replace the core cooling.

Anyone knows about something ? Thanks


----------



## 6u4rdi4n

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> 
> Is there any AIO that is fully compatible with 1080 FE?
> Or any that would be compatible after some mods? We have dremel ready.
> Not more than 100£ / 100$.
> 
> We would like to use the stock fan and shroud and just replace the core cooling.
> 
> Anyone knows about something ? Thanks


Only one I know of that is fully compatible is the EVGA Hybrid cooler. It comes with its own shroud and costs $120 tho.


----------



## arrow0309

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> 
> Is there any AIO that is fully compatible with 1080 FE?
> Or any that would be compatible after some mods? We have dremel ready.
> Not more than 100£ / 100$.
> 
> We would like to use the stock fan and shroud and just replace the core cooling.
> 
> Anyone knows about something ? Thanks


Get this one, strongly advised









http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


----------



## lanofsong

Hello GTX 1080 owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Krzych04650

Just got second card on Wednesday. So far I am more satisfied than I expected. All AAA titles I play support it, all MMOs I play it support it, even games like ATS can be fixed with Nvidia Inspector and I can now run the game with 400% resolution scale and get rid of those horrendous amounts of aliasing. Overall considerably more support than I expected.

Scaling on 3440x1440 resolution is not as good as on 4K, it vary from 50 to 90%, but in games I tested so far it is on 75% on average, so thats very acceptable. I was playing a bit with video editing recently just out of curiosity so I even made a video if someone is interested: 




Now the last upgrade to make is i7 and additional 8 GB of RAM, but I will do that in March because for now I am broke









Power draw is up to 600W with overclocked 4690K, usually around 550W if you push the cards without v-sync.

The biggest concern will be temperatures. Top card is already getting around 65 C with 80% fan speed and that is for 12 C ambient (PC in different room) and case with additional side and bottom intake. So during hot days it will require 100% fan speed to stay a bit below 80C, so such setup is definitely not usable if you have your PC next to you, or even in the same room.

Overall I am satisfied and it was worth it. Even just for a smile on my face after seeing what this second card allowed me to do with Witcher 3, basically doubling ultra settings on shadows and foliage, game looks just unbelievably beautiful right now.


----------



## webhito

Hey fellas! I recently purchased a gtx 1080, I do however have a question/problem.

For some reason my benchmarks are giving me random results, some topping 20k and some 18.9k, with no settings changed and in 2 different runs I can get the same type of result, I can also run the same benchmark 4 times and get good results and then it will drop... Driver wise I am using the official 376.33, tried the 2 newer ones with hotfixes but they do the same thing. Mind you, with the high performance option in windows and the balanced one, yield the same results.

This is driving my crazy lol.

Upon further investigation, it seems that only the normal firestrike benchmark is doing this, extreme has proven to be much more reliable and is giving me closer results.
Still I have no idea why it would be doing this.


----------



## Vellinious

Got a single card to run 2278. Couldn't get them to run in SLI at that speed, though. 2252 in SLI was the best run of the night. The 2278 score was worse than 2240, but...it ran it. Made me smile. Gonna try higher in a bit. Gotta let the loop cool back down.


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Vellinious*
> 
> Got a single card to run 2278. Couldn't get them to run in SLI at that speed, though. 2252 in SLI was the best run of the night. The 2278 score was worse than 2240, but...it ran it. Made me smile. Gonna try higher in a bit. Gotta let the loop cool back down.






What is your Mem clock again? The offset.
I have been wondering about why or what the reading in GPU-Z means when it reads, as in your case 1390.5. How does that correlate to offset?


----------



## 6u4rdi4n

I'm guessing that's +555 on memory?


----------



## Derek1

Quote:


> Originally Posted by *6u4rdi4n*
> 
> I'm guessing that's +555 on memory?


Ya it's hard to tell. I know when mine is set at +800 offset or 11600 GPU-Z reads the clock as around 1425 or something.
Why does it do that? How is it getting that number?

ETA I think I figured it out. It is the total Mem clock divided by 8, or the size of the total memory.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Got a single card to run 2278. Couldn't get them to run in SLI at that speed, though. 2252 in SLI was the best run of the night. The 2278 score was worse than 2240, but...it ran it. Made me smile. Gonna try higher in a bit. Gotta let the loop cool back down.


little *7* milli volt bump up to 1.1v?

you never know... what is the highest people are getting at stock voltage (the highest 10% of cards)... i wonder

I doubt Nvidia expected people to be running any cooler than an overkill standard water setup when they decided on 1.093v.

I coud be wrong though.

But at the clocks you're getting you're probably in that 10%

I know you've never agreed with any extra voltage but thats only going to be true up to a point. Youre water temp is at 7c. So doing well.

I've got my insulation in the post so I can go lower.

P.S.
Av got a Skype appointment with someone tonight to discuss overclocking these
Got ma dinner etc to get first.


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> little *7* milli volt bump up to 1.1v?
> 
> you never know... what is the highest people are getting at stock voltage (the highest 10% of cards)... i wonder
> 
> I doubt Nvidia expected people to be running any cooler than an overkill standard water setup when they decided on 1.093v.
> 
> I coud be wrong though.
> 
> But at the clocks you're getting you're probably in that 10%
> 
> I know you've never agreed with any extra voltage but thats only going to be true up to a point. Youre water temp is at 7c. So doing well.
> 
> I've got my insulation in the post so I can go lower.
> 
> P.S.
> Av got a Skype appointment with someone tonight to discuss overclocking these
> Got ma dinner etc to get first.






My general impression from looking around various places over the last 6 months or so is that anything over 2250 is top 5%.
I have still only seen verified scores of 2300+ from only 2 people, and they were in this thread. One on air at 1.093v at 60C. I believe the other was underwater, would need to check.
2200 - 2250 would be next tier at 10%.
2150 - 2200 next and around 25% of cards do that
2100 - 2150 next and I am guessing 50% - 75% will do that


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> My general impression from looking around various places over the last 6 months or so is that anything over 2250 is top 5%.
> I have still only
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> seen verified scores of 2300+ from only 2 people, and they were in this thread. One on air at 1.093v at 60C. I believe the other was underwater, would need to check.
> 2200 - 2250 would be next tier at 10%.
> 2150 - 2200 next and around 25% of cards do that
> 2100 - 2150 next and I am guessing 50% - 75% will do that


good post


----------



## Derek1

You and Vellinious have certainly controlled for Temps aspect of the VoltxPowerxTemperature aspect of OCing these cards.
Was just reading KingPins write up again and he mentions that overvolting is not recommended due to Temps limitations usinf air or water.
Surely that doesn't apply to you guys now.
You both should be able to punch through 2300 to my mind barring the limitations of the silicon of course.
If you both can run at 2250+ and stay under 30C while doing that then you have not crossed the thermal barrier for those clocks. For eg my card throttles one step at 43C and then again at around 55-60C I believe at 1.093v with ~110% power. That is with gpu starting at 2164. I can get bench toload at 2178 but then I crash and see VRel so I do need to add some volts to get it to be stable. (Just as an aside, I have never seen Power go over 120% despite the fact that I got the Slave bios of 130 set)
You may not realize increased FPS or higher Graphics scores or insgnificant gains to warrant the extra volts but I believe you guys can get your clocks over 2300 with the T4.

But wth do I know. lol
I have no electronics expertise at all other than putting the fat side of the plug into the proper hole in the outlet.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> You and Vellinious have certainly controlled for Temps aspect of the VoltxPowerxTemperature aspect of OCing these cards.
> Was just reading KingPins write up again and he mentions that overvolting is not recommended due to Temps limitations usinf air or water.
> Surely that doesn't apply to you guys now.
> You both should be able to punch through 2300 to my mind barring the limitations of the silicon of course.
> If you both can run at 2250+ and stay under 30C while doing that then you have not crossed the thermal barrier for those clocks. For eg my card throttles one step at 43C and then again at around 55-60C I believe at 1.093v with ~110% power. That is with gpu starting at 2164. I can get bench toload at 2178 but then I crash and see VRel so I do need to add some volts to get it to be stable. (Just as an aside, I have never seen Power go over 120% despite the fact that I got the Slave bios of 130 set)
> You may not realize increased FPS or higher Graphics scores or insgnificant gains to warrant the extra volts but I believe you guys can get your clocks over 2300 with the T4.
> 
> But wth do I know. lol
> I have no electronics expertise at all other than putting the fat side of the plug into the proper hole in the outlet.


hmm very interesting indeed, I might give the updated classy voltage tool another go, later, and see if it yields anything on the core..... not really *properly* tried yet...

I can keep core temp at 20c max even at high power/volts


----------



## Vellinious

The offset is +560, so 5556 or something right there. Temps in the room only dropped to 3.8c, so I was a little bit limited. It was supposed to get down to -3c last night, but it never quite got there.

There's the big disadvantage, Nick.....I'm at the mercy of mother nature. lol

Almost had 2278 in SLI. It ran through all the way to the 2nd space station in graphics test 2 and crashed. I think a few more degrees c under where I was at, may have got it done. 2265 was running though. I did lose about 1 FPS there, under what it was running at 2252.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> The offset is +560, so 5556 or something right there. Temps in the room only dropped to 3.8c, so I was a little bit limited. It was supposed to get down to -3c last night, but it never quite got there.
> 
> There's the big disadvantage, Nick.....I'm at the mercy of mother nature. lol
> 
> Almost had 2278 in SLI. It ran through all the way to the 2nd space station in graphics test 2 and crashed. I think a few more degrees c under where I was at, may have got it done. 2265 was running though. I did lose about 1 FPS there, under what it was running at 2252.


Do you have the PC outside, in a garage or something.. while you sit comfortable indoors controlling from USB keyboard/mouse on a very long HDMI cable to the monitor?

Sounds like a good idea lol... but imagine if you had a water chiller outside too... you're dew point (to avoid condensation) would probably be about -35c outside in -20c temps outside.

Top your coolant up with XT-1 at 30% mixture and away you go.  No need to run any anti-freeze or anything that would cause corrosion. Mayhems do the XT-1 specifically for Chillers with bioicde, anti-corrosion inhibiters and frost protection included.

I've got all my orders for insulation and XT-1 under way. Looking forward to getting it all set up this week.

Already finished modifying the electrics on my Chiller to allow it to run regardless of thermostat. Was actually really easy.

I've already flooded tech-powerup with it all so I'll just wait until its all set up and I've got proper temps /results to show, before I post any pictures up here.

But I do defnitely intend to aim for 2300mhz by Thursday when its al up-and-running. (hoping I get away *without* adding another pump due to increased thickness of coolant)

I wanted to get a flow-meter but they're too expensive or have stupid added extras I really don't want. (or, you had to already know your flow and pick from a selection of three, each only able to pick up a certain range).


----------



## Vellinious

I have a "sun room" off the back of the house. It's got near 20ft ceilings, with a ventilation fan in the top to draw hot air out in the summer. Works great in winter too. Open up the windows in the bottom and let the fan pull the cold air in. Anyway....I roll the PC out there, I've run a monitor cable through the wall, and use a wireless keyboard and mouse. Just a quick little setup I did near the door.


----------



## smicha

Done. Thank you for watching


----------



## 6u4rdi4n

Awesome work







What do you think of the Core X9? Build quality etc. Been thinking about getting one since it's got lots of space, looks decent and doesn't cost as much as the parts themselves for a decent PC. That the motherboard mounts horizontally also attracts me


----------



## smicha

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Awesome work
> 
> 
> 
> 
> 
> 
> 
> What do you think of the Core X9? Build quality etc. Been thinking about getting one since it's got lots of space, looks decent and doesn't cost as much as the parts themselves for a decent PC. That the motherboard mounts horizontally also attracts me


Quality is ok for this price, but functionality is great. 5.25 mounts are cheap, and server mobo screws are not drilled properly. For this money this is the best case for such systems.


----------



## XCalinX

Just a quick question, how do I update my entry in the club? I have 2 now, had one when I submitted my form. Also one is from MSI, the other one from Zotac...


----------



## 6u4rdi4n

Quote:


> Originally Posted by *smicha*
> 
> Quality is ok for this price, but functionality is great. 5.25 mounts are cheap, and server mobo screws are not drilled properly. For this money this is the best case for such systems.


I wasn't thinking about going as extreme as you. More of a normal system, but I want room for more/larger radiators.

Currently have a HAF X, but I'm looking at the Thermaltake Core X9 and X71 (Tempered glass edition) because they are actually affordable and have room for more/larger radiators.


----------



## Coopiklaani

The best stability test so far as I know is Firestrike Ultra Scene 2. I could oc to 2300MHz and complete all tests but not FSU S2 looping. I wonder what's the max core clock ppl get that is stable enough to pass FSU S2 1hr loop.


----------



## Vellinious

Quote:


> Originally Posted by *Coopiklaani*
> 
> The best stability test so far as I know is Firestrike Ultra Scene 2. I could oc to 2300MHz and complete all tests but not FSU S2 looping. I wonder what's the max core clock ppl get that is stable enough to pass FSU S2 1hr loop.


Must run it pretty cold to get those kinds of clocks.

Mine will run 2202 all day long @ 1.075v.


----------



## juniordnz

The best stability test I've found so far is battlefield 1 lol

Overclocks stable on firestrike shows artifacts within seconds on bf1. Even with low gpu utilization (50-60%) and low graphics on 1080p screen.

Wonder if that's really an unstable OC or just a poorly designed game...


----------



## GRABibus

I just bought a Gigabyte GTX 1080 Xtreme Gaming Waterforce.

I played with Precision X OC in order and the voltage seems limited.

I mean even if the V/F curves in Precision X (OC Scanner) show a maximum voltage of 1.162mV, I only can reach 1.09V maximum (With voltage pushed to +100% in Precision X).

Is it normal ?
Why don't I get 1.162mW while putting +100% on voltage in Precision X ?
Is it a NVIDIA limitation for safety in GTX 1080 ?

I woudl like to know alos if they are modded Bios to increease the voltage.

Thank you !


----------



## 6u4rdi4n

Because 1.093V is maximum. By setting the voltage to 100%, you are telling the card that it's allowed to use up to its maximum voltage according to bios, which is 1.093V stock.


----------



## GRABibus

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Because 1.093V is maximum. By setting the voltage to 100%, you are telling the card that it's allowed to use up to its maximum voltage according to bios, which is 1.093V stock.


Thanks.

Are there any Bios to unlock voltage ?

I didn't see this on OCN.


----------



## hertz9753

The GTX 1080 doesn't like higher voltage. What are you guys running that a GTX 1080 is not good enough for? Is it benchmarking or a 4K monitor?

I still have a GTX 980 Ti and 1080P monitor though.


----------



## Spiriva

Quote:


> Originally Posted by *GRABibus*
> 
> Thanks.
> 
> Are there any Bios to unlock voltage ?
> 
> I didn't see this on OCN.


Quote:


> Originally Posted by *nrpeyton*
> 
> strix1080xoc_t4version2withhigherfirestrikescore.zip 148k .zip file
> 
> 
> -Removed Power Limit
> -Voltage up to 1.2v (over stock 1.093v)
> *-No Temperature limit (^warning^)*
> -May cause a 'reduced fan speed' on FE cards. FE cards should only use this when under water.
> -Won't work with monitors plugged into the 3rd display port
> 
> *Be careful* (monitor your temps when using this BIOS)
> 
> -Remember ALWAYS *disable display driver in "device manager" first BEFORE flashing*.
> -*Always backup your original BIOS* first using GPU-Z (and keep a backup of your original in safe place)


To up the volt you need to do it via MSI Afterburner "graph", in AB press ctrl F to bring it up.
I use this bios on both my cards (under water) @ 2250mhz, 1.150v (Evga FE 1080´s)


----------



## GRABibus

Quote:


> Originally Posted by *Spiriva*
> 
> To up the volt you need to do it via MSI Afterburner "graph", in AB press ctrl F to bring it up.
> I use this bios on both my cards (under water) @ 2250mhz, 1.150v (Evga FE 1080´s)


Thanks.

Is there any tutorial ?
Does it work for all GTX 1080 brands?
Which utility for flashing ?

I formerly used nvflash in DOS command for my former GTX TITAN X with command "nvflash -6 Bios.rom".

I imagine this is completely different here.


----------



## GRABibus

Quote:


> Originally Posted by *hertz9753*
> 
> What are you guys running that a GTX 1080 is not good enough for?


not enough for OCN and "pursuit of performance" concept...
We are on overclock.net and not atstock.net


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Spiriva*
> 
> To up the volt you need to do it via MSI Afterburner "graph", in AB press ctrl F to bring it up.
> I use this bios on both my cards (under water) @ 2250mhz, 1.150v (Evga FE 1080´s)


I thought t4 bios on FE cards, or single 8 pin was a no-no?


----------



## Vellinious

Quote:


> Originally Posted by *Spiriva*
> 
> To up the volt you need to do it via MSI Afterburner "graph", in AB press ctrl F to bring it up.
> I use this bios on both my cards (under water) @ 2250mhz, 1.150v (Evga FE 1080´s)


What kind of graphics scores are you seeing in Timespy with those clocks? I've run mine in SLI @ 2252, and hit 16.7k graphics score, but I did it with cold water super low ambients and 1.093v. I just wonder if 2250 with higher volts and higher temps ends up getting the same kind of performance.

http://www.3dmark.com/spy/1062530


----------



## Spiriva

Quote:


> Originally Posted by *GRABibus*
> 
> Thanks.
> 
> Is there any tutorial ?
> Does it work for all GTX 1080 brands?
> Which utility for flashing ?
> 
> I formerly used nvflash in DOS command for my former GTX TITAN X with command "nvflash -6 Bios.rom".
> 
> I imagine this is completely different here.


nvflash --index=0 --save 1080org.rom
nvflash --index=0 --protectoff
nvflash --index=0 -6 strix1080xoc_t4.rom

--index=0 is because i have two cards, next card would be --index=1 and do the same thing again.

Pretty much exact the same way as it was with the Titan X.
Quote:


> Originally Posted by *6u4rdi4n*
> 
> I thought t4 bios on FE cards, or single 8 pin was a no-no?


I been using the t4 bios on my FE cards since it first came out, no problems so far.
Quote:


> Originally Posted by *Vellinious*
> 
> What kind of graphics scores are you seeing in Timespy with those clocks? I've run mine in SLI @ 2252, and hit 16.7k graphics score, but I did it with cold water super low ambients and 1.093v. I just wonder if 2250 with higher volts and higher temps ends up getting the same kind of performance.
> 
> http://www.3dmark.com/spy/1062530


Im currently not at home so i cant check right now, but in fire strike i think it was around 48.000 graphics score, in time spy i cant recall at all.


----------



## 6u4rdi4n

Good to hear! Maybe I gotta test it on my card now that I got it on water. (EVGA GTX 1080 SC ACX 3.0)


----------



## Vellinious

Quote:


> Originally Posted by *Spiriva*
> 
> nvflash --index=0 --save 1080org.rom
> nvflash --index=0 --protectoff
> nvflash --index=0 -6 strix1080xoc_t4.rom
> 
> --index=0 is because i have two cards, next card would be --index=1 and do the same thing again.
> 
> Pretty much exact the same way as it was with the Titan X.
> I been using the t4 bios on my FE cards since it first came out, no problems so far.
> Im currently not at home so i cant check right now, but in fire strike i think it was around 48.000 graphics score, in time spy i cant recall at all.


I can't run above 2202 @ 1.081v on the GPU in Firestrike with the memory offset at +495, but I'm above 48k in every run at those clocks, as long as the GPU temps stays below 30c. Any higher, and the scores drop off. Highest thus far is 48.3k with about 48.2k the average.

http://www.3dmark.com/fs/11349771


----------



## Buzzard1

Quote:


> Originally Posted by *hertz9753*
> 
> The GTX 1080 doesn't like higher voltage. What are you guys running that a GTX 1080 is not good enough for? Is it benchmarking or a 4K monitor?
> 
> I still have a GTX 980 Ti and 1080P monitor though.


Ive playing watchdogs 2 and witcher 3 in 4k and 1 1080 is not enough lol . It get the job done, but I didnt buy one of the fastest cards to just run games ok. Ill be getting another 1080 soon.


----------



## hertz9753

Quote:


> Originally Posted by *Buzzard1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hertz9753*
> 
> The GTX 1080 doesn't like higher voltage. What are you guys running that a GTX 1080 is not good enough for? Is it benchmarking or a 4K monitor?
> 
> I still have a GTX 980 Ti and 1080P monitor though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ive playing watchdogs 2 and witcher 3 in 4k and 1 1080 is not enough lol . It get the job done, but I didnt buy one of the fastest cards to just run games ok. Ill be getting another 1080 soon.
Click to expand...

That's why I questioned if somebody was trying to use on GTX 1080 to run a 4K monitor.


----------



## Vellinious

A single 1080 is enough to run 4k....if you drop some settings. Even the TitanXP I had wasn't enough to make me happy on my 4k monitor....as of now, there is still no single card solution for ultra / high settings, for every game, in 4k.


----------



## nrpeyton

I get 35 FPS on 4k at Ultra (the Witcher 3) which is good enough for single-player

remember the eye can't tell the difference at anything above 25

And thats also paired with an old AMD FX-8350 chip, by the way

Anyway my journey into sub-zero begins tomorrow/thursday...

_Hope all my deliveries come on time_ (included in that is about £60 worth of insulation goodies) and £50 worth of coolant.

Hoping to break the 1080, 2300 mhz barrier


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I get 35 FPS on 4k at Ultra (the Witcher 3) which is good enough for single-player
> 
> remember the eye can't tell the difference at anything above 25


No....just no. A trained eye can see distinct objects in 1 frame in 255 FPS. Most people can do so up to about 200. So, don't bring that console stupidity in here.


----------



## juniordnz

I'm with vellinous on that one.

Best thing I did was going for a 144hz screen. I can't even play below 80fps and it baffles me how I was able to enjoy such low framerates like that before. I configure every game I own to get a minimum FPS of 120 (my screen set refresh rate), no matter what "eye candy" I have to drop for that.

Fluidity > eye candy


----------



## jleslie246

Quote:


> Originally Posted by *nrpeyton*
> 
> I get 35 FPS on 4k at Ultra (the Witcher 3) which is good enough for single-player
> 
> remember the eye can't tell the difference at anything above 25
> 
> And thats also paired with an old AMD FX-8350 chip, by the way
> 
> Anyway my journey into sub-zero begins tomorrow/thursday...
> 
> _Hope all my deliveries come on time_ (included in that is about £60 worth of insulation goodies) and £50 worth of coolant.
> 
> Hoping to break the 1080, 2300 mhz barrier


What are you getting? Making the switch to intel?


----------



## juniordnz

Quote:


> Originally Posted by *jleslie246*
> 
> What are you getting? Making the switch to intel?


Lots of FPS. Specially on CPU bound games like BF1, R6S, etc...


----------



## nrpeyton

I suppose being able to game at 60+ FPS at 4k Ultra would be *nice*. Very nice 

Think I've already given enough money to Nvidia, though.


----------



## juniordnz

Quote:


> Originally Posted by *nrpeyton*
> 
> Think I've already given enough money to Nvidia.


You think? I paid 1400usd for my 1080 FTW in local currency


----------



## trippinonprozac

Hey guys,

Does anyone know if sli is working in bf1 under dx12?

I am thinking about getting a second 1080 but mostly play bf1. There is so much mixed info around the web that I wanted to see what people on here thought...


----------



## nrpeyton

Anyone know if it matters "where" in your loop; you place a 2nd pump?

My existing pump is combined with my reservoir in a combo.

Reason I'm asking, is the "fins" the coolant travels across in the GPU block (and CPU block too) are going to be the biggest challenge for a *thicker* coolant.

Thicker coolant is required for anti-freeze protection on my quest for 2.3GHZ 1080.

So trying to figure out whether I'll need a 2nd pump and what one to get... and where to place it....

Do I place it half way around the loop _to get the coolant flowing again_ or does the pressure 'equalise', so it doesn't matter?


----------



## Vellinious

Shouldn't matter where it's placed, it's still just a 2nd pump in serial.


----------



## trippinonprozac

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone know if it matters "where" in your loop; you place a 2nd pump?
> 
> My existing pump is combined with my reservoir in a combo.
> 
> Reason I'm asking, is the "fins" the coolant travels across in the GPU block (and CPU block too) are going to be the biggest challenge for a *thicker* coolant.
> 
> Thicker coolant is required for anti-freeze protection on my quest for 2.3GHZ 1080.
> 
> So trying to figure out whether I'll need a 2nd pump and what one to get... and where to place it....
> 
> Do I place it half way around the loop _to get the coolant flowing again_ or does the pressure 'equalise', so it doesn't matter?


Just keep filling in mind as if its much further through the loop it can be a pain. You wont be able to prime your other pump until coolant has been distributed. Also means it will rely on flow and not gravity to fill it.


----------



## nrpeyton

okay thanks, would be interesting to see how much faster my flow is with a 2nd, 3rd or 4th pump and how that would affect temperature of GPU

You'd assume that the faster the water travels over the GPU, the faster it pulls heat away (making your GPU match the water temp more closely)

whats your pump rated at vellinious? and are you going through the chipset and motherboard VRM too?


----------



## nrpeyton

Quote:


> Originally Posted by *trippinonprozac*
> 
> Just keep filling in mind as if its much further through the loop it can be a pain. You wont be able to prime your other pump until coolant has been distributed. Also means it will rely on flow and not gravity to fill it.


Right; so the best setup would be Pump 1 ---> Pump 2---> loop

got it, thanks.

edit: and that won't affect performance? wouldn't 1 pump just restrict the other--- if they are so close:?


----------



## trippinonprozac

Quote:


> Originally Posted by *nrpeyton*
> 
> Right; so the best setup would be Pump 1 ---> Pump 2---> loop
> 
> got it, thanks.
> 
> edit: and that won't affect performance? wouldn't 1 pump just restrict the other--- if they are so close:?


Make sure the pumps are the same with the same heads. If pump 2 flows more than pump 1 you can starve pump 1.

Best to match them but if you need to have 1 pump with a higher flow rate I believe (chime in if I am wrong) you want that as the first pump in the series.


----------



## nrpeyton

I think if I do it, I'm get the same rating (!500L/hr & 3.9m head)

Then just connect them in series with as short a bit of tubing as possible?

Still seems a bit of an 'akward' way to do it.. wonder if it will actually work as expected....

Anyway thanks


----------



## trippinonprozac

yeah should work fine. Also gives you redundancy if one pump fails.


----------



## nrpeyton

Quote:


> Originally Posted by *trippinonprozac*
> 
> yeah should work fine. Also gives you redundancy if one pump fails.


I really need to invest in a flow meter to test if it actually improves anything.

Shame to spend all that money on a new pump AND flow meter then find out it doesn't do very much, though :-(

Only one way to find out I suppose, though.

Someone said to me on here a few month ago, _"welcome to always being broke"_ (when I first put my 1080 under water).. Now I undrestand what he meant, lol..

Anyway good night, I should of been away to sleep ages ago....

Subzero tomorrow, I'll let you'z know how it goes with the 2300


----------



## Dragonsyph

Why do i get higher graphics scores then you guys when you are running 2200+ mhz vs my 2164? I wish i got a card that did 2250.


----------



## Dragonsyph

Quote:


> Originally Posted by *trippinonprozac*
> 
> Hey guys,
> 
> Does anyone know if sli is working in bf1 under dx12?
> 
> I am thinking about getting a second 1080 but mostly play bf1. There is so much mixed info around the web that I wanted to see what people on here thought...


Two 1080s just for bf1? You doing 5k or 8k resolution? At 4k i never drop below 60 fps in bf1 at max settings with a single 1080.


----------



## Buzzard1

Quote:


> Originally Posted by *nrpeyton*
> 
> I suppose being able to game at 60+ FPS at 4k Ultra would be *nice*. Very nice
> 
> Think I've already given enough money to Nvidia, though.


If games are pushing cards to there limits now, imagine how much they will push them by the end of this year. Im either gonna trade my 1080 in for a 1080ti via evga, or just buy another 1080. Im pretty sure 1080s will drop in price a little once the 1080ti comes out, specially if you buy the second card used.

Personally I am getting 45-60 fps ultra maxed out (4k) on witcher3. But Watchdogs 2 even @1080p will bring your pc down to its knees.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Why do i get higher graphics scores then you guys when you are running 2200+ mhz vs my 2164? I wish i got a card that did 2250.


Could be memory clocks, could be the drivers you're using, NVIDIA control panel settings, NVIDIA inspector settings.....could be a lot of things causing your scores to be that much higher. You're hitting what, 26k graphics in Firestrike? If I had to guess, I'd say LOD tweaks, maybe? Hard to tell.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> okay thanks, would be interesting to see how much faster my flow is with a 2nd, 3rd or 4th pump and how that would affect temperature of GPU
> 
> You'd assume that the faster the water travels over the GPU, the faster it pulls heat away (making your GPU match the water temp more closely)
> 
> whats your pump rated at vellinious? and are you going through the chipset and motherboard VRM too?


After it gets so high, higher flow rates can actually cause poor performance in the loop. This is why running the pumps in serial, to keep flow rates consistent with what 1 pump does, but adds head pressure, to push through the restrictions better. Parallel flow, would allow for higher flow rates, but wouldn't do anything for higher head pressure.

Each one of my loops goes through an MORA3 420. The GPU loop has 2 blocks, and the CPU loop has 1. I don't cool memory or the motherboard. Pretty much useless....it's just for show....I made these loops to GO. lol


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> Could be memory clocks, could be the drivers you're using, NVIDIA control panel settings, NVIDIA inspector settings.....could be a lot of things causing your scores to be that much higher. You're hitting what, 26k graphics in Firestrike? If I had to guess, I'd say LOD tweaks, maybe? Hard to tell.


On driver 376.33 all settings default, dont have inspector. Hitting 26,405. Might be the memory oc i guess. I'd probably rather have higher core then memory but seems memory is producing higher scores.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> On driver 376.33 all settings default, dont have inspector. Hitting 26,405. Might be the memory oc i guess. I'd probably rather have higher core then memory but seems memory is producing higher scores.


For some....some people are getting +900 and +1000 on the memory. I've never seen it in person, so I'm still skeptical that FS and TS aren't just bugging out at those clocks, but....eh, whatever.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> For some....some people are getting +900 and +1000 on the memory. I've never seen it in person, so I'm still skeptical that FS and TS aren't just bugging out at those clocks, but....eh, whatever.


Your getting almost 2300mhz but you can't believe a good memory OC?


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Your getting almost 2300mhz but you can't believe a good memory OC?


I've seen a lot of things in my time overclocking....+1000 on ANY memory isn't one of them.....and especially on air cooling. Makes me giggle.

I'll say this. I've owned 5 1080s and a TitanXP. None of them would do over +575 offset on the memory without causing dropped textures in Timespy. Would it run and complete? Yes. It'd even approve it online. Doesn't mean I'd keep them, though.


----------



## ucode

FWIW @Dragonsyph I believe you







or at the very least I don't have any reason to not believe.


----------



## TWiST2k

Quote:


> Originally Posted by *juniordnz*
> 
> I'm with vellinous on that one.
> 
> Best thing I did was going for a 144hz screen. I can't even play below 80fps and it baffles me how I was able to enjoy such low framerates like that before. I configure every game I own to get a minimum FPS of 120 (my screen set refresh rate), no matter what "eye candy" I have to drop for that.
> 
> Fluidity > eye candy


All the way dude, I have the Asus PG279Q and it is the most amazing thing ever, even just running windows at 144hz is SOO noticeable and smooth. Sorry to disagree with anyone, but that "the eye only sees XX" is total bull lol.


----------



## Vellinious

I did a little playing. Driver version 368.81. I did 3 runs at each setting, and kept the best 2. I'll try the good FS driver tomorrow (375.7), and then the newest driver at some point in the next few days. I don't imagine I'm going to see much of anything different though.

With the GPU at 2151 @ 1.050v, I used these memory offsets

495 - 117.80 | 118.01
525 - 118.85 | 119.01
555 - 120.12 | 120.67
655 - 115.68 | 114.09
755 - 115.75 | 115.06
855 - 118.89 | 119.52
955 - Froze | Froze

It's not magic. Something's buggy.


----------



## hertz9753

http://nvidia.custhelp.com/app/answers/detail/a_id/4288

Will you try those?


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> I did a little playing. Driver version 368.81. I did 3 runs at each setting, and kept the best 2. I'll try the good FS driver tomorrow (375.7), and then the newest driver at some point in the next few days. I don't imagine I'm going to see much of anything different though.
> 
> With the GPU at 2151 @ 1.050v, I used these memory offsets
> 
> 495 - 117.80 | 118.01
> 525 - 118.85 | 119.01
> 555 - 120.12 | 120.67
> 655 - 115.68 | 114.09
> 755 - 115.75 | 115.06
> 855 - 118.89 | 119.52
> 955 - Froze | Froze
> 
> It's not magic. Something's buggy.


Memory.

600 122.59

625 124.96

650 122.97

700 126.57

750 124.3

800 126.84

850 125.02

900 127.58

950 124.78

1000 128.59


----------



## emperium85

Does anybody know where I can find the bios of this card:

*EVGA GeForce GTX 1080 HYBRID GAMING*
http://www.evga.com/products/product.aspx?pn=08G-P4-6188-KR

I have an GTX1080 FE and installed the Hybrid koeler, I want to flash it to an EVGA card


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Memory.
> 600 122.59
> 625 124.96
> 650 122.97
> 700 126.57
> 750 124.3
> 800 126.84
> 850 125.02
> 900 127.58
> 950 124.78
> 1000 128.59


Yeah....magic.


----------



## ucode

Quote:


> Originally Posted by *Vellinious*
> 
> I've seen a lot of things in my time overclocking....+1000 on ANY memory isn't one of them.....and especially on air cooling.


Well what is 1000 anyway. Is it 1000MHz, no. Is it 1000MT/s, no. So what is it?

GDDR5X works as QDR at full performance so that 1000 is 500MHz memory clock or 2000MT/s which if we take MT/s is an increase from the default 10000MT/s to 12000MT/s or 20%.

As for the dips I already posted a graph previously that also shows it happens on Pascal GDDR5 as well although GDDR5X has it's special extra quirk as well.



Here's an air cooled GTX 1050Ti result that uses memory with a default transfer rate of 7000MT/s overclocked without artifacts to 9200MT/s. That's an increase of over 2000MT/s or over 31%.


----------



## Vellinious

Quote:


> Originally Posted by *ucode*
> 
> Well what is 1000 anyway. Is it 1000MHz, no. Is it 1000MT/s, no. So what is it?
> 
> GDDR5X works as QDR at full performance so that 1000 is 500MHz memory clock or 2000MT/s which if we take MT/s is an increase from the default 10000MT/s to 12000MT/s or 20%.
> 
> As for the dips I already posted a graph previously that also shows it happens on Pascal GDDR5 as well although GDDR5X has it's special extra quirk as well.
> 
> 
> 
> Here's an air cooled GTX 1050Ti result that uses memory with a default transfer rate of 7000MT/s overclocked without artifacts to 9200MT/s. That's an increase of over 2000MT/s or over 31%.


But to see it scale up and down every +50 offset? 124 and then 127 and then back to 124 and then up to 128. Scaling just doesn't work like that.

Sure, I can set +855 on my memory on a single card, but in SLI? Not happening. Possibly because it's so very unstable and buggy, that SLI is just pointing out the weakness enough that it just crashes.

I can make mine bug out too. Just set a super high memory clock and a super high core clock, and...bam, bugged run. No settings tweaked, just straight graphics test 1 and 2 just to see what the graphics score ends up being. I'm 99.9% sure had I run the physics and combined with it, it would have validated, because I got no warnings other than the fact that I ran only the graphics test. But I wouldn't keep this run on a bet....because I know it was buggy.

Futuremark has done an extraordinarily poor job as of late, keeping bugged runs / exploits out of the "valid run" category. Bottom picture shows this clear as day. That's a picture of the Hall of Fame, btw.


----------



## Vellinious

Quote:


> Originally Posted by *emperium85*
> 
> Does anybody know where I can find the bios of this card:
> 
> *EVGA GeForce GTX 1080 HYBRID GAMING*
> http://www.evga.com/products/product.aspx?pn=08G-P4-6188-KR
> 
> I have an GTX1080 FE and installed the Hybrid koeler, I want to flash it to an EVGA card


The power limits on the two cards you're talking about, are exactly the same. 180 watts. There won't be any difference.


----------



## GRABibus

Quote:


> Originally Posted by *Spiriva*
> 
> nvflash --index=0 --save 1080org.rom
> nvflash --index=0 --protectoff
> nvflash --index=0 -6 strix1080xoc_t4.rom
> 
> --index=0 is because i have two cards, next card would be --index=1 and do the same thing again.
> 
> Pretty much exact the same way as it was with the Titan X.
> I been using the t4 bios on my FE cards since it first came out, no problems so far.
> Im currently not at home so i cant check right now, but in fire strike i think it was around 48.000 graphics score, in time spy i cant recall at all.


Quote:


> Originally Posted by *Vellinious*
> 
> What kind of graphics scores are you seeing in Timespy with those clocks? I've run mine in SLI @ 2252, and hit 16.7k graphics score, but I did it with cold water super low ambients and 1.093v. I just wonder if 2250 with higher volts and higher temps ends up getting the same kind of performance.
> 
> http://www.3dmark.com/spy/1062530


I use my GTX 1080 since 2 days an try to get familiar with Boost 3.0.

Questions :
- How can we force the Voltage to go to 1.09V (Max value) ?
In all my games and benchmark, I have 1.06 or 1.07 maximum,
If I set power limit to 150%, it doesn't change anything, neither the voltage neither the power which is around 75% in my games and benchmark. Voltage adjustment is set to +100%
Is it due to lock power limit ?

How guys can you get 1.1V or 1.15V even with a Bios mod ?

Does the Bios mentionned by Spriva here fix the voltage to 1.2V or does the Bios unlock a limit of the power ?


----------



## Spiriva

Quote:


> Originally Posted by *GRABibus*
> 
> How guys can you get 1.1V or 1.15V even with a Bios mod ?
> 
> Does the Bios mentionned by Spriva here fix the voltage to 1.2V or does the Bios unlock a limit of the power ?


Yes, the t4 bios will allow you to set it to 1.2v, you need to use the msi afterburner graph mode tho, like someone shows in this video:


----------



## GRABibus

Quote:


> Originally Posted by *Spiriva*
> 
> To up the volt you need to do it via MSI Afterburner "graph", in AB press ctrl F to bring it up.
> I use this bios on both my cards (under water) @ 2250mhz, 1.150v (Evga FE 1080´s)


Quote:


> Originally Posted by *Spiriva*
> 
> Yes, the t4 bios will allow you to set it to 1.2v, you need to use the msi afterburner graph mode tho, like someone shows in this video:


You mean : fixed voltage at 1.2V under graphic loads (Games, benchmarks...etc...) ?
This bios disables Boost 3.0 process so ?

The use of the V/F curve of AB is only used to put frequiency offsets on voltages.
But how do you set voltages ?


----------



## Spiriva

Quote:


> Originally Posted by *GRABibus*
> 
> You mean : fixed voltage at 1.2V under graphic loads (Games, benchmarks...etc...) ?
> This bios disables Boost 3.0 process so ?
> 
> The use of the V/F curve of AB is only used to put frequiency offsets on voltages.
> But how do you set voltages ?


Just pull the graph from 1.2v (or whatever volt you wanna go for up to 1.2v) to whatever mhz you wanna go for, and it will use that volt.










If you do it like this it will give you ~2200mhz @ 1.100v


----------



## Vellinious

Are the fujipoly pads still the best? I've kinda looked around, and saw that Thermal Grizzly has some, but they're a lower rating than the fujipoly pads.

Opinions?


----------



## GRABibus

Quote:


> Originally Posted by *Spiriva*
> 
> Just pull the graph from 1.2v (or whatever volt you wanna go for up to 1.2v) to whatever mhz you wanna go for, and it will use that volt.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you do it like this it will give you ~2200mhz @ 1.100v


Can you show me how you set 1.2V manually on GPU ?
How can you set voltage ?

I mean, you don't have contrrol on voltage, only on frequency.
This is what I don't get


----------



## Vellinious

Quote:


> Originally Posted by *GRABibus*
> 
> Can you show me how you set 1.2V manually on GPU ?
> How can you set voltage ?
> 
> I mean, you don't have contrrol on voltage, only on frequency.
> This is what I don't get


Voltage is at the bottom. The way he has that curve set, it's going to run 2200 at 1.1v.


----------



## GRABibus

Thanks to all.
I did what you adviced and could set a curve from 1.09V until 1.2V with 2200MHz
I didn't flash, I am still with stock Bios Gigabyte.

Whatever I launch as game or benchmark now, Voltage is always = 1.09V and then I have 2200MHz.

I am stable in benchmarks (Time spy).

Let's see in games now


----------



## Vellinious

Quote:


> Originally Posted by *GRABibus*
> 
> Thanks to all.
> I did what you adviced and could set a curve from 1.09V until 1.2V with 2200MHz
> I didn't flash, I am still with stock Bios Gigabyte.
> 
> Whatever I launch as game or benchmark now, Voltage is always = 1.09V and then I have 2200MHz.
> 
> I am stable in benchmarks (Time spy).
> 
> Let's see in games now


No sense in setting 1.2v without the T4 bios. It's only gonna run 1.093v. Just set your clock to run at that voltage, and tweak the other points to make sure you're running well. They're pretty sensitive to those settings, and especially sensitive to heat. The hotter they get, the worse they'll run.


----------



## nrpeyton

*GPU Insulation for sub-zero coolant*

Thermal pads will be on memory and VRM


And
*With block on*
(Wrapped in bag and 'kneadable erasers' used to fill in gaps (holes) i cut for connectors etc).

Bag keeps fresh air and fresh humidity out (so my insulation on the card only has to deal with any moisture (air) already inside the bag.


Block doesnt cover memory VRM (notice my blue heatsinks).

Not sure whether to cut bag back, up to where the block ends, leaving the memory VRM out the bag or not? (Then filling it in with putty/erasers to try & seal as i done with PCI-E connector).

*Anyone?*

P.S EK & EVGA both say memory VRM doesn't produce a lot of heat so thats why block doesnt cover it. But I've still seem those heatsinks i placed on it reach 45-50c


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> *GPU Insulation for sub-zero coolant*
> 
> Thermal pads will be on memory and VRM
> 
> 
> And
> *With block on*
> (Wrapped in bag and 'kneadable erasers' used to fill in gaps (holes) i cut for connectors etc).
> 
> Bag keeps fresh air and fresh humidity out (so my insulation on the card only has to deal with any moisture (air) already inside the bag.
> 
> 
> Block doesnt cover memory VRM (notice my blue heatsinks).
> 
> Not sure whether to cut bag back, up to where the block ends, leaving the memory VRM out the bag or not? (Then filling it in with putty/erasers to try & seal as i done with PCI-E connector).
> 
> *Anyone?*
> 
> P.S EK & EVGA both say memory VRM doesn't produce a lot of heat so thats why block doesnt cover it. But I've still seem those heatsinks i placed on it reach 45-50c


I dunno if I'd use a plastic bag around the GPU.....do guys running sub-ambient do that? Seems like that would be seeking trouble, rather than helping to avoid it......


----------



## looniam

oh, this is gonna be good. @Jpmboy ya gotta watch this!


----------



## sirleeofroy

Quote:


> Originally Posted by *Vellinious*
> 
> Are the fujipoly pads still the best? I've kinda looked around, and saw that Thermal Grizzly has some, but they're a lower rating than the fujipoly pads.
> 
> Opinions?


They're the best I've used, I used them on my recently sold Alienware 15 R2 Laptop.

I had 1mm Fujipoly 14w/mk pads replace the rubbish that Dell put in there as well as some on the VRM's and they enabled a higher stable overclock with better temps (i7 [email protected] 72c max under 100% CPU stress).

If I ever need thermal pads, Fujipoly is my first port of call.

They do 17w/mk pads, if you can lay your hands on them......


----------



## Vellinious

Quote:


> Originally Posted by *sirleeofroy*
> 
> They're the best I've used, I used them on my recently sold Alienware 15 R2 Laptop.
> 
> I had 1mm Fujipoly 14w/mk pads replace the rubbish that Dell put in there as well as some on the VRM's and they enabled a higher stable overclock with better temps (i7 [email protected] 72c max under 100% CPU stress).
> 
> If I ever need thermal pads, Fujipoly is my first port of call.
> 
> They do 17w/mk pads, if you can lay your hands on them......


Yeah, those are the ones I ordered. Found them on PPCS and Amazon.


----------



## juniordnz

Those Fuji are great. I used 11w/mK on the heatplate and 6w/mK on the backplate and now both get so hot under extreme stress I can cook an egg on it.


----------



## OZrevhead

Guys does the T4 bios work with non asus strix cards? Is there much gain at 1.20v? The guy getting 2100 at 1.0v is doing well.


----------



## Jpmboy

Quote:


> Originally Posted by *looniam*
> 
> oh, this is gonna be good. @Jpmboy ya gotta watch this!


lol - grab the snacks...


----------



## nrpeyton

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - grab the snacks...


Would would u recommend then?

I thought it would help keep fresh humidity out, so the insulation I done on the card only has to deal with the moisture already in the bag.

Its a full cover block i am insulating with.

Never done this before.

CPU was easy.

Just finished insulating the chillers tanks and laying tubing with armaflex and insulating around CPU socket.

Almost ready to begin filling.

But one thing I'm not confident on is the GPU insulation. Its so bloody fiddly.


----------



## EDK-TheONE

Does EVGA 1080 Classified Voltage Tool works with zotac 1070 amp extreme?


----------



## nrpeyton

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Does EVGA 1080 Classified Voltage Tool works with zotac 1070 amp extreme?


No idea; i doubt they share the same voltage controller so probably, not.


----------



## Jpmboy

Quote:


> Originally Posted by *nrpeyton*
> 
> Would would u recommend then?
> 
> I thought it would help keep fresh humidity out, so the insulation I done on the card only has to deal with the moisture already in the bag.
> 
> Its a full cover block i am insulating with.
> 
> Never done this before.
> 
> CPU was easy.
> 
> Just finished insulating the chillers tanks and laying tubing with armaflex and insulating around CPU socket.
> 
> Almost ready to begin filling.
> 
> But one thing I'm not confident on is the GPU insulation. Its so bloody fiddly.


already said what I recommend.









Lol - is that an Oven bag or some high temp plastic? You sure you want to go that direction?


----------



## looniam

i am sooo getting thown out of here now.


----------



## nrpeyton

But most of the heat will be carried away by the full-cover block ?

So what about the layer i done before putting block on? Is that right?


----------



## nrpeyton

I am using my mobile phone to get on here guys its not easy to navigate the site.

What did u already recommend? Jpmboy


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> No idea; i doubt they share the same voltage controller so probably, not.


I'd walk right out on the tiny limb and say, "PFFT, HAHAHAHAHA......NO"


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> But most of the heat will be carried away by the full-cover block ?
> 
> So what about the layer i done before putting block on? Is that right?


Not most. A lot of heat is still dissipated into the air around the GPU, and the back of the card. I'd almost bet my left nut, that it just ends up a miniature oven inside that plastic bag, in turn creating an environment where condensation would form MORE easily, and not less.

Just my two copper.


----------



## OccamRazor

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - grab the snacks...


Quote:


> Originally Posted by *looniam*
> 
> 
> 
> i am sooo getting thown out of here now.










| Na, no one is thrown out my good friend!







Hope all is well with you two and your families! Nick Peyton really is one of the good guys and done a lot for the 1080 crowd!









Cheers

Occam


----------



## OZrevhead

What volts can the evga 1080 classified get to? I havent seen any hype on this, is it as limited as the rest?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Not most. A lot of heat is still dissipated into the air around the GPU, and the back of the card. I'd almost bet my left nut, that it just ends up a miniature oven inside that plastic bag, in turn creating an environment where condensation would form MORE easily, and not less.
> 
> Just my two copper.


I see what u mean. I'll remove the bag.

It was someones idea in a other thread and he was also talking about injecting it with dry air. (Oxygen)

I just hope the layer of insulation underneath is enough then.

If i was doing this with a universal block it would be simple.


----------



## OccamRazor

Quote:


> Originally Posted by *Vellinious*
> 
> Not most. A lot of heat is still dissipated into the air around the GPU, and the back of the card. I'd almost bet my left nut, that it just ends up a miniature oven inside that plastic bag, in turn creating an environment where condensation would form MORE easily, and not less.
> 
> Just my two copper.


And you are right! Heat is transmited by conduction, convection and in this case radiation, heating the air inside the plastic bag, worsening the condensation scenario!


----------



## nrpeyton

Quote:


> Originally Posted by *OccamRazor*
> 
> And you are right! Heat is transmited by conduction, convection and in this case radiation, heating the air inside the plastic bag, worsening the condensation scenario!


Okay.

Start again with the GPU then.

Everything else is done (including insulating chillers tank and all my tubing with armaflex).

And CPU was simple too.

Ive got a can of liquid electrical tape.
But im worried about being able to remove it.

I used water-proof duct tape between block and GPU (except at thermal pad spots) then i applied a tiny bit of liquid electric tape at the edges of the rows of tape


----------



## looniam

Quote:


> Originally Posted by *OccamRazor*
> 
> 
> 
> 
> 
> 
> 
> 
> | Na, no one is thrown out my good friend!
> 
> 
> 
> 
> 
> 
> 
> Hope all is well with you two and your families! Nick Peyton really is one of the good guys and done a lot for the 1080 crowd!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> Occam


doing well, back at you buddy.









yap, i know! been following his experiments for awhile







but couldn't help being mischievous.









going back to loonyuberluckermode.

te vejo mais tarde, bom amigo


----------



## Vellinious

Quote:


> Originally Posted by *OZrevhead*
> 
> What volts can the evga 1080 classified get to? I havent seen any hype on this, is it as limited as the rest?


Adding voltage without REALLY good cooling as an absolute waste of time. It'd be more productive and better for your overclock / benchmark scores if you went outside and asked everyone you saw to kick you in the groin.


----------



## OZrevhead

Haha ya wingnut ...

How good is really good? Dedicated loop with a 240 rad? Or cold?


----------



## Vellinious

Quote:


> Originally Posted by *OZrevhead*
> 
> Haha ya wingnut ...
> 
> How good is really good? Dedicated loop with a 240 rad? Or cold?


Really good loop and cold...the colder the better.


----------



## Jpmboy

Quote:


> Originally Posted by *nrpeyton*
> 
> I am using my mobile phone to get on here guys its not easy to navigate the site.
> 
> What did u already recommend? Jpmboy


erm.. your own thread on this and PMs.









Honestly, now I'ml waiting to see the outcome of this shrink wrap experiment.


----------



## Menthol

Were all waiting
I would check the sub zero threads here and other if needed to look for some preparation articles and maybe scale down from them
You probably won't get frost, your not likely to get low enough temps but you will have moisture


----------



## nrpeyton

Just checked pm's.
Got one or two from peeps asking me stuff that ive no had a chsnce to look at yet but nothing about my own issue.

Most ppl arent trying to do this on a full-cover block (from pics I've seen).
The whole block is likely to be soaking.

I'm going to have to re-think the GPU.

Called it a night yesterday and went to bed.

Spent a small fortune on coolant, in order to have enough glycol in the mix to stop it freezing.

So want to get it right first time.

I'll be removing the bag tonight, (not doing that now). Taking block off. and intensifying the insulation ive already done. Or maybe switch to liquid electrical tape.

I sprayed a bit on some metal yeaterday and it peeled off easily. But didnt peel off as 'cleanly' from plastic. Not sure about a PCB.


----------



## Wyllliam

Hi
Anybody here feels like putting their 1080's to use for a good cause?
Join the forum folding war.








Team intel could use some people with the 1080's.
more info follow the link
Forum folding war Team Intel


----------



## hertz9753

Or they could join the Second Hand Hounds in the 2017 Forum Folding War...









http://folding.axihub.ca/ffw.php

http://www.overclock.net/t/1618628/forum-folding-war-2017-second-hand-hounds-lets-end-2016-finally/0_20


----------



## Wyllliam

Quote:


> Originally Posted by *hertz9753*
> 
> Or they could join the Second Hand Hounds in the 2017 Forum Folding War...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://folding.axihub.ca/ffw.php
> 
> http://www.overclock.net/t/1618628/forum-folding-war-2017-second-hand-hounds-lets-end-2016-finally/0_20


No!!!
Go for Team Intel!!!!
We will


----------



## looniam

Quote:


> Or maybe switch to liquid electrical tape.


----------



## nrpeyton

Quote:


> Originally Posted by *looniam*


Right; think av decided how I'm going to do this then:

So I'll place thermal pads onto all relevant areas on card. Then spray the card with liquid tape. Wipe clean the liquid tape from the top of the pads (before it dries).
That should help create a seal between the side of the pads & the tape so no moisture escapes down edges of pads. (While still allowing pads full contact with block).

First I'll spray a tester on the edge of the PCB to see how easily it peels off.
If its not peeling off properly then i could switch to petrolium gelly.

Petroleum gelly wont be a permanent solution but should be enough to get me a validation (and prove its possible with full-cover block).
_Until i can order universal block and industrial gymnasium fan as a more practical solution_

Thats my thoughts anyway; and I'm just home from work. 

I've also got a few tubes of dielectric grease i could smear around edges of liquid tape and pads for some extra line of defence.


----------



## sirleeofroy

Quote:


> Originally Posted by *Wyllliam*
> 
> No!!!
> Go for Team Intel!!!!
> We will


Never folded before but I have a 1080, I'll give it a go.....


----------



## Wyllliam

Quote:


> Originally Posted by *sirleeofroy*
> 
> Never folded before but I have a 1080, I'll give it a go.....


Welcome to the club


----------



## nrpeyton

Can't believe I'm about to do this to an £800 GPU

Hope it comes off again :-(


Lol

*Edit*


----------



## Jpmboy

oh my.


----------



## looniam

well, practice makes perfect!


----------



## nrpeyton

2nd coat?


----------



## looniam

i was going to suggest the brush method but you seem to have quickly gotten some pretty good coverage there.


----------



## nrpeyton

Quote:


> Originally Posted by *looniam*
> 
> i was going to suggest the brush method but you seem to have quickly gotten some pretty good coverage there.


Aye that might b good idea. I wish i'd grabbed both.

I left the waterproof duct tape underneath as an extra line.

Should ease removal too

Edit:
Coat the top of the liquid tape (once dried) with dielectric grease or petrolium gelly?
I intend to pack the screw holes with it.


----------



## looniam

check out some liquid tape application threads/posts; the tape is enough. also keep in mind the PCB itself is coated with polymer, ya know why it's green/black/whatever. it can stand getting wet; it's waterproof.

what you want to cover is the solder points: where the components are soldered to the power/signal planes (copper) underneath and the components (resistors/transistors etc.) themselves.


----------



## nrpeyton

Okay great. Thank you.

Edit.
Lil bit of solder between vrm and memory VRM that looks vulnerable so I've added a second deeper coat to the area arpind the 2nd last row of capacitors.


----------



## pez

So is anyone here running ultra wide 1440p and a single 1080? About to give this a try soon and just kinda wondering what experience you guys are having. How are your OCs scaling with it as well?


----------



## looniam

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay great. Thank you.


i ought to made sure i clarified _you do want a coating on the whole pcb_ - it's actually easier to peel off. i'm trying to point out it isn't necessary to go overkill with putting grease on top.

sorry i didn't reply sooner, got distracted with the whole trump thing on the media.


----------



## nrpeyton

Quote:


> Originally Posted by *looniam*
> 
> i ought to made sure i clarified _you do want a coating on the whole pcb_ - it's actually easier to peel off. i'm trying to point out it isn't necessary to go overkill with putting grease on top.
> 
> sorry i didn't reply sooner, got distracted with the whole trump thing on the media.


Thanks 

I appreciate the help 

Its nearing bed time (past midnight here) up early for work.

*Heres how it looks so far:*


Still got a lil bit insulation to do around the PCI-e connector (incase of any droplets near mobo as its full-cover block).

P.S
_Was just thinking:
I'll also have gravity on my side due to the gap between most non thermal padded areas and block. _

Insulating CPU


Insulating inside chiller (tank & pipes)


*Re-wiring electrics in chiller to bypass thermostat and allow continuous running = sub-zero;-)*

_Oops what a donut, never cut wire long enough lol_


_Thats better. Outside right to outside left (blue wire)_ 


£60 worth of coolant, ready to go tomorrow
the distilled is clearly desperate for some EVGA Classified ;-)


Quote:


> Originally Posted by *pez*
> 
> So is anyone here running ultra wide 1440p and a single 1080? About to give this a try soon and just kinda wondering what experience you guys are having. How are your OCs scaling with it as well?


I'm running 1440p and 1080 on old FX CPU at Ultra with 80+FPS (witcher 3) so 1080 is more than enough at that res.
Most r getter 2100+ and +500 mem.

Scaling with cooling = 100mhz per 50 degree C (approx.) but varies card to card


----------



## Jpmboy

Quote:


> Originally Posted by *looniam*
> 
> i ought to made sure i clarified _you do want a coating on the whole pcb_ - it's actually easier to peel off. i'm trying to point out it isn't necessary to go overkill with putting grease on top.
> 
> sorry i didn't reply sooner, *got distracted with the whole trump thing on the media*.


----------



## pez

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm running 1440p and 1080 on old FX CPU at Ultra with 80+FPS (witcher 3) so 1080 is more than enough at that res.
> Most r getter 2100+ and +500 mem.
> 
> Scaling with cooling = 100mhz per 50 degree C (approx.) but varies card to card


Is this standard 16:9 1440p or 21:9 1440p? I know the 1080 holds its own at 1440p, but I was just trying to get people's attention experiences with ultra wide beforehand







.


----------



## pantsoftime

Just a PSA about the Fuji Sarcon XR-m (17W/mK) pads that people bring up a lot in this thread. Those pads are fantastic, but are very sensitive to the amount they get compressed. They work best when they're compressed between 25% and 65%. When you overcompress or undercompress them their performance degrades significantly. If you want the best performance be sure to pick the proper thickness pad and don't overtighten your screws. To add insult to injury, there is something like a +/- 0.5mm tolerance on the sheets that Fuji sells. This can have a noticeable impact on your expected compression and thus thermal performance.

When using these pads for a significantly hot component it's usually best to buy spares. It won't matter much for parts with lowish power (under 10 watts) but if you're trying to use a pad like this on a super hot component then you'll want to dial it in precisely.


----------



## nrpeyton

Quote:


> Originally Posted by *pantsoftime*
> 
> Just a PSA about the Fuji Sarcon XR-m (17W/mK) pads that people bring up a lot in this thread. Those pads are fantastic, but are very sensitive to the amount they get compressed. They work best when they're compressed between 25% and 65%. When you overcompress or undercompress them their performance degrades significantly. If you want the best performance be sure to pick the proper thickness pad and don't overtighten your screws. To add insult to injury, there is something like a +/- 0.5mm tolerance on the sheets that Fuji sells. This can have a noticeable impact on your expected compression and thus thermal performance.
> 
> When using these pads for a significantly hot component it's usually best to buy spares. It won't matter much for parts with lowish power (under 10 watts) but if you're trying to use a pad like this on a super hot component then you'll want to dial it in precisely.


I would definitely second that 

Good post,

Quote:


> Originally Posted by *pez*
> 
> Is this standard 16:9 1440p or 21:9 1440p? I know the 1080 holds its own at 1440p, but I was just trying to get people's attention experiences with ultra wide beforehand
> 
> 
> 
> 
> 
> 
> 
> .


ahh sorry, so 3440 x 1440.

I'm running 2560 x 1440.

So 2560 * 1440 = 3,686,400
and
3440 *1440 = 4,953,600
then
3,686,400 / 4,953,600 * 100 = 74%

*So I'd say 1440p Ultra Wide would be about 20% slower.

Maybe 75 FPS instead of 100 FPS at ultra?
*
I could be wrong.... but that's how I'd see it...

*1440 Ultra Wide could be a very good 'medium' between standard 1440 and 4K.

Sounds great *

yeeeehaa!!!!!!!!!

_*FULL SIZE* - right click & select 'open new tab'_


*First time sub-zero*

Not tried to go further yet.. the chiller was still dropping fast so definitely more headroom, to go lower 

VRM temperature a little high though, idling at 24c at a water temp of 10c.. was definitely better before.

Might need to re-seat the card or try a slightly bigger pad, since insulating.. (think some liquid tape must have dribbled underneath the pads, or; the block isn't compressing the VRM pad enough now )

Memory idle temp good, idling at 11c VRM side, 9c I/O side.


----------



## emperium85

-


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Thanks
> 
> I appreciate the help
> 
> Its nearing bed time (past midnight here) up early for work.
> 
> *Heres how it looks so far:*
> 
> 
> Still got a lil bit insulation to do around the PCI-e connector (incase of any droplets near mobo as its full-cover block).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> P.S
> _Was just thinking:
> I'll also have gravity on my side due to the gap between most non thermal padded areas and block. _
> 
> Insulating CPU
> 
> 
> Insulating inside chiller (tank & pipes)
> 
> 
> *Re-wiring electrics in chiller to bypass thermostat and allow continuous running = sub-zero;-)*
> 
> _Oops what a donut, never cut wire long enough lol_
> 
> 
> _Thats better. Outside right to outside left (blue wire)_
> 
> 
> £60 worth of coolant, ready to go tomorrow
> the distilled is clearly desperate for some EVGA Classified ;-)
> 
> 
> I'm running 1440p and 1080 on old FX CPU at Ultra with 80+FPS (witcher 3) so 1080 is more than enough at that res.
> Most r getter 2100+ and +500 mem.
> 
> Scaling with cooling = 100mhz per 50 degree C (approx.) but varies card to card


ever considered this saves insulating components
\
http://www.overclock.net/t/1533164/the-24-7-sub-zero-liquid-chillbox-club/0_20


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> [/SPOILER]
> 
> ever considered this saves insulating components
> \
> http://www.overclock.net/t/1533164/the-24-7-sub-zero-liquid-chillbox-club/0_20


I asked for help on that thread a few months ago (while trying to understand it all) as you have to completely build it yourself, from scratch.

Some of those guys are refrigeration engineers.

Anyway I got 'ignored', completely. Nobody even bothered to acknowledge my post... so I was forced to make my own path.



*I see frost, lol *





It begun to get soo cold moisture begun to appear on the *BACK* of the PCB. _And not just where the block touches on the opposite side._

My memory temps were even sitting at 0c, idling lol.
I never insulated the back because the block is on the front. But it cooled the *entire* PCB SO WELL the entire card was beginning to go into the minus.

I'm going to need to think about about that lol.

Its also making me think again.. about that "bag" idea the other day. (obviously not for prolonged use)....?

God knows, but I'm impressed by the results.. and that was only at -8 too.

Chiller was still dropping but beginning to slow.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> I asked for help on that thread a few months ago (while trying to understand it all) as you have to completely build it yourself, from scratch.
> 
> Some of those guys are refrigeration engineers.
> 
> Anyway I got 'ignored', completely. Nobody even bothered to acknowledge my post... so I was forced to make my own path.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *I see frost, lol *
> 
> 
> 
> 
> 
> 
> It begun to get soo cold moisture begun to appear on the *BACK* of the PCB. _And not just where the block touches on the opposite side._
> 
> My memory temps were even sitting at 0c, idling lol.
> I never insulated the back because the block is on the front. But it cooled the *entire* PCB SO WELL the entire card was beginning to go into the minus.
> 
> I'm going to need to think about about that lol.
> 
> Its also making me think again.. about that "bag" idea the other day. (obviously not for prolonged use)....?
> 
> God knows, but I'm impressed by the results.. and that was only at -8 too.
> 
> Chiller was still dropping but beginning to slow.


there many ways you could make an air tight box, the design is for 24/7 use I see you got everything you need apart from a insulated box.
which in reality for benching purposes you could just wrap your whole pc in shrink wrap and put a few blankets over top for insulation.
the whole concept to prevent any air circulation preventing any condensation forming.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> there many ways you could make an air tight box, the design is for 24/7 use I see you got everything you need apart from a insulated box.
> which in reality for benching purposes you could just wrap your whole pc in shrink wrap and put a few blankets over top for insulation.
> the whole concept to prevent any air circulation preventing any condensation forming.


originally I had bagged the block (and filled in the holes I made with kneadable erasers (putty) but people were suggesting it would create a warm air bubble around the card (making condensation even *more* likely to form.. so I took the dunt to my pride lol, and removed the bag.

My thoughts were that if there was no air circulation then condensation wouldn't be able to form anyway..

And that how *much* formed was kind of irrelevant because at -10 or -20 any humidity is going to turn into frost/moisture immediately anyway.

Maybe bagging the entire system (as you say) would work better? But then there's _already_ also more humidity in 1.5 sqr. metres than there is in 30 sqr cm.. But then then doing the entire system also wouldn't create such a "hot box hazard"...

Its all good stuff to think about and ponder.....

I definitely appreciate your post, good things to think about 

The chill-box idea did kind of inspire me when I thought about using the bag (and the block IS a full-cover one after-all)...

I wouldn't be running at those temps during gaming for prolonged periods.....

I even also thought about adding a block (_with no component attached)_ to the inside of the PC to draw the humidity out (and away from other components) (its kind of how de-humidifiers work).

P.S.

How does the chillbox idea keep the air inside cool? With the radiator.. how does that work?


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> originally I had bagged the block (and filled in the holes I made with kneadable erasers (putty) but people were suggesting it would create a warm air bubble around the card (making condensation even *more* likely to form.. so I took the dunt to my pride lol, and removed the bag.
> 
> My thoughts were that if there was no air circulation then condensation wouldn't be able to form anyway..
> 
> And that how *much* formed was kind of irrelevant because at -10 or -20 any humidity is going to turn into frost/moisture immediately anyway.
> 
> Maybe bagging the entire system (as you say) would work better? But then there's _already_ also more humidity in 1.5 sqr. metres than there is in 30 sqr cm.. But then then doing the entire system also wouldn't create such a "hot box hazard"...
> 
> Its all good stuff to think about and ponder.....
> 
> I definitely appreciate your post, good things to think about
> 
> The chill-box idea did kind of inspire me when I thought about using the bag (and the block IS a full-cover one after-all)...
> 
> I wouldn't be running at those temps during gaming for prolonged periods.....
> 
> I even also thought about adding a block (_with no component attached)_ to the inside of the PC to draw the humidity out (and away from other components) (its kind of how de-humidifiers work).
> 
> P.S.
> 
> How does the chillbox idea keep the air inside cool? With the radiator.. how does that work?


yes there is a radiator inside the chill box cooling the inside of the box to remove heat from memory etc.
the whole concept being there will be a little humidity in 1m³ how many m³ is in you room? 100m³ on average that is a lot of moisture that could circulate, cut the air circulation that's 100% less moisture to deal with


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> yes there is a radiator inside the chill box cooling the inside of the box to remove heat from memory etc.
> the whole concept being there will be a little humidity in 1m³ how many m³ is in you room? 100m³ on average that is a lot of moisture that could circulate, cut the air circulation that's 100% less moisture to deal with


hmm true... so the cold radiator (due to freezing coolant travelling through it) actually cools the air? (as the air travels across the cold radiator) and that cools the air.

Then seal the case with the radiator inside with a bag.. hmm.. that could work...

Have I got that 'understanding' right?

P.S.

So all I need to do is add the radiator back into my "freezing loop" (it is disconnected at the moment) and seal the case or create a new, sealed case to house the PC.?
_
Edit: Wouldn't the radiator just put warmth back into the loop?_

Edit 2:
I also noticed my 'flow' dropped considerably (the colder the coolant got) despite the fact it won't freeze until -15. It is still pumped around very fast when its warm.

I think I'd definitely need a 2nd pump if I added the radiator back in. (i quite like the idea of having an excuse to get a second one lol) ;-)


----------



## feznz

Bingo, you need to size your chiller to remove more heat than you total PC heat output.
so for example if your PC draws 750W from the power socket then you will need a chiller that has at least 1000w of cooling power
and that is a very conservative estimate because of heat loss from poor insulation, pumps etc.

I had great plans I but reality hit and I sold my condensing unit 1.5Kw
Pumps? I have these 4 of them http://www.overclock.net/t/1514355/speck-my2-8000/0_20 there's another 80W+ of heat output to deal with
the cooler you go the more hurdles you need to deal with in the end.... the gains were getting marginal even with the 7xx cards IMHO

probably best to talk this in the chiller club there is more people with a lot more experience than me there.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> Bingo, you need to size your chiller to remove more heat than you total PC heat output.
> so for example if your PC draws 750W from the power socket then you will need a chiller that has at least 1000w of cooling power
> and that is a very conservative estimate because of heat loss from poor insulation, pumps etc.
> 
> I had great plans I but reality hit and I sold my condensing unit 1.5Kw
> Pumps? I have these 4 of them http://www.overclock.net/t/1514355/speck-my2-8000/0_20 there's another 80W+ of heat output to deal with
> the cooler you go the more hurdles you need to deal with in the end.... the gains were getting marginal even with the 7xx cards IMHO
> 
> probably best to talk this in the chiller club there is more people with a lot more experience than me there.


okay, i'll try again.. they might take me more seriously now that I can actually prove I am capable and already have most of the equipment (and am already hitting sub-zero).

To be fair, I was starting from scratch as a n00b when I confronted them before.. now I'm far past that. maybe they'll listen

I'll give it a bash, thanks 

*BTW lads, we finally broke through: lol \/*



*2303 MHZ*, not entirely stable but it was a quick dirty run, at stock voltages and only -4 water temp

I need to revisit the back of the PCB before I push harder.


----------



## Synntx

Hey all, just wanted to share my all-time high record on firestrike with my 1080.

LOOK AT DAT!

http://www.3dmark.com/3dm/17548166?


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> okay, i'll try again.. they might take me more seriously now that I can actually prove I am capable and already have most of the equipment (and am already hitting sub-zero).
> 
> To be fair, I was starting from scratch as a n00b when I confronted them before.. now I'm far past that. maybe they'll listen
> 
> I'll give it a bash, thanks
> 
> *BTW lads, we finally broke through: lol \/*
> 
> 
> 
> *2303 MHZ*, not entirely stable but it was a quick dirty run, at stock voltages and only -4 water temp
> 
> I need to revisit the back of the PCB before I push harder.


Can you run FS with 2300mhz? Id like to see that score haha.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> okay, i'll try again.. they might take me more seriously now that I can actually prove I am capable and already have most of the equipment (and am already hitting sub-zero).
> 
> To be fair, I was starting from scratch as a n00b when I confronted them before.. now I'm far past that. maybe they'll listen
> 
> I'll give it a bash, thanks
> 
> *BTW lads, we finally broke through: lol \/*
> 
> 
> 
> *2303 MHZ*, not entirely stable but it was a quick dirty run, at stock voltages and only -4 water temp
> 
> I need to revisit the back of the PCB before I push harder.


I'll be out of town all next week, but, hopefully mother nature will provide me some cold weather when I get back, and I'll make another go at 2300. Congrats on hitting it. Now run some benches. = )


----------



## nrpeyton

[/quote]
Quote:


> Originally Posted by *Vellinious*
> 
> I'll be out of town all next week, but, hopefully mother nature will provide me some cold weather when I get back, and I'll make another go at 2300. Congrats on hitting it. Now run some benches. = )


Quote:


> Originally Posted by *Dragonsyph*
> 
> Can you run FS with 2300mhz? Id like to see that score haha.


Not quite stable enough yet to get through a full bench.

My card was getting so cold (the bare *back* of the PCB) was evening beginning to accumulate moisture. As the entire I/O side of the card (all around) was below the dew point.

I need to have another look at that before I go any further.

Apart from that; no problems and no shorts. So I must have done a decent job insulating the front 

Don't worry; I'll get there lol... just a matter of _time_.

Think I'm going to use putty to seal it tonight then go for another run 

I won't even need to disconnect from computer to do the back.


----------



## Radmanhs

Somewhere in this thread there was a link to a special bios that allowed you to increase the voltage past the stock max, can someone link it for me please? Also, how do you load this bios to the card? I'm kinda dumb when it comes to these sort of things


----------



## Dragonsyph

Like the other guy was asking any of you have a single 1080 with a 1440px3440 ultra wide screen? You able to get max settings at 75-100 fps in most games?

Reason i ask is i ordered an x34 monitor.


----------



## Xristo

So i bit the bullet and went and bought myself a GTX 1080, coming from the 780ti its not something that i needed but its been itching at me the last week or so and life is too short so i gather why not.. I recently purchased a 144hz monitor and wasnt getting the ideal frame rates in the newer games at high detail and now I have a much smoother playing experience and should be future proof for the next few years atleast.


----------



## Dragonsyph

Quote:


> Originally Posted by *Xristo*
> 
> So i bit the bullet and went and bought myself a GTX 1080, coming from the 780ti its not something that i needed but its been itching at me the last week or so and life is too short so i gather why not.. I recently purchased a 144hz monitor and wasnt getting the ideal frame rates in the newer games at high detail and now I have a much smoother playing experience and should be future proof for the next few years atleast.


Woot grats bro, i always say build ur PC around ur monitor so i think it was a good move to upgrade and get the FPS to match ur monitors refresh rate.

That's why im hoping mine can push the monitor i just got.

Let us know how it overclocks 8).


----------



## Xristo

Thanks mate =) it cost an arm and a leg ! the 780ti did a good job on my old 60hz monitor but i recently got this 144hz monitor and in most games i was struggling to see the frame rates i would of liked.

I seen more than double the frame rate of the 780ti in most games, except watch dogs 2 suprisingly i didnt see much of an increase in performance =( struggle to keep above 60 on ultra in dense area's but ill blame that on poor optmization or driver support.

3D mark Firestrike and Time Spy, I seen double the frame rate .. day and night difference, its really smooth compared to the 780ti which was struggling alot in this benchmark and really began to show its age.

Overclocked to 2100Mhz core, 5150Mhz mem .. 61c was the highest i seen the temperature after benchmark and its also silent compared to my reference 780ti which im very pleased about. Not planning to go ridiculous with this thing, until i read some reviews on how capable it is with voltages and temps .. I know it has more potential in it, im sure it will get the better of me eventually.

My 780ti sounded like a hair dryer and got very hot during gaming almost 80 degrees .. I basically got double the frame rate, lower temps and quiet fans .. very pleased with my purchase, definately worth the upgrade for me.


----------



## Spiriva

Quote:


> Originally Posted by *Radmanhs*
> 
> Somewhere in this thread there was a link to a special bios that allowed you to increase the voltage past the stock max, can someone link it for me please? Also, how do you load this bios to the card? I'm kinda dumb when it comes to these sort of things


Quote:


> Originally Posted by *nrpeyton*
> 
> strix1080xoc_t4version2withhigherfirestrikescore.zip 148k .zip file
> 
> 
> -Removed Power Limit
> -Voltage up to 1.2v (over stock 1.093v)
> *-No Temperature limit (^warning^)*
> -May cause a 'reduced fan speed' on FE cards. FE cards should only use this when under water.
> -Won't work with monitors plugged into the 3rd display port
> 
> *Be careful* (monitor your temps when using this BIOS)
> 
> -Remember ALWAYS *disable display driver in "device manager" first BEFORE flashing*.
> -*Always backup your original BIOS* first using GPU-Z (and keep a backup of your original in safe place)


To up the volt you need to do it via MSI Afterburner "graph", in AB press ctrl F to bring it up.
I use this bios on both my cards (under water) @ 2250mhz, 1.150v (Evga FE 1080´s)

nvflash --index=0 --save 1080org.rom
nvflash --index=0 --protectoff
nvflash --index=0 -6 strix1080xoc_t4.rom

--index=0 is because i have two cards, next card would be --index=1 and do the same thing again.


----------



## Radmanhs

I'm getting an error saying no display adapter was detected, how do I overcome this?


----------



## Dragonsyph

Quote:


> Originally Posted by *Xristo*
> 
> Thanks mate =) it cost an arm and a leg ! the 780ti did a good job on my old 60hz monitor but i recently got this 144hz monitor and in most games i was struggling to see the frame rates i would of liked.
> 
> I seen more than double the frame rate of the 780ti in most games, except watch dogs 2 suprisingly i didnt see much of an increase in performance =( struggle to keep above 60 on ultra in dense area's but ill blame that on poor optmization or driver support.
> 
> 3D mark Firestrike and Time Spy, I seen double the frame rate .. day and night difference, its really smooth compared to the 780ti which was struggling alot in this benchmark and really began to show its age.
> 
> Overclocked to 2100Mhz core, 5150Mhz mem .. 61c was the highest i seen the temperature after benchmark and its also silent compared to my reference 780ti which im very pleased about. Not planning to go ridiculous with this thing, until i read some reviews on how capable it is with voltages and temps .. I know it has more potential in it, im sure it will get the better of me eventually.
> 
> My 780ti sounded like a hair dryer and got very hot during gaming almost 80 degrees .. I basically got double the frame rate, lower temps and quiet fans .. very pleased with my purchase, definately worth the upgrade for me.


Nice, haha ya them older cards were so loud.


----------



## Radmanhs

now i'm getting "no longer supports WoW64 flasher"


----------



## Radmanhs

think I got it to work, it booted and I can't directly change anything in afterburner. Is there another way to bring up the graph? ctrl F isnt working


----------



## Radmanhs

AAANNNNNNDDDDDD... it works I think. I'll spend tomorrow tweaking it around and see what I can get


----------



## Xristo

The best oc i could manage is 2150 mhz core, 10,360 mhz memory .. any higher crashes in fire strike, I didnt really play with the memory too much but its a decent overclock none the less.. temps are considerable sitting around the mid, high 60's under load .. im happy with that =) its summer here in Australia at the moment so ambient temps are pretty high to begin with.

I unlocked voltage control and tried adding abit more voltage at a slightly higher overclock but it crashes regardless of anything higher than 2150mhz ? ill need to play around with it some more but i dont think I will achieve much .. what sort of overclock are you guys getting ?


----------



## Dragonsyph

Quote:


> Originally Posted by *Xristo*
> 
> The best oc i could manage is 2150 mhz core, 10,360 mhz memory .. any higher crashes in fire strike, I didnt really play with the memory too much but its a decent overclock none the less.. temps are considerable sitting around the mid, high 60's under load .. im happy with that =) its summer here in Australia at the moment so ambient temps are pretty high to begin with.
> 
> I unlocked voltage control and tried adding abit more voltage at a slightly higher overclock but it crashes regardless of anything higher than 2150mhz ? ill need to play around with it some more but i dont think I will achieve much .. what sort of overclock are you guys getting ?


All most all cards are locked to 1.09v max the voltage% slider to 100% will just allow the card to go that high. Your core clock is pretty good, im only at 2164 but with a +1000 memory OC. Memory OC help alot in scores like FS.

There are people getting up to 2300mhz but with chillers i think there called.


----------



## Xristo

Thanks man =) is it safe to run voltage slider at 100% all the time ? ill have to give that a go and bump up the clock abit .. also ill have to play around with the memory overclock, because yeah thats pretty weak =( im sure it has more in it ill get back to ya on that







thanks for the help.


----------



## Dragonsyph

Quote:


> Originally Posted by *Xristo*
> 
> Thanks man =) is it safe to run voltage slider at 100% all the time ? ill have to give that a go and bump up the clock abit .. also ill have to play around with the memory overclock, because yeah thats pretty weak =( im sure it has more in it ill get back to ya on that
> 
> 
> 
> 
> 
> 
> 
> thanks for the help.


Ya pretty much everyone on here who is overclocking runs it at 100%. I would put the OSD up so you can see voltage and if your card is not stable above a MHZ and not limited by vcore you can tell and then turn the slider back down for lower voltage and lower temps.


----------



## dgateles

Where i find TWIN FROZR VI of Gaming models from MSI to replace Armor cooling system?


----------



## Xristo

Ok so i set the slider to 100% and it still crashes past a certain point regardless but thats fine im happy with that overclock anyway, realistically anything more is negligable .. ive kept my eye on the fps counter during certain scenes and it seems like its only 1 or 2 fps increase at higher clocks until it crashes so im done with it as it is.

2100mhz core and 10,500 mhz memory is the best it can do without crashing the program ... unless i throw alot more voltage to it and its simply not worth it unless youre on water cooling ..

Also you were right about the memory clock, with that increase i bumped the score up past 18k


----------



## zGunBLADEz

I don't understand why ppl want to add more volts when is proven Pascal don't like it. I can manage 2126 with a fix 1.00mV trying to lower your power limiter wall is the best case in founders editions. Then you fix the curve so when it reaches the limit it down clocks accordingly and not drastic drops. Like from 2114 then your lowest would be 2101.


----------



## zGunBLADEz

Anything beyond 2.1ghz is kind of useless. Prefer an extra 50mhz in memory for better gains


----------



## Spiriva

Quote:


> Originally Posted by *zGunBLADEz*
> 
> I don't understand why ppl want to add more volts when is proven Pascal don't like it. I can manage 2126 with a fix 1.00mV trying to lower your power limiter wall is the best case in founders editions. Then you fix the curve so when it reaches the limit it down clocks accordingly and not drastic drops. Like from 2114 then your lowest would be 2101.


Mine never drops below 2250mhz tho


----------



## fat4l

So once again, guys, what nvflash do we use(version) and what's the proper commands to flash that T4 bios, on Fe card, if you run just 1 card ?
thanks


----------



## zGunBLADEz

Quote:


> Originally Posted by *Spiriva*
> 
> Mine never drops below 2250mhz tho


That wasn't what I'm saying. BUT GOOD FOR YOU XD


----------



## zGunBLADEz

This case you get better gains on your mem overclocks than the core itself.


----------



## Spiriva

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Then you fix the curve so when it reaches the limit it down clocks accordingly and not drastic drops. Like from 2114 then your lowest would be 2101.


Quote:


> Originally Posted by *zGunBLADEz*
> 
> That wasn't what I'm saying. BUT GOOD FOR YOU XD


kk....


----------



## MonarchX

Damn, the Gigabyte GeForce GTX 1080 G1 Gaming is not the best clocker. After a month of playing games, it does about 2000Mhz on GPU and 11000Mhz on VRAM 100% stable. I think it is because Gigabyte used a single 8+2 power plug on this card, while other manufacturers and later revision of Gigabyte's 1080 cards have 2x 8+2 power plugs. Aside from not being stable above 2000Mhz, it is a pretty good card! Quiet, temps never go past 66C with stock fan profile.


----------



## Radmanhs

i flashed to the t4 bios, but it wont let me get over 1.093v

Should I just reflash the bios?


----------



## invincible20xx

i noticed that increasing voltage in after burner doesn't really give me any better stable OC, what gives ?


----------



## nrpeyton

Beautifully *stable at 2265 MHZ* but can't get a fully _stable_ run over that.

Can get stable 2278 using Precision X, but not Afterburner for stranger reason.*http://www.3dmark.com/3dm/17574951* btw non-default settings were NOT used in the run. I think its because I need to update it. http://www.3dmark.com/3dm/17575002 <<-- score is still obviously being held back by my old AMD FX, CPU.

Can get to 2303 MHZ but it crashes every time!

*Can't seem to get my bloody chiller to go below -10* (then it has a *hard* time keeping up with that temperature the second I put *any* load on either CPU or GPU either.

*So ******* close*. _*Two bloody steps. (steps are 13 MHZ each)*_ towards stable (or at least non-crashing) 2300 MHZ.

So frustrated with my chillers lack of getting below -10.

I even under-volted my CPU to 1.0v at 800 MHZ to and left the GPU idling and could still only get water temp down to -10

*Still don't understand how a 200L freezer, with a 1/5th HP compressor, manages to get to -22c. When I can only get to -10, on a system with a 1/2 HP compressor, (a huge evaporator) and a condenser that is aided by a fan!!*

SO F******G CLOSE! !!









Another 10c (to -20c would of done it).


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> Beautifully *stable at 2265 MHZ* but can't get a fully stable run over that.
> 
> Can get stable 2278 using Precision X, but not Afterburner for stranger reason.*http://www.3dmark.com/3dm/17574951* btw non-default settings were NOT used in the run. I think its because I need to update it. http://www.3dmark.com/3dm/17575002 <<-- score is still obviously being held back by my old AMD FX, CPU.
> 
> Can get to 2303 MHZ but it crashes every time!
> 
> *Can't seem to get my bloody chiller to go below -10* (then it has a *hard* time keeping up with that temperature the second I put *any* load on either CPU or GPU either.
> 
> *So ******* close*. *Two bloody steps. (steps are 13 MHZ each)* towards stable (or at least non-crashing) 2300 MHZ.
> 
> So frustrated with my chillers lack of getting below -10.
> 
> I even under-volted my CPU to 1.0v at 800 MHZ to and left the GPU idling and could still only get water temp down to -10
> 
> *Still don't understand how a 200L freezer, with a 1/5th HP compressor, manages to get to -22c. When I can only get to -10, on a system with a 1/2 HP compressor, (a huge evaporator) and a condenser that is aided by a fan!!*
> 
> SO F******G CLOSE! !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another 10c (to -20c would of done it).


That's pretty sick. Does those temps help memory OC at all?


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> Beautifully *stable at 2265 MHZ* but can't get a fully _stable_ run over that.
> 
> Can get stable 2278 using Precision X, but not Afterburner for stranger reason.*http://www.3dmark.com/3dm/17574951* btw non-default settings were NOT used in the run. I think its because I need to update it. http://www.3dmark.com/3dm/17575002 <<-- score is still obviously being held back by my old AMD FX, CPU.
> 
> Can get to 2303 MHZ but it crashes every time!
> 
> *Can't seem to get my bloody chiller to go below -10* (then it has a *hard* time keeping up with that temperature the second I put *any* load on either CPU or GPU either.
> 
> *So ******* close*. _*Two bloody steps. (steps are 13 MHZ each)*_ towards stable (or at least non-crashing) 2300 MHZ.
> 
> So frustrated with my chillers lack of getting below -10.
> 
> I even under-volted my CPU to 1.0v at 800 MHZ to and left the GPU idling and could still only get water temp down to -10
> 
> *Still don't understand how a 200L freezer, with a 1/5th HP compressor, manages to get to -22c. When I can only get to -10, on a system with a 1/2 HP compressor, (a huge evaporator) and a condenser that is aided by a fan!!*
> 
> SO F******G CLOSE! !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another 10c (to -20c would of done it).






Because a freezer does not have a proximal heat source the way you have.

Remember 'coldness' is not a thing in and of itself. It is the absence of energy.


----------



## OZrevhead

I really like the whole chilled water idea, phase is great but its only for a cpu or a gpu where chilled water can cool everything you care to plumb to.

Your cpu should have really responded to the cold too, its not just chilling your gpu is it?


----------



## hparks

Is there a way to safely undervolt this card when idling below the standard 625mv?

It doesn't seem to matter if it's running at 200MHz or 1200MHz; it's always 625Mv


----------



## looniam

Quote:


> Originally Posted by *Derek1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Beautifully *stable at 2265 MHZ* but can't get a fully _stable_ run over that.
> 
> Can get stable 2278 using Precision X, but not Afterburner for stranger reason.*http://www.3dmark.com/3dm/17574951* btw non-default settings were NOT used in the run. I think its because I need to update it. http://www.3dmark.com/3dm/17575002 <<-- score is still obviously being held back by my old AMD FX, CPU.
> 
> Can get to 2303 MHZ but it crashes every time!
> 
> *Can't seem to get my bloody chiller to go below -10* (then it has a *hard* time keeping up with that temperature the second I put *any* load on either CPU or GPU either.
> 
> *So ******* close*. _*Two bloody steps. (steps are 13 MHZ each)*_ towards stable (or at least non-crashing) 2300 MHZ.
> 
> So frustrated with my chillers lack of getting below -10.
> 
> I even under-volted my CPU to 1.0v at 800 MHZ to and left the GPU idling and could still only get water temp down to -10
> 
> *Still don't understand how a 200L freezer, with a 1/5th HP compressor, manages to get to -22c. When I can only get to -10, on a system with a 1/2 HP compressor, (a huge evaporator) and a condenser that is aided by a fan!!*
> 
> SO F******G CLOSE! !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another 10c (to -20c would of done it).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Because a freezer does not have a proximal heat source the way you have.
> 
> Remember 'coldness' is not a thing in and of itself. It is the absence of energy *heat*.
Click to expand...

FTFY.

and if you don't know why; please refresh your school lessons on _mass being energy at rest_.


----------



## Valtava

Quote:


> Originally Posted by *nrpeyton*
> http://www.3dmark.com/3dm/17575002 <<-- score is still obviously being held back by my old AMD FX, CPU.


My run with fx 8350 @ stock speed and 1080 gaming z @ 2114 -> http://www.3dmark.com/3dm/17600128 (Graphics Score 24 174)

In 3dmark pascals best performance peak is not where the max boost is for the card.


----------



## snow cakes

Haven't been around in a while.... question because I have been ATI my entire PC building days...will 2 of these in SLI be sufficient enough for 4k maxing out most games?


----------



## BrainSplatter

Quote:


> Originally Posted by *KickAssCop*
> 
> will 2 of these in SLI be sufficient enough for 4k maxing out most games?


*IF* the game supports proper SLI scaling then u should see at least 60 fps @ 4K in most of todays games.


----------



## snow cakes

Quote:


> Originally Posted by *BrainSplatter*
> 
> *IF* the game supports proper SLI scaling then u should see at least 60 fps @ 4K in most of todays games.


do the majority of today's games support SLI?


----------



## keikei

Quote:


> Originally Posted by *snow cakes*
> 
> do the majority of today's games support SLI?


It depends on what you tend to play. If its a modern day AAA game, then you should expect support. If its a port or indie games, it may not, and if it does, it may be buggie. Thats what i've found. SLI support also tends to be an afterthought unfortunately.


----------



## BrainSplatter

Quote:


> Originally Posted by *snow cakes*
> 
> do the majority of today's games support SLI?


I would say that mabye about 75% will get some SLI/CF support eventually but don't expect it to work properly on release date.


----------



## nrpeyton

Quote:


> Originally Posted by *looniam*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> FTFY.
> 
> and if you don't know why; please refresh your school lessons on _mass being energy at rest_.


Quote:


> Originally Posted by *Dragonsyph*
> 
> That's pretty sick. Does those temps help memory OC at all?


Yes; they do indeed (very much so) as I'm running it with a full-cover block. So the subzero conditions extend to my memory. I can even monitor my memory temps by using temp probes I positioned *directly behind* the memory on the back of the PCB. ;-)

However in reality; due to only being able to reach -10 by seriously undervolting and idling my components. Then quickly resuming them at normal operating specs to initiate a benchmark. The lowest possible temperature at LOAD, I can hope to achieve using my current Water Chiller is 0c -> 2c. (GPU core Temp).
And Memory load Temp about 10c - 15c (due to thermal pad).

*Update on the Water Chiller situation*

Done some research; & it appears the reason I can't get below -10c is due to the gas used in my Water Chiller. R134a. Which only has a boiling point of -26c.

The colder the chiller runs the closer the gas is to its minimum boiling point. And the less efficient.

You see; the 'temperature drop' or 'cooling' happens, when the gas evaporates in the evaporator (as it requires heat energy to change into a more energetic gas form).

In the same way that water evaporates slower, the lower the temperature (or closer to 100c), the gas in a chiller/freezer/fridge evaporates slower the closer it is to its boiling point. In my case. -26c.

In other words, my chiller has to do _more_ eveporation/condensing cycles to expel the same heat; the lower the temperature. _(Less efficient the lower the temperature)._ Eventually (in my case -10c) the system can no longer expel enough watts of heat energy to continue getting lower.

*I need to upgrade/buy a new Chiller, which runs off a gas with a lower boiling point. Maybe R404 which has a boiling point of -46c.*

This will allow me to get much lower. *And reach my GTX 1080 goal of stable 2303 MHZ & +1000 memory.*

P.S alternatively I could look at converting an air conditioning condenser unit; but at this stage in my research... I've barely got the slightest notion how I'd practically accomplish that. (I don't even have any buddies who work in HVAC). I don't know if anyone with my level of knowledge has ever even managed to achieve such a project...

Anyone who is able to contribute or if anyone even knows; where I might buy a non-industrial water chiller that runs off R404.. please let me know


----------



## Dragonsyph

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes; they do indeed (very much so) as I'm running it with a full-cover block. So the subzero conditions extend to my memory. I can even monitor my memory temps by using temp probes I positioned *directly behind* the memory on the back of the PCB. ;-)
> 
> However in reality; due to only being able to reach -10 by seriously undervolting and idling my components. Then quickly resuming them at normal operating specs to initiate a benchmark. The lowest possible temperature at LOAD, I can hope to achieve using my current Water Chiller is 0c -> 2c. (GPU core Temp).
> And Memory load Temp about 10c - 15c (due to thermal pad).
> 
> *Update on the Water Chiller situation*
> 
> Done some research; & it appears the reason I can't get below -10c is due to the gas used in my Water Chiller. R134a. Which only has a boiling point of -26c.
> 
> The colder the chiller runs the closer the gas is to its minimum boiling point. And the less efficient.
> 
> You see; the 'temperature drop' or 'cooling' happens, when the gas evaporates in the evaporator (as it requires heat energy to change into a more energetic gas form).
> 
> In the same way that water evaporates slower, the lower the temperature (or closer to 100c), the gas in a chiller/freezer/fridge evaporates slower the closer it is to its boiling point. In my case. -26c.
> 
> In other words, my chiller has to do more eveporation/condensing cycles to expel the same heat; the lower the temperature. (Less efficient the lower the temperature). Eventually (in my case -10c) the system can no longer expel enough watts of heat energy to continue getting lower.
> 
> *I need to upgrade/buy a new Chiller, which runs off a gas with a lower boiling point. Maybe R404 which has a boiling point of -46c.*
> 
> This will allow me to get much lower. *And reach my GTX 1080 goal of stable 2303 MHZ & +1000 memory.*
> 
> P.S alternatively I could look at converting an air conditioning condenser unit; but at this stage in my research... I've barely got the slightest notion how I'd practically accomplish that. (I don't even have any buddies who work in HVAC). I don't know if anyone with my level of knowledge has ever even managed to achieve such a project...
> 
> Anyone who is able to contribute or if anyone even knows; where I might buy a non-industrial water chiller that runs off R404.. please let me know


Very impressive mate, if you reach your goal a 2303 and 12,000 memory clock would net some epic benchmark scores.


----------



## nikuk

Quote:


> Originally Posted by *pantsoftime*
> 
> Just a PSA about the Fuji Sarcon XR-m (17W/mK) pads that people bring up a lot in this thread. Those pads are fantastic, but are very sensitive to the amount they get compressed. They work best when they're compressed between 25% and 65%. When you overcompress or undercompress them their performance degrades significantly. If you want the best performance be sure to pick the proper thickness pad and don't overtighten your screws. To add insult to injury, there is something like a +/- 0.5mm tolerance on the sheets that Fuji sells. This can have a noticeable impact on your expected compression and thus thermal performance.
> 
> When using these pads for a significantly hot component it's usually best to buy spares. It won't matter much for parts with lowish power (under 10 watts) but if you're trying to use a pad like this on a super hot component then you'll want to dial it in precisely.


Thanks For this, I have these in a cart while I cross shop.
I have a depth gauge and calipers that I can use and so I will double check before ordering (guess I'll pull the card again lol)... but my question is:
How much of a performance drop are taking when out of spec?
Does the 11w/mk stuff have better work spectrum?


----------



## fat4l

Quote:


> Originally Posted by *nikuk*
> 
> Thanks For this, I have these in a cart while I cross shop.
> I have a depth gauge and calipers that I can use and so I will double check before ordering (guess I'll pull the card again lol)... but my question is:
> How much of a performance drop are taking when out of spec?
> Does the 11w/mk stuff have better work spectrum?


It is good yes. I wouldnt use it for memory tho as you will gain nothing but for VRM yes, use it or even 14W/mK. I used the 17 ones...


----------



## nVIDIASLiRig

Feel good to join 1080 club!


----------



## DennyCorsa86




----------



## pantsoftime

Quote:


> Originally Posted by *nikuk*
> 
> How much of a performance drop are taking when out of spec?
> Does the 11w/mk stuff have better work spectrum?


When it's undercompressed the performance drops off a cliff. 15% compression will be ~20% the conductivity. In my experience it's been hard to quantify overcompression, but I've seen worse results than nominal (maybe a performance reduction of 10-20%).

I haven't used XR-Um but it seems to have a wider range than XR-m and the same thermal conductivity. They also offer a variant with an aluminum layer incorporated.

I don't have any experience with the 14W/mK stuff to provide any thoughts there.


----------



## nrpeyton

Quote:


> Originally Posted by *OZrevhead*
> 
> I really like the whole chilled water idea, phase is great but its only for a cpu or a gpu where chilled water can cool everything you care to plumb to.
> 
> Your cpu should have really responded to the cold too, its not just chilling your gpu is it?


CPU & GPU, I've not done a lot of experimenting with CPU yet (I still have that to look forward to) 

Quote:


> Originally Posted by *Spiriva*
> 
> Mine never drops below 2250mhz tho


You are very lucky then. You must be in the top 1% of cards that can do that at normal temperatures? You are just using regular water-cooling? What kind of temperatures are you running for that. What voltages? And are you using the 'curve' overclocking method (new to pascal)... or just a traditional core overclock, in the main window?
Quote:


> Originally Posted by *fat4l*
> 
> So once again, guys, what nvflash do we use(version) and what's the proper commands to flash that T4 bios, on Fe card, if you run just 1 card ?
> thanks


Don't use the certificates bypassed version.

Use the regular version (but the one updated for Pascal).

The command is nvflash --overridesub [filename] without the []'s

Disable graphics driver in device manager first.

That is from memory /\ if it doesn't work come back to me and i'll look up my notes.


----------



## nrpeyton

Double post sorry (error)

Microsoft Edge (is so damn buggy with overclock.net)


----------



## Spiriva

Quote:


> Originally Posted by *nrpeyton*
> 
> You are very lucky then. You must be in the top 1% of cards that can do that at normal temperatures? You are just using regular water-cooling? What kind of temperatures are you running for that. What voltages? And are you using the 'curve' overclocking method (new to pascal)... or just a traditional core overclock, in the main window?


I use regular water cooling, block from EK, 2*480 rads. Im using the t4 bios, running at ~1.150v during games both cards max out at 41c (atleast now on the winter)








Using the curve overclock in MSI-AB.


----------



## Vellinious

Quote:


> Originally Posted by *Spiriva*
> 
> I use regular water cooling, block from EK, 2*480 rads. Im using the t4 bios, running at ~1.150v during games both cards max out at 41c (atleast now on the winter)
> 
> 
> 
> 
> 
> 
> 
> 
> Using the curve overclock in MSI-AB.


Yeah, the additional voltage is helping out a lot there. At stock voltage and those temps you were seeing what....2214 for benchmark runs, but 2202 or 2193 probably ran better?

I'd be curious to see a timespy run done at the 2250 with the 1.15v, and then a run done at 2193 @ 1.093v for comparison purposes. I know mine at 2193 with a +560 memory offset hits 8400-8500 graphics score at stock volts and running no more than 25c.


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> CPU & GPU, I've not done a lot of experimenting with CPU yet (I still have that to look forward to)
> You are very lucky then. You must be in the top 1% of cards that can do that at normal temperatures? You are just using regular water-cooling? What kind of temperatures are you running for that. What voltages? And are you using the 'curve' overclocking method (new to pascal)... or just a traditional core overclock, in the main window?
> Don't use the certificates bypassed version.
> 
> Use the regular version (but the one updated for Pascal).
> 
> The command is nvflash --overridesub [filename] without the []'s
> 
> Disable graphics driver in device manager first.
> 
> That is from memory /\ if it doesn't work come back to me and i'll look up my notes.


I need a link pls and full commands








Dont want to screw my card.

So can someone tell me exact commands to flash my 1080 FE with T4 bios and a link to nvflash that works for that ?
Thanks!


----------



## fat4l

Is this it ?

_"Then use these 3 commands WITH THE ADAPTER DISABLED IN DEVICE MANAGER:

nvflash -i0 --protectoff

nvflash -i0 -6 strix1080xoc_t4.rom

nvflash -i0 --protecton"_

Version 5.287.0 ??


----------



## GRABibus

Quote:


> Originally Posted by *Spiriva*
> 
> Just pull the graph from 1.2v (or whatever volt you wanna go for up to 1.2v) to whatever mhz you wanna go for, and it will use that volt.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you do it like this it will give you ~2200mhz @ 1.100v


I flashed my Gigabyte GTX 1080 Xtreme Gaming WATERFORCE 8G with Strix t4 Bios.
Works perfectly.

I am stable (No artefacts, no crash) at 2202MHz/5500MHz at 1.1V in BF1, DOOM, OVERWATCH, TITANFALL 2.
I am also stable with no crashes and no artefacst in FireStrike and Time spy.

Max GPU temp is 50°C at 21°C ambient.

But, i didn't tweak the V/F curve as you show. Why ?

Because if I do as you show here, means only increase frequency at 1.15V and above, then my Time Spy, Firestrike scores remain the same whatever the frequency at 1.15V !
I mean : same score with 2126MHz at 1.15V and 2202MHz at same voltage 1.15V if I tweak the curve as you do.

To increase Time Spy and Firestrike scores and also fps in games when the frequency increases, I first apply an offset in MSI afterburner (eg +210MHz).

=> Then all the curve increases.
You can see I start at above 1700MHz at 800mV. You are below 1600MHz at 800mV.

And then, I tweak at high voltage to get my 2202MHz at 1.1V.

In this case, scores in Time Spy and Firestrike increase a lot, and fps in games also.

Here is my curve :

http://www.casimages.com/img.php?i=17013109204717369814816598.png

As you can see, there is a first offset to be above "stock" curve at all frequenices, here +210MHz.
And after applying this global offset +210MHz, then I tweak to get 2202MHz at 1.1V.


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> Is this it ?
> 
> _"Then use these 3 commands WITH THE ADAPTER DISABLED IN DEVICE MANAGER:
> 
> nvflash -i0 --protectoff
> 
> nvflash -i0 -6 strix1080xoc_t4.rom
> 
> nvflash -i0 --protecton"_
> 
> Version 5.287.0 ??


It's a single "-" if you're using the numbers or a double "--" if you're using the text command

I just flashed it to my card to double check.

I took screenshots as I went.

(You may need to use the --protectoff command before, as you mentioned) 

*This is the easiest way:* _(remember to disable display driver first & backup your original BIOS to a disc)_
1. Open folder where nvflash.exe is
2. Go to File & select 'open command prompt - administrator'
3. Command prompt will conveniently appear with directory already present

4. nvflash --protectoff _only required if your cards BIOS is write protected by default_
5. nvflash --overridesub filename.rom
6. press y
7. press y again to confirm
8. reboot & check display driver is re-enabled
9. You still need 100% on the voltage slider + tweak the curve for the voltage/core offset you want. The last post above by GRABibus explains a very nice method 

-You can check the BIOS was successfully flashed because the power/temp sliders on MSI Afterburner will be disabled.
-The vendor will also be different in GPU-Z

nvflash_5.319.0-win.zip 2819k .zip file


strix1080xoc_t4version2.zip 148k .zip file











*/\ running at 1.125v ;-)*

P.S. remember to either disregard the 32bit version and rename the 64 bit .exe to simply "nvflash.exe" or change the command I gave above to include "64" .

Note: If your monitor is plugged into the 3rd display port it won't work under the STRIX, BIOS.. so change that before you flash. (people have mistakingly thought they'd bricked their cards due to this oversight)

*For anyone reading this and suddenly deciding to flash this BIOS to your card. DON'T*. This was for fat4l. I won't be held responsible if you overheat and blow up your card due to giving your card too much voltage and drawing too much power.

Your card's temperature limits are also disabled using this BIOS so your card won't throttle down to save it's self if your fans/pump should fail.

A user on here with a FE,recently melted the PCI-E power cable, using this BIOS, as he was drawing more power than the cable could handle. The cable was probably already the wrong gauge of cable and using this BIOS just made the problem happen faster. But still something to be aware of.

Also EVGA Classified 1080 Owners should use the updated Classified Voltage Tool, instead:

Classified1080voltagetool.zip 934k .zip file


----------



## GRABibus

Quote:


> Originally Posted by *nrpeyton*
> 
> It's a single "-" if you're using the numbers or a double "--" if you're using the text command
> 
> I just flashed it to my card to double check.
> 
> I took screenshots as I went.
> 
> (You may need to use the --protectoff command before, as you mentioned)
> 
> *This is the easiest way:* _(remember to disable display driver first & backup your original BIOS to a disc)_
> 1. Open folder where nvflash.exe is
> 2. Go to File & select 'open command prompt - administrator'
> 3. Command prompt will conveniently appear with directory already present
> 
> 4. nvflash --protectoff _only required if your cards BIOS is write protected by default_
> 5. nvflash --overridesub filename.rom
> 6. press y
> 7. press y again to confirm
> 8. reboot & check display driver is re-enabled
> 9. You still need 100% on the voltage slider + tweak the curve for the voltage/core offset you want. The last post above by GRABibus explains a very nice method
> 
> -You can check the BIOS was successfully flashed because the power/temp sliders on MSI Afterburner will be disabled.
> -The vendor will also be different in GPU-Z
> 
> nvflash_5.319.0-win.zip 2819k .zip file
> 
> 
> strix1080xoc_t4version2.zip 148k .zip file
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> */\ running at 1.125v ;-)*
> 
> P.S. remember to either disregard the 32bit version and rename the 64 bit .exe to simply "nvflash.exe" or change the command I gave above to include "64" .
> 
> Note: If your monitor is plugged into the 3rd display port it won't work under the STRIX, BIOS.. so change that before you flash. (people have mistakingly thought they'd bricked their cards due to this oversight)
> 
> *For anyone reading this and suddenly deciding to flash this BIOS to your card. DON'T*. This was for fat4l. I won't be held responsible if you overheat and blow up your card due to giving your card too much voltage and drawing too much power.
> 
> Your card's temperature limits are also disabled using this BIOS so your card won't throttle down to save it's self if your fans/pump should fail.
> 
> A user on here with a FE,recently melted the PCI-E power cable, using this BIOS, as he was drawing more power than the cable could handle. The cable was probably already the wrong gauge of cable and using this BIOS just made the problem happen faster. But still something to be aware of.


Thanks for your comment concerning my post









At idle, with t4 strix Bios on my Gigabyte, fan speed of the card = 0rpm

Is it "ok" ?


----------



## nrpeyton

Quote:


> Originally Posted by *GRABibus*
> 
> Thanks for your comment concerning my post
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At idle, with t4 strix Bios on my Gigabyte, fan speed of the card = 0rpm
> 
> Is it "ok" ?


at idle many cards fans are at 0rpm. (whether you have the STRIX, BIOS or not). In actual fact many of them "_report_" 0RPM, as below a certain RPM no value is reported from _some_ fans. But often its still spinning slowly at up to 15% ?

Just monitor the temps, as long as the temp is fine it shouldn't be a problem. The airflow in your case (passing over the GPU heatsink) is often enough to keep the temperature in check, while idling.

The 'power management mode' in the Nvidia control panel also has an impact. If you have it set to 'optimal power', most cards will only idle at 0.625v.

Just be careful though.


----------



## GRABibus

Quote:


> Originally Posted by *nrpeyton*
> 
> at idle many cards fans are at 0rpm. (whether you have the STRIX, BIOS or not). In actual fact many of them "_report_" 0RPM, as below a certain RPM no value is reported from _some_ fans. But often its still spinning slowly at up to 15% ?
> 
> Just monitor the temps, as long as the temp is fine it shouldn't be a problem. The airflow in your case (passing over the GPU heatsink) is often enough to keep the temperature in check, while idling.
> 
> The 'power management mode' in the Nvidia control panel also has an impact. If you have it set to 'optimal power', most cards will only idle at 0.625v.
> 
> Just be careful though.


The fan of my GIGABYTE Xtreme Gaming Waterforce iis not "0rpm" at idle.
It is at 25%.

Concerning idle temp with fan=0rpm => 36°C at 20°c ambient.
That's ok


----------



## jura11

Hi guys

Currently have GTX1080 FE and max OC on my card is 2151MHz(few days back been stable for period of two renders 2181MHz) and on VRAM 5505MHz..Temps are max 36°C under water,only once I've seen 38°C with slow fan profile on Aquaero/Aquasuite

Strange is I'm unable go beyond 1.05v on core,what is max voltage on FE?

Previously have owned Zotac GTX1080 AMP with which I've run I think 2088MHz as max on core and on VRAM anything above 50MHz caused freezing

I thinking using T4 BIOS and try this BIOS,just what bothers me my FE does have only one PCI_E power connector

I'm running custom water loop and temps shouldn't be issue

Thanks, Jura


----------



## ucode

Max is whatever you make it, otherwise limited to 1.093V


----------



## fat4l

Thanks @nrpeyton and @GRABibus !








Both + rep

I will try that today later on and report back. My card can do 2202MHz with 1.094v as it is under water and getting like 35C under load.

What voltage should I aim for on my FE(watercooled) ? 1.125? 1.1? 1.15?

@GRABibus , when setting the offset first, should I try to find maximum offset for my card ? or just use some random one like...+200MHz ?
And basically, how does it work? I set the offset, lets say +200MHz, press apply, then open frequency/voltage tab and change the curve(for example 1.1v, set all "dots" to be on one line?), then press apply again ?

I tried this T4 bios some time ago but that time, with higher frequency I had low scores ....so basically what you just said :
Quote:


> But, i didn't tweak the V/F curve as you show. Why ?
> 
> Because if I do as you show here, means only increase frequency at 1.15V and above, then my Time Spy, Firestrike scores remain the same whatever the frequency at 1.15V !
> I mean : same score with 2126MHz at 1.15V and 2202MHz at same voltage 1.15V if I tweak the curve as you do.


----------



## Dragonsyph

My GPU is getting way hotter at 1440x3440 100hz vs 4k 60hz. ?????????????????????


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> My GPU is getting way hotter at 1440x3440 100hz vs 4k 60hz. ?????????????????????


Were you using vsync with 4K prior? I know my ACX 3.0 is getting up to around 70C and then 75C in super intensive titles.

Off-topic somewhat, but I meant to ask how you were enjoying that x34 with your 1080. I moved the 16:9 Gsync panel over to my GFs computer and ended up going back to the x34. I couldn't go back to the noise of the Titan, however, so I've made my compromises







.


----------



## Dragonsyph

Quote:


> Originally Posted by *pez*
> 
> Were you using vsync with 4K prior? I know my ACX 3.0 is getting up to around 70C and then 75C in super intensive titles.
> 
> Off-topic somewhat, but I meant to ask how you were enjoying that x34 with your 1080. I moved the 16:9 Gsync panel over to my GFs computer and ended up going back to the x34. I couldn't go back to the noise of the Titan, however, so I've made my compromises
> 
> 
> 
> 
> 
> 
> 
> .


Ya i used v sync with 4k, so you think the extra 40hz might be the cause of the heat?

I'm loving the x34, going from 60 to 100hz was instantly noticeable. And g sync is also a nice addition, i also like the on-screen FPS from the monitor. Every game iv tried so far supported the ultrawide, so iv been having alot of fun.

But ya with this x34 my GPU is getting about 10C hotter.


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya i used v sync with 4k, so you think the extra 40hz might be the cause of the heat?
> 
> I'm loving the x34, going from 60 to 100hz was instantly noticeable. And g sync is also a nice addition, i also like the on-screen FPS from the monitor. Every game iv tried so far supported the ultrawide, so iv been having alot of fun.
> 
> But ya with this x34 my GPU is getting about 10C hotter.


What actual temps are you seeing?

And yeah, if you were capping your FPS previously and hitting the top end quite a bit (i.e. staying at 60FPS consistently) your GPU was most likely either not sitting at 99%. 21:9 1440p is a middle point of 1440p and 4K, but trying to get 100hz from it might be just as taxing for the GPU if not slightly more. I'm no expert on how to calculate that out exactly, but I have a decent feeling that that's what you're seeing here.


----------



## Dragonsyph

Quote:


> Originally Posted by *pez*
> 
> What actual temps are you seeing?
> 
> And yeah, if you were capping your FPS previously and hitting the top end quite a bit (i.e. staying at 60FPS consistently) your GPU was most likely either not sitting at 99%. 21:9 1440p is a middle point of 1440p and 4K, but trying to get 100hz from it might be just as taxing for the GPU if not slightly more. I'm no expert on how to calculate that out exactly, but I have a decent feeling that that's what you're seeing here.


Went from around 42c max to seeing up to 55c.


----------



## adam07510

Why u can't tri sli GTX 1080 ?!


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> Went from around 42c max to seeing up to 55c.


Is this on water or air?

Also, could you take a screenshot of the extended graph on MSI Afterburner?


----------



## GRABibus

Quote:


> Originally Posted by *fat4l*
> 
> Thanks @nrpeyton and @GRABibus !
> 
> 
> 
> 
> 
> 
> 
> 
> Both + rep


Thanks !









Quote:


> Originally Posted by *fat4l*
> 
> @GRABibus , when setting the offset first, should I try to find maximum offset for my card ? or just use some random one like...+200MHz ?
> And basically, how does it work? I set the offset, lets say +200MHz, press apply, then open frequency/voltage tab and change the curve(for example 1.1v, set all "dots" to be on one line?), then press apply again ?
> 
> I tried this T4 bios some time ago but that time, with higher frequency I had low scores ....so basically what you just said :


Before choosing the offset (Which is applied to all voltage range = all points of the curve), I made hours of Time Spy and gaming in order to find the best compromise between Voltage and frequency, without game crashes, artefacts...etc....

And also, by reading all what people experimented, we know that 2202MHz at 1.1V is really one of the best thing we can get.

By doing a lot of tests, I conclued that the higher the offset before tweaking voltage and frequency points by points, the higher the Time Spy scores and fps in games.

If you go back to my post #9232, the first curve is typically made with offset = 0 in MSI afterburner for the GPU Core.
The tweaking is done starting at 1.1V at 2202MHz

If I set this curve, I make exactly the same score in Time Spy than if I would tweak for example 1.05V at 2100MHz !!!!

To get improvment of fps in a function of GPU Core frequency increase, the only way I found is first to set an offset in MSI afterburner (or PX) for the GPU core frequency => Then all the points of the V/F curve rise together.

And then, I tweak as on the first curve to get 2202MHz at V higher or equal to 1.1V.

You can see this method in my second curve of post #9232 in which you also have 1.1V at 2202MHz, but the graphics score here at Time Spy is 8500 points : http://www.3dmark.com/3dm/17627544

With the first curve, I get only 7800 !!!!!

And both give 2202MHz at 1.1V.....


----------



## 6u4rdi4n

Pascal seems to be a mystical little creature


----------



## Vellinious

It's tough to figure out, that's for sure. Boost 3.0 changed the game entirely.


----------



## jprovido

Just saw a listing of used Founder edition 1080's for 500 each(no tax). pulled the trigger on two I know the timing was bad with the 1080ti and vega but I couldn't help it


----------



## KickAssCop

Never posted my build here so let me just spam it on this forum.


----------



## fat4l

Ok here is the progress.

With the new bios, I 'm able to keep up with the scores if comparing to my stock bios.
2202MHz + 1.094V with stock and Strix XOC v2 gives me the same scores(I have TDP hard mod that's one of the reason why)

Now, I tried to push for max at 1.15V and I could get 2240MHz and max temp during Firestrike Extreme was 35C while mostly it was 33-34C.

What do you think guys ? Should we push for 1.2V?
Regarding the cooling, I use Liquid Metal on the core + Fujipoly pads 17W/mK on VRMs + fullcover EK block so temps shoud not be an issue at all.


----------



## DennyCorsa86

I am wainting to ext from the hospital for try mu new build with this graphic cards under water ?


----------



## Vellinious

Quote:


> Originally Posted by *fat4l*
> 
> Ok here is the progress.
> 
> With the new bios, I 'm able to keep up with the scores if comparing to my stock bios.
> 2202MHz + 1.094V with stock and Strix XOC v2 gives me the same scores(I have TDP hard mod that's one of the reason why)
> 
> Now, I tried to push for max at 1.15V and I could get 2240MHz and max temp during Firestrike Extreme was 35C while mostly it was 33-34C.
> 
> What do you think guys ? Should we push for 1.2V?
> Regarding the cooling, I use Liquid Metal on the core + Fujipoly pads 17W/mK on VRMs + fullcover EK block so temps shoud not be an issue at all.


Where are you scoring higher. When the scores start to fall off, it pretty much means the GPU isn't running cool enough.


----------



## GRABibus

Quote:


> Originally Posted by *KickAssCop*
> 
> Never posted my build here so let me just spam it on this forum.


Nice rig !
But in your signature, you wrote : "i7-5930K at 4.6GWatts"


----------



## Vellinious

Quote:


> Originally Posted by *GRABibus*
> 
> Nice rig !
> But in your signature, you wrote : "i7-5930K at 4.6GWatts"


lol, I noticed that too, but I wasn't going to say anything.


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> Ok here is the progress.
> 
> With the new bios, I 'm able to keep up with the scores if comparing to my stock bios.
> 2202MHz + 1.094V with stock and Strix XOC v2 gives me the same scores(I have TDP hard mod that's one of the reason why)
> 
> Now, I tried to push for max at 1.15V and I could get 2240MHz and max temp during Firestrike Extreme was 35C while mostly it was 33-34C.
> 
> What do you think guys ? Should we push for 1.2V?
> Regarding the cooling, I use Liquid Metal on the core + Fujipoly pads 17W/mK on VRMs + fullcover EK block so temps shoud not be an issue at all.


You can try it, but over 25c Pascal doesn't like anything over 1.2v

So *at* 1.2v you might be okay.. it depends on your card.. some cards will take it, most won't.. but you might be in the lucky 25%  ?

You certainly won't do any immediate damage to the chip. (just watch your VRM temps)

I think after 1.25v the *hard limit" kicks in.. _(I can't remember)?_

I also heard it was a myth, so only one way to find out? What does the T4 max out at anyway, 1.2v or 1.25v?

Quote:


> Originally Posted by *DennyCorsa86*
> 
> I am wainting to ext from the hospital for try mu new build with this graphic cards under water ?


Very nice.

That's the first HOF 1080 I've seen on this thread... and this thread is BIG. lol.

Very interested to see what kind of clocks you get out of her.

I have an EVGA 1080 Classified (which has a similarly overkill VRM like your HOF).. hence my interest 

VRM looks beautiful  all those phases


----------



## Mago

I just got a 1080 HOF and I am using Extreme Tuner Plus to OC it. I am quite newbish OC graphics cards but here's what I did.
In Heaven Benchmark Power Target =116 , temp target=92 got stable +120 core and +805 on memory ( out the box no adjustments it was 2038 settling at 2025 core and 5005 mem)

Max temp was 54C HWMonitor showed 1.63v (did not manually adjust it left at 0.800 in Tuner) 97.22%

Spy runner was a lil different, same setting but core +110 and mem +740.

I'm scared to mess with the core and don't have the balls to mess with the BIOS at all.

was wondering should I manually adjust the volts to 1.1 and see how she bangs and will I get a stable higher OC.

I'm a gamer so a 24/7 of +75 core and +500 mem is good for me just curious to find the cards potential.

Any advice is welcome

IMG_0455.JPG 1923k .JPG file

I hope picture is not too big


----------



## Mago

1.063 v sorry typo


----------



## Mago

Should be 1.063 v sorry typo


----------



## hertz9753

You just did a Triple Lindy.


----------



## Vellinious

Quote:


> Originally Posted by *Mago*
> 
> I just got a 1080 HOF and I am using Extreme Tuner Plus to OC it. I am quite newbish OC graphics cards but here's what I did.
> In Heaven Benchmark Power Target =116 , temp target=92 got stable +120 core and +805 on memory ( out the box no adjustments it was 2038 settling at 2025 core and 5005 mem)
> 
> Max temp was 54C HWMonitor showed 1.63v (did not manually adjust it left at 0.800 in Tuner) 97.22%
> 
> Spy runner was a lil different, same setting but core +110 and mem +740.
> 
> I'm scared to mess with the core and don't have the balls to mess with the BIOS at all.
> 
> was wondering should I manually adjust the volts to 1.1 and see how she bangs and will I get a stable higher OC.
> 
> I'm a gamer so a 24/7 of +75 core and +500 mem is good for me just curious to find the cards potential.
> 
> Any advice is welcome
> 
> IMG_0455.JPG 1923k .JPG file
> 
> I hope picture is not too big


That's not bad. I believe the highest voltage you'll see is 1.093v. 54c is a little bit high for pursuing really high clocks, though.


----------



## DennyCorsa86

I will put the cards under water and water chiller, i have the bios for xoc and voltage's tool for this cards. If You see under the backplate of 1080 HOF, there is one microswitch. That switch is for put off temperatures protection, overvolt protection and increment the power limit for use the cards under ln2


----------



## Vellinious

Cool. I wish you luck, and look forward to seeing your results.


----------



## ucode

@Mago FWIW I've run my FE 1080 at 1.2V both air and water with water preferred without problem and I've run a 1050Ti at over 1.3V on air, again without problem other than in the 1050ti case it made little difference to the OC. Just be sensible and start with light loads rather than full on furmark to get an idea of how it's going to behave.


----------



## Dragonsyph

When games use 8gb of vram with my card it seems to skyrocket the temps lol. So im guessing the gpu and vram plate are producing more heat than the hybrid can get rid of.


----------



## Derek1

Quote:


> Originally Posted by *Dragonsyph*
> 
> When games use 8gb of vram with my card it seems to skyrocket the temps lol. So im guessing the gpu and vram plate are producing more heat than the hybrid can get rid of.


You should read this.
http://www.gamersnexus.net/hwreviews/2582-evga-gtx-1080-ftw-hybrid-review-vs-sea-hawk-x?showall=1

I have talked about this before here I believe. Yes according to the article the plate does inhibit cooling in their comparisons.
BUT in my case, leaving the plate and using GC Extreme on the core and a push/pull set up kept my temps to the same or lwer than the temps reproted in the article when they tested the FTW without the plate.
Initially, after reading this article, my idea was to leave the plate off and put heatsinks on the VRAM so that the block would be dedicated to cooling the core only. But I figured that (see above sentence) it would not produce a significant reduction in Temps to keep my card under 40C. It seemed that I might only gain about 5-8C if that. HOWEVER, if I did do the mod with the heatsinks, I felt that the VRAM would not be cooled as effectively because the heatsink fan was not powerful enough to draw heat from the sinks therby making the VRAM appreciably hotter and thereby reducing my ability to OC the Memory.

Bottom line is I didn't do the mod...yet. lol
I may go ahead and try it to see if I can squeeze some more out of the Core while still maintaining a healthy Mem oc.

But the mod would have to be able to keep my core temps down below 40C for it to keep the card from throttling down at that 40C threshold otherwise it makes no difference.


----------



## nrpeyton

Quote:


> Originally Posted by *DennyCorsa86*
> 
> I will put the cards under water and water chiller, i have the bios for xoc and voltage's tool for this cards. If You see under the backplate of 1080 HOF, there is one microswitch. That switch is for put off temperatures protection, overvolt protection and increment the power limit for use the cards under ln2


interesting. ;-)

I never knew the HOF 1080 had a voltage tool.

I wonder if it works properly with boost 3.0, because the classified 1080's voltage tool doesn't *appear to*. *however* I still haven't been able to work out yet, if thats *only* due to not getting cold enough.

The classy's voltage tool was done by the community, not EVGA.

If Galax is different and done it themselves, hmm.. interesting.. very...*thinks about boost 3.0 and the tool being done secretly by GALAX"







_I am of course only speculating..._

Either GALAX can afford to 'opt out' of their warranty for all HOF cards with Nvidia or they have indeed, done it in secret.. or.. its been done by the community (like the classified's) in which case.. i'll be interested to see if it works with boost 3.0.


----------



## Raisingx

I think my MSI 1080 Gamingx died after 6 months of use..

All of a sudden while browsing the internets my pc turned off, wouldn't turn on so I decided to disconnect all cables besides the motherboard and cpu, pc started up now but it started to smell funny weird so I quickly turned it off.

After removing the gfx card I noticed the burnt smell comes from the rear end of it.

Tried old AMD 5850 and it's working peachy, using it right now.

Should I try the gtx 1080 again ? Did I cause the problem by starting up the computer during testing without having the 2 power connectors plugged in the gpu ?

How is good is MSI RMA service ?


----------



## nrpeyton

Quote:


> Originally Posted by *Raisingx*
> 
> I think my MSI 1080 Gamingx died after 6 months of use..
> 
> All of a sudden while browsing the internets my pc turned off, wouldn't turn on so I decided to disconnect all cables besides the motherboard and cpu, pc started up now but it started to smell funny weird so I quickly turned it off.
> 
> After removing the gfx card I noticed the burnt smell comes from the rear end of it.
> 
> Tried old AMD 5850 and it's working peachy, using it right now.
> 
> Should I try the gtx 1080 again ? Did I cause the problem by starting up the computer during testing without having the 2 power connectors plugged in the gpu ?
> 
> How is good is MSI RMA service ?


I tried to start my PC up without the GPU power cables connected once -- a long time ago-- (just to see what happened, _I was curious)_ I got a message displayed on-screen telling me to connect the power cables. Never done any damage. I just wasn't able to log into windows.


----------



## nikuk

Looking for input please. Asus 1080 OC edition.

I was previously running a single 140mm AIO, and I got her to run & validate @ 2202 core clock / +400 mem clock, with max temps of 56C in Heaven Benchmark. I was hitting the 1.093 voltage limit per GPUZ. I lived with this OC for a few weeks with no problems.

Today I've just finished setting up & testing a full water loop. 1st custom loop.
_*I had to drop my memory OC to keep a 2200 MHZ core clock and I can't even get TO the voltage limit now...?*_
Max temp in Heaven benchmark of 38C. Hitting 1.062 voltage, not hitting ANY limiters per GPUZ. Same card, same 379.xx driver. Full EK waterblock & back plate. Fujipoly 11w/mk thermal pads installed per EK's instructions. Kryonaut thermal paste on the GPU as well as dabs beneath the thermal tape.

My CPU 7700K & BIOS are running back at stock right now, but otherwise all the hardware is the same other than cooling.

I don't understand.

I also noticed a high pitched stuttering whine when I closed out Heaven. I don't know if this was present before - the fans were alot louder before.


----------



## Raisingx

Quote:


> Originally Posted by *nrpeyton*
> 
> I tried to start my PC up without the GPU power cables connected once -- a long time ago-- (just to see what happened, _I was curious)_ I got a message displayed on-screen telling me to connect the power cables. Never done any damage. I just wasn't able to log into windows.


Yeah.. if the card exploded without the power cables plugged in there would be a giant warning over it, guess it's RMA time and hope I don't get a squealer


----------



## KickAssCop

Quote:


> Originally Posted by *GRABibus*
> 
> Nice rig !
> But in your signature, you wrote : "i7-5930K at 4.6GWatts"


Quote:


> Originally Posted by *Vellinious*
> 
> lol, I noticed that too, but I wasn't going to say anything.


Because it is at 4.6 GIGAWATTS!


----------



## Derek1

Finally picked up a new CPU, 4930K from a batch listed on HWBOT (3326B652 4.5Ghz Cinebench/Vantage @ 1.18v - h2o, 5.6Ghz @ 1.65v - LN2, no CB/CBB) and have it OC to 4.5 at the moment.

Ran 1st FS @ 2152/11560, my new high overall score.



and a comparison with my 2 previous high scores.



I have already had the CPU validated to 4.7 and will be seeing how far it goes throughout the weekend.

Pushing on to 20K! lol

Will be posting new highs on the relevant threads.


----------



## Derek1

Well...that didn't take long. lol





http://www.3dmark.com/fs/11613431


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Well...that didn't take long. lol
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/11613431


Nice scores, you noticed a big FPS difference in games too?


----------



## Derek1

Haven't played any yet.
Still tinkering and stabillizing the OC.

Damn TS is giving me Time Inconsistency error message. Not sure what that is about.
http://www.3dmark.com/spy/1165859


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Haven't played any yet.
> Still tinkering and stabillizing the OC.
> 
> Damn TS is giving me Time Inconsistency error message. Not sure what that is about.
> http://www.3dmark.com/spy/1165859


Seeing your scores up there with your new CPU reminds me of the upcoming ZEN release and my plans for a new CPU... this kind of thing gets me excited lol ;-)

If you were on an old chip like me, with a 1080 connected to it and you were as HUNGRY as me..
imagine how excited you'd be about ZEN.. _;-) <---thats me_

So excited, I think I'm going to fire the Chiller up tonight and do a few benches just for the hell of it lol...

I'll see how close I can get to you, lol on my old FX...

Which CPU is scoring highest on 3dmark at the moment anyway? _Or is it *more simple than that*_, just the i7 6950?

P.S.
I see you broke through 8000 on timespy ;-)

The inconsitency errors can be something as little as, a background task
Usualy a reboot fixes it, for me.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Seeing your scores up there with your new CPU reminds me of the upcoming ZEN release and my plans for a new CPU... this kind of thing gets me excited lol ;-)
> 
> If you were on an old chip like me, with a 1080 connected to it and you were as HUNGRY as me..
> imagine how excited you'd be about ZEN.. _;-) <---thats me_
> 
> So excited, I think I'm going to fire the Chiller up tonight and do a few benches just for the hell of it lol...
> 
> I'll see how close I can get to you, lol on my old FX...
> 
> Which CPU is scoring highest on 3dmark at the moment anyway? _Or is it *more simple than that*_, just the i7 6950?
> 
> P.S.
> I see you broke through 8000 on timespy ;-)
> 
> The inconsitency errors can be something as little as, a background task
> Usualy a reboot fixes it, for me.


Ya the TS score is not valid but as you say is probably something in background or OC is still a little unstable.

Just moved up a few spots in the FS Extreme list too. You better hurry lol.

http://www.3dmark.com/fs/11614301

FS Ultra next. Broke 6K there.

http://www.3dmark.com/3dm/17794033

So far I think it is the 6950X will have to look. Havent seen any 7700's yet but will have to wait for the release of the X versions I suppose.


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Ya the TS score is not valid but as you say is probably something in background or OC is still a little unstable.
> 
> Just moved up a few spots in the FS Extreme list too. You better hurry lol.
> 
> http://www.3dmark.com/fs/11614301
> 
> FS Ultra next. Broke 6K there.
> 
> http://www.3dmark.com/3dm/17794033
> 
> So far I think it is the 6950X will have to look. Havent seen any 7700's yet but will have to wait for the release of the X versions I suppose.


*everything at stock, lol... just for laughs * \/



*Next; I'll awake the overclocking beast in me, brb.*


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> *everything at stock, lol... just for laughs * \/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *Next; I'll awake the overclocking beast in me, brb.*


I was gonna add earlier when you mentioned Ryzen that you had better hope that it doesn't turn out to be a RAISON!
(Get some Intel in ya!!!) lmao


----------



## shilka

So after having put it off more times then i care for i am finally going to upgrade to a GTX 1080
The only thing i am worried about is the rumors i hear about the GTX 1080 Ti and a possible price drop for the GTX 1080

I would be really pissed if i went out and bought a GTX 1080 just to have them drop in price
So should i grab a GTX 1080 here in march like i plan or wait and see?

As for which GTX 1080 i have settled on either the Gigabyte GTX 1080 G1 or the Gigabyte GTX 1080 Extreme Gaming / Gigabyte GTX 1080 Aorus
The new EVGA GTX 1080 FTW2 also interest me but i have yet to see any news of it?

What i would like so ask is is the Extreme Gaming or the new Aorus worth spending so much extra on?

Looking at reviews all i see is less then 5 FPS for the Extreme Gaming and here where i live its $100 more for the Extreme which i find outrageous
What i comes down to then is noise and temps and there the Extreme is a little bit better but not by much

I should also mention that i dont play as much as i used to so what i will be using the card for most of the time is video editing
Any thoughts and suggestions would be welcome thanks


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> I was gonna add earlier when you mentioned Ryzen that you had better hope that it doesn't turn out to be a RAISON!
> (Get some Intel in ya!!!) lmao


lol....

*STOCK*
:

http://www.3dmark.com/3dm/17794690


*OVERCLOCKED*

http://www.3dmark.com/3dm/17797512


*COMPARISON*



25% improvement in overall score

42% improvement in physics score

not a bad improvement, just for making something do more than its designed to









also shows just how badly my poor old FX CPU is lagging behind. :-(

I had the poor thing overclocked at 5.3GHZ @ 1.7v on water (stock 4.0GHZ @ 1.28v). its like feeding your granny speed and racing her up the hill, she's still never keeping up.


----------



## jprovido

Temps are surprisingly good with my teeny weeny s340 case


----------



## nrpeyton

Quote:


> Originally Posted by *jprovido*
> 
> Temps are surprisingly good with my teeny weeny s340 case


wow, very small.. also very 'tidy' looking  nice

like everything is perfectly in its place. 

what is that black thing hanging from the front?

And what temps are you getting?


----------



## jprovido

Quote:


> Originally Posted by *nrpeyton*
> 
> wow, very small.. also very 'tidy' looking  nice
> 
> like everything is perfectly in its place.
> 
> what is that black thing hanging from the front?
> 
> And what temps are you getting?


thanks! that's the oculus rift CV1









temps were surprisingly good. with a custom fan curve (wanted it to be as quiet as possible but with decent temps) I was getting 74 degrees and 70 degrees celsius from the two gpus while running heaven for like 20 minutes. I had a strix 1080 before sold it and got this two FE cards. I think it was the right decision I didn't want to my case for some reason I really love the s340 lol


----------



## nrpeyton

Quote:


> Originally Posted by *jprovido*
> 
> thanks! that's the oculus rift CV1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> temps were surprisingly good. with a custom fan curve (wanted it to be as quiet as possible but with decent temps) I was getting 74 degrees and 70 degrees celsius from the two gpus while running heaven for like 20 minutes. I had a strix 1080 before sold it and got this two FE cards. I think it was the right decision I didn't want to my case for some reason I really love the s340 lol


Ahh I see, makes sense.. yeah those temps actually sound good for a FE on stock cooler in such a small case..

As long as your not throttling due to temps then you'll be A okay.

You done any overclocking on them? I'm always curious how FE's overclock?

What is that VR experience like?, I haven't had the pleasure yet. Do you *need* both of them?


----------



## jprovido

Quote:


> Originally Posted by *nrpeyton*
> 
> Ahh I see, makes sense.. yeah those temps actually sound good for a FE on stock cooler in such a small case..
> 
> As long as your not throttling due to temps then you'll be A okay.
> 
> You done any overclocking on them? I'm always curious how FE's overclock?
> 
> What is that VR experience like?, I haven't had the pleasure yet. Do you *need* both of them?


just a modest 2000mhz OC and 11000mhz memory oc I see it downclocking a little bit 1950-1970 when it gets warmer but I guess it's normal cause my case is small.

VR is awesome I'm in love with my rift + touch. you're actually fine with a gtx 970 with VR I had to wait for my 1080's for a few days and the 970 was good enough for vr


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> lol....
> 
> *STOCK*
> :
> 
> http://www.3dmark.com/3dm/17794690
> 
> 
> *OVERCLOCKED*
> 
> http://www.3dmark.com/3dm/17797512
> 
> 
> *COMPARISON*
> 
> 
> 
> 25% improvement in overall score
> 
> 42% improvement in physics score
> 
> not a bad improvement, just for making something do more than its designed to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also shows just how badly my poor old FX CPU is lagging behind. :-(





I had the poor thing overclocked at 5.3GHZ @ 1.7v on water (stock 4.0GHZ @ 1.28v). its like feeding your granny speed and racing her up the hill, she's still never keeping up.

5.3 @1.7???








What were your temps?


----------



## emreonal69

Quote:


> Originally Posted by *Derek1*
> 
> Well...that didn't take long. lol
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/11613431


is this score a little low for that speed?

here is my GTX1080 FTW @2113 Mhz / 1451 Mhz @1,093v (stock slave bios)


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *emreonal69*
> 
> is this score a little low for that speed?
> 
> here is my GTX1080 FTW @2113 Mhz / 1451 Mhz @1,093v (stock slave bios)






No idea.
What is the rest of your system?
Your sig rig seems outdated. Still running the same stuff? (Just noticed it was still the 3930K but what were clocked at?) Ram?
Variance is 3%. That means at those scores there is a swing of 600 pts either way
What driver version are you running?
You on water or air?
My cpu OC still wasn't 100% stable either when I did that run.
Allllso, as Vellinious will be in to point out, with Pascal higher clocks with warmer temps equals lower performance. lol
Could be any number of reasons.


----------



## nrpeyton

Quote:


> Originally Posted by *jprovido*
> 
> thanks! that's the oculus rift CV1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> temps were surprisingly good. with a custom fan curve (wanted it to be as quiet as possible but with decent temps) I was getting 74 degrees and 70 degrees celsius from the two gpus while running heaven for like 20 minutes. I had a strix 1080 before sold it and got this two FE cards. I think it was the right decision I didn't want to my case for some reason I really love the s340 lol


If you like fiddling around with things; you could try dropping the voltage and overclocking using the "curve" in MSI Afterburner.

Last night I had mine clocked to 2101MHZ at 1.0v and she was only drawing 160w on 'The Witcher 3' (everything ultra at 1440p).

Helps keep temps nice and low too 

There was only a 5c difference between my Water Temp (in water loop) and my GPU at load ;-)

Probably helps to preserve GPU too.. at those volts it would last forever ;-)

Quote:


> Originally Posted by *Derek1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 5.3 @1.7???
> 
> 
> 
> 
> 
> 
> 
> 
> What were your temps?


CPU temp was only about 35c on the socket


----------



## jprovido

Quote:


> Originally Posted by *nrpeyton*
> 
> If you like fiddling around with things; you could try dropping the voltage and overclocking using the "curve" in MSI Afterburner.
> 
> Last night I had mine clocked to 2101MHZ at 1.0v and she was only drawing 160w on 'The Witcher 3' (everything ultra at 1440p).
> 
> Helps keep temps nice and low too
> 
> There was only a 5c difference between my Water Temp (in water loop) and my GPU at load ;-)
> 
> Probably helps to preserve GPU too.. at those volts it would last forever ;-)
> CPU temp was only about 35c on the socket


you can downclock the gpu I didn't know/didn't really felt the need to try until now.

I actually have a conservative OC right now. stock voltage 2000mhz 11000mhz memory with a custom fan curve rock solid stable no downclocking.

you think I can get away with the same overclock with lower voltage? I'd like the lower temps of course









msi afterburner doesn't have the undervolt feature I'm guessing it is with evga precision? I will install it now


----------



## 6u4rdi4n

Quote:


> Originally Posted by *jprovido*
> 
> you can downclock the gpu I didn't know/didn't really felt the need to try until now.
> 
> I actually have a conservative OC right now. stock voltage 2000mhz 11000mhz memory with a custom fan curve rock solid stable no downclocking.
> 
> you think I can get away with the same overclock with lower voltage? I'd like the lower temps of course
> 
> 
> 
> 
> 
> 
> 
> 
> 
> msi afterburner doesn't have the undervolt feature I'm guessing it is with evga precision? I will install it now


CTRL + F to bring up the curve tool in MSI Afterburner. I guess that's where people set the voltage.


----------



## jprovido

Quote:


> Originally Posted by *6u4rdi4n*
> 
> CTRL + F to bring up the curve tool in MSI Afterburner. I guess that's where people set the voltage.


i see it thanks!

edit:

nvm i figured it out


----------



## nrpeyton

Quote:


> Originally Posted by *jprovido*
> 
> you can downclock the gpu I didn't know/didn't really felt the need to try until now.
> 
> I actually have a conservative OC right now. stock voltage 2000mhz 11000mhz memory with a custom fan curve rock solid stable no downclocking.
> 
> you think I can get away with the same overclock with lower voltage? I'd like the lower temps of course
> 
> 
> 
> 
> 
> 
> 
> 
> 
> msi afterburner doesn't have the undervolt feature I'm guessing it is with evga precision? I will install it now


----------



## jprovido

I can't get it to 1v 2000mhz am I doing somethign wrong?


----------



## nrpeyton

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> I can't get it to 1v 2000mhz am I doing somethign wrong?


that looks right, its just so you can enjoy fiddling around with the settings a little and get used to the new 'curve' method ;-)

you don't necessarily need to run with it if it doesn't get u the results u want ;-)

try a lower clock.

bear in mind my GPU temperature is only 20c odd degrees, if it was higher I might need to reduce the clock rate a bit (maybe 1900mhz then tweak it from there) ;-)

Anything that's a ++ on the core frequency at all at an under-volt is good ;-) depends what ur doing too... I know people who 'fold' and still want to prolong the life of their GPU's can have more control over the power usage while folding also find this useful ;-) its a better solution than watching the core jump around ridiculously as poor boost 3.0 desperately tries to compensate (which would happen if u just lowered the power target) ;-)

Setting it at 1.0v and still watching it stable at 1800mhz + is still nice to watch.. it lets you see how beautifully power efficient these chips are compared to their predecessors.

You're running SLI too, so you've got plenty headroom to under-volt without losing performance on-screen ;-)

Lastly, at the other end of the spectrum, most people are able to achieve *higher* overclocks using this 'curve' method of overclocking than simply overclocking the core the traditional way, in the main window.. (boost 3.0 prefers this method)









;-)

P.S.
I've been invited to a folding event next week , so i'll definitely be utilising something like this myself.


----------



## Vellinious

That's a pretty steep climb to 2100. Usually if you need the curve that aggressive to hit a specific clock, it's because your GPU is too warm to be running it, and you should probably back the clock off just a shade.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> That's a pretty steep climb to 2100. Usually if you need the curve that aggressive to hit a specific clock, it's because your GPU is too warm to be running it, and you should probably back the clock off just a shade.


Vellinious have you done any 'single' GPU 3dmark runs?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Vellinious have you done any 'single' GPU 3dmark runs?


Yeah. I'm hitting mid to low 25k graphics scores in FS, and just shy of 8.5k in Timespy. I put the wrong card in the primary slot last time I changed my motherboard out. I'll be tearing the GPU loop apart again in the next few weeks to reseat the GPU blocks and replace the thermal pads. I'm thinking about taking the backplates off too, see if it doesn't help with temps a little bit. When I'm done, I'll start doing some more single card runs.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah. I'm hitting mid to low 25k graphics scores in FS, and just shy of 8.5k in Timespy. I put the wrong card in the primary slot last time I changed my motherboard out. I'll be tearing the GPU loop apart again in the next few weeks to reseat the GPU blocks and replace the thermal pads. I'm thinking about taking the backplates off too, see if it doesn't help with temps a little bit. When I'm done, I'll start doing some more single card runs.


Excellent, ZEN is only 4 weeks away now.

I've begun saving 

Not sure what route I'm going down yet.

i7 7700k for example has 20% better single-core performance than a i7 6950k

But the 6950k is obviously twice as fast when utilising all cores.

I'm only benching and gaming so I'm not sure I'd benefit from i7 6950k performance (except in benches) and I don't think I can justify spending extra just to get higher scores in multi-core benchmarks when I'm never going to actually *use* the performance.

I am of course talking about equivilant AMD CPU's.... I believe each flagship intel CPU (at each tier) will have an equivalnt targetted AMD chip. (with ZEN's release).

If ZEN doesn't live up to the hype then it will still be the best time for me to buy, as intel will still be forced to lower prices a bit, what ever happens. So an intel purchase is still possible.

Any advice?

And why did you change your mobo out?

P.S. I'm very interested to see if re-seating your GPU blocks helps your water / GPU temp differential. (i'll be waiting from you, on that one, with anticipation 

If I can offer some advice on that one too, if I may: _Use a bit of longggg soft tubing to connect your GPU to your loop (temporarily) so you can keep pulling the block off, to re-seat, & swapping different pad heights. (without disconnecting loop/draining)

I got -20c off memory and VRM by working out the bests heights myself, in contrast to simply living with instructions in EK manual._

_Once you've got it perfect on one GPU you could copy it for the next.

I bought loads of different sizes for as little as $3 each for 100mm x 100mm, on ebay
W/mk ratings over 3 w/mk didn't affect temps. compression/heights did!_
It was painstaking but well worth it, in the end ;-)

_If you're leaving the backplate off like me, it will be easy for you to monitor temps from the back of the PCB for memory & VRM (most components metal stick out the other side)_

https://www.gamegrin.com/hardware/element-gaming-temperature-fan-controller-review/


£25 includes (4 temperature probes)


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Excellent, ZEN is only 4 weeks away now.
> 
> I've begun saving
> 
> Not sure what route I'm going down yet.
> 
> i7 7700k for example has 20% better single-core performance than a i7 6950k
> 
> But the 6950k is obviously twice as fast when utilising all cores.
> 
> I'm only benching and gaming so I'm not sure I'd benefit from i7 6950k performance (except in benches) and I don't think I can justify spending extra just to get higher scores in multi-core benchmarks when I'm never going to actually *use* the performance.
> 
> I am of course talking about equivilant AMD CPU's.... I believe each flagship intel CPU (at each tier) will have an equivalnt targetted AMD chip. (with ZEN's release).
> 
> If ZEN doesn't live up to the hype then it will still be the best time for me to buy, as intel will still be forced to lower prices a bit, what ever happens. So an intel purchase is still possible.
> 
> Any advice?
> 
> And why did you change your mobo out?
> 
> P.S. I'm very interested to see if re-seating your GPU blocks helps your water / GPU temp differential. (i'll be waiting from you, on that one, with anticipation
> 
> If I can offer some advice on that one too, if I may: _Use a bit of longggg soft tubing to connect your GPU to your loop (temporarily) so you can pull off the block and re-seat to fiddle around with different pad heights. (without disconnecting loop/draining)
> 
> I got -20c off memory and VRM by working out the bests heights myself, in contrast to simply living with instructions in EK manual._
> 
> _Once you've got it perfect on one GPU you could copy it for the next.
> 
> I bought loads of different sizes for as little as $3 each for 100mm x 100mm.
> W/mk ratings didn't affect temps. compression/heights did!_
> It was painstaking but well worth it, in the end ;-)


Clock for clock, you won't see 20% IPC increase on the 7700k to the 6950X...maybe 5%. MAYBE. The 7700 gets the boost in IPC, because it can clock higher, because it has less cores to create heat. Bout it.

Both of my loops are quite large....they each hold almost 3 liters of coolant with a radiator the size of a small honda. I'm really beginning to wonder if the back plate, while it looks awesome, is holding in some heat there.

I had to change out the mobo because the X99A-II I had went tits up. Replaced it with an ASRock Taichi, then when the X99A-II came back from ASUS, I went back to it, and sold the ASRock.

I'll be doing a Zen build in May, probably. I'm really looking forward to seeing how they perform in a normal user's hands. I'll probably keep the 6950X, just because.


----------



## ucode

Quote:


> Originally Posted by *Vellinious*
> 
> Clock for clock, you won't see 20% IPC increase on the 7700k to the 6950X...maybe 5%. MAYBE. The 7700 gets the boost in IPC, because it can clock higher, because it has less cores to create heat. Bout it


IPC should be measured per clock not by frequency, Instructions Per Clock, but the meaning seems to have been bastardized somewhat as are many other computer terms. So yes, not that much difference with true IPC however, comparatively Broadwell is a terrible overclocker otherwise one might be able to get some of that single thread performance back through stepped turbo bins that take advantage of idle cores.


----------



## nrpeyton

I resent paying for onboard graphics on a CPU which I'll never use.

I don't know how much of the price tag is involved; but its one thing which has always made me turn away from intel. (6700k/7700k etc)

Every-time I've been close to buying an Intel chip, I read a review somewhere that mentions the on-board graphics (then I remember why I decided against it last time).

And the bigger chips (without onboard graphics) are twice the price, without twice the performance and don't seem to reflect the fact they don't have the on-board graphics (in the price).

Its something thats always just really grinded at me.

The space could be used for more raw CPU performance without hurting manufacturing costs, and better price/performance for the end user.

Someone, please change my opinion on this...


----------



## ucode

Quote:


> Originally Posted by *nrpeyton*
> 
> And the bigger chips (without onboard graphics) are twice the price, without twice the performance and don't seem to reflect the fact they don't have the on-board graphics (in the price).
> 
> Its something thats always just really grinded at me.


Welcome to market segmentation, your concerns are shared by many.


----------



## emreonal69

Quote:


> Originally Posted by *Derek1*
> 
> 
> No idea.
> What is the rest of your system?
> Your sig rig seems outdated. Still running the same stuff? (Just noticed it was still the 3930K but what were clocked at?) Ram?
> Variance is 3%. That means at those scores there is a swing of 600 pts either way
> What driver version are you running?
> You on water or air?
> My cpu OC still wasn't 100% stable either when I did that run.
> Allllso, as Vellinious will be in to point out, with Pascal higher clocks with warmer temps equals lower performance. lol
> Could be any number of reasons.


the system is :
3930k @4,8Ghz (ss off) , (ht on) 1,33v(bios) - 1,376v(load) (full stable , almost 4 years old)
2400mhz cl9 4x4 GB
Rampage 4 ext MB.
1tb ssd
AX1200i
9x140mm external radiator + 2x D5 pump + all component have water block.
GTX1080 FTW have universal block and pci-ex fan holder for mems and vrms.


----------



## fat4l

I was just doing some excessive testing of my card.

At around 2189 it is stable with around 1.094V.
However 2202MHz is not game stable even with 1.15v ?

Huh ?

35C during load lol

Also, whats weird is, then I was testing 2240 MHz, I could do this but my CPU was at stock 4.4G.
When Ioaded my bios profile to 5.1G I couldnt do 2240MHz anymore ? lol


----------



## Derek1

Quote:


> Originally Posted by *emreonal69*
> 
> the system is :
> 3930k @4,8Ghz (ss off) , (ht on) 1,33v(bios) - 1,376v(load) (full stable , almost 4 years old)
> 2400mhz cl9 4x4 GB
> Rampage 4 ext MB.
> 1tb ssd
> AX1200i
> 9x140mm external radiator + 2x D5 pump + all component have water block.
> GTX1080 FTW have universal block and pci-ex fan holder for mems and vrms.


Well then the difference is probably due to the fact you have a custom loop and are getting cooler temps.
Mine is a Hybrid conversion.
Cooler temps = better performance.

My temps during a FS run hover around 40-43C.
Yours will probably be much lower.


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nrpeyton*
> 
> Excellent, ZEN is only 4 weeks away now.
> 
> I've begun saving
> 
> Not sure what route I'm going down yet.
> 
> i7 7700k for example has 20% better single-core performance than a i7 6950k
> 
> But the 6950k is obviously twice as fast when utilising all cores.
> 
> I'm only benching and gaming so I'm not sure I'd benefit from i7 6950k performance (except in benches) and I don't think I can justify spending extra just to get higher scores in multi-core benchmarks when I'm never going to actually *use* the performance.
> 
> I am of course talking about equivilant AMD CPU's.... I believe each flagship intel CPU (at each tier) will have an equivalnt targetted AMD chip. (with ZEN's release).
> 
> If ZEN doesn't live up to the hype then it will still be the best time for me to buy, as intel will still be forced to lower prices a bit, what ever happens. So an intel purchase is still possible.
> 
> Any advice?
> 
> And why did you change your mobo out?
> 
> P.S. I'm very interested to see if re-seating your GPU blocks helps your water / GPU temp differential. (i'll be waiting from you, on that one, with anticipation
> 
> If I can offer some advice on that one too, if I may: _Use a bit of longggg soft tubing to connect your GPU to your loop (temporarily) so you can keep pulling the block off, to re-seat, & swapping different pad heights. (without disconnecting loop/draining)
> 
> I got -20c off memory and VRM by working out the bests heights myself, in contrast to simply living with instructions in EK manual._
> 
> _Once you've got it perfect on one GPU you could copy it for the next.
> 
> I bought loads of different sizes for as little as $3 each for 100mm x 100mm, on ebay
> W/mk ratings over 3 w/mk didn't affect temps. compression/heights did!_
> It was painstaking but well worth it, in the end ;-)
> 
> _If you're leaving the backplate off like me, it will be easy for you to monitor temps from the back of the PCB for memory & VRM (most components metal stick out the other side)_
> 
> https://www.gamegrin.com/hardware/element-gaming-temperature-fan-controller-review/
> 
> 
> £25 includes (4 temperature probes)






I am assuming you have seen this article about your Raisons.

As always consider the source YMMV.

http://wccftech.com/amd-ryzen-am4-processor-family-leak-r7-1800x-flagship/

Apparently the top of the line will match the i7 6900.

BTW I had a fast look at the Top 100 on the 3D Mark world wide scores and the 6950X figures prominently as you mentioned.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I resent paying for onboard graphics on a CPU which I'll never use.
> 
> I don't know how much of the price tag is involved; but its one thing which has always made me turn away from intel. (6700k/7700k etc)
> 
> Every-time I've been close to buying an Intel chip, I read a review somewhere that mentions the on-board graphics (then I remember why I decided against it last time).
> 
> And the bigger chips (without onboard graphics) are twice the price, without twice the performance and don't seem to reflect the fact they don't have the on-board graphics (in the price).
> 
> Its something thats always just really grinded at me.
> 
> The space could be used for more raw CPU performance without hurting manufacturing costs, and better price/performance for the end user.
> 
> Someone, please change my opinion on this...


The 6 core processors, like the 5820k..the low end of the "extreme" series processors, aren't but a few bucks more expensive than the highest end "consumer" processors.


----------



## emreonal69

Quote:


> Originally Posted by *Derek1*
> 
> Well then the difference is probably due to the fact you have a custom loop and are getting cooler temps.
> Mine is a Hybrid conversion.
> Cooler temps = better performance.
> 
> My temps during a FS run hover around 40-43C.
> Yours will probably be much lower.


No, Im also about same temps, even hotter than yours, my GTX1080 temp around 40-45 on FS bench. On 20 loop stability test it reach about 48-49.


----------



## Derek1

Quote:


> Originally Posted by *emreonal69*
> 
> No, Im also about same temps, even hotter than yours, my GTX1080 temp around 40-45 on FS bench. On 20 loop stability test it reach about 48-49.


Really?
That seems high to me from what I have seen posted around from other people with full blocks and custom loops.
You should at least be under 40C I think.
High ambients? Are you in the Sahara? lol


----------



## nrpeyton

Quote:


> Originally Posted by *fat4l*
> 
> I was just doing some excessive testing of my card.
> 
> At around 2189 it is stable with around 1.094V.
> However 2202MHz is not game stable even with 1.15v ?
> 
> Huh ?
> 
> 35C during load lol
> 
> Also, whats weird is, then I was testing 2240 MHz, I could do this but my CPU was at stock 4.4G.
> When Ioaded my bios profile to 5.1G I couldnt do 2240MHz anymore ? lol


Cards don't like being over 1.094v at anything over 25 degrees C.

You need to find a way to get another -10c out of her.

That translates to 20mhz more stability.

Or drop the voltage/clock speed 2 steps.

Also how stable are you at 5.1G (prime 95)? If the card is already having to error-correct or drop frames when its clocked at the 'bleeding edge', any instability *at all* from the CPU might just be enough to 'tip it over the edge'... if the CPU causes a miscalculation in the GPU's error-correction algorithm it's going to crash.

Alternatively; it might just be the GPU has less 'wait time' between frames (at the nano-second level) which means its got to work slightly harder.. again.. just enough to tip it over the edge into 'instability'.

Just my theory 

...Also begs the question, I wonder if the Classy will clock as high when I upgrade from my old FX soon ;-) lol


----------



## emreonal69

Quote:


> Originally Posted by *Derek1*
> 
> Really?
> That seems high to me from what I have seen posted around from other people with full blocks and custom loops.
> You should at least be under 40C I think.
> High ambients? Are you in the Sahara? lol


the liquid temp on idle about 28 , long term load about 33 , flow rate is 2,9 LPM . but the gpu's universal block at end of the loop so the hottest water looping inside of it, ambient temp about 24-25.


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> No, Im also about same temps, even hotter than yours, my GTX1080 temp around 40-45 on FS bench. On 20 loop stability test it reach about 48-49.


Agree with above, my GTX1080 with 2164MHz OC can go in benches up to 36-38°C, in single Firestrike bench temps are under that, usually at 34-36°C

I've saw 40°C only when I do [email protected] or in very long rendering(12 hour render)

How many radiators do you have and what GPU blocks do you have and what ambient temperature do you have?

These temps are high there

Hope this helps

Thanks, Jura


----------



## Derek1

Quote:


> Originally Posted by *emreonal69*
> 
> the liquid temp on idle about 28 , long term load about 33 , flow rate is 2,9 LPM . but the gpu's universal block at end of the loop so the hottest water looping inside of it, ambient temp about 24-25.


Ah, so just one Rad then?
A 280mm? In push/pull?


----------



## nrpeyton

CPU temp of 40c at an ambient of 25c isn't bad at all.

Difference between water and GPU = 10c - 12c

That's only leaving a 5c differential for water temp up/down. You'd have to have a good amount of radiator space to maintain an "at ambient" water temp at full load.

If you have multiple components in the loop its also going to rise 1c-2c after each component (2 components and that's your 5c spent already) before it reaches your radiator.


----------



## fat4l

Quote:


> Originally Posted by *nrpeyton*
> 
> Cards don't like being over 1.094v at anything over 25 degrees C.
> 
> You need to find a way to get another -10c out of her.
> 
> That translates to 20mhz more stability.
> 
> Or drop the voltage/clock speed 2 steps.
> 
> Also how stable are you at 5.1G (prime 95)? If the card is already having to error-correct or drop frames when its clocked at the 'bleeding edge', any instability *at all* from the CPU might just be enough to 'tip it over the edge'... if the CPU causes a miscalculation in the GPU's error-correction algorithm it's going to crash.
> 
> Alternatively; it might just be the GPU has less 'wait time' between frames (at the nano-second level) which means its got to work slightly harder.. again.. just enough to tip it over the edge into 'instability'.
> 
> Just my theory
> 
> ...Also begs the question, I wonder if the Classy will clock as high when I upgrade from my old FX soon ;-) lol


Well I just checked and ..... its the same with 5.1G and 4.4G







its all weird...
Anyway I think I will just stay with 2190 and thats it ..

The CPU is 8 hours Realbench stable and 1 hour OCCT Large data set stable and pretty much never had any bsod or anything with it. 1.36V is what I use atm


----------



## emreonal69

Quote:


> Originally Posted by *Derek1*
> 
> Ah, so just one Rad then?
> A 280mm? In push/pull?


1 X (9x140mm Watercool MO-RA3 420 radiator) (only push with 140mm fans)

here is my rig :


----------



## emreonal69

Quote:


> Originally Posted by *jura11*
> 
> Agree with above, my GTX1080 with 2164MHz OC can go in benches up to 36-38°C, in single Firestrike bench temps are under that, usually at 34-36°C
> 
> I've saw 40°C only when I do [email protected] or in very long rendering(12 hour render)
> 
> How many radiators do you have and what GPU blocks do you have and what ambient temperature do you have?
> 
> These temps are high there
> 
> Hope this helps
> 
> Thanks, Jura


I have 1 big external radiator (9x140mm) , and the gpu block is EK Supremacy nickel


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> I have 1 big external radiator (9x140mm) , and the gpu block is EK Supremacy nickel


Ahh MO-RA3 420mm,are you cooling only GPU or are you cooling CPU as well?

Looks great there and yours temps of water are pretty awesome and water flow of 2.9LPM is around 0.77GPM which is not bad,many people would say you want be close to 1GPM

Not sure what pump are you running, did you tried to run pump faster

Hope this helps

Thanks, Jura


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> CPU temp of 40c at an ambient of 25c isn't bad at all.
> 
> Difference between water and GPU = 10c - 12c
> 
> That's only leaving a 5c differential for water temp up/down. You'd have to have a good amount of radiator space to maintain an "at ambient" water temp at full load.
> 
> If you have multiple components in the loop its also going to rise 1c-2c after each component (2 components and that's your 5c spent already) before it reaches your radiator.


Even with a HUGE radiator, you won't see ambient temps for loop temps. Even with my MORA3 420s on each loop, I still see a delta of between 2 - 5c. I do run silent fans, though......I could probably cut that down by going with better fans.


----------



## emreonal69

Quote:


> Originally Posted by *jura11*
> 
> Ahh MO-RA3 420mm,are you cooling only GPU or are you cooling CPU as well?
> 
> Looks great there and yours temps of water are pretty awesome and water flow of 2.9LPM is around 0.77GPM which is not bad,many people would say you want be close to 1GPM
> 
> Not sure what pump are you running, did you tried to run pump faster
> 
> Hope this helps
> 
> Thanks, Jura


its cooling all parts of rig ( MB chipset , MB vrm , cpu , rams , gpu)

I have Dual D5 pump + EK Dual pump top (each pump set to 4. speed level ) max level is 5 but its a bit loud at max level so I prefer 4. level for optimum performance/noise balance.
Also I have 9 x bitfenix spectre pro pwm 140mm fans (122cfm / 2,8mm/hg) (for radiator) and they running almost the lowest speed which you are not hear any noise from them and the pwm control connected to MB cpu socket, if cpu get hot the fans speed up.


----------



## Derek1

Quote:


> Originally Posted by *emreonal69*
> 
> I have 1 big external radiator (9x140mm) , and the gpu block is EK Supremacy nickel


Just getting back to your original question about the reason for the difference in our scroes.
Because your CPU is OC to 4.8 is probably the reason. Mine was at 4.7 and not really stabel. Also my RAM (if that makes diff) was at 2133 and yours at 2400.
That may account for it.
I just did a few runs at my now stable OC on the CPU of 4.6 and highest Score was 19732 and my Physics dropped to 16720 from 17K.
Also I am not on the latest Driver version. Still one behind I believe so that may make a difference as well.

ETA and it could just be a minor variation in chip quality on your FTW.


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> its cooling all parts of rig ( MB chipset , MB vrm , cpu , rams , gpu)
> 
> I have Dual D5 pump + EK Dual pump top (each pump set to 4. speed level ) max level is 5 but its a bit loud at max level so I prefer 4. level for optimum performance/noise balance.
> Also I have 9 x bitfenix spectre pro pwm 140mm fans (122cfm / 2,8mm/hg) (for radiator) and they running almost the lowest speed which you are not hear any noise from them and the pwm control connected to MB cpu socket, if cpu get hot the fans speed up.


That's nice setup, I'm running single EK DDC 3.2 PWM with EK XE360 and Mayhem Havoc 240mm and CPU EK Supremacy block and two EK WB for GTX1080 and Titan X, for keeping delta at 5°C I need to run fans almost at max and these EK Vardar F3 1850RPM are quite loud, but still I'm unable keep water delta at 5°C, usually I end up at 7-8°C water delta

Due this I will be redoing my loop,I will be putting extra 240mm radiator at front or I will be switching to bigger case

In yours case,have look on Aquaero for controlling fans and pumps etc and regarding the fans have look on Thermalright TY-147SQ or Phanteks PH-F140MP or XP,possibly Noiseblocker fans are pretty good fans as well

These Bitfenix are not bad, but you can get better fans which will push lot more air at same speed/noise

I would expected with yours setup better temperatures

Hope this helps

Thanks, Jura


----------



## jura11

Here is my Firestrike result at 2164MHz/+500Mhz on VRAM



Hope this helps

Thanks,Jura


----------



## Derek1

Quote:


> Originally Posted by *jura11*
> 
> Here is my Firestrike result at 2164MHz/+500Mhz on VRAM
> 
> 
> 
> Hope this helps
> 
> Thanks,Jura


And what are you running your CPU at?


----------



## jura11

Quote:


> Originally Posted by *Derek1*
> 
> And what are you running your CPU at?


Ahh forgot to add and sorry , I'm running i7-5820k at 4.5Ghz OC

Hope this helps

Thanks, Jura


----------



## Derek1

Quote:


> Originally Posted by *jura11*
> 
> Ahh forgot to add and sorry , I'm running i7-5820k at 4.5Ghz OC
> 
> Hope this helps
> 
> Thanks, Jura


So you are running in SLI or no? You got the 2 cards in there so I am assuming you are not due to the low Graphics Score.


----------



## emreonal69

Quote:


> Originally Posted by *jura11*
> 
> Here is my Firestrike result at 2164MHz/+500Mhz on VRAM
> 
> 
> 
> Hope this helps
> 
> Thanks,Jura


there is a interesting thing, if I lock gpu clock to really higher than 7/24 stable clock, exp; 2152Mhz @1,093v the score lowered than 2113Mhz (7/24 stable clock) , also curved OC less efficient than normal OC, İ will bench my gpu at +100/+500 for compare your score. (my gpu stock boost is 2012mhz)


----------



## jura11

Double post


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> there is a interesting thing, if I lock gpu clock to really higher than 7/24 stable clock, exp; 2152Mhz @1,093v the score lowered than 2113Mhz (7/24 stable clock) , also curved OC less efficient than normal OC, İ will bench my gpu at +100/+500 for compare your score. (my gpu stock boost is 2012mhz)


I will try post my curved OC,running
1.093v or 1.094v as max,previously without curved I could run max as [email protected]

My stock boost is 1984MHz I think,but I will check that

Hope this helps

Thanks, Jura


----------



## Derek1

Quote:


> Originally Posted by *jura11*
> 
> You can't run as SLI Titan X Maxwell and GTX1080
> 
> This PC is used more for rendering than benching
> 
> My score not sure if its OK or not or low or high what I'm running, as I said, I do rather render than bench
> 
> Hope this helps
> 
> Thanks, Jura


Ya I didn't think that was possible









Thanks


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> there is a interesting thing, if I lock gpu clock to really higher than 7/24 stable clock, exp; 2152Mhz @1,093v the score lowered than 2113Mhz (7/24 stable clock) , also curved OC less efficient than normal OC, İ will bench my gpu at +100/+500 for compare your score. (my gpu stock boost is 2012mhz)


Here is my MSI AB Curve OC



Hope this helps

Thanks,Jura


----------



## emreonal69

Quote:


> Originally Posted by *jura11*
> 
> I will try post my curved OC,running
> 1.093v or 1.094v as max,previously without curved I could run max as [email protected]
> 
> My stock boost is 1984MHz I think,but I will check that
> 
> Hope this helps
> 
> Thanks, Jura


its 2113 Mhz @1,093v and +500 mem.


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> its 2113 Mhz @1,093v and +500 mem.


Nice result there

This above result is my best,I just couldn't break 20k barrier

I'm running only 2133MHz on my RAM DDR4 and my OC on CPU is 4.5Ghz, I will try later on with 4.6Ghz OC if this does change or not, no cache OC etc(running only 32x)

With 2152Mhz on mine my results are around 19250 mark

Hope this helps

Thanks, Jura


----------



## emreonal69

Quote:


> Originally Posted by *jura11*
> 
> Nice result there
> 
> This above result is my best,I just couldn't break 20k barrier
> 
> I'm running only 2133MHz on my RAM DDR4 and my OC on CPU is 4.5Ghz, I will try later on with 4.6Ghz OC if this does change or not, no cache OC etc(running only 32x)
> 
> With 2152Mhz on mine my results are around 19250 mark
> 
> Hope this helps
> 
> Thanks, Jura


if you take out titan x from your rig, the score may raise


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> if you take out titan x from your rig, the score may raise


I can only disable Titan X, physically removing Titan X is not worth it as I would need redo loop

Hope this helps

Thanks, Jura


----------



## Vellinious

Quote:


> Originally Posted by *emreonal69*
> 
> if you take out titan x from your rig, the score may raise


Do you think having a 2nd card in the PC makes a difference on single card benchmark scores? I guess I would have never even considered that......

My single card runs score.....ok, I guess, but not what I expect. But, my SLI scores are really, very, very good. I might have to experiment with this, the next time I tear down my loop.


----------



## emreonal69

Quote:


> Originally Posted by *Vellinious*
> 
> Do you think having a 2nd card in the PC makes a difference on single card benchmark scores? I guess I would have never even considered that......
> 
> My single card runs score.....ok, I guess, but not what I expect. But, my SLI scores are really, very, very good. I might have to experiment with this, the next time I tear down my loop.


I have experience that before, I have GTX780 Ti + GTX750 Ti (PhysX) setup before, and 3DMark , Valley scores are lower than single card, I think its related to PCI-E bus bandwith occupation.


----------



## Vellinious

Quote:


> Originally Posted by *emreonal69*
> 
> I have experience that before, I have GTX780 Ti + GTX750 Ti (PhysX) setup before, and 3DMark , Valley scores are lower than single card, I think its related to PCI-E bus bandwith occupation.


Even with the 2nd GPU disabled? Hmm......


----------



## jura11

Quote:


> Originally Posted by *Vellinious*
> 
> Do you think having a 2nd card in the PC makes a difference on single card benchmark scores? I guess I would have never even considered that......
> 
> My single card runs score.....ok, I guess, but not what I expect. But, my SLI scores are really, very, very good. I might have to experiment with this, the next time I tear down my loop.


I have done few benches few weeks ago...

I removed Titan X and run only GTX1080, this time I run Zotac GTX1080 AMP Edition and my best results with Titan X in place and running without Titan X has been minimal

Never seen big differences in any bench, tried like TimeSpy and Firestrike,previously have run Win10 as well and compared to Win7 although with Win10 I saw slightly better results

Hope this helps

Thanks, Jura


----------



## jura11

Quote:


> Originally Posted by *emreonal69*
> 
> I have experience that before, I have GTX780 Ti + GTX750 Ti (PhysX) setup before, and 3DMark , Valley scores are lower than single card, I think its related to PCI-E bus bandwith occupation.


Not sure there, I have tried that in past and there has been small differences as most, not seen big differences

I will check my older results and post them later, but in these results as my main GPU is Titan X

Hope this helps

Thanks, Jura


----------



## emreonal69

Quote:


> Originally Posted by *Vellinious*
> 
> Even with the 2nd GPU disabled? Hmm......


no, not disabled, it was always dedicated at PhysX.


----------



## 6u4rdi4n

Lots of good stuff in this thread!

Pascal is definitely sensitive to voltage. Tried using the OC Scanner in Precision XOC for lols, just to test the waters of V/F curve tweaking and kind of set a little guideline for my own card. At 1.093V, it didn't even want to set +10MHz, 1.081V no problem at all. After fiddling around, I managed to run my card at 2114MHz through fire strike, but it wasn't game stable.

Think I might have to lower my temps somehow, as I top out at 46°C. I have 1 EK XTX 120 and 1 EK XTX 360 rad with EK Vardar fans. i7 7700K with a Swiftech Apogee HD block and of course the GTX 1080 with an EK block. I've tried running my fans at max speed (~18-1900rpm), but it doesn't seem to make a difference over 600-1000rpm.

Maybe I should try a different TIM for my graphics card? Currently using AS5.


----------



## hertz9753

I use Arctic MX-4 right now. Many other people use Thermal Grizzly.


----------



## jura11

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Lots of good stuff in this thread!
> 
> Pascal is definitely sensitive to voltage. Tried using the OC Scanner in Precision XOC for lols, just to test the waters of V/F curve tweaking and kind of set a little guideline for my own card. At 1.093V, it didn't even want to set +10MHz, 1.081V no problem at all. After fiddling around, I managed to run my card at 2114MHz through fire strike, but it wasn't game stable.
> 
> Think I might have to lower my temps somehow, as I top out at 46°C. I have 1 EK XTX 120 and 1 EK XTX 360 rad with EK Vardar fans. i7 7700K with a Swiftech Apogee HD block and of course the GTX 1080 with an EK block. I've tried running my fans at max speed (~18-1900rpm), but it doesn't seem to make a difference over 600-1000rpm.
> 
> Maybe I should try a different TIM for my graphics card? Currently using AS5.


Not sure of better TIM will gain you better temps,I'm using Noctua NT-H1 on my cards and tried previously on Titan X Kryonaut as well, but not been very happy with results and temps

Regarding yours temps all depends on ambient temperature or room temperature and then yours water temperature and from this you can check what yours delta is

Sometimes running faster fans won't lower temps and you simply need more rad space, but in yours case, you are running 360mm and 120mm which should be OK

Hope this helps

Thanks, Jura


----------



## 6u4rdi4n

Room temp is mostly pinned at 22°C according to my "old style" thermometer. Water temperature between 29.9°C and 32°C under load, if my digital thermometer is correct. Flow should be sufficient, using a D5 pump. Not at full speed, but enough to actually see some cruising in the reservoir (Swiftech micro res).

Maybe it's time to go with a larger case and get more/larger rads.

I have my fans in a pull configuration, but push vs pull is a rather negligible difference? I could fit a push/pull configuration on the 120 rad, but the 360 is already just a few mm from hitting the top of the motherboard, so that one is a no-go.


----------



## Dragonsyph

Weird how the hybrid cards with a tiny 120mm rad are running cooler than some of these custom loops.

Moving from +1000 on memory to +925 netted me a 26,501 graphics scores, timespy seems to not care as I'm always around 8500-8600 graphics score. Watching the GPU utilization during timepspy was showing low usage. Is my 4 core CPU the cause for this?


----------



## Vellinious

The magic memory again. lol


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Dragonsyph*
> 
> Weird how the hybrid cards with a tiny 120mm rad are running cooler than some of these custom loops.
> 
> Moving from +1000 on memory to +925 netted me a 26,501 graphics scores, timespy seems to not care as I'm always around 8500-8600 graphics score. Watching the GPU utilization during timepspy was showing low usage. Is my 4 core CPU the cause for this?


The hybrid cards only have water cooling for the gpu tho? Most, if not all, loops here have full cover blocks which cool vrms and vram as well.

I think this may be the cause, but I could be wrong.


----------



## Dragonsyph

Quote:


> Originally Posted by *6u4rdi4n*
> 
> The hybrid cards only have water cooling for the gpu tho? Most, if not all, loops here have full cover blocks which cool vrms and vram as well.
> 
> I think this may be the cause, but I could be wrong.


Ours do the vram and the gpu with the vrms cooled by the fan. But ya no vrm cooling could be it. But still a single 120m does pretty well for a 1080.


----------



## Vellinious

Quote:


> Originally Posted by *6u4rdi4n*
> 
> The hybrid cards only have water cooling for the gpu tho? Most, if not all, loops here have full cover blocks which cool vrms and vram as well.
> 
> I think this may be the cause, but I could be wrong.


Doubtful. More likely it's just a bad mount, poor use of TIM, flow rates not optimized....could be a million different things.


----------



## Dragonsyph

What game have you all been playing lately? Looking for something good. Been playing tomb raider, game maxes out my vram lol.


----------



## jura11

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Room temp is mostly pinned at 22°C according to my "old style" thermometer. Water temperature between 29.9°C and 32°C under load, if my digital thermometer is correct. Flow should be sufficient, using a D5 pump. Not at full speed, but enough to actually see some cruising in the reservoir (Swiftech micro res).
> 
> Maybe it's time to go with a larger case and get more/larger rads.
> 
> I have my fans in a pull configuration, but push vs pull is a rather negligible difference? I could fit a push/pull configuration on the 120 rad, but the 360 is already just a few mm from hitting the top of the motherboard, so that one is a no-go.


Hi there

From yours reply looks like yours water delta is 8-10°C, I would thought so too yours flow will be good with single D5 pump

Regarding push/pull this largely depends on radiator thickness and FPI of radiator

I will be putting extra EK F3 1850RPM for push pull on my 360 rad and will see if its worth to run push pull

On AIO this makes negligible difference and on custom loop this depends on more factors there

Hope this helps

Thanks, Jura


----------



## nikuk

Man this card is temperature sensitive. Asus Strix 1080 OC. I've got Heaven 4.0 running windowed for about 45 minutes now and the card has been great @ 2202 core using the Curve editor. Holding steady at 35c with the occasional blip to 36c.

I have started running some x264 stress test concurrent (testing capacity new water loop) and tuning out the speedfan profiles... if the cards temp levels out at 38c I'll start getting occasional purple starts in Heaven.

Ambient 22c
GPU in Heaven water temps 30c
GPU & CPU water temps 31c


----------



## Vellinious

Quote:


> Originally Posted by *nikuk*
> 
> Man this card is temperature sensitive. Asus Strix 1080 OC. I've got Heaven 4.0 running windowed for about 45 minutes now and the card has been great @ 2202 core using the Curve editor. Holding steady at 35c with the occasional blip to 36c.
> 
> I have started running some x264 stress test concurrent (testing capacity new water loop) and tuning out the speedfan profiles... if the cards temp levels out at 38c I'll start getting occasional purple starts in Heaven.
> 
> Ambient 22c
> GPU in Heaven water temps 30c
> GPU & CPU water temps 31c


Welcome to Pascal.


----------



## nikuk

LOL.

In the last 3 months I've gone from an ATX case with (7) 140mm fans to a cube case with (2) AIOs, now to a mid tower with a custom loop. ****in' Pascal.


----------



## swingarm

Folding temp on Asus Strix 1080 at 66C, that ok for extended period of time?


----------



## Vellinious

Quote:


> Originally Posted by *swingarm*
> 
> Folding temp on Asus Strix 1080 at 66C, that ok for extended period of time?


Must be on air. Just monitor it. I wouldn't want mine running that warm for an extended period, but......that's why I watercool.


----------



## 6u4rdi4n

Would be fine. If the card burns out at only 66C, you have a perfectly good reason to RMA it.


----------



## nrpeyton

What CPU scores the highest on 3dmark.

A CPU with much better "single core" performance.
or
The CPU with "slightly better" overclocked multi-core performance.

What would win, between two i7's?

So for example, comparing the I7 4930k and the i7 7700k

i7 7700k:
Faster "effective speed" (17%)
Faster Single Core (34%)
Faster Quad Core speed (24%)

But the
i7 4930k:
Faster multi-core speed (8%) _or 16% faster if overclocked._

The "faster than" scores mean faster than the other CPU. (when vs. each other)

The 7700k looks like it would win hands down... yet when the 4930k has all its cores utilised its 8-16% faster. Which would yield a higher score in 3dmark?


----------



## KickAssCop

One with the highest overclock.


----------



## pez

Lol if your card burns out at 66C, something else happened.


----------



## Derek1

Quote:


> Originally Posted by *KickAssCop*
> 
> One with the highest overclock.


nrpeyton asked "What CPU scores the highest on 3dmark.

A CPU with much better "single core" performance.
or
The CPU with "slightly better" overclocked multi-core performance.

What would win, between two i7's?

So for example, comparing the I7 4930k and the i7 7700k

i7 7700k:
Faster "effective speed" (17%)
Faster Single Core (34%)
Faster Quad Core speed (24%)

But the
i7 4930k:
Faster multi-core speed (8%) or 16% faster if overclocked.

The "faster than" scores mean faster than the other CPU. (when vs. each other)

The 7700k looks like it would win hands down... yet when the 4930k has all its cores utilised its 8-16% faster. Which would yield a higher score in 3dmark?"

Not according to this.

http://www.overclock.net/t/1518806/fire-strike-ultra-top-30/1720#post_25825861

Look for my and Preim's score 2 from the bottom of the page.

And a comparison between me and the next top 2 FS Ultra scores. Both 7700K were OC to 5.0.

http://www.3dmark.com/compare/fs/11614588/fs/11615304/fs/11470905#

Interestingly, only TXP's with a 7700K score higher than me in FSU.


----------



## Vellinious

More cores is usually better for the 3DMark CPU portions of the test.


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> More cores is usually better for the 3DMark CPU portions of the test.


Substantially I might add.
My 4820K would score around 12K on the Physics @ 4.7, this 4930K will break 17K @ 4.7. That's on FS. Overall scores went from 17800 to over 20K


----------



## swingarm

Card is not burning out at 66C(it's been higher), just concerned about it folding for an extended amount of time at that temp.


----------



## jura11

Quote:


> Originally Posted by *swingarm*
> 
> Card is not burning out at 66C(it's been higher), just concerned about it folding for an extended amount of time at that temp.


Hi there

I have rendered up to 12-14 hours per day on my Zotac GTX1080 AMP and then sold this card after while as I went with custom water loop and mainly I couldn't get good water block for Zotac GTX1080 AMP, only Alphacool has been available and this block is not the best plus is very restrictive

Hope this helps

Thanks, Jura


----------



## jura11

Forgot to say, my Zotac GTX1080 AMP in rendering has been in constant 76-78°C and Titan X has been in 42-45°C with Raijintek Morpheus

Renderer used has been mainly LuxRender,IRAY or Cycles

Hope this helps

Thanks, Jura


----------



## fat4l

I should maybe show you my scores ....getting over 26K in graphics in 3D FS


----------



## Vellinious

Quote:


> Originally Posted by *fat4l*
> 
> I should maybe show you my scores ....getting over 26K in graphics in 3D FS


Clocks / driver?


----------



## zGunBLADEz

Quote:


> Originally Posted by *Vellinious*
> 
> Must be on air. Just monitor it. *I wouldn't want mine* running that warm for an extended period, but......that's why I watercool.


This is why people look at posts like this and get afraid..

66c on a gpu is perfectly fine specially with all the throttling features on the 1080 XD

I mean not like my 30c fully loaded 1080 is burning itself up...


----------



## Vellinious

Quote:


> Originally Posted by *zGunBLADEz*
> 
> This is why people look at posts like this and get afraid..
> 
> 66c on a gpu is perfectly fine specially with all the throttling features on the 1080 XD
> 
> I mean not like my 30c fully loaded 1080 is burning itself up...


Afraid of what? Because I said I wouldn't want mine running that high for an extended period of time? If they're going to take that and turn it into the fear of god, that's their own fault. roflmao


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> nrpeyton asked "What CPU scores the highest on 3dmark.
> 
> A CPU with much better "single core" performance.
> or
> The CPU with "slightly better" overclocked multi-core performance.
> 
> What would win, between two i7's?
> 
> So for example, comparing the I7 4930k and the i7 7700k
> 
> i7 7700k:
> Faster "effective speed" (17%)
> Faster Single Core (34%)
> Faster Quad Core speed (24%)
> 
> But the
> i7 4930k:
> Faster multi-core speed (8%) or 16% faster if overclocked.
> 
> The "faster than" scores mean faster than the other CPU. (when vs. each other)
> 
> The 7700k looks like it would win hands down... yet when the 4930k has all its cores utilised its 8-16% faster. Which would yield a higher score in 3dmark?"
> 
> Not according to this.
> 
> http://www.overclock.net/t/1518806/fire-strike-ultra-top-30/1720#post_25825861
> 
> Look for my and Preim's score 2 from the bottom of the page.
> 
> And a comparison between me and the next top 2 FS Ultra scores. Both 7700K were OC to 5.0.
> 
> http://www.3dmark.com/compare/fs/11614588/fs/11615304/fs/11470905#
> 
> Interestingly, only TXP's with a 7700K score higher than me in FSU.


I see, makes perfect sense now.

3dmark utilises the extra cores quite nicely.

Your 4930k is 7% faster than a 7700k when properly utilising all of its 6 cores ;-) Getting you better scores in 3dmark.

However in raw gaming or software only utilising 1-4 cores the 7700k would win.

Hmm.. so.. I need to decide whats more important to me... 3dmark scores or gaming FPS.

To be honest; I doubt any of the 'latest' CPU's would bottleneck games, so I think I'll opt for more cores. ;-)

Thanks again for the info 

argh, this complicates things now though, coz the simple choice would of been to go with the 7700k (its listed as the 3rd most powerful CPU at 'CPU User Benchmark' website. Only topped by the ridiculously over priced 6900 & 6950.

BTW. I noticed on that website, the difference between you, and a 7700k doubles after you overclock your CPU... to +15% in your favour ;-)

Also explains how my score scales so nicely when I overclock my CPU (as it's an 8 core). even if its old.


----------



## ucode

I posted my mobile 4700MQ on 'CPU User Benchmark' some time ago and they removed it from the listings. http://cpu.userbenchmark.com/SpeedTest/149/IntelR-CoreTM-i7-4700MQ-CPU---240GHz#

The actual benchmark is still there http://www.userbenchmark.com/UserRun/613476 From listed best place of 76% to 105% Sure, it was overclocked but not on any exotic cooling. Moral of the story, don't rely on user benchmark to tell you how well a CPU can operate unless running stock.


----------



## kevindd992002

Which is the most practical GTX 1080 to buy if I considering slapping a waterblock onto it? Are Gainward or Palit variants any good? They seem yo be the cheapest among the bunch.


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Which is the most practical GTX 1080 to buy if I considering slapping a waterblock onto it? Are Gainward or Palit variants any good? They seem yo be the cheapest among the bunch.


Founders Edition / Reference designs feature the cheapest waterblocks

Palit & gainward:
https://www.ekwb.com/news/new-ek-water-blocks-for-multiple-palit-and-gainward-graphics-cards/

Palit:
http://www.palit.com/palit/vgapro.php?id=2604&lang=en&pn=NEB1080015P2-1040J&tab=sp

So:
*GPU:* Palit GEF GTX 1080 8GB JETSTREAM *£574.93*
*Block:* EK-FC1080 GTX JetStream - Nickel *122.95€*

Founders Edition:
*GPU:* EVGA GeForce GTX 1080 Founders Edition *£522.06*
*Block:* *EK*-FC1080 GTX *€ 98.32*


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> Founders Edition / Reference designs feature the cheapest waterblocks
> 
> Palit & gainward:
> https://www.ekwb.com/news/new-ek-water-blocks-for-multiple-palit-and-gainward-graphics-cards/
> 
> Palit:
> http://www.palit.com/palit/vgapro.php?id=2604&lang=en&pn=NEB1080015P2-1040J&tab=sp
> 
> So:
> EK-FC1080 GTX JetStream - Nickel *122.95€*
> Palit GEF GTX 1080 8GB JETSTREAM *£574.93*


But don't the reference card designs have less OC potential compared to AIB cards? Or does it still boil down to silicon lottery?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> But don't the reference card designs have less OC potential compared to AIB cards? Or does it still boil down to silicon lottery?


That depends on a few things:

-Maximum O/C potential for the GPU's core on FE's has been reportedly good (even better than many partner boards) _<-- silicon lottery applies here_
-But it's let down massively by poor stock cooler. But that's irrelevant if you're slapping a water-block on it.

-The only other limiting factor is you're more power limited on the cheaper cards. For example a fully overclocked FE can't draw more than 216w.
-A fully overclocked EVGA Classified can draw up to 320w _(although you'd be lucky to ever draw that much in regular games)_

You could cross-flash a BIOS from another card for a higher power limit, or short out at resistor on the board yourself to bypass the power sensing.

You can use the BIOS database at www.techpowerup.comto find out the max power draw of each card.
https://www.techpowerup.com/gpudb/

That being said, look at my power draw in The Witcher 3 while I was under-volting (*notice I was still clocked at 2100MHZ & only drawing 165w) all settings maxed out at Ultra 1440p.
*
Power draw in watts in the top left hand corner of this picture:
http://cdn.overclock.net/0/0d/0dc00686_witcher3undervolting.jpeg


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> That depends on a few things:
> 
> -Maximum O/C potential for the GPU's core on FE's has been reportedly good (even better than many partner boards) _<-- silicon lottery applies here_
> -But it's let down massively by poor stock cooler. But that's irrelevant if you're slapping a water-block on it.
> 
> -The only other limiting factor is you're more power limited on the cheaper cards. For example a fully overclocked FE can't draw more than 216w.
> -A fully overclocked EVGA Classified can draw up to 320w _(although you'd be lucky to ever draw that much in regular games)_
> 
> You could cross-flash a BIOS from another card for a higher power limit, or short out at resistor on the board yourself to bypass the power sensing.
> 
> You can use the BIOS database at www.techpowerup.comto find out the max power draw of each card.
> https://www.techpowerup.com/gpudb/


I see. I did read about cross-flashing although I haven't personally tried it with my 1070 yet.

So are you saying that even though the FE's are power limited, they can still outperform partner cards with higher limits? Aren't they also limited by the single 8-pin power cable? Does a higher clock speed, regardless of the power draw, automatically mean better performance for the same card model but among the different variants?

Oh, and would it be wiser if I were to wait until March if ever the 1080Ti is unveiled and the 1080's get a bit cheaper across the board?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> I see. I did read about cross-flashing although I haven't personally tried it with my 1070 yet.
> 
> So are you saying that even though the FE's are power limited, they can still outperform partner cards with higher limits? Aren't they also limited by the single 8-pin power cable? Does a higher clock speed, regardless of the power draw, automatically mean better performance for the same card model but among the different variants?


I've edited my last post 

A higher core clock speed = more FPS.

But the power draw in watts rises the higher the voltage. And higher voltages are required for higher GPU core clocks.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> T
> -The only other limiting factor is you're more power limited on the cheaper cards. For example a fully overclocked FE can't draw more than 216w.
> -A fully overclocked EVGA Classified can draw up to 320w _(although you'd be lucky to ever draw that much in regular games)_


Doubtful anyone short of maybe some guys with LN2 even getting close to 300 watts. The power limit on the FTW at 130% is 280 watts, roughly, and I've never seen any power limiting on it with them running 2252. I would guess that most people will maybe, possibly see 260 or so, but not much more than that.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Doubtful anyone short of maybe some guys with LN2 even getting close to 300 watts. The power limit on the FTW at 130% is 280 watts, roughly, and I've never seen any power limiting on it with them running 2252. I would guess that most people will maybe, possibly see 260 or so, but not much more than that.


I agree.

And if you keep the volts nice and low (my card can do 2100mhz at 1.0v) it will only draw 165-175w at 1440p.

In that instance, you wouldn't even need to increase the power slider in your overclocking app.

For general gaming a FE is absoluetely fine.. you've still got up to 216w to play with and you can get the EK block + card for £625.

Or pay an extra 80-100 for a slightly better card and bigger block, for more headroom (between 260-280w as Vellinious pointed out).


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> I've edited my last post
> 
> A higher core clock speed = more FPS.
> 
> But the power draw in watts rises the higher the voltage. And higher voltages are required for higher GPU core clocks.


Ok, so are there even instances where the power limit of the FE's is reached? If so, what are those?

I also edited my post to add another question


----------



## Vellinious

Quote:


> Originally Posted by *kevindd992002*
> 
> Ok, so are there even instances where the power limit of the FE's is reached? If so, what are those?
> 
> I also edited my post to add another question


Yes, the FE's sometimes hit the power limit and throttle when overclocked. How high that overclock is, will depend on the GPU, the voltage you're running, and how cool you're keeping it.


----------



## nrpeyton

Aye, it basically comes down to this:

Save 100 bux and maybe power throttle a bit (and lose a maximum of 10% performance) _<----
_
but thats only in the worst case scenario, real terms maybe 5-7%??

or
spend an extra 100 bux and give yourself a bit extra overclocking headroom. 7%...
10% if you're really lucky on the silicon lottery and are able to actually utilize it.


----------



## nikuk

basically, yes.

thats why I bought an asus card.


----------



## kevindd992002

Ok, that makes sense.

But what then is the sense of cross-flashing to a BIOS that has a higher power limit if you are hardware-limited by the single 8-pin power cable in the FE's?

Would it be wiser if I were to wait until March if ever the 1080Ti is unveiled and the 1080's get a bit cheaper across the board?


----------



## Vellinious

Quote:


> Originally Posted by *kevindd992002*
> 
> Ok, that makes sense.
> 
> But what then is the sense of cross-flashing to a BIOS that has a higher power limit if you are hardware-limited by the single 8-pin power cable in the FE's?
> 
> Would it be wiser if I were to wait until March if ever the 1080Ti is unveiled and the 1080's get a bit cheaper across the board?


A single 8 pin can deliver 150-175 watts. With 75-85 coming from the PCIe slot, you're looking at 255 watts max for a single 8 pin. IF, the bios is setup right to allow for that, AND your PSU / PSU cables can deliver it. I'm not a big fan of pushing a single 8 pin past 150, but.....I've had single 6 pins pulling 150 each and never had any issues. /shrug

Unfortunately, with Pascal, it's impossible to tell how the bios is setup for power draw from the 8 / 6 pin connectors. Sure wish someone could get the bios editor working.


----------



## ucode

@kevindd992002
The 8-pin connector itself and wire should support 3 circuits to give almost 300W of power. A 6-pin connector a little over 200W but be aware it's not just the connector that determines how much power can be drawn within operational specification.

If you can wait until March and if that's for sure when 1080Ti hits the streets then IMO wait.

@Vellinious
It's been possible to tell power limit settings from Pascal VBIOS for each Mini-Fit connector, the MB slot and total power limit since a long time ago.

http://www.overclock.net/t/1601329/gtx-1070-1080-titan-x-2nd-gen-bios-who-has-it/110#post_25319052


----------



## kevindd992002

Quote:


> Originally Posted by *ucode*
> 
> @kevindd992002
> The 8-pin connector itself and wire should support 3 circuits to give almost 300W of power. A 6-pin connector a little over 200W but be aware it's not just the connector that determines how much power can be drawn within operational specification.
> 
> If you can wait until March and if that's for sure when 1080Ti hits the streets then IMO wait.


I see. So technically, all Pascal cards can just go away with just using one 8-pin connector and not be power-limited by the connector itself. What is the significance of the 150-175W of power draw from the 8-pin connector?

Yeah, I'd definitely wait. But of course, I should sell my GTX 1070 ASAP.


----------



## alpsie

Question

Has any done any ingame testing to see what FPS gains or FPS stability they get, after having overclocked the GPU?
I´m wondering if its worth it, compared to temp increase etc.


----------



## ucode

I don't know Kevin, a large safety margin perhaps or accounting for low quality products. How can some say a 6-Pin connector with 2 circuits (4 pins) is 75W and an 8-Pin with 3 circuits (6 pins) is 150W. The math doesn't appear to work.


----------



## Derek1

Quote:


> Originally Posted by *kevindd992002*
> 
> I see. So technically, all Pascal cards can just go away with just using one 8-pin connector and not be power-limited by the connector itself. What is the significance of the 150-175W of power draw from the 8-pin connector?
> 
> Yeah, I'd definitely wait. But of course, I should sell my GTX 1070 ASAP.


The 1080Ti MIGHT be announced in March.
When they will be available for purchase could be April or longer.
And with demand? It may be another 2 months before you actually get one.
Still better to wait I suppose rather than going from 1070 to 1080 at this point.
And probably a good idea to sell now as the market will be saturated with people wanting get rid of their cards to upgrade.


----------



## kevindd992002

Quote:


> Originally Posted by *Derek1*
> 
> The 1080Ti MIGHT be announced in March.
> When they will be available for purchase could be April or longer.
> And with demand? It may be another 2 months before you actually get one.
> Still better to wait I suppose rather than going from 1070 to 1080 at this point.
> And probably a good idea to sell now as the market will be saturated with people wanting get rid of their cards to upgrade.


Right. Well, the reason why I want to wait for the 1080Ti would be a possible decrease in price of the 1080. If the 1080Ti gets released in March, when would the price of the 1080 possibly drop?


----------



## Derek1

Quote:


> Originally Posted by *kevindd992002*
> 
> Right. Well, the reason why I want to wait for the 1080Ti would be a possible decrease in price of the 1080. If the 1080Ti gets released in March, when would the price of the 1080 possibly drop?


I don't have enough experience with Nvidia release cycles to give you an answer to that 64K dollar question.
The Ti's placement price may mean only a minor price drop in the 1080. Everyone is guessing at this point on that.
Right now, my FTW is selling for 910 Can. The Titan XP is selling for 1400 Can.
Looks like there is room to put the Ti in the middle there somewhere without lowering the price of the 1080 to me.


----------



## ondoy

any idea when version 2 of FTW or SC coming out ?


----------



## GRABibus

In case of 2-way SLI on my Deluxe II, I would use 2 GTX 1080 on slots PCIEX16_1 and PCIEX16_3.

http://www.casimages.com/img.php?i=17020911110917369814835902.png

Question : which HB SLI bridge should I use ?
Which dimensions must have this SLI HB bridge ?


----------



## GRABibus

I found :

http://www.casimages.com/img.php?i=17020911300917369814835926.png

So a 2 slots 60mm dimension one


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Right. Well, the reason why I want to wait for the 1080Ti would be a possible decrease in price of the 1080. If the 1080Ti gets released in March, when would the price of the 1080 possibly drop?


I agree, with Derek. I highly doubt your looking at a significant price drop.

At best maybe 50 bux.

If we were talking about a 1/2 price drop like what is rumoured with the new ZEN release (I7 6900K performance for HALF the price) then yes, by all means you'd be insane, not to wait!

But to wait another maybe, 6 months.. for 50-100 bux drop (if even that)... ...... I wouldn't say it's worth the wait.


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree, with Derek. I highly doubt your looking at a significant price drop.
> 
> At best maybe 50 bux.
> 
> If we were talking about a 1/2 price drop like what is rumoured with the new ZEN release (I7 6900K performance for HALF the price) then yes, by all means you'd be insane, not to wait!
> 
> But to wait another maybe, 6 months.. for 50-100 bux drop (if even that)... ...... I wouldn't say it's worth the wait.


Alright, that makes sense. Do you think it would be worth it to upgrade to a 1080 if a I just play at 1080p though?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Alright, that makes sense. Do you think it would be worth it to upgrade to a 1080 if a I just play at 1080p though?


Not necessarily, there's different ways to look at it.

In terms of performance, probably not.

However personally, I always like to keep my system up-to-date.

I sold two STRIX GTX 980 (SLI)'s for a 1080. Performance is similar. However In a years time, I can sell my 1080 and upgrade again (at much lower cost to myself).

It's my way of 'spreading the cost' and being content that my system is up to date.

Also keeps avenues open to me if I do want to upgrade a monitor.

I game at 1440p usually, + sometimes I'll connect my 5m HDMI cable to my 4K television for 'the Witcher 3' which is beautiful.

Sometimes I love just walking around admiring the beauty of the game at that resolution.

I remember the first day I got the TV, it was almost a surreal experience.. the mind even tricks you into believing your playing in 3d due to the intensity of the detail (which doesn't get any worse even if I sit within 2 feet of the screen) ;-)


----------



## nrpeyton

*IMPORTANT NEWS FOR EVGA CUSTOMERS:*

EVGA is now offering an 'upgrade' to the new ICX cooling for a fee of $100:

U.S. Customers:
http://www.evga.com/icxupgrade/

European Customers:
http://eu.evga.com/icxupgrade/


----------



## pez

My ACX 3.0 is performing so well I'm not sure this is really worth it. The only thing that makes the deal rather enticing is that you basically get a fresh step up period when doing this. Might be worth it considering the mysterious 1080 Ti might come out in that period of time.


----------



## looniam

just mind any WB compatibility with those new PCBs that have the temp sensors:


Spoiler: Warning:NSFW *naked* pcbs!



OLD FTW PCB


NEW FTW2 PCB


EKWB




moved a fan header that was replaced with the LED read out for the temp sensors on the shroud.

back to


----------



## Vellinious

Quote:


> Originally Posted by *looniam*
> 
> just mind any WB compatibility with those new PCBs that have the temp sensors:
> 
> 
> Spoiler: Warning:NSFW *naked* pcbs!
> 
> 
> 
> OLD FTW PCB
> 
> 
> NEW FTW2 PCB
> 
> 
> EKWB
> 
> 
> 
> 
> moved a fan header that was replaced with the LED read out for the temp sensors on the shroud.
> 
> back to


I wonder how high that is off the pcb.......


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> I wonder how high that is off the pcb.......


There is a pic of it here with the midplate on. You might be able to get a sense of it if you expand the pic out.


----------



## Dissolution187

I have a question. I have a 1080 SC EVGA card and I was wondering if +100 core and +350 memory boost is low, and if so is there any possible way to raise that or am I screwed if on the stock cooler?

Thanks


----------



## SiriusLeo

*Picked up my fourth GTX 1080 Founders Edition card yesterday and it looks like I hit the jackpot. Got it running at 2228, (this is edited from earlier stated 2200) Core stable with zero artifacts or crashing. It feels like it's at its limit but I'm still doing manual tweeks with EVGA Precision. I've also had A LOT of issues with Coil Whine on these Founders Edition Cards. Two of the four were so bad I exchanged them and one of the four flat out was defective and showed signs of all out failure after only three months in use.*



*
GPU and CPU setup in a Parallel configuration.*



*Running two AlphaCool XT45 X-Flow 360mm Rads With EK Vardar Fans and a EK-D5 Pump set to 80% speed. GPU temp never exceeds 41c.*


----------



## looniam

Quote:


> Originally Posted by *Vellinious*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> just mind any WB compatibility with those new PCBs that have the temp sensors:
> 
> 
> Spoiler: Warning:NSFW *naked* pcbs!
> 
> 
> 
> OLD FTW PCB
> 
> 
> NEW FTW2 PCB
> 
> 
> EKWB
> 
> 
> 
> 
> moved a fan header that was replaced with the LED read out for the temp sensors on the shroud.
> 
> back to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder how high that is off the pcb.......
Click to expand...

the other two headers have grooves in the acrylic so its higher than the metal.


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> *Picked up my fourth GTX 1080 Founders Edition card yesterday and it looks like I hit the jackpot. Got it running at 2228, (this is edited from earlier stated 2200) Core stable with zero artifacts or crashing. It feels like it's at its limit but I'm still doing manual tweeks with EVGA Precision. I've also had A LOT of issues with Coil Whine on these Founders Edition Cards. Two of the four were so bad I exchanged them and one of the four flat out was defective and showed signs of all out failure after only three months in use.*
> 
> 
> 
> *
> GPU and CPU setup in a Parallel configuration.*
> 
> 
> 
> *Running two AlphaCool XT45 X-Flow 360mm Rads With EK Vardar Fans and a EK-D5 Pump set to 80% speed. GPU temp never exceeds 41c.*


It may be running at 2228, but it's not performing like 2200+ should. You should be seeing 8.4k or 8.5k graphics scores with that kind of core clock. Just because you can, doesn't mean you should. lol, I'd lower temps and try again.

Quote:


> Originally Posted by *looniam*
> 
> the other two headers have grooves in the acrylic so its higher than the metal.


Yeah, it's the one under the block itself that worries me.


----------



## SiriusLeo

Quote:


> Originally Posted by *Vellinious*
> 
> It may be running at 2228, but it's not performing like 2200+ should. You should be seeing 8.4k or 8.5k graphics scores with that kind of core clock. Just because you can, doesn't mean you should. lol, I'd lower temps and try again.
> Yeah, it's the one under the block itself that worries me.


I get what you're saying. However, it's more of a proof of concept at the moment. As it sits right now I can stable benchmark at 2250mhz. It's stable running, Valley, Heaven, Firestrike, (Ultra and Extreme) Timespy and Witcher 3. Thus far, I've only ran into driver crashes. I've seen ZERO artifacts even all the way up to 2300mhz. That points much more towards lack of power feeding the core than the core itself hitting it's limit... and that's great news. I've had the card for 24hrs and it's looking VERY promising if I can feed the starved core more volts with a volt mod. If the core was bad and artifacting at lower clocks then pumping more volts into it may do very little or nothing at all.

Yes, I can and have scored higher if I drop the core down to 2150mhz, but that's not the end goal. If I can mod it and actually hit 2300mhz+ without hitting a power limit then I'd say that's a pretty good jackpot of a Founders Edition GTX1080.


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> I get what you're saying. However, it's more of a proof of concept at the moment. As it sits right now I can stable benchmark at 2250mhz. It's stable running, Valley, Heaven, Firestrike, (Ultra and Extreme) Timespy and Witcher 3. Thus far, I've only ran into driver crashes. I've seen ZERO artifacts even all the way up to 2300mhz. That points much more towards lack of power feeding the core than the core itself hitting it's limit... and that's great news. I've had the card for 24hrs and it's looking VERY promising if I can feed the starved core more volts with a volt mod. If the core was bad and artifacting at lower clocks then pumping more volts into it may do very little or nothing at all.
> 
> Yes, I can and have scored higher if I drop the core down to 2150mhz, but that's not the end goal. If I can mod it and actually hit 2300mhz+ without hitting a power limit then I'd say that's a pretty good jackpot of a Founders Edition GTX1080.


You don't need more volts, you need cooler temps. But....do what ya like. rofl


----------



## SiriusLeo

Quote:


> Originally Posted by *Vellinious*
> 
> You don't need more volts, you need cooler temps. But....do what ya like. rofl


Most benchmark runs never exceed 37c. Once temps top out at 40c I'll let it cool down for a few minutes before adjusting for the next run. I understand that GPU Boost 2.0 favors lower temps but it's not realistic to ask for lower temps than 37c unless you're running LN2 or chilling your water.

The reason for the lower scores is due to GPU Boost kicking in and tuning the core volts down because of the pathetic power limit on the Founders Edition Cards. If I can remove that wall then my card will run a stable clock, and if that clock is say, 2250-2300, (which I now know this core is capable of) then the scores will rise. If the core is set to 2150 then it'll usually finish the benchmark without variations, and score favor that over a bouncy high top in clock.


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> Most benchmark runs never exceed 37c. Once temps top out at 40c I'll let it cool down for a few minutes before adjusting for the next run. I understand that GPU Boost 2.0 favors lower temps but it's not realistic to ask for lower temps than 37c unless you're running LN2 or chilling your water.


Or opening a window. My best runs come at 2252 in SLI with peak core temps never exceeding 24c. Any higher, and the frames start dropping. Adding voltage will just create more heat, causing even more inefficiency in the core. You're working the problem back to front.

Like I said, though....to each his own. You'll learn just like the rest of us, that adding extra voltage really won't do all that much for ya, where as dropping temps will. Enjoy.


----------



## Dissolution187

I have a question. I have a 1080 SC EVGA card and I was wondering if +100 core and +350 memory boost is low, and if so is there any possible way to raise that or am I screwed if on the stock cooler?

Anyone? Thanks.


----------



## Vellinious

Quote:


> Originally Posted by *Dissolution187*
> 
> I have a question. I have a 1080 SC EVGA card and I was wondering if +100 core and +350 memory boost is low, and if so is there any possible way to raise that or am I screwed if on the stock cooler?
> 
> Anyone? Thanks.


Try to use the voltage / frequency curve tool instead of just setting an offset. Keep the voltage as low as possible for the core clock you're trying to reach, and create a custom fan curve to keep the temps as low as possible.


----------



## looniam

Quote:


> Originally Posted by *Vellinious*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> the other two headers have grooves in the acrylic so its higher than the metal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, it's the one under the block itself that worries me.
Click to expand...

that's what i thought i pointed out; since the acrylic is manufactured to accommodate the height of the fan headers; those divots, holes pits or whatever; the metal heatsink won't allow it since it's closer to the card than the acrylic.


----------



## Vellinious

Quote:


> Originally Posted by *looniam*
> 
> that's what i thought i pointed out; since the acrylic is manufactured to accommodate the height of the fan headers; those divots, holes pits or whatever; the metal heatsink won't allow it since it's closer to the card than the acrylic.


How much is the question. Doesn't take much to do a little machining to gain some clearance.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> How much is the question. Doesn't take much to do a little machining to gain some clearance.


They haven't announced the Classified yet.

Which means they could still be designing the new PCB on the Classy. (I hope).

So I figured a pre-emptive strike was in order:

I emailed the director of customer relations at EVGA tonight regarding water-block compatibility and the struggle classy owners have already had.

I also added some fruit to the email regarding sales (and how within 2 weeks of EK beginning *OFFICIAL* support of 780TI block for 1080 Classy the block was out of stock on the EK site). I do believe they actually had to do another order at their factory to compensate.

I'm sure EVGA benefited too... many classy customers opted out due to lack of water block support in the beginning.

Anyhow I'll wait and see what happens 

Personally I think this is great.

We've all wanted VRM & memory temp monitoring for years.

Most companies would have left us hanging, but EVGA are giving us an upgrade route. (Can't imagine ASUS or any of the others would have done something like that).

People aught to be encouraging the guys at EVGA who fought for this. Not moaning about it on the forums. _(in terms of not leaving existing customers stranded).
_
Yeah it's 80 bux... yeah.. it could have been a bit cheaper.. but its not the end of the world,.

The process has also had time to mature a bit, FE cards are also going to be even less popular (that's two reasons why there's every chance we'll all end up with better binned cards).

EVGA aren't charging anymore for the upgrade, than the money you would of lost in an ebay devaluation. So it's a choice. Companies shouldn't have to explain themselves when they decide to improve a product line.... the fact EVGA are nice enough to allow us to upgrade is a bloody amazing benefit no other company would of offered.

We're getting a brand new card in a sealed box with all the extra bells and whistles for a measely 80 bux. For the "tinkerers and overclockers / hardware enthusiasts" thats great news.

Imagine having to risk selling on Ebay to get the benefits of the new card? I'd take the EVGA offered upgrade route any day


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> They haven't announced the Classified yet.
> 
> Which means they could still be designing the new PCB on the Classy. (I hope).
> 
> So I figured a pre-emptive strike was in order:
> 
> I emailed the director of customer relations at EVGA tonight regarding water-block compatibility and the struggle classy owners have already had.
> 
> I also added some fruit to the email regarding sales (and how within 2 weeks of EK beginning *OFFICIAL* support of 780TI block for 1080 Classy the block was out of stock on the EK site). I do believe they actually had to do another order at their factory to compensate.
> 
> I'm sure EVGA benefited too... many classy customers opted out due to lack of water block support in the beginning.
> 
> Anyhow I'll wait and see what happens
> 
> Personally I think this is great.
> 
> We've all wanted VRM & memory temp monitoring for years.
> 
> Most companies would have left us hanging, but EVGA are giving us an upgrade route. (Can't imagine ASUS or any of the others would have done something like that).
> 
> People aught to be encouraging the guys at EVGA who fought for this. Not moaning about it on the forums.
> 
> Yeah it's 80 bux... yeah.. it could have been a bit cheaper.. but its not the end of the world,.
> 
> The process has also had time to mature a bit, FE cards are also going to be even less popular (that's two reasons why there's every chance we'll all end up with better binned cards).


EVGA screwed the pooch this gen....there's no mistaking that, and certainly no excuse. "Sorry, we messed up the heating pad issue, here's some free ones.....oh, and btw, we improved the air cooler, slapped a new name on it, called it the FTW2, killing your resale value, and....we changed just enough on the pcb that your waterblock won't fit on it....oh, and sorry, one more thing....upgrading to the new cooling solution will also cost you $99".

Paint all the rosy pictures you want to.....I just don't see it that way.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> EVGA screwed the pooch this gen....there's no mistaking that, and certainly no excuse. "Sorry, we messed up the heating pad issue, here's some free ones.....oh, and btw, we improved the air cooler, slapped a new name on it, called it the FTW2, killing your resale value, and....we changed just enough on the pcb that your waterblock won't fit on it....oh, and sorry, one more thing....upgrading to the new cooling solution will also cost you $99".
> 
> Paint all the rosy pictures you want to.....I just don't see it that way.


I agree 100% on the waterblock situation, they definitely could of had a bit consideration for us in that respect.

However they may of taken the opinion that cards under water wouldn't benefit anyway.....?

Think of it this way then:
-Normally to get something as exciting as sensors on VRM and MEMORY (a fundamental change we've all been begging for for years).. you'd have to normally wait for the next generation of nvidia cards for that.

Instead.. we're getting it for 80 bux without waiting for it....?

Don't get me wrong, I hear what you're saying... but would you rather EVGA went bankrupt? It's a small company and only really does with nvidia GPU's.. this could of broken them.. i'm surprised if they haven't had to borrow money this round in order to support upgrading all our GPU's for only 80 bux.

The press went mental at them on this (the thermal issue).. and it all got blown totally out of proportion and the problems were made out to be a lot worse than they are. EVGA done what they had to do, to save their reputation and their companies future.

I like EVGA, and what they have to offer (and how they attempt to contribute to the overclocking community) with releases such as evbot, epower and Kingpin edition cards....


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree 100% on the waterblock situation, they definitely could of had a bit consideration for us in that respect.
> 
> However they may of taken the opinion that cards under water wouldn't benefit anyway.....?
> 
> Think of it this way then:
> -Normally to get something as exciting as sensors on VRM and MEMORY (a fundamental change we've all been begging for for years).. you'd have to normally wait for the next generation of nvidia cards for that.
> 
> Instead.. we're getting it for 80 bux without waiting for it....?
> 
> Don't get me wrong, I hear what you're saying... but would you rather EVGA went bankrupt? It's a small company and only really does with nvidia GPU's.. this could of broken them.. i'm surprised if they haven't had to borrow money this round in order to support upgrading all our GPU's for only 80 bux.
> 
> The press went mental at them on this (the thermal issue).. and it all got blown totally out of proportion and the problems were made out to be a lot worse than they are. EVGA done what they had to do, to save their reputation and their companies future.
> 
> I like EVGA, and what they have to offer (and how they attempt to contribute to the overclocking community) with releases such as evbot, epower and Kingpin edition cards....


And didn't take into consideration that it would kill any resale value the ACX cards had.

EVGA isn't going bankrupt...not even close. I doubt it even put a dent in their yearly earnings report. Hurt Q3, sure, but.....not the annual. It was barely a ding.

Their customer service has always been outstanding. They just really fell on the fail button with pascal.....they fell so hard on it, that they bounced and hit it twice.


----------



## ucode

Quote:


> Originally Posted by *SiriusLeo*
> 
> I get what you're saying. However, it's more of a proof of concept at the moment. As it sits right now I can stable benchmark at 2250mhz. It's stable running, Valley, Heaven, Firestrike, (Ultra and Extreme) Timespy and Witcher 3. Thus far, I've only ran into driver crashes. I've seen ZERO artifacts even all the way up to 2300mhz. That points much more towards lack of power feeding the core than the core itself hitting it's limit... and that's great news. I've had the card for 24hrs and it's looking VERY promising if I can feed the starved core more volts with a volt mod. If the core was bad and artifacting at lower clocks then pumping more volts into it may do very little or nothing at all.
> 
> Yes, I can and have scored higher if I drop the core down to 2150mhz, but that's not the end goal. If I can mod it and actually hit 2300mhz+ without hitting a power limit then I'd say that's a pretty good jackpot of a Founders Edition GTX1080.


I Keep meaning to mod my own FE as well. Artifacts generally appear to be a result of unstable memory clocks, GPU stability is usually driver resets. If you want to have a feel of how the extra volts might work and remove the power limitation you could try x-flash the strix T4 VBIOS. Be aware though that power draws of over 300W @ 2200MHz / 1.2V with FSU and like will happen so keep an eye on things. Results seem to vary a lot from chip to chip so YMMV. Also clock for clock the FE VBIOS appears better.
Quote:


> Originally Posted by *Vellinious*
> 
> Or opening a window.










Lol, reminded me of a old Norwegian overclocker I know. Sometimes wish I could do that but would result in 30C+ temps and 80%+ humidity most of the time. :/


----------



## Dragonsyph

Almost 2300mhz core and you only score 8100 graphics? What's the point of using such high core when it's netting you lower scores/fps in games?

Me at 2164 core hit about 8600 graphics score.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Almost 2300mhz core and you only score 8100 graphics? What's the point of using such high core when it's netting you lower scores/fps in games?
> 
> Me at 2164 core hit about 8600 graphics score.


Not everyone has magic memory.

8100 is low, though....


----------



## nrpeyton

memory loves low temps,

yields on GDDR5X memory have been very poor and costly to manufacturers so we are very lucky to be getting the memory overclocking we have been getting


----------



## nrpeyton

@Vellinious

Did you re-seat your blocks etc yet (like u were talking about last week)?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> @Vellinious
> 
> Did you re-seat your blocks etc yet (like u were talking about last week)?


Not yet. I ended up on a business trip, and have three more scheduled for the next month and a half. Afraid I won't have time to do much of anything for a while.


----------



## ssgwright

what's the best bios for the 1080 FE card? I'm currently running the t4 at 2140


----------



## nrpeyton

Quote:


> Originally Posted by *ssgwright*
> 
> what's the best bios for the 1080 FE card? I'm currently running the t4 at 2140


You could try searching the GPU BIOS database at techpower for a 5-phase VRM card with a higher power limit.

Just be careful, having your original BIOS backed up and a way to re-flash the card if it turns out to be incompatible.

There are guides available that allow you to do a "blind flash".

You could also continue with T4 BIOS and see how you get on with it. It has no temp limits, no power limits and allows voltage up to 1.2v.

I am NOT HOWEVER "recommending" that you do it. You do so at entirely your own risk and after considerabe research.

*INFO ON THE T4 BIOS BELOW (REVEAL THE SPOILER)*


Spoiler: Warning: Spoiler!



: If you do decide to jump onto the "T4 BIOS bandwagon" and flash it to your card; be careful:

- it has no temp or power limit.
-if your fans or pump failed the card would melt down before it throttled
-FE cards on air using this BIOS will have slower fan speed
-the third display port on your card won't work anymore (nothing to worry about -- but don't let that mistake you into thinking you've bricked your card)
-don't use the 'certificates bypassed' version of nvflash to flash it (that seems back to front, I know - but don't worry about that there's a good reason)

strix1080xoc_t4version2.zip 148k .zip file


-Allows voltage increase up to 1.2v (stock locked limit is 1.093v so a 107 millivolt, max increase)
-Power limit removed
-Temp limit removed (warning)!
-Could be dangerous on cards with only one power connector with lower quality cables. Max _"recommended"_ power draw for 1 power cable cards is 225 watts. <--- search the forum more more info/discussion on this

P.S.
-Many are getting higher benchmark scores using this BIOS without even using the increased voltage it offers. Two reasons for this. no.1) removed power limit no.2) it is roomered to have 'tighter' memory timings.

-Not compatible with EVGA 1080 Classified; it won't brick the card; you'll just get a lower score + numbers won't report correctly.

-People on EVGA 1080 FTW have been getting good results, however 



-EVGA Classified owners should use the Classified 1080 Voltage Tool instead:

Classified1080voltagetool.zip 934k .zip file


Let us know how you get on


----------



## TK421

Hey guys, question. If you take a 200w tdp limit 1080 and make it 100w tdp limit 1080 are you effectively getting 50% reduced performance out of that card?\\\

@nrpeyton do you have a voltage tool and OC bios for the ZOTAC AMP/AMP Extreme cards? These cards don't have a TDP limit but have voltage restriction on 1.093v which is odd.


----------



## nrpeyton

Quote:


> Originally Posted by *TK421*
> 
> Hey guys, question. If you take a 200w tdp limit 1080 and make it 100w tdp limit 1080 are you effectively getting 50% reduced performance out of that card?\\\
> 
> @nrpeyton do you have a voltage tool and OC bios for the ZOTAC AMP/AMP Extreme cards? These cards don't have a TDP limit but have voltage restriction on 1.093v which is odd.


*Q1*
Correct, however if you lower the voltage while maintaining the same clock rate (not ABSOLUETELY the same clock rate; but to an extent) you could close the performance loss margin slightly..

OR CONSIDERABLY, but only *IF* by lower temperatures a lot..

But, to answer your question in simple terms (for all intensive purposes) I'd have to say YES.

But it also depends on the gaming environment. In an older game that isn't graphically demanding the GPU might only need to draw 100w anyway, so you wouldn't see a performance drop. But in any decent, modern game at Ultra. Yes, absolutely. You're correct 

*Q2.*
I'm afraid not, the only option is to try the T4 BIOS. But I am in no way personally recommending that you go ahead with that.

I've recommended it before and someone complained because their card was running so fast it literally melted the power cable. The card was fine. But, well, you get the picture 

Check out the spoiler in my last message for info on the T4 BIOS (for benefits/warnings). 

ZOTAC don't have any world-class LN2 sponsors (they're pretty much non-existant in the extreme cooling scene) as far as i know?

So for that card, your only option is going to be physical hardware mods, or experimenting by cross-flashing BIOS's from other manufacturers until you find something that works best.

Also, ZOTAC cards *do* have TDP limits:

ZOTAC AMP: 276 watts max
ZOTAC AMP EXTREME: 386 watts max

For comparison a Founders Edition as a max power draw of: 216W (fully overclocked with stock BIOS and no hardware mods)

Most games at 4k Ultra still won't draw more than about 250w.
Furmark will draw as much as you want it to. ;-)


----------



## Dissolution187

Quote:


> Originally Posted by *Vellinious*
> 
> Try to use the voltage / frequency curve tool instead of just setting an offset. Keep the voltage as low as possible for the core clock you're trying to reach, and create a custom fan curve to keep the temps as low as possible.


Can you in box me how to do it? I only know how to use MSI AB.


----------



## Vellinious

Quote:


> Originally Posted by *Dissolution187*
> 
> Can you in box me how to do it? I only know how to use MSI AB.


It's in MSI AB.


----------



## Vellinious

Quote:


> Originally Posted by *TK421*
> 
> Hey guys, question. If you take a 200w tdp limit 1080 and make it 100w tdp limit 1080 are you effectively getting 50% reduced performance out of that card?\\\
> 
> @nrpeyton do you have a voltage tool and OC bios for the ZOTAC AMP/AMP Extreme cards? These cards don't have a TDP limit but have voltage restriction on 1.093v which is odd.


TDP and power limit are different. You're talking about power limit. At 100 watts of power limit, you'd be greatly reducing it's ability to run even at stock clocks. As for %? Who knows.


----------



## looniam

Quote:


> Originally Posted by *TK421*
> 
> Hey guys, question. If you take a 200w tdp limit 1080 and make it 100w tdp limit 1080 are you effectively getting 50% reduced performance out of that card?\\\


short answer:
NO.

longer answer:
first of all consider both clock speed and voltage affect power consumption. clock speed will affect power consumption linearly and voltage exponentially ; so a -50% in clock speed and ~-36.5% in voltage would both be equal to a 50% power reduction. now consider both clock speed and voltage affect each other; lower clockspeed also lowers the voltage and visa versa so you would never be reducing only clock speed 50% for a 50% decrease in power. and lastly, _performance doesn't scales linearly with clock speed_, (though i wouldn't doubt there is an efficiency curve where it would be close.) even if you did lower the clock speed 50% - it won't bring a 50% decrease in performance.

imma going back to watch narcos and


----------



## ucode

Quote:


> Originally Posted by *TK421*
> 
> Hey guys, question. If you take a 200w tdp limit 1080 and make it 100w tdp limit 1080 are you effectively getting 50% reduced performance out of that card?\\\


Are you Mobius1?

As looniam says.

Example :- an increase from 1600MHz @ 0.8V to 2200MHz @ 1.2V with same graphics work load is a 37.5% increase in GPU clock performance but over 3x increase in dynamic power.


----------



## TK421

Quote:


> Originally Posted by *nrpeyton*
> 
> *Q1*
> Correct, however if you lower the voltage while maintaining the same clock rate (not ABSOLUETELY the same clock rate; but to an extent) you could close the performance loss margin slightly..
> 
> OR CONSIDERABLY, but only *IF* by lower temperatures a lot..
> 
> But, to answer your question in simple terms (for all intensive purposes) I'd have to say YES.
> 
> But it also depends on the gaming environment. In an older game that isn't graphically demanding the GPU might only need to draw 100w anyway, so you wouldn't see a performance drop. But in any decent, modern game at Ultra. Yes, absolutely. You're correct
> 
> *Q2.*
> I'm afraid not, the only option is to try the T4 BIOS. But I am in no way personally recommending that you go ahead with that.
> 
> I've recommended it before and someone complained because their card was running so fast it literally melted the power cable. The card was fine. But, well, you get the picture
> 
> Check out the spoiler in my last message for info on the T4 BIOS (for benefits/warnings).
> 
> ZOTAC don't have any world-class LN2 sponsors (they're pretty much non-existant in the extreme cooling scene) as far as i know?
> 
> So for that card, your only option is going to be physical hardware mods, or experimenting by cross-flashing BIOS's from other manufacturers until you find something that works best.
> 
> Also, ZOTAC cards *do* have TDP limits:
> 
> ZOTAC AMP: 276 watts max
> ZOTAC AMP EXTREME: 386 watts max
> 
> For comparison a Founders Edition as a max power draw of: 216W (fully overclocked with stock BIOS and no hardware mods)
> 
> Most games at 4k Ultra still won't draw more than about 250w.
> Furmark will draw as much as you want it to. ;-)


Hmm that's interesting, never saw the card have power limit on gpuz.

But oh well, I guess 276w is plenty to have on 1.093v

By any chance what card is that t4 bios based on? Strix 1080?


----------



## nrpeyton

Quote:


> Originally Posted by *TK421*
> 
> Hmm that's interesting, never saw the card have power limit on gpuz.
> 
> But oh well, I guess 276w is plenty to have on 1.093v
> 
> By any chance what card is that t4 bios based on? Strix 1080?


Yes, it's based on the ASUS STRIX.

At 1.093v in normal gaming environments, yes absolustely it's 'more than enough' 

You'd be very hard pushed to hit 276w in any game consistently... you might SPIKE to 276w for split-seconds... in fact I believe there are a few *frames* during FireStrike Ultra that use around 270w.. but only momentarily.. not enough to cause any noticable drop in performance.

Score might drop by 50 points etc, but thats bearly even worth noting.

On a card with a lower power limit than 270w, GPU BOOST 3.0 simply kicks in and momentarily lowers the voltage/clock to allow the card to stay within its maximum power envelope.

Thats why a driver is more likely to crash when using a traditional overclock (offset) compared to using the curve method.

+ 150 at 1.0v might not be enough voltage to maintain the clock rate at +150, but it might be enough to maintain it at a different/higher point along the voltage/frequency curve.

Example: 1.0v
Stock Clock at 1.0v = 1850 MHZ
+150mhz (offset method) = 2000 MHZ @ 1.0v
1.0v isn't enough to maintain stability at 2000 MHZ = crash

Stock Clock at 1.093v = 1900 MHZ
+150mhz (offset method) = 2050 MHZ @ 1.093v
1.093v is enough to maintain stability at 2050 MHZ = stable

The "stock frequency" isn't as steep at the higher end of the 'voltage/frequency curve', giving more overclocking 'headroom' at the higher end of that curve.

Thats why people are able to overclock further using the 'curve' overclocking method, as opposed to simply using the traditional method of simply adding an offset of +150 across the entire voltage range.

I.E. you add a little offset at the lower voltage points and a bigger offset at the higher voltage points. It's at the end of the curve, where stock isn't as 'steep' where peeople who won the silicon lottery have all the overclocking headroom.

Now.. the point to all of this, is with a HIGHER power limit (which the STRIX BIOS brings). The card won't be forced to drop to a lower voltage. Which is why it seems to allow people to overclock further (with more stability) using the traditional method. _<--- the card stays at the end of the voltage/frequency curve for the whole run.
_

Thats why some people claim the T4 gets them higher scores even when they're not using the higher voltages which the BIOS offers.

Personally, I always run with the power slider at 130% regardless.


----------



## nrpeyton

Nvidia Voltage Limit! Stop Killing me!

*Running my EVGA GTX 1080 Classified at 2300 MHZ @ 1.2 volts*

(Above nvidia BIOS limit of 1.093v).

When I *lower* the voltage, back towards stock limit, the software *crashes*!

This PROVES our PASCAL cards are voltage starved!

video:


----------



## SiriusLeo

Quote:


> Originally Posted by *Dragonsyph*
> 
> Almost 2300mhz core and you only score 8100 graphics? What's the point of using such high core when it's netting you lower scores/fps in games?
> 
> Me at 2164 core hit about 8600 graphics score.


That run wasn't about the score - it was proof of stability up to *2200mhz* core, not 2300mhz core.

I think a lot of people reading my posts are getting stuck on the performance aspect when I clearly stated that it's proof of concept at this point. I can easily bang out a much higher score by lowering my core to 2100mhz-2150mhz... The reason the score is only 8150 is because when I set the core to 2200mhz it will stay steady at that 2200mhz throughout the entire first test but the second test will fluctuate wildly between 1980mhz and 2200mhz due to the card hitting it's power limit. When I set the core to 2150mhz it'll maintain that throughout all the first test and MOST of the second test, THUS resulting in a higher overall score, which is GREAT it that's what you're looking for but that doesn't tell me the limitations of anything... also, I've not even begun to touch the memory yet - that test result was at stock...

I've spent the better part of the past 24hours determining the maximum stable core frequency PER voltage point. Once complete it will give me an MUCH better understanding of this card overall capabilities. It's worth doing this BEFORE I volt mod because if I got a dud then the volt mod may not even matter and I waist my time on a poor quality card, (like the last 3 I've owned).

*Here is where I'm at right now. 325mhz offset at 1.050v seems to be stable. 2202mhz at 1.043 / 1.031v / 1.025v is definitely stable. At 1.012v / 1.000v it drops down to 2189mhz / 2176mhz...*


----------



## nrpeyton

Quote:


> Originally Posted by *SiriusLeo*
> 
> That run wasn't about the score - it was proof of stability up to *2200mhz* core, not 2300mhz core.
> 
> I think a lot of people reading my posts are getting stuck on the performance aspect when I clearly stated that it's proof of concept at this point. I can easily bang out a much higher score by lowering my core to 2100mhz-2150mhz... The reason the score is only 8150 is because when I set the core to 2200mhz it will stay steady at that 2200mhz throughout the entire first test but the second test will fluctuate wildly between 1980mhz and 2200mhz due to the card hitting it's power limit. When I set the core to 2150mhz it'll maintain that throughout all the first test and MOST of the second test, THUS resulting in a higher overall score, which is GREAT it that's what you're looking for but that doesn't tell me the limitations of anything... also, I've not even begun to touch the memory yet - that test result was at stock...
> 
> I've spent the better part of the past 24hours determining the maximum stable core frequency PER voltage point. Once complete it will give me an MUCH better understanding of this card overall capabilities. It's worth doing this BEFORE I volt mod because if I got a dud then the volt mod may not even matter and I waist my time on a poor quality card, (like the last 3 I've owned).
> 
> *Here is where I'm at right now. 325mhz offset at 1.050v seems to be stable. 2202mhz at 1.043 / 1.031v / 1.025v is definitely stable. At 1.012v / 1.000v it drops down to 2189mhz / 2176mhz...*


If you can do 2189 at 1.0v you're got a very, very good card there 

Don't know if this is off topic; but do you have HWINFO64? I use that to monitor power draw (in watts) in real time. You can set it up to log power usage/voltage/clock & even FPS throughout the entire run, then view it in a spreadsheet.. you can also get it to send all the data to your screen beside your FPS counter.


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> That run wasn't about the score - it was proof of stability up to *2200mhz* core, not 2300mhz core.
> 
> I think a lot of people reading my posts are getting stuck on the performance aspect when I clearly stated that it's proof of concept at this point. I can easily bang out a much higher score by lowering my core to 2100mhz-2150mhz... The reason the score is only 8150 is because when I set the core to 2200mhz it will stay steady at that 2200mhz throughout the entire first test but the second test will fluctuate wildly between 1980mhz and 2200mhz due to the card hitting it's power limit. When I set the core to 2150mhz it'll maintain that throughout all the first test and MOST of the second test, THUS resulting in a higher overall score, which is GREAT it that's what you're looking for but that doesn't tell me the limitations of anything... also, I've not even begun to touch the memory yet - that test result was at stock...
> 
> I've spent the better part of the past 24hours determining the maximum stable core frequency PER voltage point. Once complete it will give me an MUCH better understanding of this card overall capabilities. It's worth doing this BEFORE I volt mod because if I got a dud then the volt mod may not even matter and I waist my time on a poor quality card, (like the last 3 I've owned).
> 
> *Here is where I'm at right now. 325mhz offset at 1.050v seems to be stable. 2202mhz at 1.043 / 1.031v / 1.025v is definitely stable. At 1.012v / 1.000v it drops down to 2189mhz / 2176mhz...*


If your frames are dropping, I wouldn't call that stable. lol It may not be tossing artifacts around or crashing the driver, but stable? You have a very different idea of what's stable, than I do.


----------



## SiriusLeo

Quote:


> Originally Posted by *Vellinious*
> 
> If your frames are dropping, I wouldn't call that stable. lol It may not be tossing artifacts around or crashing the driver, but stable? You have a very different idea of what's stable, than I do.


I don't quite get why you're being so unnecessarily aggressive. I've had the card for less than 48 hours and I'm posting my initial findings. Of course I haven't found the pinnacle of stability yet - that's literally what I'm doing now.

The drops are due to power limits being reached, which I can attempt to solve by one of many means, (volt mod, bios flash, hard mod to PCB). What I can't fix is a poor overclocking core, (which is definitely not what working with here). This core has a lot of potential and I'm trying to find its limits. I don't understand why you can't respect that and lay off the non-useful jabs.

I don't mean any disrespect and I've already stated that I understood where you were coming from. I even took your advice and cooled down the GPU a few degrees just as an added measure. It's running 33c-34c now.


----------



## SiriusLeo

Quote:


> Originally Posted by *nrpeyton*
> 
> If you can do 2189 at 1.0v you're got a very, very good card there
> 
> Don't know if this is off topic; but do you have HWINFO64? I use that to monitor power draw (in watts) in real time. You can set it up to log power usage/voltage/clock & even FPS throughout the entire run, then view it in a spreadsheet.. you can also get it to send all the data to your screen beside your FPS counter.


Thanks for the advice. Downloaded and logging now. Currently stability testing 1.062v at 2252mhz. Max GPU power draw is 191.7 watts.


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> I don't quite get why you're being so unnecessarily aggressive. I've had the card for less than 48 hours and I'm posting my initial findings. Of course I haven't found the pinnacle of stability yet - that's literally what I'm doing now.
> 
> The drops are due to power limits being reached, which I can attempt to solve by one of many means, (volt mod, bios flash, hard mod to PCB). What I can't fix is a poor overclocking core, (which is definitely not what working with here). This core has a lot of potential and I'm trying to find its limits. I don't understand why you can't respect that and lay off the non-useful jabs.
> 
> I don't mean any disrespect and I've already stated that I understood where you were coming from. I even took your advice and cooled down the GPU a few degrees just as an added measure. It's running 33c-34c now.


Not sure what you think is aggressive about what I said.....I'm merely pointing out the flaw in your overclocking logic. I shall refrain from doing so in the future. You are free to continue your hunt for lower frame rates at higher clocks unhindered and unchecked. lol


----------



## nrpeyton

Quote:


> Originally Posted by *SiriusLeo*
> 
> Thanks for the advice. Downloaded and logging now. Currently stability testing 1.062v at 2252mhz. Max GPU power draw is 191.7 watts.


When I first got my 1080 Classified I done the exact same as you, spent days tweaking the entire voltage curve.. in fact i still have mine saved lol.

Villinious will probably back me up, he probably remember me flooding with posts with all these numbers, i even got told off a fea times haha.

here:
800mv - 200 - 1797
850mv - 185 - 1898
875mv - 165 - 1911
881mv - 180 - 1949
893mv - 160 - 1949
900mv - 160 - 1961
912mv - 150 - 1961
925mv - 135 - 1974
931mv - 150 - 1987
943mv - 160 - 2025
950mv - 135 - 2012 - 12505
125 - 1999 - 12521, 12731
110 - 1987 - 12513

140 - 2138 - 13587
1062v 115 - 2113 - 13668

1075v 115 - 2126 - 13556, 13690

140 - 2151 - 13642, 13612 = 13627
130 - 2138 - 13658, 13653 = 13656
115 - 2126 - 13628, 13707 = 13668
1081v 110 - 2113 - 13555, 13665 = 13610
100 - 2100 - 13658, 13546 = 13602
95 - 2100 - 13538, 13609 = 13574
93 - 2100 - 13687, 13579 = 13633
85 - 2088 - 13589, 13608 = 13599

175 - 2176 - 13816
1093v - 140 - 2164 -
135 - 2151 -
122 - 2126 - 13795
105 - 2113 -
100 - 2100 -
90 - 2100 -
85 - 2088 -

So I totally get what you're going through, 

It's all part of the hobby and the enjoyment of descovering your new card 

We all crazy lol ;-)


----------



## nrpeyton

@Vellinious i desperately need some back up lol

getting annihilated by all the care bear landers over at tech-powerup who think overclocking is 'dangerous'.

wait til you see how my poll is doing regarding getting more voltage... could make u sick lol

really need a someone to jump in and at least cast a vote on no.1 lol if its not too much trouble haha 

https://www.techpowerup.com/forums/threads/poll-nvidia-voltage-limit-stop-killing-me.230588/


----------



## TK421

nrpeyton, do you really think that running above 1.093v is beneficial for everyday use though?

Believe pascal only benefits from a voltage bump at very low temps.


----------



## GRABibus

Quote:


> Originally Posted by *TK421*
> 
> nrpeyton, do you really think that running above 1.093v is beneficial for everyday use though?
> 
> Believe pascal only benefits from a voltage bump at very low temps.


With Strix OC t4 Bios, I can be stable in all benchmarks (Time Spy, Firestrike) and all my games except Battlefront 2015 at 2202MHz/5500MHz with 1.1V.
So yes, it is beneficial, at least for me


----------



## TK421

Hmm do you think that the next generation nvidia volta / big pascal will have a modded bios from ASUS again? Because if so I will buy ASUS strix.

And I suppose the T4 bios works the best in strix cards right?


----------



## Vellinious

Quote:


> Originally Posted by *TK421*
> 
> Hmm do you think that the next generation nvidia volta / big pascal will have a modded bios from ASUS again? Because if so I will buy ASUS strix.
> 
> And I suppose the T4 bios works the best in strix cards right?


Doubtful. NVIDIA put the clamps down because the smaller processes react much worse to more voltage than the previous gens, and the mentality was then, and still is, unfortunately, that if you just add more volts, everything will be just fine and everyone will be dancing with unicorns and everyone will be farting rainbows..... Which is just catastrophic stupidity. Temps, THEN worry about voltages.


----------



## nrpeyton

Quote:


> Originally Posted by *TK421*
> 
> nrpeyton, do you really think that running above 1.093v is beneficial for everyday use though?
> 
> Believe pascal only benefits from a voltage bump at very low temps.


I agree.

But it would just be nice to have the same control over our GPU's, that we have on our CPU's.

The whole sub-zero thing is a just a hobby.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree.
> 
> But it would just be nice to have the same control over our GPU's, that we have on our CPU's.
> 
> The whole sub-zero thing is a hobby for a lot of people these days (people compete at hwbot.org) for example.
> 
> Imagine how much easier it would be for people less experienced in electronics to 'get into' on the GPU side if Nvidia never had everything locked down so hard


I can get on board with that.....the way it is now, you have to have money to burn to get into the extreme side of overclocking, and then have to hope your money doesn't run out, before someone decides to throw some sponsor money at ya.

I'm not going to pretend that the enthusiasts control the market, because we don't......but we're a big enough group that if they continue to take a giant crap on our heads, we're gonna start to take offense.


----------



## SiriusLeo

*Baseline results for 12 voltage points on my GTX 1080 Founders Edition...*


Above image is the voltage points found to be obtainable for extended periods of time with no crashes or artifacts. Testing per voltage point consisted of 30 minute loops of Heaven Benchmark at 1440p and maximum settings; as well as Timespy loops of Graphics Test One and Two for 30 minutes each. So, roughly 1.5 hours of continuous testing per voltage point, (this has been my past 48 hours).










Above image is the lowest voltage with the highest core clock I've tested, (*0.981v / 2151mhz*). Core clocks and voltage mostly maintain locked throughout the tests. Test one is locked at both the desired core and voltage points 100% of the time. In test two, it's locked at the desired core lock 62% of the time and is locked at the desired voltage 100% of the time.

I'll add, while running Heaven Benchmark it will stay locked at both *0.0981v* and *2151mhz* 100% of the time throughout many passes, (looped for one hour).


----------



## TK421

Damn that's a very well binned card there.

My 1080 AMP only does 2012 stable at 1.093v


----------



## constantine741

Just got a gtx 1080. When do u think they will come out with a gtx 1080ti? Should I have waited


----------



## nrpeyton

Quote:


> Originally Posted by *SiriusLeo*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Baseline results for 12 voltage points on my GTX 1080 Founders Edition...*
> 
> 
> 
> 
> Above image is the voltage points found to be obtainable for extended periods of time with no crashes or artifacts. Testing per voltage point consisted of 30 minute loops of Heaven Benchmark at 1440p and maximum settings; as well as Timespy loops of Graphics Test One and Two for 30 minutes each. So, roughly 1.5 hours of continuous testing per voltage point, (this has been my past 48 hours).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Above image is the lowest voltage with the highest core clock I've tested, (*0.981v / 2151mhz*). Core clocks and voltage mostly maintain locked throughout the tests. Test one is locked at both the desired core and voltage points 100% of the time. In test two, it's locked at the desired core lock 62% of the time and is locked at the desired voltage 100% of the time.
> 
> I'll add, while running Heaven Benchmark it will stay locked at both *0.0981v* and *2151mhz* 100% of the time throughout many passes, (looped for one hour).


One of the best cards I've seen in a while, (and I thought mine was good, lol).

I need 5 degrees C or lower to hit 2278 at 1.093v. You've clearly hit the silicon lottery


----------



## zipper17

Quote:


> Originally Posted by *SiriusLeo*
> 
> *Baseline results for 12 voltage points on my GTX 1080 Founders Edition...*
> 
> 
> Above image is the voltage points found to be obtainable for extended periods of time with no crashes or artifacts. Testing per voltage point consisted of 30 minute loops of Heaven Benchmark at 1440p and maximum settings; as well as Timespy loops of Graphics Test One and Two for 30 minutes each. So, roughly 1.5 hours of continuous testing per voltage point, (this has been my past 48 hours).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Above image is the lowest voltage with the highest core clock I've tested, (*0.981v / 2151mhz*). Core clocks and voltage mostly maintain locked throughout the tests. Test one is locked at both the desired core and voltage points 100% of the time. In test two, it's locked at the desired core lock 62% of the time and is locked at the desired voltage 100% of the time.
> 
> I'll add, while running Heaven Benchmark it will stay locked at both *0.0981v* and *2151mhz* 100% of the time throughout many passes, (looped for one hour).


Very very interesting observation with Timespy Custom loop, these might could help 1070's owner to get some idea.
For stability test I use Firestrike Extreme/Timespy Stress test, after that go playing games at 1440P.
How about stability test with 1.093V next?


----------



## SiriusLeo

Quote:


> Originally Posted by *zipper17*
> 
> How about stability test with 1.093V next?


Will do! I plan on getting some Liquid Metal on Wednesday, (hopefully) so I can perform the shunt mod. When I start adding any kind of additional voltage to the card past *1.062v, (1.075v -> 1.081v -> 1.093v)* I start to maintain the maximum clock less and less.

*Long winded example being...* While looping 1440p Heaven at *1.093v* I'll see drops down from *2272mhz* about 10% of the run. It'll drop down to the next voltage stepping and force whatever the max core would be for that, so, *1.081v 2252mhz* and it will continue down till it hits as low as *2164mhz* at some points. The card is hitting it's power limit and GPU Boost is forcing a down clock. I'm hopeing the shunt mod will reduce those fluctuation. If that makes any since, not sure if I explained it well?

Had the same issues with my 980ti's. Luckily with them it was easy to load custom bios that allowed for stable voltage to the core. One of my 980ti's was stuck at *1400mhz* and the other was *1500mhz*. After the custom bios to fix the voltage fluctuation they'd hit *1550mhz* and *1605mhz*


----------



## SiriusLeo

Here is my top Time Spy score thus far. This was achieved running *2202mhz* on the core and *1377mhz, (5508mhz)* on the ram.

This card seems to favor two voltage points WAY more than others. I've had the most success on *0.981v* and *1.031v*. I'll be performing the shunt mod at some point this week so that should help level out the core frequency and reduce the card from hitting its power limit and down clocking.

*8619 Graphics Score*


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> Here is my top Time Spy score thus far. This was achieved running *2202mhz* on the core and *1377mhz, (5508mhz)* on the ram.
> 
> This card seems to favor two voltage points WAY more than others. I've had the most success on *0.981v* and *1.031v*. I'll be performing the shunt mod at some point this week so that should help level out the core frequency and reduce the card from hitting its power limit and down clocking.
> 
> *8619 Graphics Score*


Much better


----------



## SiriusLeo

Quote:


> Originally Posted by *zipper17*
> 
> How about stability test with 1.093V next?


Here is a full run of Heaven at 1440p running at *1.093v* and maintaining *2273mhz* 91% of the run.
https://docs.google.com/spreadsheets/d/1vKm3--PCBVcMHBdxFEY0gL3qW5QuUfgtikuyGtoi6_M/pubhtml


----------



## ucode

Quote:


> Originally Posted by *SiriusLeo*
> 
> Here is my top Time Spy score thus far. This was achieved running *2202mhz* on the core and *1377mhz, (5508mhz)* on the ram.
> 
> This card seems to favor two voltage points WAY more than others. I've had the most success on *0.981v* and *1.031v*. I'll be performing the shunt mod at some point this week so that should help level out the core frequency and reduce the card from hitting its power limit and down clocking.
> 
> *8619 Graphics Score*


That's an awesome score for a non-modified FE card.

Here's a graphed Time Spy run (just the demo section) with 1080 FE x-flashed to T4 using a fixed voltage of 1.2V and starting off at~2200MHz.. Power peaks at 380W.



Unfortunately my GPU needs a lot more voltage than yours to hit 2200MHz. At least with a component hardware mod it's possible to change how much current each circuit takes if done that way. With a firmware mod we can only change thresholds, not current balance which IMO isn't as safe. I don't know how much of that power draw is coming from the slot that is rated typically for 75W.


----------



## TK421

my gpu can't even hold anything above 2025 on 1.093v ***


----------



## pez

I'm assuming you're worried about this for benchmarking purposes? Otherwise, your results/differences will be negligible.


----------



## TK421

On game play it doesn't really matter?


----------



## pez

I'm pretty sure gaming differences after 2000MHz across the entire 10-series (mind 1070 vs 1070, 1080 vs 1080, etc) are pretty negligible to the point it'd be down to margin of error. That and almost all Pascal cards generally OC the same outside of the hidden gems and the ones under some serious water, etc.


----------



## SiriusLeo

Quote:


> Originally Posted by *pez*
> 
> I'm pretty sure gaming differences after 2000MHz across the entire 10-series (mind 1070 vs 1070, 1080 vs 1080, etc) are pretty negligible to the point it'd be down to margin of error. That and almost all Pascal cards generally OC the same outside of the hidden gems and the ones under some serious water, etc.


Of the four GTX 1080 Founders Edition cards I've owned the overall results look as follows.

1) *2100mhz*
2) *2070mhz* (horrible coil whine)
3) *1970mhz* (horrible coil whine and defective after 3 months)
4) *2200mhz*+(current card can hit *2270mhz*)

You're correct though, no really big overall difference. If you sat me down in front of 4 machine with each card I seriously doubt I could tell you which card was in the machine. The only real exception would be with my last two cards. The GTX 1080 that only hit *1970mhz* would drop below *60fps*, (to around *56fps*) in Witcher 3 at 1440p with max settings, (NO HAIRWORKS) while my current card running at *2200mhz* will never go below *60fps*, (lowest was *62fps* in same area). That's a pretty extreme case though, we're talking about *230mhz* difference. I run with G-Sync so it's not an issue but I could see someone not having a G-Sync panel would enjoy not dropping below *60fps* and experience more fluid gameplay.

I've owned two GTX 1070's. One was the Asus Strix and the other was the EVGA Hybrid. The Asus got returned because I was getting artifacts at stock settings. It would boost to over *2000mhz* out of the box but I had to under clock it. The EVGA Hybrid was great. Overclocked over *2200mhz* and experienced no coil whine.


----------



## GRABibus

One thing I noticed is that if I increase fan speed at a fix value or through an agressive temperature curve through MSI AB for example, I can get stable frequency 2202MHz.

If I let the fan on the card in auto , then my fréquency is not stable during benches or gaming.


----------



## GRABibus

What I said is explained here :


----------



## Coopiklaani

Oh no, my EVGA GTX 1080 SC can easily hit 2200+ @1.093v. But that new ICX card with all the sensors, soooo tempting....


----------



## TK421

Why pay extra for getting a card thaf only fix mistake in the first place?


----------



## Vellinious

Quote:


> Originally Posted by *Coopiklaani*
> 
> Oh no, my EVGA GTX 1080 SC can easily hit 2200+ @1.093v. But that new ICX card with all the sensors, soooo tempting....


I'm really tempted to upgrade. My GPUs overclock so well, though.....and I really don't want to have to try to machine my waterblocks or buy new ones. The 90 day extension for the step up to the 1080ti is rather tempting too....


----------



## Coopiklaani

Quote:


> Originally Posted by *Vellinious*
> 
> I'm really tempted to upgrade. My GPUs overclock so well, though.....and I really don't want to have to try to machine my waterblocks or buy new ones. The 90 day extension for the step up to the 1080ti is rather tempting too....


I also use fullcover EKWB for my GPU. Not sure if it is compatible with all the new sensors. But what I concern the most is that fuse thing. Since I'm using T4 bios atm, and it works great for me removing that annoying TDP. I'm not sure what that fuse will do to the unlocked TDP...Nevertheless, T4 bios would render all the sensors useless anyways... I really really wish there'll be a BIOS tweaker!


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> I'm really tempted to upgrade. My GPUs overclock so well, though.....and I really don't want to have to try to machine my waterblocks or buy new ones. The 90 day extension for the step up to the 1080ti is rather tempting too....


I have also noticed that not only is there an additioal 4 pin head near the GPU but the header that was at the end of the card which was a 4 pin is now a 6 pin and therefore larger.
Maybe the best solution is to sell your current blocks upgrade the cards and HOPE that a Ti is even released and IF it is that EVGA will allow a Step Up to it.

ETA Just saw your pic over at the website.
It seems the 6 pin will be ok and would easy enough to check by getting a measurement of one for certainty.
The other one though yes is a pita.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> I have also noticed that not only is there an additioal 4 pin head near the GPU but the header that was at the end of the card which was a 4 pin is now a 6 pin and therefore larger.
> Maybe the best solution is to sell your current blocks upgrade the cards and HOPE that a Ti is even released and IF it is that EVGA will allow a Step Up to it.


The one on the end shouldn't be an issue. The cutout in the acrylic is already quite large. The only one that's going to be an issue, is the new header on the bottom left. Shouldn't be too hard to just cut that corner of the block off. Sand it smooth, polish it up and call it good. IF, the new 6 pin causes problems, it's stupid easy to machine acrylic.


----------



## hotrod717

Didnt think i would be here, but just picked up a Strix 1080 OC open box. Playing catch up. First impressions - 1926 stock clock jumps 100mhz easily at stock voltage. Bumping to 1978 on stock volts boosts to 2100. Very little time on card, but man, looking good so far as i can tell. Looks like 2300 mhz is about tops on these without serious mods? Where is top clocks for mem?


----------



## SiriusLeo

Quote:


> Originally Posted by *hotrod717*
> 
> Didnt think i would be here, but just picked up a Strix 1080 OC open box. Playing catch up. First impressions - 1926 stock clock jumps 100mhz easily at stock voltage. Bumping to 1978 on stock volts boosts to 2100. Very little time on card, but man, looking good so far as i can tell. Looks like 2300 mhz is about tops on these without serious mods? Where is top clocks for mem?


Memory is tricky. Most of my my Founders Edition Cards will go to *+500mhz* on the memory. One of my cards could benchmark around *+900mhz*. HOWEVER - overclocking the memory past *+400mhz* usually results in lower scores because it starts to fight the core for power. Since these cards are pretty power limited you usually want to give the core as much headroom as possible. Find the core overclock first then start tweaking the ram to see how it reacts.


----------



## hotrod717

Quote:


> Originally Posted by *SiriusLeo*
> 
> Memory is tricky. Most of my my Founders Edition Cards will go to *+500mhz* on the memory. One of my cards could benchmark around *+900mhz*. HOWEVER - overclocking the memory past *+400mhz* usually results in lower scores because it starts to fight the core for power. Since these cards are pretty power limited you usually want to give the core as much headroom as possible. Find the core overclock first then start tweaking the ram to see how it reacts.


Thanks, core first and then mem, pretty much like any other card. See you guys are shunt modding. Good results? Seems like std. gputweak is perfect for this gen. No bridging ln2 pad necessary, like with Matrix on previous gen.


----------



## pez

Quote:


> Originally Posted by *SiriusLeo*
> 
> Of the four GTX 1080 Founders Edition cards I've owned the overall results look as follows.
> 
> 1) *2100mhz*
> 2) *2070mhz* (horrible coil whine)
> 3) *1970mhz* (horrible coil whine and defective after 3 months)
> 4) *2200mhz*+(current card can hit *2270mhz*)
> 
> You're correct though, no really big overall difference. If you sat me down in front of 4 machine with each card I seriously doubt I could tell you which card was in the machine. The only real exception would be with my last two cards. The GTX 1080 that only hit *1970mhz* would drop below *60fps*, (to around *56fps*) in Witcher 3 at 1440p with max settings, (NO HAIRWORKS) while my current card running at *2200mhz* will never go below *60fps*, (lowest was *62fps* in same area). That's a pretty extreme case though, we're talking about *230mhz* difference. I run with G-Sync so it's not an issue but I could see someone not having a G-Sync panel would enjoy not dropping below *60fps* and experience more fluid gameplay.
> 
> I've owned two GTX 1070's. One was the Asus Strix and the other was the EVGA Hybrid. The Asus got returned because I was getting artifacts at stock settings. It would boost to over *2000mhz* out of the box but I had to under clock it. The EVGA Hybrid was great. Overclocked over *2200mhz* and experienced no coil whine.


Yeah, that's a good OC, too. I modestly OC'd my 1080 and it sticks around 2050-2100. The same goes for the TXP. It likes the 2088 mark, but I mean...it runs through most games like silk already, so it's OC benefits are seen in high resolutions like 21:9 1440p or 4K. Still not determined I want to keep the 1080 in my rig over it. The Mrs. might be getting a free upgrade if I decide to keep the TXP.


----------



## Coopiklaani

Quote:


> Originally Posted by *SiriusLeo*
> 
> Memory is tricky. Most of my my Founders Edition Cards will go to *+500mhz* on the memory. One of my cards could benchmark around *+900mhz*. HOWEVER - overclocking the memory past *+400mhz* usually results in lower scores because it starts to fight the core for power. Since these cards are pretty power limited you usually want to give the core as much headroom as possible. Find the core overclock first then start tweaking the ram to see how it reacts.


GTX 1080 has separated power rails for the core and the memory. The core gets its power 100% from the 8pin (or 8+8pin) PCIE power. The memory and other components get their power 100% from the PCIE slot on the motherboard. So it's not fighting for power.
Lower scores at higher memory clock is due to the VRAM error correction mechanism. I think it uses Automatic repeat request (ARQ) for error corretion. Bascially, every block of data received is checked using the error detection code used, and if the check fails, retransmission of the data is requested. So GPU has to wait for the retransmission until all data are correct, hence lower scores...


----------



## SiriusLeo

Quote:


> Originally Posted by *Coopiklaani*
> 
> GTX 1080 has separated power rails for the core and the memory. The core gets its power 100% from the 8pin (or 8+8pin) PCIE power. The memory and other components get their power 100% from the PCIE slot on the motherboard. So it's not fighting for power.
> Lower scores at higher memory clock is due to the VRAM error correction mechanism. I think it uses Automatic repeat request (ARQ) for error corretion. Bascially, every block of data received is checked using the error detection code used, and if the check fails, retransmission of the data is requested. So GPU has to wait for the retransmission until all data are correct, hence lower scores...


Interesting. My information is solely based off my personal observations. What I have observed is my power limits being hit much more frequently when I overclock my ram, resulting in my core being down clocked slightly as GPU Boost compensates to keep within the power limit design.

This is a chart, these numbers *aren't accurate* it's just a *rounded example* of what I observe.



Running a benchmark that is more stressful will result in greater droops over time due to power limits being met / exceeded more often. If this theory is correct - when I mod my card later this week, (shunt mod) then I should see FAR less droops on my core, even when the ram is over clocked due to the card being tricked into thinking it's using less power than it actually is.

I could be 100% wrong on all of this - again - I'm just basing my info on my observations. Some of my terminology may be incorrect but I feel that my methods are pretty sound and I enjoy tinkering with this kind of stuff.


----------



## Vellinious

Quote:


> Originally Posted by *SiriusLeo*
> 
> Interesting. My information is solely based off my personal observations. What I have observed is my power limits being hit much more frequently when I overclock my ram, resulting in my core being down clocked slightly as GPU Boost compensates to keep within the power limit design.
> 
> This is a chart, these numbers *aren't accurate* it's just a *rounded example* of what I observe.
> 
> 
> 
> Running a benchmark that is more stressful will result in greater droops over time due to power limits being met / exceeded more often. If this theory is correct - when I mod my card later this week, (shunt mod) then I should see FAR less droops on my core, even when the ram is over clocked due to the card being tricked into thinking it's using less power than it actually is.
> 
> I could be 100% wrong on all of this - again - I'm just basing my info on my observations. Some of my terminology may be incorrect but I feel that my methods are pretty sound and I enjoy tinkering with this kind of stuff.


You're correct. Any power the GPU is using will impact the power limit, which will affect performance...specifically, when the power limit is exceeded.


----------



## Coopiklaani

Quote:


> Originally Posted by *SiriusLeo*
> 
> Interesting. My information is solely based off my personal observations. What I have observed is my power limits being hit much more frequently when I overclock my ram, resulting in my core being down clocked slightly as GPU Boost compensates to keep within the power limit design.
> 
> This is a chart, these numbers *aren't accurate* it's just a *rounded example* of what I observe.
> 
> 
> 
> Running a benchmark that is more stressful will result in greater droops over time due to power limits being met / exceeded more often. If this theory is correct - when I mod my card later this week, (shunt mod) then I should see FAR less droops on my core, even when the ram is over clocked due to the card being tricked into thinking it's using less power than it actually is.
> 
> I could be 100% wrong on all of this - again - I'm just basing my info on my observations. Some of my terminology may be incorrect but I feel that my methods are pretty sound and I enjoy tinkering with this kind of stuff.


The additional memory power due to overclock does add to the total TDP. But it is usually very very small. Maybe additional 5w to the memory when overclock from +0 to +500MHz. I think it may because lower VRAM clock bottlenecks the core, so the core runs less busy, hence lower TDP. Only my theory.

I run my card with T4 BIOS, so no TDP. I can see a big dip in performance when I overclock my memory from 575 to 600.

TS: +575Mem, Graphics Score: 8595 http://www.3dmark.com/spy/1210827
TS: +600Mem, Graphics Score: 8319 http://www.3dmark.com/spy/1210974

almost -300 TS score...


----------



## ucode

Quote:


> Originally Posted by *Coopiklaani*
> 
> GTX 1080 has separated power rails for the core and the memory. The core gets its power 100% from the 8pin (or 8+8pin) PCIE power. The memory and other components get their power 100% from the PCIE slot on the motherboard. So it's not fighting for power.
> Lower scores at higher memory clock is due to the VRAM error correction mechanism. I think it uses Automatic repeat request (ARQ) for error corretion. Bascially, every block of data received is checked using the error detection code used, and if the check fails, retransmission of the data is requested. So GPU has to wait for the retransmission until all data are correct, hence lower scores...


I guess you've never forgotten to plug the power connector in the graphics card. It comes up with an on screen message asking you to turn off and plug it in IIRC, my own memory is not so good. If the GPU gets all it's power from the 8-pin then how would it do that.

AFAIK error re-transmission, if enabled, is for the transmission path between memory and controller and not for actual memory errors. The drop in performance is likely due to re-training. Check out the 'eye'.

Typical Pascal memory performance.


You can see there are drops beforehand and that after the drop performance starts increasing again rather than getting worse. Usually at a certain point artifacts start appearing, no retransmission helping here, that clearly indicates instability.


----------



## Coopiklaani

Quote:


> Originally Posted by *ucode*
> 
> I guess you've never forgotten to plug the power connector in the graphics card. It comes up with an on screen message asking you to turn off and plug it in IIRC, my own memory is not so good. If the GPU gets all it's power from the 8-pin then how would it do that.
> 
> AFAIK error re-transmission, if enabled, is for the transmission path between memory and controller and not for actual memory errors. The drop in performance is likely due to re-training. Check out the 'eye'.
> 
> Typical Pascal memory performance.
> 
> 
> You can see there are drops beforehand and that after the drop performance starts increasing again rather than getting worse. Usually at a certain point artifacts start appearing, no retransmission helping here, that clearly indicates instability.


I actually checked the power delivery myself with a multimetre when I did my shunt mount. A GTX 1080 FE has 3 shunt, 1 for 8-pin power, 1 for PCIE-slot power, and 1 for core vrm power.

There's no physical connection between the PCIE slot power to the core vrm. And vise versa, no connection from the 8-pin power to the memory vrm. I guess the screen message is from a 2D auxiliary chip or a part of the main chip (maybe).


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I actually checked the power delivery myself with a multimetre when I did my shunt mount. A GTX 1080 FE has 3 shunt, 1 for 8-pin power, 1 for PCIE-slot power, and 1 for core vrm power.
> 
> There's no physical connection between the PCIE slot power to the core vrm. And vise versa, no connection from the 8-pin power to the memory vrm. I guess the screen message is from a 2D auxiliary chip or a part of the main chip (maybe).


Are you able to use that to check the power draw in watts? i'd love to see how it compares to the onscreen reading in hwinfo64, using my multimeter.

so far I've only used it to check vcore and mem voltage using the pins especially for it on the classified.

*Anyone able to hazard a guess what "Loadline Trim" and "Switching Period" does in this 1080 voltage tool?
*

full size - right click & open new tab

-At idle at 200 KHz (switching period) card is drawing 15w at 0.625v
-At idle at 2000Khz (switching period) card is drawing 30w but voltage is still 0.625v (confirmed with multi-meter)

power draw measured with hwinfo64 in watts

Doesn't make sense to me yet,

Anyone?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Are you able to use that to check the power draw in watts? i'd love to see how it compares to the onscreen reading in hwinfo64, using my multimeter.
> 
> so far I've only used it to check vcore and mem voltage using the pins especially for it on the classified.
> 
> *Anyone able to hazard a guess what "Loadline Trim" and "Switching Period" does in this 1080 voltage tool?
> *
> 
> full size - right click & open new tab
> 
> -At idle at 200 KHz (switching period) card is drawing 15w at 0.625v
> -At idle at 2000Khz (switching period) card is drawing 30w but voltage is still 0.625v (confirmed with multi-meter)
> 
> power draw measured with hwinfo64 in watts
> 
> Doesn't make sense to me yet,
> 
> Anyone?


HWinfo64 readings are surprisingly accurate. I have a power meter that measures the overall power consumption of the entire computer, including my monitor. I compared the HWinfo64 readings to my power meter, here's the results:
Idle:
Power meter : 142w
HWinfo64 CPU: 27w
HWinfo64 GPU: 10w

Furmark 1080p, NO AA:
Power meter : 524w
HWinfo64 CPU: 61w
HWinfo64 GPU: 350w

So if you compare the readings from HWinfo64 and the power meter,
Furmark - Idle
HWinfo64: (350+61) - (27+10) = 374w
Powermeter: 382w

readings only differ by 8w! I'll say this is within the margin of error.

some pics i took for the measurement.
Idle:




Furmark:


----------



## Dragonsyph

Ok since all of you seem to be nuts on power and voltage im gonna ask a question.

Do you see GPU temps drop with same clocks using 1.09 vs 1.06?


----------



## Coopiklaani

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ok since all of you seem to be nuts on power and voltage im gonna ask a question.
> 
> Do you see GPU temps drop with same clocks using 1.09 vs 1.06?


Yes, but not by much. Maybe it's just me, using a custom loop. But I do see about 30w drop in power consumption with furmark. Why furmark? Furmark is very consistent with very small variation in term of power consumption overtime...


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> HWinfo64 readings are surprisingly accurate. I have a power meter that measures the overall power consumption of the entire computer, including my monitor. I compared the HWinfo64 readings to my power meter, here's the results:
> Idle:
> Power meter : 142w
> HWinfo64 CPU: 27w
> HWinfo64 GPU: 10w
> 
> Furmark 1080p, NO AA:
> Power meter : 524w
> HWinfo64 CPU: 61w
> HWinfo64 GPU: 350w
> 
> So if you compare the readings from HWinfo64 and the power meter,
> Furmark - Idle
> HWinfo64: (350+61) - (27+10) = 374w
> Powermeter: 382w
> 
> readings only differ by 8w! I'll say this is within the margin of error.
> 
> some pics i took for the measurement.
> Idle:
> 
> 
> 
> 
> Furmark:


I also have a watt meter plugged into the socket in the wall.

Only problem is due to my old AMD FX CPU and 990fx mobo.. the only accurate power reading I get is on the GPU.. my mobo and CPU power readings are practically non-existant.

Can't wait to upgrade my CPU 

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ok since all of you seem to be nuts on power and voltage im gonna ask a question.
> 
> Do you see GPU temps drop with same clocks using 1.09 vs 1.06?


yes, slightly ;-)

Less power your GPU draws the less heat it generates.

Power is a sum of voltage x amps

I know this is kind of going off-topic now.. but I undertand voltage like this:

Voltage = how much energy in each electron.
Amps = how many electrons are flowing (past a fixed point)
Power = the sum of the two


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ok since all of you seem to be nuts on power and voltage im gonna ask a question.
> 
> Do you see GPU temps drop with same clocks using 1.09 vs 1.06?


Yup. Small, but noticeable. It's probably more pronounced with a stock air cooler.


----------



## SiriusLeo

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ok since all of you seem to be nuts on power and voltage im gonna ask a question.
> Do you see GPU temps drop with same clocks using 1.09 vs 1.06?


Between 0.981v - 1.6v is nearly no difference at all. 1.6v - 1.9v is about 2c difference. I'm hovering anywhere between 33c and 40c during stress testing and that's almost entirely caused by the ambient room temperature. My loop liquid temp never exceeds 31c. I'm running two 45mm thick 360 rads with EK Vardar fans set to 1200rpm. One rad is push and the other is pull.


----------



## Dragonsyph

Ok cool, then I won't worry about it, thanks everyone for responding.


----------



## nrpeyton

no probs ;-)

Anyone got the slightest incling about "Loadline Trim" and "Switching Period"?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> no probs ;-)
> 
> Anyone got the slightest incling about "Loadline Trim" and "Switching Period"?


Not sure how they work with Classy card. But Loadline Trim usually refers to loadline calibration or compensation. Meaning, VRM raises voltage at heavy load in order to compensate the vdroop due to high current.
As for the switching period, usually GPU VRM runs at 500kHz switching period. Higher frequency usually gives smoothier voltage but more heat produced by the VRM (worse VRM efficiency)


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Not sure how they work with Classy card. But Loadline Trim usually refers to loadline calibration or compensation. Meaning, VRM raises voltage at heavy load in order to compensate the vdroop due to high current.
> As for the switching period, usually GPU VRM runs at 500kHz switching period. Higher frequency usually gives smoothier voltage but more heat produced by the VRM (worse VRM efficiency)


Right so it runs less efficiently due to more heat (switching period).. but under exotic cooling, could potentially allow higher stable clocks?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Right so it runs less efficiently due to more heat (switching period).. but under exotic cooling, could potentially allow higher stable clocks?


In theory yes...But in practice, you probably won't notice the difference..


----------



## hotrod717

Quote:


> Originally Posted by *nrpeyton*
> 
> *Anyone able to hazard a guess what "Loadline Trim" and "Switching Period" does in this 1080 voltage tool?
> *
> 
> full size - right click & open new tab
> 
> -At idle at 200 KHz (switching period) card is drawing 15w at 0.625v
> -At idle at 2000Khz (switching period) card is drawing 30w but voltage is still 0.625v (confirmed with multi-meter)
> 
> power draw measured with hwinfo64 in watts
> 
> Doesn't make sense to me yet,
> 
> Anyone?


Loadline trim is LLC in my best guess and switching is well switching frequency for vrm. Loadline would control vdroop and theoretically a faster switching frequency on vrm should improve efficiency, but would generate more heat. Nothing new here in terms of aftermarket cards. Asus has these adjustments and more in various versions of gpu tweak for strix and matrix cards.

What tool is this and by whom? EVGA?


----------



## nrpeyton

Quote:


> Originally Posted by *hotrod717*
> 
> Loadline trim is LLC in my best guess and switching is well switching frequency for vrm. Loadline would control vdroop and theoretically a faster switching frequency on vrm should improve efficiency, but would generate more heat. Nothing new here in terms of aftermarket cards. Asus has these adjustments and more in various versions of gpu tweak for strix and matrix cards.
> 
> What tool is this and by whom? EVGA?


hmm interesting, I never knew those tools existed in the STRIX overclocking software (coming from SLI STRIX 980).

Very interesting indeed ;-)

PM me for details, the guy who sent me the file was quite adamant (hence why I had to remove my link to it I posted a few pages back).

Here's a *comparison* between the maximum overclock I was able to achieve using 1.25v v core. compared with maximum able to achieve at 1.093v _(normal locked down maximum)._





2379 MHZ at 1.25v

No exotic cooling, just a £200 aquarium water chiller rigged up to my custom loop.

-Current overclocking & voltage controls are adequate for mainstream users, but Enthusiasts want more!

-I believe we should be able to tweak unlocked BIOS's on Nvidia GPU's, (same way we can buy an unlocked 'K' series CPU from intel & pair with an enthusiast motherboard)

-I understand its more complicated with GPU's (learnt my lesson at techpowerup the other day when I got a lot of opposition due to my less-than-diplomatic way of wording it). But I still believe we should have more flexibility. Nvidia locks down harder and harder with every new generation.

-There must be some, more amicable answers out there to make it easier for the 'average joe' to break through and at least compete in the same league as the elite who have untold access to unlimited hardware and engineering/electrical experience to draw from.

-I'm not saying for a minute that manufacturers should pay out on warranty cover when we fry a card, but I do think there should be more flexibility with BIOS tweaks and voltage for those just 'starting out' with 'extreme cooling'.

-If CPU's were all locked down the same way GPU's were.. the sport at hwbot would be reserved soley for those with the knowledge with a soldering iron and micro-electronics.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm interesting, I never knew those tools existed in the STRIX overclocking software (coming from SLI STRIX 980).
> 
> Very interesting indeed ;-)
> 
> PM me for details, the guy who sent me the file was quite adamant (hence why I had to remove my link to it I posted a few pages back).
> 
> Here's a *comparison* between the maximum overclock I was able to achieve using 1.25v v core. compared with maximum able to achieve at 1.093v _(normal locked down maximum)._
> 
> 
> 
> 
> 
> 2379 MHZ at 1.25v
> 
> No exotic cooling, just a £200 aquarium water chiller rigged up to my custom loop.
> 
> -Current overclocking & voltage controls are adequate for mainstream users, but Enthusiasts want more!
> 
> -I believe we should be able to tweak unlocked BIOS's on Nvidia GPU's, (same way we can buy an unlocked 'K' series CPU from intel & pair with an enthusiast motherboard)
> 
> -I understand its more complicated with GPU's (learnt my lesson at techpowerup the other day when I got a lot of opposition due to my less-than-diplomatic way of wording it). But I still believe we should have more flexibility. Nvidia locks down harder and harder with every new generation.
> 
> -There must be some, more amicable answers out there to make it easier for the 'average joe' to break through and at least compete in the same league as the elite who have untold access to unlimited hardware and engineering/electrical experience to draw from.
> 
> -I'm not saying for a minute that manufacturers should pay out on warranty cover when we fry a card, but I do think there should be more flexibility with BIOS tweaks and voltage for those just 'starting out' with 'extreme cooling'.
> 
> -If CPU's were all locked down the same way GPU's were.. the sport at hwbot would be reserved soley for those with the knowledge with a soldering iron and micro-electronics.


I agree with all of that


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I agree with all of that


Thank you, that means a lot


----------



## ucode

Quote:


> Originally Posted by *Coopiklaani*
> 
> There's no physical connection between the PCIE slot power to the core vrm. And vise versa, no connection from the 8-pin power to the memory vrm. I guess the screen message is from a 2D auxiliary chip or a part of the main chip (maybe).


That's very interesting, thanks for sharing.


----------



## hotrod717

Gputweak. Actually you are correct, Matrix cards featured more control than strix. Initially it looked the same at first glance, but low and behold, no LLC, PEX, or VRM Freq. available on 1080 Strix as on previous Matrix cards.
In reality, this is the first mainstream gen that has been locked down. Lightning, Hawk, Classified, and Matrix cards all the way back to 470's Can be unlocked without a soldering iron.


----------



## Buzzard1

Has any one tried the bios off the Gigabyte GeForce GTX 1080 AORUS Xtreme? I have a evga 1080 ftw and was thinking of trying them. It should allow power to go from 130% - 150%. The T4 bios worked of for me but were a little buggy every now and then.

Thank you

https://www.techpowerup.com/gpudb/b4192/gigabyte-gtx-1080-aorus-xtreme-edition


----------



## pez

So long as they don't have the same QC issues that the original Xtreme Gaming had, I think it would be a good choice. The OG Xtreme Gaming was a nice card already, but quite a few reports popped up of damaged heatsink fins, badly installed fans, etc. This could be an attempt at a rebrand to kinda make people forget (while fixing the issue). The added goodies are pretty cool, too, if that's your thing.


----------



## fat4l

I see you guys are testing Heaven etc for stability.
I dont find it that stressful.

Go and loop 3dMark Firestrike Extreme/ultra 2.nd test...try to pass 20 times for example.
No crashes, it just cancels the test


----------



## Coopiklaani

Quote:


> Originally Posted by *fat4l*
> 
> I see you guys are testing Heaven etc for stability.
> I dont find it that stressful.
> 
> Go and loop 3dMark Firestrike Extreme/ultra 2.nd test...try to pass 20 times for example.
> No crashes, it just cancels the test


I agree, FSU 2nd scene is so far the best test for stability far as I know. I can pass all other tests with +50MHz more core clock than FSU 2nd scene 1hr loop..


----------



## kevindd992002

Can you cross-flash an FTW BIOS to an SC card to increase its power limit even though it just uses a single 8pin connector?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Can you cross-flash an FTW BIOS to an SC card to increase its power limit even though it just uses a single 8pin connector?


hmm pass on that one, really not sure.

does the SC have dual BIOS switch?

*Edit:* If I were you i'd grab the FTW, great card and peace of mind.

EVGA are also the only company (I believe) that fully support your warranty if you decide to rip the stock cooler off and water cool your card

they're known in the industry as caring about customers more than any other company, partly due to being a bit smaller, compared to ASUS.

ASUS cards (just for one example) are massively over-priced for the same / less.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm pass on that one, really not sure.
> 
> does the SC have dual BIOS switch?


No, it doesn't.

And no, you really shouldn't.


----------



## Coopiklaani

Quote:


> Originally Posted by *kevindd992002*
> 
> Can you cross-flash an FTW BIOS to an SC card to increase its power limit even though it just uses a single 8pin connector?


You should use T4 BIOS if you want to increase your tdp for a SC card, not FTW BIOS. FTW cards use different power controller than FE cards (SC cards use FE PCBs). In fact, you get REDUCTION in the power limit if you use FTW BIOS on your SC card.


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm pass on that one, really not sure.
> 
> does the SC have dual BIOS switch?
> 
> *Edit:* If I were you i'd grab the FTW, great card and peace of mind.
> 
> EVGA are also the only company (I believe) that fully support your warranty if you decide to rip the stock cooler off and water cool your card
> 
> they're known in the industry as caring about customers more than any other company, partly due to being a bit smaller, compared to ASUS.
> 
> ASUS cards (just for one example) are massively over-priced for the same / less.


Yeah, I'm kind of leaning towards that route already. What do you say about the FTW Hydro Copper Gaming though? It seems to be cheaper compared to the total price of getting an FTW plus an EK WB and a backplate. Although I'm not sure who made the Hydro Copper block itself?
Quote:


> Originally Posted by *Coopiklaani*
> 
> You should use T4 BIOS if you want to increase your tdp for a SC card, not FTW BIOS. FTW cards use different power controller than FE cards (SC cards use FE PCBs). In fact, you get REDUCTION in the power limit if you use FTW BIOS on your SC card.


Is the T4 BIOS only applicable for cards that use FE PCB's?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Yeah, I'm kind of leaning towards that route already. What do you say about the FTW Hydro Copper Gaming though? It seems to be cheaper compared to the total price of getting an FTW plus an EK WB and a backplate. Although I'm not sure who made the Hydro Copper block itself?
> Is the T4 BIOS only applicable for cards that use FE PCB's?


The T4 BIOS was designed for ASUS, STRIX cards... unless you're flashing it to an ASUS STRIX it's a cross-flash no matter what card you're flashing.


----------



## tiramoko

Is it still a bad time to buy a 1080 right now? $550 for evga sc 1080 worth it? Still don't when is 1080ti coming out.


----------



## nrpeyton

Quote:


> Originally Posted by *tiramoko*
> 
> Is it still a bad time to buy a 1080 right now? $550 for evga sc 1080 worth it? Still don't when is 1080ti coming out.


There's room in the middle between 1080 and Titan for a 1080ti so I doubt you'll see prices come down on 1080 when the TI is released.

Titan prices may come down however, but if I'm wrong you're still only going to see1080 price drop of 25-50 bux.

For how many months wait? and not even an official announcement yet?

Up to you of course, but that's just my 2 pence.

If you're waiting for 1080ti (instead of the hassle of selling a 1080) maybe thats different? But then still no annoucement?

I'd base it on your 'needs' now.

It took me 2 weeks to decide which GPU to buy when I was shopping for my 1080, I can't imagine what I'd have went through if the 1080ti was also on the horizon lol.

So I certainly empathise lol ;-)


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> The T4 BIOS was designed for ASUS, STRIX cards... unless you're flashing it to an ASUS STRIX it's a cross-flash no matter what card you're flashing.


But which EVAG GTX 1080 card can only accept the T4 BIOS wihout any repercussions?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> But which EVAG GTX 1080 card can only accept the T4 BIOS wihout any repercussions?


I believe your only option would be the FTW.

I heard FTW owners were getting good results with the T4.

Doesn't the FTW Hydro Copper and FTW Hybrid also use the same BIOS as the normal FTW?
(I remember reading that somewhere on the EVGA forums a few weeks ago) but I'd double check, first.

What about the EVGA Classified?

I spent weeks trying to decide which card to get and got really frustrating (I can sense you're going through the same dilemma I did) so I empathise.. in the end.. the only way I was able to satisfy myself was to get take the extra 50 bux hit on my wallet and grab the Classified.

I have the 780TI waterblock on mine and it works like a charm. The difference between water temp and GPU idle temp is within 1-2c and difference between water and GPU load is 10c (at about 200w / normal gaming). Furmark might take me to 12c over water temp on the core. Memory overclocks great too and my VRM temps are fantastic.


----------



## SiriusLeo

Quote:


> Originally Posted by *fat4l*
> 
> I see you guys are testing Heaven etc for stability.
> I dont find it that stressful.
> 
> Go and loop 3dMark Firestrike Extreme/ultra 2.nd test...try to pass 20 times for example.
> No crashes, it just cancels the test


I usually start with Heaven because it's easy to run. Really allows me to get a feel for how things are going to overclock. Once I've dialed in an overclock that will loop Heaven for at least 30 minutes I'll move over to Time Spy Test One and Test Two. I'll start with Test one loop for at least 30 minutes then move to Test Two for 30 minutes. I still use Firestrike Ultra and Extreme but not as much as Time Spy. *IF* it passes Heaven, Time Spy Test One and Time Spy Test Two I'll then load up Witcher III and head to the Bog. I've found the Bog area in Witcher 3 to be exceptionally taxing. I've had overclocks pass hours of looping Firestrike Ultra Test One and Test Two only to crash while running around in the bog in less than 30 minutes.

This is how I found the maximum overclock for all my voltage points for my card.


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> I believe your only option would be the FTW.
> 
> I heard FTW owners were getting good results with the T4.
> 
> Doesn't the FTW Hydro Copper and FTW Hybrid also use the same BIOS as the normal FTW?
> (I remember reading that somewhere on the EVGA forums a few weeks ago) but I'd double check, first.
> 
> What about the EVGA Classified?
> 
> I spent weeks trying to decide which card to get and got really frustrating (I can sense you're going through the same dilemma I did) so I empathise.. in the end.. the only way I was able to satisfy myself was to get take the extra 50 bux hit on my wallet and grab the Classified.
> 
> I have the 780TI waterblock on mine and it works like a charm. The difference between water temp and GPU idle temp is within 1-2c and difference between water and GPU load is 10c (at about 200w / normal gaming). Furmark might take me to 12c over water temp on the core. Memory overclocks great too and my VRM temps are fantastic.


I see. Did you try the T4 BIOS for your Classified though?

I would assume that the HC and Hybrid FTW also use the same BIOS as the FTW but, yeah, I'd have to confirm.

I'm trying to keep the total price to a minimum (as we always do) so I'm no sure with the Classified. With all these choices, it really tends to get frustrating. How different is the Classified compared to the FTW anyway? Just more power phases and higher power limit?

Do you happen to know who the manufacturer of the HC block is?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> I see. Did you try the T4 BIOS for your Classified though?
> 
> I would assume that the HC and Hybrid FTW also use the same BIOS as the FTW but, yeah, I'd have to confirm.
> 
> I'm trying to keep the total price to a minimum (as we always do) so I'm no sure with the Classified. With all these choices, it really tends to get frustrating. How different is the Classified compared to the FTW anyway? Just more power phases and higher power limit?
> 
> Do you happen to know who the manufacturer of the HC block is?


The T4 isn't compatible with the Classified. (it doesn't brick it, but it just doesn't work properly due to the Classified having a massive 14-phase VRM).. they just don't work properly.. numbers get reported incorrectly e.t.c

-The Classified has it's own voltage tool.
-Pins on the side of the card conveniently added to allow you to use a multi-meter to actually take physical (hard-level) readings for vcore, v mem and a few others
-Also has an evbot connection (EVGA don't make the EVBOT anymore but there is a new "Epower board" on the horizon with evbot built in.
-Bigger power limit
-Not affected by EVGA thermal mod issue

Evbot allows you to connect a little evbot controller directly to your card (it plugs in near where the PCI-E power connectors are) and lets you control voltages directly

They're hard to find on Ebay, and if you do find one it won't be cheap (since they're rare). But hopefully if this EPOWER comes out anytime soon my dreams will be fulfilled. lol You can get updates on it's release from kingpincooling.com

HC block, not sure. I don't think it's EK though. Maybe koolance? You'd have to do a bit research on that i'm afraid. :-(


----------



## kevindd992002

Interesting. So with EVBOT, you can bypass the voltage limit imposed in the Classified's BIOS? Anybody tried using that yet?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Interesting. So with EVBOT, you can bypass the voltage limit imposed in the Classified's BIOS? Anybody tried using that yet?


Yes,

But:
They are incredibly difficult to find.

If you do manage to get a hold of one on Ebay it won't be cheap (as they're rare)

EVGA doesn't make them anymore. Not since Nvidia ordered them to cut all evbot support in the 7 series days.

However they're found a way around it, by sneaking evbot into new epower board (so Kingpin says at his forums) but we're still waiting on its release

If you manage to find an evbot i'd probably buy it off you lol (unless new Epower is released by then) and rumours of built-in evbot are true.

but I can't think why they wouldn't be.. I mean EVGA must have included evbot support on 1080 for a reason, right?


----------



## TK421

How do you set 3dmark 10 min stress test to run indefinitely?

I have paid edition.


----------



## Dragonsyph

Quote:


> Originally Posted by *TK421*
> 
> How do you set 3dmark 10 min stress test to run indefinitely?
> 
> I have paid edition.


Go to custom run tab and hit the loop button.


----------



## hotrod717

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes,
> 
> But:
> They are incredibly difficult to find.
> 
> If you do manage to get a hold of one on Ebay it won't be cheap (as they're rare)
> 
> EVGA doesn't make them anymore. Not since Nvidia ordered them to cut all evbot support in the 7 series days.
> 
> However they're found a way around it, by sneaking evbot into new epower board (so Kingpin says at his forums) but we're still waiting on its release
> 
> If you manage to find an evbot i'd probably buy it off you lol (unless new Epower is released by then) and rumours of built-in evbot are true.
> 
> but I can't think why they wouldn't be.. I mean EVGA must have included evbot support on 1080 for a reason, right?


??? EVbot where still being used on 980ti's with firmware update as well as additional control through os with classified voltage tool.

Here is firmware for 980 kp/classy. - http://forum.kingpincooling.com/showthread.php?t=2977

and 980ti kp/classy firmware at bottom (980i57hex) - http://forum.kingpincooling.com/showthread.php?t=3820

FYI - 470 thru 980ti, or 6970 thru 290x in terms of AMD, Matrix and/or Lightning/Hawk cards can be fully unlocked thru os app/Afterburner/Gputweak and/or hotwire, in case of Matrix cards, if they were available on gen. No solder needed. Pehaps a conductive pen in case of Matrix.


----------



## KickAssCop

Time to sell these cards for the potential to nab a Ti in March?


----------



## pez

Quote:


> Originally Posted by *fat4l*
> 
> I see you guys are testing Heaven etc for stability.
> I dont find it that stressful.
> 
> Go and loop 3dMark Firestrike Extreme/ultra 2.nd test...try to pass 20 times for example.
> No crashes, it just cancels the test


Sometimes games are more stressful to GPUs than benchmarks. Crysis Warhead was always a good way to test my GPU OC's (and even CPU OC's) previously. It seemed to stress the system wholly better than CPU or GPU stress tests did.

You might pass 3,000 tests in Heaven or FSU only to find Crysis 3 will bring your system to its' knees







.


----------



## TK421

Quote:


> Originally Posted by *Dragonsyph*
> 
> Go to custom run tab and hit the loop button.


thanks


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes,
> 
> But:
> They are incredibly difficult to find.
> 
> If you do manage to get a hold of one on Ebay it won't be cheap (as they're rare)
> 
> EVGA doesn't make them anymore. Not since Nvidia ordered them to cut all evbot support in the 7 series days.
> 
> However they're found a way around it, by sneaking evbot into new epower board (so Kingpin says at his forums) but we're still waiting on its release
> 
> If you manage to find an evbot i'd probably buy it off you lol (unless new Epower is released by then) and rumours of built-in evbot are true.
> 
> but I can't think why they wouldn't be.. I mean EVGA must have included evbot support on 1080 for a reason, right?


Ok, gotcha.

@All

Which is a better option?

1.) EVGA GTX 1080 FTW + EK Waterblock + EK Backplate
2.) EVGA GTX 1080 FTW Hydro Copper

Here are my concerns:

1.) In terms of resale value, won't option 1 have a better potential of being sold easily in the future because I can sell the 1080 FTW faster (with the stock air cooler) compared to selling the 1080 FTW Hydro Copper?

2.) Won't option 1 have a better cooling performance in general because it has an EK WB?


----------



## jura11

Hi there

Earlier Hydro Copper has been made by EK but in this round not sure who is making them maybe Swiftech or other company

Hope this helps

Thanks, Jura


----------



## pfinch

So anyone tried T4 with a Zotac GTX1080 AMP Extreme?
I got with the exact same settings sometimes 'red flashing dots'.
Only a reboot removes them


----------



## kevindd992002

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Earlier Hydro Copper has been made by EK but in this round not sure who is making them maybe Swiftech or other company
> 
> Hope this helps
> 
> Thanks, Jura


Yeah. I emailed them just now and this is the response I got:

Code:



Code:


Hi,

Unfortunately we do not have any info to provide on the maker of the waterblock. I apologize for any inconvenience.

Regards,
EVGA

I don't get it. Why would they not divulge that info?


----------



## jura11

Quote:


> Originally Posted by *kevindd992002*
> 
> Yeah. I emailed them just now and this is the response I got:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Hi,
> 
> Unfortunately we do not have any info to provide on the maker of the waterblock. I apologize for any inconvenience.
> 
> Regards,
> EVGA
> 
> I don't get it. Why would they not divulge that info?


Hi there

Not sure why they don't disclose this information or why they're so secretive about this

I would ask people who using these blocks what temps they're getting

As per EK block, my temps are pretty much good, not seen higher than 40-42°C with [email protected] and that's GTX1080 with 2164MHz and Titan X Maxwell with 1469MHz with water delta at 8-9°C,CPU has have some load around 16-19% and CPU temperature has been around 55-58°C on PKG ,will post screenshot later on

In gaming CPU is at mid 50's with 45% load on PKG and GPU never breaks 38°C, usually around 36-37°C is temperature for GTX1080

Hope this helps

Thanks, Jura


----------



## Coopiklaani

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Not sure why they don't disclose this information or why they're so secretive about this
> 
> I would ask people who using these blocks what temps they're getting
> 
> As per EK block, my temps are pretty much good, not seen higher than 40-42°C with [email protected] and that's GTX1080 with 2164MHz and Titan X Maxwell with 1469MHz with water delta at 8-9°C,CPU has have some load around 16-19% and CPU temperature has been around 55-58°C on PKG ,will post screenshot later on
> 
> In gaming CPU is at mid 50's with 45% load on PKG and GPU never breaks 38°C, usually around 36-37°C is temperature for GTX1080
> 
> Hope this helps
> 
> Thanks, Jura


I'm using EKWB fullcover with D5 pump.
At 3800RPM Pump speed, here's the GPU - water delta temps I have
@~180w, ~6c
@~220w, ~8c
@~280w, ~10c
@~350w, ~13c

water out temp- water in temp is about 1c


----------



## jura11

Here is screenshot from today [email protected],during the morning



Such GPU Power I've never seen on my GTX1080 and I do renders with IRAY,Octane or Poser SuperFly,usually 400W for both GPU,not for single one









Hope this helps

Thanks,Jura


----------



## hertz9753

Quote:


> Originally Posted by *jura11*
> 
> Here is screenshot from today [email protected],during the morning
> 
> 
> 
> Such GPU Power I've never seen on my GTX1080 and I do renders with IRAY,Octane or Poser SuperFly,usually 400W for both GPU,not for single one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps
> 
> Thanks,Jura


Folding? you should join my team on the FFW competition. It ends on Sunday and you don't have to fold 24/7.

http://folding.extremeoverclocking.com/user_summary.php?s=&u=725761

http://folding.axihub.ca/ffw_signup.php

All you have to do is change the team name to Second Hand Hounds and enter your username and folding name.


----------



## jura11

Quote:


> Originally Posted by *hertz9753*
> 
> Folding? you should join my team on the FFW competition. It ends on Sunday and you don't have to fold 24/7.
> 
> http://folding.extremeoverclocking.com/user_summary.php?s=&u=725761
> 
> http://folding.axihub.ca/ffw_signup.php
> 
> All you have to do is change the team name to Second Hand Hounds and enter your username and folding name.


Hi there

I'm already folding with Team BBQ and I'm under OCN team I think

http://folding.axihub.ca/displayone.php?user=jura11

Hope this helps

Thanks,Jura


----------



## hertz9753

You are on the list...

http://folding.axihub.ca/ffw_teams.php


----------



## fat4l

Quote:


> Originally Posted by *pez*
> 
> Sometimes games are more stressful to GPUs than benchmarks. Crysis Warhead was always a good way to test my GPU OC's (and even CPU OC's) previously. It seemed to stress the system wholly better than CPU or GPU stress tests did.
> 
> You might pass 3,000 tests in Heaven or FSU only to find Crysis 3 will bring your system to its' knees
> 
> 
> 
> 
> 
> 
> 
> .


This is exactly why I say, TEST SECOND TEST of 3dMARK FIRESTRIKE X/ULTRA ..."ONLY".
Loop it for 30 mins(Custom tab, 2nd test, loop on)
If you pass this, then your games should be pretty much stable or very close to stable.

Other tests or benchmarks are easy to pass. No real point of spending time there and getting excited that you can loop Heaven for 7 days but ingame the OC will fail


----------



## emreonal69

Quote:


> Originally Posted by *nrpeyton*
> 
> Are you able to use that to check the power draw in watts? i'd love to see how it compares to the onscreen reading in hwinfo64, using my multimeter.
> 
> so far I've only used it to check vcore and mem voltage using the pins especially for it on the classified.
> 
> *Anyone able to hazard a guess what "Loadline Trim" and "Switching Period" does in this 1080 voltage tool?
> *
> 
> full size - right click & open new tab
> 
> -At idle at 200 KHz (switching period) card is drawing 15w at 0.625v
> -At idle at 2000Khz (switching period) card is drawing 30w but voltage is still 0.625v (confirmed with multi-meter)
> 
> power draw measured with hwinfo64 in watts
> 
> Doesn't make sense to me yet,
> 
> Anyone?


Where can I find this voltage tool , and is it work on FTW ?


----------



## Loladinas

The topic has probably been discussed before, but trawling through nearly a thousand pages is a bit much.

I have a chance to get my hands on an EVGA SC 1080 rather cheaply; how big of a disadvantage the whole five phase VRM, lack of VRM cooling and only one 8pin for power delivery is, when overclocking? Can it get enough juice to hit the same overclocks as other cards (without blowing up, preferably)? Does replacing thermal pads fix the cooling issues or only mitigate it slightly?


----------



## GamerDork

Quote:


> Originally Posted by *Loladinas*
> 
> The topic has probably been discussed before, but trawling through nearly a thousand pages is a bit much.
> 
> I have a chance to get my hands on an EVGA SC 1080 rather cheaply; how big of a disadvantage the whole five phase VRM, lack of VRM cooling and only one 8pin for power delivery is, when overclocking? Can it get enough juice to hit the same overclocks as other cards (without blowing up, preferably)? Does replacing thermal pads fix the cooling issues or only mitigate it slightly?


I had GTX 1080 SC's in SLi, they hit 2050MHZ each in SLI, one card OC'd better than the other. The better of the two could handle 2150MHZ.

After selling those I decided to try the FTW edition 1080's, I now have those in SLI and amd still limited to 2050Mhz OC in SLi. One of those overclocks much better than the other as well.. can handle over 2200Mhz @ 1053mv and stays under 65*C, but the other card uses 1095mv runs much warmer and cannot handle anything above the 2050mhz area.

I don't see any advantage to the FTW cards with the extra power phases/plugs.

*Edit to add*

I performed the thermal pad fix/mod to the SC's and now the FTW cards and saw no change in temp's before and after.


----------



## Vellinious

Quote:


> Originally Posted by *GamerDork*
> 
> I had GTX 1080 SC's in SLi, they hit 2050MHZ each in SLI, one card OC'd better than the other. The better of the two could handle 2150MHZ.
> 
> After selling those I decided to try the FTW edition 1080's, I now have those in SLI and amd still limited to 2050Mhz OC in SLi. One of those overclocks much better than the other as well.. can handle over 2200Mhz @ 1053mv and stays under 65*C, but the other card uses 1095mv runs much warmer and cannot handle anything above the 2050mhz area.
> 
> I don't see any advantage to the FTW cards with the extra power phases/plugs.
> 
> *Edit to add*
> 
> I performed the thermal pad fix/mod to the SC's and now the FTW cards and saw no change in temp's before and after.


The only real advantage the FTW has over the SC, is the increased power limit. If we had a Pascal Bios Editor, then the dual bios would have come in handy, but......we don't.


----------



## Coopiklaani

Quote:


> Originally Posted by *Vellinious*
> 
> The only real advantage the FTW has over the SC, is the increased power limit. If we had a Pascal Bios Editor, then the dual bios would have come in handy, but......we don't.


But we have T4 bios. The best GTX 1080 ever! if you don't mind losing 1 DP port.


----------



## Dragonsyph

Anyone ever use K mode and does it provide any real benefits?


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Anyone ever use K mode and does it provide any real benefits?


You mean KBoost in PCX? It doesn't really add any benefits any more.


----------



## Dragonsyph

Quote:


> Originally Posted by *Vellinious*
> 
> You mean KBoost in PCX? It doesn't really add any benefits any more.


Ya its just a K button in EVGA precision.


----------



## Vellinious

Quote:


> Originally Posted by *Dragonsyph*
> 
> Ya its just a K button in EVGA precision.


It locks the GPU at the boost clocks that you have set. I'm not sure how boost 3.0 acts with it, though.


----------



## TechSilver13

The K button on EVGA's PX does lock the core and memory at the max speeds, which helps when going for those suicide benchmarks runs. Other than that, it is pointless in my eyes.


----------



## SiriusLeo

Just performed the shunt mod on my FE GTX 1080, (my Before and After tests show a reduction of *34%* TDP). At first it all appeared to be working well but after a few Time Spy runs I noticed the scores were low, (around *7800* running the same core/mem clocks that got me *8600* BEFORE the shunt mod).

Here is the log from Time Spy Test One running *1.094v* @ *2278core*


Yeah, I can no longer bench over 8000 after the mod no matter what I do. There's no way these numbers can be correct. I can loop through Test One and Test Two in Time Spy without dropping core frequency, (even at 2278mhz on the core) yet performance has decreased?


----------



## GamerDork

Quote:


> Originally Posted by *Dragonsyph*
> 
> Anyone ever use K mode and does it provide any real benefits?


It did absolutely nothing for me, my Time spy/Firestrike scores were exactly the same as was in game FPS and my temp's were considerably higher.


----------



## schoolofmonkey

Hey guys.

I'm wondering, is it safe to use on of those split PCI-e power cables on a GTX1080 Strix (non oc model), they have a 8 + 6 power connector or just use 2 separate ones?
It'll be running from a Corsair HX850i PSU.


----------



## Insan1tyOne

Hello Everyone,

I have come across a fairly intriguing issue, so I figured I would share it. A couple of days ago various HTML5 based players started showing up for me like THIS: (Basically all Cyan and Pink)



Disabling the HTML5 player solves the issue on Twitch.TV, but for other HTML5 players like YouTube, OpenLoad and Mp4Upload there is not a solution. Just for reference, I am using the Google Chrome version 58.0.3013.3 (Dev) 64-bit, and am obviously running an Nvidia GTX 1080 with the latest drivers (378.72 Hotfix). I tried rolling back to later drivers where I know for a fact that I did _not_ experience this issue (378.49), and it did not solve the problem.

I have spoken with one other user so far who is experiencing this issue and we have no idea what to make of it / who to report it to. At this point I do not think this is a driver-related issue. What do you all think / can any of you replicate this issue with your setups?

Thanks!

- Insan1tyOne


----------



## TechSilver13

It will be more than fine. Testing showed that the only card you probably shouldn't use it on is a single card with dual AMD GPU's (and this was years ago when power draw was insane). You are good to my friend.


----------



## Coopiklaani

Quote:


> Originally Posted by *Dragonsyph*
> 
> Anyone ever use K mode and does it provide any real benefits?


It does nothing to GPU boost 3.0. It works similarly to force your card to max performance mode, i.e. no power saving. It cannot overtake temp limit, or voltage limit or TDP limit...


----------



## Coopiklaani

Quote:


> Originally Posted by *SiriusLeo*
> 
> Just performed the shunt mod on my FE GTX 1080, (my Before and After tests show a reduction of *34%* TDP). At first it all appeared to be working well but after a few Time Spy runs I noticed the scores were low, (around *7800* running the same core/mem clocks that got me *8600* BEFORE the shunt mod).
> 
> Here is the log from Time Spy Test One running *1.094v* @ *2278core*
> 
> 
> Yeah, I can no longer bench over 8000 after the mod no matter what I do. There's no way these numbers can be correct. I can loop through Test One and Test Two in Time Spy without dropping core frequency, (even at 2278mhz on the core) yet performance has decreased?


Maybe your card isn't stable at 2278MHz? How much does it score at 2100MHz?


----------



## SiriusLeo

Quote:


> Originally Posted by *Coopiklaani*
> 
> Maybe your card isn't stable at 2278MHz? How much does it score at 2100MHz?


*2273* is definitely stable. I'll post more after I gather more information but, apparently performing the shunt mod changes the foundation of how overclocking works, (certainly did for me at least). The method I was doing before the shunt mod was no longer effective *AT ALL* after the mod. I mean, my numbers were registering in all my logs but the results weren't reflecting those numbers. An example... I have two logs, both register the *SAME* numbers for the *SAME* timesstamps but the Time Spy score vary by nearly *1000*... O_O ... I was so confused...

Now that I've figured it out though, I can start to push it further.

Here is my first *REAL* run after the shunt mod. *8700* Graphics Score with *2265* Core.


----------



## kevindd992002

Quote:


> Originally Posted by *Vellinious*
> 
> The only real advantage the FTW has over the SC, is the increased power limit. If we had a Pascal Bios Editor, then the dual bios would have come in handy, but......we don't.


Does that mean that if the SC and FTW are both watercooled and given they are both good in the silicon lottery, the FTW would still overclock further?


----------



## Dragonsyph

Thanks everyone for responding, looks like K mode is no good and i wont use it.


----------



## SiriusLeo

This is the log for my full validated Time Spy run at *2290mhz* core and *5508mhz* ram.

https://docs.google.com/spreadsheets/d/12JNfo2AxpO2RTIENaiSJvQ4rhT20T2lUb6XunHoQRVU/pubhtml

This is the log for my full validated Time Spy run at *2278mhz* core and *5508mhz* ram.

https://docs.google.com/spreadsheets/d/1P0g_GjD2seSH66eJDdBCG4dpib2uxpaB5Py8FqhXaSw/pubhtml

Overall this is about the limit of my card currently. Seems like *2290mhz* on the core is the maximum stable clock I can achieve at the max stock voltage of *1.094v*. After the shunt mod all my clocks have stabled out dramatically, even up to *2278mhz*. I witnessed overall about a *30-35%* reduction of my power limit and overall power draw of my system while running Time Spy went up about *80watts*. Temps while running Time Spy at maximum sustained clocks after the shunt mod went up about *3c*.

Had some issues overclocking after the shunt mod but I found that arching the voltage curve per voltage point in MSI Afterburner resolved the issue as seen in the image below...



What confused me was the fact that BOTH those curves resulted in my logs being identical. BOTH logs would read stable voltage and stable clocks yet the scores would differ drastically. Another thing I noticed was that MSI Afterburner was reading my TDP about *15-20* points higher than GPUZ and HWiNFO64. I also noticed that when MSI Afterburner registered my TDP hit over *100%* my core clocks would dip, (even though at that SAME point GPUZ and HWiNFO64 read the card TDP at a much lower *85%*). So, the card was reacting to MSI's readings - not GPUZ or HWiNFO64's.


----------



## Vellinious

Quote:


> Originally Posted by *kevindd992002*
> 
> Does that mean that if the SC and FTW are both watercooled and given they are both good in the silicon lottery, the FTW would still overclock further?


No. It just wouldn't be limited as much by the power limit. If it all. I haven't been able to push either of my GPUs hard enough to reach the power limit yet. The SCs I had originally would, though. WIth air cooling.


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> The only real advantage the FTW has over the SC, is the increased power limit. If we had a Pascal Bios Editor, then the dual bios would have come in handy, but......we don't.
> 
> 
> 
> Does that mean that if the SC and FTW are both watercooled and given they are both good in the silicon lottery, the FTW would still overclock further?
Click to expand...

What it means with water cooling is your card won't throttle with high temps and yes, effectively overclock better with the increased voltages of a custom BIOS.


----------



## nrpeyton

Stuck using an old Radeon 5770 due to my 1080 still being "wet" after having to literally leave it to "thaw out" after my experiment yesterday.

Card was still running perfectly stable before I switched it off.

I knew as soon as I switched it off the ice would melt and possibly cause a short.

Here's the video lol:




_(Thick slab of ice in the gap between block and card)_


----------



## kevindd992002

Quote:


> Originally Posted by *SiriusLeo*
> 
> This is the log for my full validated Time Spy run at *2290mhz* core and *5508mhz* ram.
> 
> https://docs.google.com/spreadsheets/d/12JNfo2AxpO2RTIENaiSJvQ4rhT20T2lUb6XunHoQRVU/pubhtml
> 
> This is the log for my full validated Time Spy run at *2278mhz* core and *5508mhz* ram.
> 
> https://docs.google.com/spreadsheets/d/1P0g_GjD2seSH66eJDdBCG4dpib2uxpaB5Py8FqhXaSw/pubhtml
> 
> Overall this is about the limit of my card currently. Seems like *2290mhz* on the core is the maximum stable clock I can achieve at the max stock voltage of *1.094v*. After the shunt mod all my clocks have stabled out dramatically, even up to *2278mhz*. I witnessed overall about a *30-35%* reduction of my power limit and overall power draw of my system while running Time Spy went up about *80watts*. Temps while running Time Spy at maximum sustained clocks after the shunt mod went up about *3c*.
> 
> Had some issues overclocking after the shunt mod but I found that arching the voltage curve per voltage point in MSI Afterburner resolved the issue as seen in the image below...
> 
> 
> 
> What confused me was the fact that BOTH those curves resulted in my logs being identical. BOTH logs would read stable voltage and stable clocks yet the scores would differ drastically. Another thing I noticed was that MSI Afterburner was reading my TDP about *15-20* points higher than GPUZ and HWiNFO64. I also noticed that when MSI Afterburner registered my TDP hit over *100%* my core clocks would dip, (even though at that SAME point GPUZ and HWiNFO64 read the card TDP at a much lower *85%*). So, the card was reacting to MSI's readings - not GPUZ or HWiNFO64's.


How did arching the voltage curve per voltage point solve the issue? I'm curious on the details








Quote:


> Originally Posted by *Vellinious*
> 
> No. It just wouldn't be limited as much by the power limit. If it all. I haven't been able to push either of my GPUs hard enough to reach the power limit yet. The SCs I had originally would, though. WIth air cooling.


Quote:


> Originally Posted by *KedarWolf*
> 
> What it means with water cooling is your card won't throttle with high temps and yes, effectively overclock better with the increased voltages of a custom BIOS.


Ok, so I guess to be safe it's still better to get the FTW. The FTW Hydro Copper was $650 a few days ago but in "ships within 1 to 2 months" status. I asked Amazon support if that was the final price, they said yes but there is still no exact shipping date. Today, there are a few stocks left but with a price tag of $799.99! I should've pulled the trigger a few days ago when I had the chance, dammit.


----------



## nrpeyton

Quote:


> Originally Posted by *SiriusLeo*
> 
> Just performed the shunt mod on my FE GTX 1080, (my Before and After tests show a reduction of *34%* TDP). At first it all appeared to be working well but after a few Time Spy runs I noticed the scores were low, (around *7800* running the same core/mem clocks that got me *8600* BEFORE the shunt mod).
> 
> Here is the log from Time Spy Test One running *1.094v* @ *2278core*
> 
> 
> Yeah, I can no longer bench over 8000 after the mod no matter what I do. There's no way these numbers can be correct. I can loop through Test One and Test Two in Time Spy without dropping core frequency, (even at 2278mhz on the core) yet performance has decreased?


Very interesting.

I've noticed that behaviour' when since I got my 1080.

So what you are saying, is even though your voltage stayed at 1.093v the entire run (and your clock stayed at 2278/2290) _the lower points on the voltage/frequency curve still impacted your score?_. Even though the card never 'used' those voltages/frequencies, during the run.

That sounds familiar.

Not only that, 'removing' the power limit on your card (with SHUT mod) actually LOWERED your score?

That's how I've 'understood' your 2 posts.. if you can clarify my understanding.. then I think I might have a theory about 'why'... its a longshot but a theory nonetheless.

Quote:


> Originally Posted by *GamerDork*
> 
> I had GTX 1080 SC's in SLi, they hit 2050MHZ each in SLI, one card OC'd better than the other. The better of the two could handle 2150MHZ.
> 
> After selling those I decided to try the FTW edition 1080's, I now have those in SLI and amd still limited to 2050Mhz OC in SLi. One of those overclocks much better than the other as well.. can handle over 2200Mhz @ 1053mv and stays under 65*C, but the other card uses 1095mv runs much warmer and cannot handle anything above the 2050mhz area.
> 
> I don't see any advantage to the FTW cards with the extra power phases/plugs.
> 
> *Edit to add*
> 
> I performed the thermal pad fix/mod to the SC's and now the FTW cards and saw no change in temp's before and after.


How were you measuring the temperature?

The thermal mod was not to help CORE temps (the temp your card displays in windows). It was to help VRM temps mainly (and to a small extent memory temps too). Which could help with memory overclocking a little.

Will also prolong the life of your card. Since VRM is running cooler.

Under EXTEME stress conditions (I.E. prolonged Furmark it *may* be possible to get a difference in core temp by cooler VRM since less heat will be conducting across the circuit board of your over to the core.

Most cards (except EVGA's new ICX cards) don't have temperature sensors for mem & VRM. None of our cards have that.

I believe this will all be changing very soon, as other companies will follow EVGA.

Up until now (EVGA's new ICX), only AMD offered temp monitoring of mem & VRM in windows.

_P.S
if you really wnated to measure the difference you could stick temp probes to corresponding components on the back of your PCB. (but that would be a bit pointless now as you've already done the MOD so you'd have no base-line)._


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Does that mean that if the SC and FTW are both watercooled and given they are both good in the silicon lottery, the FTW would still overclock further?


It may overclock the same, (I.E you'll be able to achieve the same frequencies).

However depending on the how demanding the software is that you're running, if you hit the power limit... GPU BOOST 3.0 will automatically (and momentarily) lower your overclock to bring the card back within its maximum power envelope.

If you were running an app that draws more power than your limit continuously, you'd never see your overclock at all. (despite setting it). As the GPU BOOST 3.0 would have you throttled (down-clocked the entire time.

However there's not a lot of software out there that fully utilises the power limits on any card except the Founders Edition (reference). which only has a max of 180w, or 216w (max) with power slider at 120%.

Safest bet is go for something (*anything*) with a power limit GREATER than 217w with compatible EK waterblock.

Even 215w would be fine.. as once you apply the power overclock (power slider in MSI afterburner) a card advertised with a 215w limit would then allow up to 258 watts which is more than enough, for *anything*.
_
*Unless* you wanted to do LN2 on the card, in future._


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> It may overclock the same, (I.E you'll be able to achieve the same frequencies).
> 
> However depending on the how demanding the software is that you're running, if you hit the power limit... GPU BOOST 3.0 will automatically (and momentarily) lower your overclock to bring the card back within its maximum power envelope.
> 
> If you were running an app that draws more power than your limit continuously, you'd never see your overclock at all. (despite setting it). As the GPU BOOST 3.0 would have you throttled (down-clocked the entire time.
> 
> However there's not a lot of software out there that fully utilises the power limits on any card except the Founders Edition (reference). which only has a max of 180w, or 216w (max) with power slider at 120%.
> 
> Safest bet is go for something (*anything*) with a power limit GREATER than 217w with compatible EK waterblock.
> 
> Even a little great, 225w would be fine.. as once you apply the power overclock (power slider in MSI afterburner) a card advertised with a 225w limit would then allow up to 270wm which is more than enough, for *anything*.
> _
> *Unless* you wanted to do LN2 on the card, in future._


Right.

Is there an AIB card that uses a reference board yet the power limit is more than 180W?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Right.
> 
> Is there an AIB card that uses a reference board yet the power limit is more than 180W?


I've edited my last post slightly (look for something with an advertised power limit of anything over 215w (that allow up to 258 watts with power slider at 120%).

Best way to find that out is make a list of maybe 10 reference models. Write down the power limit (if advertised).

If not advertised - head over to the BIOS database at tech-powerup & look the card's BIOS up.

I know all EVGA's reference designs only have 180w limits due to 5-phase VRM.

Not sure about other companies. (You could double check though). But I think you'll be hard pushed to find a reference design (5-phase VRM) with a power limit greater than 180w.

The reason for that is safety of PCI-E power cable draw.

A cable allows a draw of 150w. The PCI-E lane it's self allows 75w. Thats 225w. (only 9 watts within spec) for a graphics card with just 1 power connector cable from the PSU.

And *all* reference designs only have 1 power connector. (if they had two, they wouldn't be a reference design.

This page shows *all* EVGA's reference designs *all* only feature 180w limits.

http://www.evga.com/products/product.aspx?pn=08G-P4-5180-KR

*Edited twice* /\

P.S.
If I were you, I'd just grab the FTW and get an EK waterblock on it at a later date.

You get to enjoy and test the card with its stock heatsink/fan on from factory.

Then later (next month after you get paid again) throw an EK block on it. That way you get the ejoyment of comparing the two differences in performace you're able to achieve with watercooling.

You'll also learn something about the card when you take it apart.

EVGA is also the only company that covers you full warranty with watercooilng. (if you took the stock heatsink off any other manufacturers card you'd void the warranty).

Just do it mate, honestly - you'll be glad in the long-run.. Spend an extra 50 bux and grab the FTW. Or spend an extra 100 bux and get the Classified.

You might be strapped for crash for 3 weeks but you're going to spend most of that time by your PC, enjoying your new card anyway! 

*i7 6700k owners*

Hi lads, need a favour from one of you:

Could you do a Firestrike (1st, basic 1080p *physics* test) please, at *stock clocks?*

Make it a custom for quickness if you want. ;-) Just the *physics* test, is all I need.









And post the score:
*1)* With hyper-threading on
*2)*With hyper-threading off

_Rep+ if anyone can do this lil favour for me, please ;-)_


----------



## ucode

Quote:


> Originally Posted by *SiriusLeo*
> 
> Here is my first *REAL* run after the shunt mod. *8700* Graphics Score with *2265* Core.


Nice. Wish mine could clock that high at 1.094V. A quick run with x-flash T4 and GPU clock at 2240MHz needs 1.2V to get me near.
Hitting 55C.



With a mod instead of x-flash should hopefully break 9000 overall score. Not so bad for an FE and $300 CPU however the physics doesn't seem to fully load the cores. Shame that Intel limited it's clock but at least MT is nice and has the option of adding another CPU with the right board.


----------



## SiriusLeo

Quote:


> Originally Posted by *kevindd992002*
> 
> How did arching the voltage curve per voltage point solve the issue? I'm curious on the details


I'm not entirely sure how it fixed it - I only know that it did, or at least, it was a point in the right direction. As shown in the voltage curve pictures my logs would read the same. The card thinks it's running steady voltage and core clock in both tests. The only difference is the variation of Time Spy scores. My guess is that the logs are wrong and the card is actually running slower than is being displayed when I don't adjust the voltage curve per voltage point.


----------



## SiriusLeo

Quote:


> Originally Posted by *nrpeyton*
> 
> *So what you are saying, is even though your voltage stayed at 1.093v the entire run (and your clock stayed at 2278/2290) the lower points on the voltage/frequency curve still impacted your score?. Even though the card never 'used' those voltages/frequencies, during the run.
> 
> Not only that, 'removing' the power limit on your card (with SHUT mod) actually LOWERED your score?*


Yes, this is exactly what I don't understand. This issue only happened after the shunt mod. Before the mod I was able to switch between a subtle voltage curve or a spiked voltage curve, (like the example in the picture I posted) and my scores would only vary slightly. BEFORE the shunt mod lower voltages like *0.981* preferred a spiked voltage curve and higher voltages like *1.093* preferred a subtle voltage curve. After the shunt mod I have to produce a subtle voltage curve or my performance/scores will suffer.

Another thing I don't get is why GPUZ and HWiNFO64 are reading my TFP lower than MSI, (by about *10-15%*) and the card is reacting to MSI's reading, not to GPUZ and HWiNFO64. I would assume the actual TDP is closer to GPUZ and HWiNFO but the card seems to think MSI's readings are more accurate. Example being, if I set the max power limit to *90%* and clock the card up to *2200core* the readings from GPUZ and HWiNFO will never exceed *90%* after the shunt mod. However, if I look at the readout for MSI Afterburner I find that not only is the power limit exceeding *90%* the card will down clock itself everytime it nears that *90%* mark, again, even though GPUZ and HWiNFO aren't reading anywhere near *90%* TDP in my logs.

I might just make a video - it's hard to explain these things. All in all, the shunt mod definitely increased performance and stability. I just feel like all my readings are misleading and I'm having to adjust everything blindly solely based off scores and FPS counters. I've managed to do that, I'm just curious as to why my readings are wrong. Even HWiNFO is reading my Power Draw in watts wrong now, states it's pulling about *80w* less even though my power draw from the wall has clearly gone up about *80w* when checked with an external reader.


----------



## SiriusLeo

Here is the best example I can give you regarding the performance / score anomaly. Look at the log and how the core, memory and voltage are all static through the tests yet the FPS is off by 4! The logs are even showing less TDP and power draw for the, "spiked" curve yet it's holding steady at the same core clock and voltage.


----------



## Coopiklaani

Quote:


> Originally Posted by *SiriusLeo*
> 
> This is the log for my full validated Time Spy run at *2290mhz* core and *5508mhz* ram.
> 
> https://docs.google.com/spreadsheets/d/12JNfo2AxpO2RTIENaiSJvQ4rhT20T2lUb6XunHoQRVU/pubhtml
> 
> This is the log for my full validated Time Spy run at *2278mhz* core and *5508mhz* ram.
> 
> https://docs.google.com/spreadsheets/d/1P0g_GjD2seSH66eJDdBCG4dpib2uxpaB5Py8FqhXaSw/pubhtml
> 
> Overall this is about the limit of my card currently. Seems like *2290mhz* on the core is the maximum stable clock I can achieve at the max stock voltage of *1.094v*. After the shunt mod all my clocks have stabled out dramatically, even up to *2278mhz*. I witnessed overall about a *30-35%* reduction of my power limit and overall power draw of my system while running Time Spy went up about *80watts*. Temps while running Time Spy at maximum sustained clocks after the shunt mod went up about *3c*.
> 
> Had some issues overclocking after the shunt mod but I found that arching the voltage curve per voltage point in MSI Afterburner resolved the issue as seen in the image below...
> 
> 
> 
> What confused me was the fact that BOTH those curves resulted in my logs being identical. BOTH logs would read stable voltage and stable clocks yet the scores would differ drastically. Another thing I noticed was that MSI Afterburner was reading my TDP about *15-20* points higher than GPUZ and HWiNFO64. I also noticed that when MSI Afterburner registered my TDP hit over *100%* my core clocks would dip, (even though at that SAME point GPUZ and HWiNFO64 read the card TDP at a much lower *85%*). So, the card was reacting to MSI's readings - not GPUZ or HWiNFO64's.


I have noticed this "problem" for quite a while actually. It's not the problem of the shunt mod; it also happens to cards using T4 bios.My theory is, although both curves give you the same solid core freq, GPU actually has a certain amount of flexibility in the actual frequency. Actually GPU frequency usually jumps between two adjacent frequency/voltage points randomly. In the "smooth" case, this mean the GPU jumps between 2278 and 2264Mhz; in the steep case, the GPU jumps between 1700 to 2278MHz... GPU-Z only reports the target GPU frequency at a rather slow interval. The actual GPU frequency is sometimes different from the target frequency that GPU reports.


----------



## zipper17

Quote:


> Originally Posted by *SiriusLeo*
> 
> Here is the best example I can give you regarding the performance / score anomaly. Look at the log and how the core, memory and voltage are all static through the tests yet the FPS is off by 4! The logs are even showing less TDP and power draw for the, "spiked" curve yet it's holding steady at the same core clock and voltage.


seem happen to all pascal card, a significant point curve drop will make performance lower. It was also happened to my 1070
http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6540#post_25703006

This is a quote short speculate explanation from some guy








http://www.overclock.net/t/1601546/official-nvidia-gtx-1070-owners-club/6560#post_25704868


----------



## pez

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Hey guys.
> 
> I'm wondering, is it safe to use on of those split PCI-e power cables on a GTX1080 Strix (non oc model), they have a 8 + 6 power connector or just use 2 separate ones?
> It'll be running from a Corsair HX850i PSU.


Quote:


> Originally Posted by *TechSilver13*
> 
> It will be more than fine. Testing showed that the only card you probably shouldn't use it on is a single card with dual AMD GPU's (and this was years ago when power draw was insane). You are good to my friend.


To add to that further, this is less of an issue if you're using a PSU with a single rail vs multi rail. I don't think there's a modern GPU out now that would realistically overload the actual wires. (i.e. heat)


----------



## TK421

Looking for some knowledgable people about bios flashing here.

I have obtained myself a hardware flasher and can flash GPU vbios without software

I want to know if anyone could mod my vbios, it doesn't matter if you can or cannot flash it with software. I will force flash it with hardware.

Can brick as many times necessary, don't have to work on the 1st try.

I can provide a compensation for your work if needed!


----------



## ucode

@TK421 I did this some time ago with a Pascal GTX 1050Ti, no one seemed interested but that seems the general case for me so not so interested in trying explain things anymore. FWIW the nVidia driver does not like a modified Pascal VBIOS, or a least those regions that are sensitive to modification.



External EEPROM chip mod, the other 2 larger wires below were used to measure GPU voltage while using a dynamic voltage mod.

Hope you can find a VBIOS mod that will work. IIRC you know Johnksss from NBR and he had some modified VBIOS he wanted to flash shortly after Pascal launch. Might be worth sending him a PM.


----------



## TK421

Quote:


> Originally Posted by *ucode*
> 
> @TK421 I did this some time ago with a Pascal GTX 1050Ti, no one seemed interested but that seems the general case for me so not so interested in trying explain things anymore. FWIW the nVidia driver does not like a modified Pascal VBIOS, or a least those regions that are sensitive to modification.
> 
> 
> 
> External EEPROM chip mod, the other 2 larger wires below were used to measure GPU voltage while using a dynamic voltage mod.
> 
> Hope you can find a VBIOS mod that will work. IIRC you know Johnksss from NBR and he had some modified VBIOS he wanted to flash shortly after Pascal launch. Might be worth sending him a PM.


Oh yes, I know john very well.

I have one of those clamp type flasher, not really sure what's the technical term of it.


----------



## Loladinas

Quote:


> Originally Posted by *TK421*
> 
> not really sure what's the technical term of it.


EEPROM programmer, EEPROM clip?


----------



## TK421

Quote:


> Originally Posted by *Loladinas*
> 
> EEPROM programmer, EEPROM clip?


Probably that.


----------



## jmr71

Hello

I have Wrong clock reading in monitor window.
Reinstalling the drivers Nvidia and GPUTweak II does not solve the problem.

What is the real GPU Boost 2100 MHz or 1980 MHz?

BF1test2.jpg 574k .jpg file


The problem described below, but it is not solved:

https://rog.asus.com/forum/showthread.php?90526-Asus-GPU-Tweak-and-Nvidia-378-49-Drivers-Bug

Please help. Thanks in advance.


----------



## GRABibus

Quote:


> Originally Posted by *jmr71*
> 
> Hello
> 
> I have Wrong clock reading in monitor window.
> Reinstalling the drivers Nvidia and GPUTweak II does not solve the problem.
> 
> What is the real GPU Boost 2100 MHz or 1980 MHz?
> 
> BF1test2.jpg 574k .jpg file
> 
> 
> The problem described below, but it is not solved:
> 
> https://rog.asus.com/forum/showthread.php?90526-Asus-GPU-Tweak-and-Nvidia-378-49-Drivers-Bug
> 
> Please help. Thanks in advance.


Did you try to overclock with another soft as MSI AB v4.3.0 for example ?
Same issue ?


----------



## Dragonsyph

Quote:


> Originally Posted by *jmr71*
> 
> Hello
> 
> I have Wrong clock reading in monitor window.
> Reinstalling the drivers Nvidia and GPUTweak II does not solve the problem.
> 
> What is the real GPU Boost 2100 MHz or 1980 MHz?
> 
> BF1test2.jpg 574k .jpg file
> 
> 
> The problem described below, but it is not solved:
> 
> https://rog.asus.com/forum/showthread.php?90526-Asus-GPU-Tweak-and-Nvidia-378-49-Drivers-Bug
> 
> Please help. Thanks in advance.


You are at 2100mhz as you can see in GPU-z(Gpu-z is a real time reading), with GPU boost 3.0 your GPU will shift clock speeds based on which voltage it's at and what the environmental conditions are. It will boost high as it can depending on the current voltage point.


----------



## pfinch

Hey guys,

I have for some time 'red popups' as soon as I apply set over 1.050 at intensive games/benchmarks (Timespy, the Hunter). Up to 1.05 v everything is working fine (2075 core at that voltage).
Previously I had not the problem.
Get these glitches with T4 and Stock-BIOS.

Do you know the problem or do you know what could cause it?

Already reinstalled the driver and MSI Afterburner (DDU).


----------



## Loladinas

Any thoughts regarding KFA2 1080 EXOC? I'm having a hard time finding high res pictures of it, but at least it has a 6+8 pin power input and a 6+2 VRM.


----------



## nrpeyton

Quote:


> Originally Posted by *TK421*
> 
> Looking for some knowledgable people about bios flashing here.
> 
> I have obtained myself a hardware flasher and can flash GPU vbios without software
> 
> I want to know if anyone could mod my vbios, it doesn't matter if you can or cannot flash it with software. I will force flash it with hardware.
> 
> Can brick as many times necessary, don't have to work on the 1st try.
> 
> I can provide a compensation for your work if needed!


No one has been able to mod a PASCAL BIOS yet.

Nvidia has them locked down so hard no one has been able do decrypt it yet. (this is the rumour).

The only benefit I can see to you having the "hardware vbios flasher" is you could try different BIOS's from other cards to see which one gets you better results. And not worrying about bricking. As you can hardware flash something else to fix.
However a lot of cards are shipped these days with DUAL bios switches anyway, so many people already have this safety net. There still isn't a way to edit the BIOS.

Unfortunately though there are only two known BIOS's that go over 1.093v. An XOC HOF BIOS (new) and one that's been around for quite a while now and is known as the XOC STRIX T4. The T4 (latter one) is mostly compatible with all cards from Founders Editions right up to the likes of EVGA FTW's... but isn't compatible with EVGA Classified or HOF's (I think may be something to do with the PCB designs being so different due to having so many phases on VRMS's). The latter (bigger cards) use the HOF BIOS or their overclocking tools.


----------



## pfinch

Hey,
someone got T4 working on an EVGA FTW HYBRID?


----------



## Vellinious

Quote:


> Originally Posted by *pfinch*
> 
> Hey,
> someone got T4 working on an EVGA FTW HYBRID?


It works on the FTW, it should work on the FTW hybrid....they're the exact same card.


----------



## pfinch

Quote:


> Originally Posted by *Vellinious*
> 
> It works on the FTW, it should work on the FTW hybrid....they're the exact same card.


So it's a nice decision to get one instead of an ASUS strix? (option to step up 1080ti etc)


----------



## ssgwright

what overclocks are you guys getting? I did the shunt mod and I'm running the t4v2 bios and I'm able to hit 2164 game stable. Haven't messed around with benches yet since the shunt mod.


----------



## GRABibus

Quote:


> Originally Posted by *ssgwright*
> 
> what overclocks are you guys getting? I did the shunt mod and I'm running the t4v2 bios and I'm able to hit 2164 game stable. Haven't messed around with benches yet since the shunt mod.


I am stable 2202MHz without shunt mod and with t4 bios.

What is this t4v2 ?


----------



## ssgwright

not sure what the difference is... it's linked a couple pages back. Is there a trick to using the curve that I don't know about? What voltage are you running at 2202?


----------



## GRABibus

Quote:


> Originally Posted by *ssgwright*
> 
> not sure what the difference is... it's linked a couple pages back. Is there a trick to using the curve that I don't know about? What voltage are you running at 2202?


I am at 1.1V

For the curve : i have first set an offset of 210MHz in MSI AB.
Then, I have tweaked starting at 1.1V to have a flat curve at 2202MHz until 1.2V

Here is the curve :

http://www.casimages.com/img.php?i=17022210560417369814868622.png

The one below the curve is the default curve


----------



## Vellinious

Quote:


> Originally Posted by *ssgwright*
> 
> not sure what the difference is... it's linked a couple pages back. Is there a trick to using the curve that I don't know about? What voltage are you running at 2202?


At 2202 with normal ambients (18-19c), I'm running 1.075v. The lower the ambients, the lower the voltage it takes for 2202 to run well.


----------



## ssgwright

Quote:


> Originally Posted by *GRABibus*
> 
> I am at 1.1V
> 
> For the curve : i have first set an offset of 210MHz in MSI AB.
> Then, I have tweaked starting at 1.1V to have a flat curve at 2202MHz until 1.2V
> 
> Here is the curve :
> 
> http://www.casimages.com/img.php?i=17022210560417369814868622.png
> 
> The one below the curve is the default curve


why do you set an offset of 210?


----------



## ucode

Quote:


> Originally Posted by *TK421*
> 
> Oh yes, I know john very well.


And of course Prema too









FYI these VBIOS are not encrypted, just signed. That means it's easy enough to find where to change things but without signing those changes get rejected. There is a mechanism to apply for a user key (Hulk certificate) from the manufacturer with nvflash so that VBIOS tweaks can be applied by using the Hulk certificate to bypass the original signing but as that never seemed to have taken off it may well be abandoned now.

Do note there are something like 80 voltage points with Pascal of which software such as Afterburner only shows 40 of and limits the top voltage point at 1.2V while it actually goes to 1.24375V so you would likely need to find some other software for higher voltage points. Don't know if that 1.24375V firmware maximum can be increased.

Other than the possibility of increasing the Video clock over GPU clock I'd personally go for a hardware mod.


----------



## alucardis666

Deleted


----------



## KickAssCop

So which one of you guys are *upgrading to the 1080 Ti?


----------



## GRABibus

Quote:


> Originally Posted by *ssgwright*
> 
> why do you set an offset of 210?


Because if I don't do it and tweak only starting at 1.1v, then I get a score graphics score time spy of 7800 roughly.
If I do like explained above (With offset), graphics score jumps to 8500.

I already had ewplained this some days ago in the same thread.....please, check.
I am outside and on IPhone then I can't find again my thread


----------



## GRABibus

Quote:


> Originally Posted by *KickAssCop*
> 
> So which one of you guys are *upgrading to the 1080 Ti?


Maybe....
Or SLI GTX 1080.....
Let's wait for benchmarks


----------



## alucardis666

Hey guys, can I please get a little help over here?

http://www.overclock.net/t/1623986/help-understanding-voltage-frequency-curve-editor-in-msi-afterburner#post_25866112

Thanks!


----------



## mbm

Sorry for asking this,, maybe I could read it somewhere in the post.

Is there a bios editor for the 1080 GTX.. I would like to set my boost clock to a fixed number.


----------



## alucardis666

Quote:


> Originally Posted by *mbm*
> 
> Sorry for asking this,, maybe I could read it somewhere in the post.
> 
> Is there a bios editor for the 1080 GTX.. I would like to set my boost clock to a fixed number.


Currently there's no bios editor, your best bet is playing with the freq/volt curve editor in MSI AB it's the little icon to the left of where you set your core offset.


----------



## ucode

Quote:


> Originally Posted by *alucardis666*
> 
> Hey guys, can I please get a little help over here?
> 
> http://www.overclock.net/t/1623986/help-understanding-voltage-frequency-curve-editor-in-msi-afterburner#post_25866112
> 
> Thanks!


You could look through the 1070 thread, they have been posting about 'the curve' lately. Just a thought as I don't use MSI Afterburner myself.


----------



## alucardis666

Quote:


> Originally Posted by *ucode*
> 
> You could look through the 1070 thread, they have been posting about 'the curve' lately. Just a thought as I don't use MSI Afterburner myself.


Thanks!


----------



## tiramoko

Im getting my evga 1080 today. Someone posted about using the precision x or curve for the temps but I can't remember what it is for.


----------



## Dragonsyph

There any way to save the curve to a file so it can be uploaded so i can use it hahahaha?


----------



## 6u4rdi4n

I do believe it gets saved in the cfg file in the profile folder of afterburner.


----------



## tiramoko

i think my gtx 1080 is bottle necking my i5 4670k(4.4ghz). on battlefield 1, i only get 60-70fps on 1440p and same for 1080p.
im not sure why? i've seen battlefield gameplay on youtube its fps goes more than 110 on 1440p (i7 4670k+ gtx 1080)


----------



## sirleeofroy

Quote:


> Originally Posted by *tiramoko*
> 
> i think my gtx 1080 is bottle necking my i5 4670k(4.4ghz). on battlefield 1, i only get 60-70fps on 1440p and same for 1080p.
> im not sure why? i've seen battlefield gameplay on youtube its fps goes more than 110 on 1440p (i7 4670k+ gtx 1080)


I know this might sound really obvious but, V SYNC?

Check not only the game setting but also the Nvidia Control Panel, under "Manage 3D Settings" make sure to check:

Multi-display/mixed-GPU acceleration - set to "Single Display Performance Mode"

Power Management Mode - set to "Prefer Maximum Performance"

Ensure "DSR - Factors" is off.

And finally, now I think about it this could be your issue (wild guess based on nothing!) in the BF1 game settings, make sure your "resolution scale" is set to 100.

Anything above that forces the GPU to render at higher resolutions and then scale to fit your current resolution which uses loads of power for no visual gain (IMO). Especially if you're running 1440p.

Hope that helps ya


----------



## owikhan

So after Zotac 1080 now buy MSI GTX 1080 Gaming X

Is there any update lates bioes?and what is the way of upgrade bios?


----------



## Alberello

Quote:


> Originally Posted by *tiramoko*
> 
> i think my gtx 1080 is bottle necking my i5 4670k(4.4ghz). on battlefield 1, i only get 60-70fps on 1440p and same for 1080p.
> im not sure why? i've seen battlefield gameplay on youtube its fps goes more than 110 on 1440p (i7 4670k+ gtx 1080)


at what setting? What point/map?
Use this procedure to reinstall/upgrade the drivers:

*How to clean install the driver:*
1. Download the new driver: http://www.guru3d.com/files-details/geforce-378-72-driver-download.html
2. Download the last version of "Display Driver Uninstaller (DDU)" V17.0.5.4 here: http://www.wagnardsoft.com/content/display-driver-uninstaller-ddu-v17054-released
3. Reboot in safe mode, if you don't know how to do, search in Youtube.
4. Use DDU for uninstall from "safe mode" and reboot
5. Start the Driver installation and select the "Custom (Advanced)" mode
6. If you don't have 3D glass don't installa the two related components.
7. Don't install "GeForce Experience" if you don't need, *if needed download the last version separately because is newer than the version included in the drivers.*


----------



## TK421

I'm not sure who asked this, but, the EVGA waterblocks are in-house design. Not EK.

An EVGA rep told me about this himself.


----------



## Dragonsyph

378.88 worth updating too?


----------



## ucode

@Dragonsyph Dunno. What have you found best so far, especially in respect to Time Spy ?


----------



## wholeeo

Is anyone cooling their 1080 with a H100i by any chance? Thinking about using one that's been abandoned to cool mines with a Kraken G10..


----------



## invincible20xx

Quote:


> Originally Posted by *wholeeo*
> 
> Is anyone cooling their 1080 with a H100i by any chance? Thinking about using one that's been abandoned to cool mines with a Kraken G10..


you won't gain much by doing that as all 1080s even ref ones top out at around 2.1 ghz


----------



## TK421

keeping it cool under 50c is the goal though!


----------



## pez

Quote:


> Originally Posted by *tiramoko*
> 
> i think my gtx 1080 is bottle necking my i5 4670k(4.4ghz). on battlefield 1, i only get 60-70fps on 1440p and same for 1080p.
> im not sure why? i've seen battlefield gameplay on youtube its fps goes more than 110 on 1440p (i7 4670k+ gtx 1080)


Could possibly be due to DX12 and BF1's love for cores. I see 60-70FPS on 21:9 1440p with a 4770K at 4.2GHz. There's a 4670K in my GFs rig, but I haven't even though to test BF1 on it as she'll probably never play BF1.

Let me know if you continue to see issues and I'll throw it on her PC to at least see how it does with her 1070.


----------



## invincible20xx

Quote:


> Originally Posted by *tiramoko*
> 
> i think my gtx 1080 is bottle necking my i5 4670k(4.4ghz). on battlefield 1, i only get 60-70fps on 1440p and same for 1080p.
> im not sure why? i've seen battlefield gameplay on youtube its fps goes more than 110 on 1440p (i7 4670k+ gtx 1080)


it could be the lack of the extra cpu threads .... i5 is fine but not for very high frame rate monitors as they will usually max out before any modern gpu at lower/higher refresh resolution

modern games utilize the extra threads now of the i7 and you won't hit very high frame rates without an i7 in a modern game now because any i5 will cap before a gpu like the gtx 1080 can hit the 100+ frames that it can manage on lower resolutions on a higher resolution though the 1080 will cap before the i5


----------



## owikhan

Any body help please how to install latest bios MSI GTX 1080 Gaming X

and also where from i get


----------



## wholeeo

Quote:


> Originally Posted by *invincible20xx*
> 
> you won't gain much by doing that as all 1080s even ref ones top out at around 2.1 ghz


It's more for noise than anything. I had my reference on a full loop with an EK block but have since went back to air/stock. Figured I'd make use of an H100i I have lying around in my closet.
Quote:


> Originally Posted by *owikhan*
> 
> Any body help please how to install latest bios MSI GTX 1080 Gaming X
> 
> and also where from i get


Look up how to use nvflash. For the BIOS look for it in techpowerup.com BIOS database.


----------



## jmr71

qwe
Quote:


> Originally Posted by *owikhan*
> 
> Any body help please how to install latest bios MSI GTX 1080 Gaming X
> 
> and also where from i get


Guide:




http://www.overclock.net/t/1523391/easy-nvflash-guide-with-pictures-for-gtx-970-980





Bios:
https://www.techpowerup.com/vgabios/

NVIDIA NVFlash 5.353.0
https://www.techpowerup.com/download/nvidia-nvflash/

This is my cmd Strix Asus GTX1080:

1080cmdflashing.jpg 180k .jpg file


Before Flash make copies of the original BIOS:
https://www.techpowerup.com/download/gpu-z/

Hope this helps.









FLASH AT YOUR OWN RISK!
I AM NOT RESPONSIBLE FOR ANY DAMAGES THAT MAY OCCUR WHILE ATTEMPTING THIS!


----------



## HaykOC

Figured Id get around to trying out a couple 1080s but Ive been out of the loop for awhile. Hows the air cooling on these cards in SLI? Stock or aftermarket coolers. I ask because Im coming from the Maxwell Titan Xs which ran pretty hot. Appreciate any help.


----------



## alucardis666

Quote:


> Originally Posted by *HaykOC*
> 
> Figured Id get around to trying out a couple 1080s but Ive been out of the loop for awhile. Hows the air cooling on these cards in SLI? Stock or aftermarket coolers. I ask because Im coming from the Maxwell Titan Xs which ran pretty hot. Appreciate any help.


Dunno if you're aware/got the 1080's yet, but the 1080 Ti will be announced tomorrow, maybe hold off if you can?


----------



## HaykOC

Quote:


> Originally Posted by *alucardis666*
> 
> Dunno if you're aware/got the 1080's yet, but the 1080 Ti will be announced tomorrow, maybe hold off if you can?


Thanks for the heads up. Ill look out for that announcement.


----------



## alucardis666

Quote:


> Originally Posted by *HaykOC*
> 
> Thanks for the heads up. Ill look out for that announcement.


Glad I could help 

If nothing else, should hopefully drive the prices of the regular 1080 down. So win win!


----------



## Sean McCargar

Hey guys. I wanted too add on here because I couldn't find anyone anywhere that did it but I own 2x evga gtx 1080 ftw hybrids in sli.

I really wanted the extra voltage on the cards so I flashed over the t4 bios.

I don't know about regular ftw cards but my ftw hybrids did not like that vbios at all.

First of all the vbios was causing both my cards vrms to squeal like mad. To the point the first time I heard it I shut off my computer it was so loud. Don't know why its doing that mabe because of the different timings on the ram or the fact that my cards have the 10+2 phase but never the less it makes my vrms howl.

Secondly when I was testing out the vbios I jacked up the core to 1.2v at 2200 mhz.

When I did this insta restart when I got into a game or benching the cards. Have no idea why that is happening but it is.

So I tried all the way up to 1.175v and it wouldn't cause my computer to restart.

Also the t4 bios literally shuts off the fans on the vrms. If your going to try set it to 50% fan speed manually or your fans are not gonna turn at all even on auto.

I have a 1200w power supply so that's not the problem mabe due to my motherboard I'm not too sure its really weird. Ive never heard of anyone flashing the t4 bios have either of these issues. Can anyone that reads this post if there is a fix for this or its just incompatability between vbios.

If that is so evga gtx 1080 ftw hybrid owners do not use the asus t4 vbios.

If prolonged use I can seriously see damage to your cards vrms. They should not howl like crazy.

Went back to my original vbios and the howling went away.


----------



## ucode

Quote:


> Originally Posted by *TK421*
> 
> keeping it cool under 50c is the goal though!


Yep, might help me get that magic 9000



Still, should be able to get there with proper HW mod even at 60C

Quote:


> Originally Posted by *Sean McCargar*
> 
> Can anyone that reads this post if there is a fix for this or its just incompatability between vbios.


Sounds like incompatibility or those cards are a couple of dogs.


----------



## MonarchX

Still no word on how to edit GTX 1080 BIOS to get more OC/performance out of it?

My Gigabyte G1 only uses a single 8-pin power plug. Does this limit my OC? I know the newer power cards use 2x 8-pin, although clock about the same as my card...


----------



## TK421

Quote:


> Originally Posted by *MonarchX*
> 
> Still no word on how to edit GTX 1080 BIOS to get more OC/performance out of it?
> 
> My Gigabyte G1 only uses a single 8-pin power plug. Does this limit my OC? I know the newer power cards use 2x 8-pin, although clock about the same as my card...


short the r005 resistor with a little thin layer of liquid metal (or pencil)


----------



## JTJJ

I have an itx case and asus trix 1080. I havebeen fiddling with undervolting with asus gpu tweak software. So far I found 0.912v stable at 1900mhz. My goal is to getthe coolest and most silent setup with air. Case is rvz02. What are your undervolt results?


----------



## pez

Depending on distance away from you, you shouldn't have to undervolt to get the silence you want. Lower your power target or leave it at 100% and set a custom fan profile. Complete silence on the EVGA ACX 3.0 from about 2-3 feet away is about 50-55% fan. I have my fan profile max at 60% and I'm sitting at 67-70C in all games with a consistent boost of 2K MHz or higher.

I've got slightly better airflow it seems, but you should be fine provided ambient aren't too high and you've got at least some semblance of airflow to the machine.


----------



## kevindd992002

Quote:


> Originally Posted by *TK421*
> 
> I'm not sure who asked this, but, the EVGA waterblocks are in-house design. Not EK.
> 
> An EVGA rep told me about this himself.


I'm the one who asked this. Is this evga rep your friend or he just felt like spilling the beans?

@All

Now thst the GTX 1080Ti is out and the price of the 1080 is lower by a $100, when will we see all these changes in the market? When will the Ti be available?


----------



## KedarWolf

Next question, what would you do.

Two 1080s in SLI when they drop in price, or one 1080 Ti.


----------



## ironhide138

ALWAYS 1 card over 2. SLI problems aren't worth it.


----------



## JTJJ

Quote:


> Originally Posted by *pez*
> 
> Depending on distance away from you, you shouldn't have to undervolt to get the silence you want. Lower your power target or leave it at 100% and set a custom fan profile. Complete silence on the EVGA ACX 3.0 from about 2-3 feet away is about 50-55% fan. I have my fan profile max at 60% and I'm sitting at 67-70C in all games with a consistent boost of 2K MHz or higher.
> 
> I've got slightly better airflow it seems, but you should be fine provided ambient aren't too high and you've got at least some semblance of airflow to the machine.


I am using dust cover over gpu. Before i did not and it lowers temps dust cover off by about 5 celcius.

I found that undervolting or ovelrclocking with lower voltage stays at rhe max clock more stable. I have stock clocks.

Now with dust cover on my max temps are 72 with fan maxing out at 52 percent. Without and dust cover on undervolting temps were the same with fan maxing at 75 percent so this was audible. Now it is not

Havent tried hingher clocks and wanted to hear what i can expect with what clocks to have some picture what to aim for


----------



## pez

Quote:


> Originally Posted by *JTJJ*
> 
> I am using dust cover over gpu. Before i did not and it lowers temps dust cover off by about 5 celcius.
> 
> I found that undervolting or ovelrclocking with lower voltage stays at rhe max clock more stable. I have stock clocks.
> 
> Now with dust cover on my max temps are 72 with fan maxing out at 52 percent. Without and dust cover on undervolting temps were the same with fan maxing at 75 percent so this was audible. Now it is not
> 
> Havent tried hingher clocks and wanted to hear what i can expect with what clocks to have some picture what to aim for


I imagine that cooler is pretty quiet at 52%. You're still looking for quieter? I'd imagine if it's louder than that, it's simply because it's 3 smaller fans vs something like the ACX 3.0 or the Gaming X cooler from MSI.


----------



## jmr71

What you think about the new GTX 1080Ti?


----------



## alucardis666

Quote:


> Originally Posted by *jmr71*
> 
> 
> 
> What you think about the new GTX 1080Ti?


Looks great, dunno about the 11gb of ram... Hoping it's not the story of the 670 all over again.

Other than that what's the eta for the AIB cards? lol I wanna upgrade my 1080.


----------



## kevindd992002

Quote:


> Originally Posted by *alucardis666*
> 
> Looks great, dunno about the 11gb of ram... Hoping it's not the story of the 670 all over again.
> 
> Other than that what's the eta for the AIB cards? lol I wanna upgrade my 1080.


What is the story of the 670?


----------



## Krzych04650

Quote:


> Originally Posted by *ironhide138*
> 
> ALWAYS 1 card over 2. SLI problems aren't worth it.


Generalizing like that is for mentally limited. Everything is dependent on your use case, what do you need, what you expect, what compromises you are willing to take, what games do you play. For how long you plan to use those GPUs. Everybody should know his needs and choose things that are good for him and don't even at look at opinions from blockheads who cannot understand that there may be someone with different needs then theirs and all they can do is generalizing based on something they heard somewhere, especially about things they have no clue about, never tried, never used, no clue about anything, but "let's just repeat what I heard somewhere with some definitive statement to pretend that I know something"


----------



## pez

Quote:


> Originally Posted by *Krzych04650*
> 
> Generalizing like that is for mentally limited. Everything is dependent on your use case, what do you need, what you expect, what compromises you are willing to take, what games do you play. For how long you plan to use those GPUs. Everybody should know his needs and choose things that are good for him and don't even at look at opinions from blockheads who cannot understand that there may be someone with different needs then theirs and all they can do is generalizing based on something they heard somewhere, especially about things they have no clue about, never tried, never used, no clue about anything, but "let's just repeat what I heard somewhere with some definitive statement to pretend that I know something"


Unfortunately this argument falls on deaf ears many times over on this forum. It's generally fighting a losing battle that I've given up on







. I just give people my $0.02 when they ask and tell them about my personal experiences rather than the garbage most people just spew with no experience or evidence to back up.


----------



## Barterlos

Hi guys, i have problem with my GTX 1080 Founders Edition, from time to time im getting random Driver crashes, maybe my OC is not 100% stable, but its strange, last week i spend alomst 100hrs in HITMAN(2016) and not a single Driver crash, yesterday i launched Witcher 3 and bam after 5 hrs, Driver crash,

guys that have GTX 1080 FE can test for me settings: 100+ to core, 120% Power Target, in Witcher 3 max settings at max resolution,4k, or 4k DSR with uncaped framerate, i courious what voltage u are be runing and what power will be consuming.

at that settings in Witcher 3 at 4k my Gtx 1080 FE is consuming from 200w to 235w 1800-1880mhz at 0.950v-0.975v after 25min of playtime, Fan is set to 74% Fixed, temps 78c max

after your testing i will know if my GTX 1080 FE is very Poor Overclocker because is someone can get like 2ghz at 1v consuming 200w max then that person prolly Won silicon lottery









thanks in advance

greetings


----------



## mtbiker033

Anyone have a 1080 and a 1440p 144hz monitor? I'm curious if a single 1080 can give the same performance as my (2) 780's


----------



## Barterlos

Quote:


> Originally Posted by *mtbiker033*
> 
> Anyone have a 1080 and a 1440p 144hz monitor? I'm curious if a single 1080 can give the same performance as my (2) 780's


i have 1440p 144hz Benq XL2730z and GTX 1080 is plenty for that rez, BF1 120fps/120hz no problem, but ofc with two settings set to high. post process and shadows. other games runs very nice, i think 1440p 144hz is a sweet spot for GTX 1080/1070/980TI and with GTX 1080 u will get more performance, i did had GTX 970 G1 Gamng SLI before GTX 1080, and in games single GTX 1080 is faster

but there is a map in BF1, her name is Amiens i think, every intel 4c/8 threads i7 is not enough for 120fps on that 64 player map, my i7 4790k 4.8gh is bottleneckin GTx 1080 on that map, results are 70 to 90fps, cpu is pegged almost 100%


----------



## mtbiker033

Quote:


> Originally Posted by *Barterlos*
> 
> i have 1440p 144hz Benq XL2730z and GTX 1080 is plenty for that rez, BF1 120fps/120hz no problem, but ofc with two settings set to high. post process and shadows. other games runs very nice, i think 1440p 144hz is a sweet spot for GTX 1080/1070/980TI and with GTX 1080 u will get more performance, i did had GTX 970 G1 Gamng SLI before GTX 1080, and in games single GTX 1080 is faster
> 
> but there is a map in BF1, her name is Amiens i think, every intel 4c/8 threads i7 is not enough for 120fps on that 64 player map, my i7 4790k 4.8gh is bottleneckin GTx 1080 on that map, results are 70 to 90fps, cpu is pegged almost 100%


thank you very much!


----------



## alton brown

Hey guys. Hope all is good with everyone. I need some advise. Im sure all of you know that the 1080ti is going to be released in a few days. i have an Acer Predator 1440p 144hz monitor. I've been waiting for the Gtx 1080ti to be released so the 1080s price will drop and I was planning on buying 2 1080s in sli configuration. But now I'm starting to rethink things. Should I get a single 1080ti for my monitor or 2 1080s?


----------



## x-apoc

I would pick single 1080ti for better consistency and scaling, SLI does not guarantee all games will benefit from the dual card setup.
Witcher 3 on the other hand as example scales nicely in SLI, you can hit 60+ fps on ultra at 4k resolution. With my existing single 1080 at 4k I can only hit like 44-52 fps. So they show that 1080ti supposed to be 30-40% faster then 1080, which at this point you should hit 60fps at 4k playing game like Witcher 3 with 1080ti.


----------



## Barterlos

ye i did had two SLI setups in the past, and i higly recommend going single GPU route, its the best way, and the cheapest, because not every game is utlizing multi GPU setups, and in that situation your second GPU is doing nothing, in results you are wasting money, better spend that money for another parts, like better monitor/tv, better SSD, better CPU, better CPU COOLER, better RAM etc etc...


----------



## Joshwaa

Just thought I would throw this out there. Titan X Pascal waterblocks from EK will fit the GTX 1080 TI. Think I am going to order one now and hope I can get a GTX 1080 ti on release date.


----------



## JTJJ

Quote:


> Originally Posted by *pez*
> 
> I imagine that cooler is pretty quiet at 52%. You're still looking for quieter? I'd imagine if it's louder than that, it's simply because it's 3 smaller fans vs something like the ACX 3.0 or the Gaming X cooler from MSI.


Yeah it is quiet now







now I am wondering about the oc portion, but maybe 100-200mhz are irrelevant. I do have htc vive and some vr games are demanding so maybe 200mhz are relevant though







So I will start testing when weekend comes.


----------



## Kriant

fle
Quote:


> Originally Posted by *alton brown*
> 
> Hey guys. Hope all is good with everyone. I need some advise. Im sure all of you know that the 1080ti is going to be released in a few days. i have an Acer Predator 1440p 144hz monitor. I've been waiting for the Gtx 1080ti to be released so the 1080s price will drop and I was planning on buying 2 1080s in sli configuration. But now I'm starting to rethink things. Should I get a single 1080ti for my monitor or 2 1080s?


Don't.
Over the years I had:
4850x2
2x4870
480 SLI
580 SLI
5970
7970 x4
r9 290x x4
Titan X tri-sli
980ti SLI
1080 SLI

And I can tell you that while both crossfire and SLI has their good moments, and I am generally OK with belated SLI/Cross support, the recent trend is that DX12 games do not (for the most part) support multigpu setups, and performance/scaling with SLI can be inconsistent. A single 1080ti card will be a better fit for your needs as it will provide a more consistent experience in general.


----------



## kevindd992002

Is there any reason one would get a 1080Ti for a 1080p setup? Or would that purely be for future monitor upgrades?


----------



## Kriant

Quote:


> Originally Posted by *kevindd992002*
> 
> Is there any reason one would get a 1080Ti for a 1080p setup? Or would that purely be for future monitor upgrades?


144hz+ gaming, maybe, but it sounds as overkill.


----------



## lilchronic

Quote:


> Originally Posted by *Joshwaa*
> 
> Just thought I would throw this out there. Titan X Pascal waterblocks from EK will fit the GTX 1080 TI. Think I am going to order one now and hope I can get a GTX 1080 ti on release date.


Who said that?


----------



## alton brown

Thanks for the input! Right now I have 2 780s in sli and I feel they cause issues in elite dangerous and I also have seen some reviews of bad scaling with sli and direct x 12. So that leads me to ask, will a GTX 1080ti push my gsync 2 k monitor to ultras settings?


----------



## Barterlos

Quote:


> Originally Posted by *alton brown*
> 
> Thanks for the input! Right now I have 2 780s in sli and I feel they cause issues in elite dangerous and I also have seen some reviews of bad scaling with sli and direct x 12. So that leads me to ask, will a GTX 1080ti push my gsync 2 k monitor to ultras settings?


ofc! if u have no problem with cash, go for 1080Ti it will be amazing GPU, like Titan Xp, 1080Ti overclocked to atleast 1.9GHZ will be a monster, second true 4k GPU, u can find benchmarks videos of Titan X oc to 2GHZ, 1080TI will have same performance or even better, 1080Ti and Titan Xp they are almost indentical,

and dont forget, that u need to wait for custom 1080Tis, dont buy FE, with FE u will run in to thermal issue, it was proven that Blower cooler its not sufficient to cool 250w TDP with low noise, with 80%Fan speed u will counter thermals but with high noise


----------



## kevindd992002

Quote:


> Originally Posted by *Kriant*
> 
> 144hz+ gaming, maybe, but it sounds as overkill.


Yeah, I do use a 144Hz monitor so at least I can be stable at 144fps.


----------



## Joshwaa

Quote:


> Originally Posted by *lilchronic*
> 
> Who said that?


Right on the front page of their shop.

https://www.ekwb.com/shop/


----------



## lilchronic

Quote:


> Originally Posted by *Joshwaa*
> 
> Right on the front page of their shop.
> 
> https://www.ekwb.com/shop/











So i wonder if the backplate the 1080ti comes with is compatible with there waterblock


----------



## alton brown

Agreed! I plan on slapping on my NZXT G10 to an aftermarket card.


----------



## nrpeyton

Quote:


> Originally Posted by *Barterlos*
> 
> ofc! if u have no problem with cash, go for 1080Ti it will be amazing GPU, like Titan Xp, 1080Ti overclocked to atleast 1.9GHZ will be a monster, second true 4k GPU, u can find benchmarks videos of Titan X oc to 2GHZ, 1080TI will have same performance or even better, 1080Ti and Titan Xp they are almost indentical,
> 
> and dont forget, that u need to wait for custom 1080Tis, dont buy FE, with FE u will run in to thermal issue, it was proven that Blower cooler its not sufficient to cool 250w TDP with low noise, with 80%Fan speed u will counter thermals but with high noise


I agree,

Imagine trying to cool an AMD 9590 (220w CPU) on air. Impossible to stop throttling. And a blower style? OMG.

_AMD actually 'officially' recommends a decent liquid cooling system. Many mobo manufacturers are the same. Asrock for example, & I'll quote: "please install a decent liquid cooler"
_
*The 1080TI TDP is even bigger. 250w*

The other problem with reference, is the power limit is always also too low.

At least the temp issue can be corrected by installing a water-block. But the power limit cap on reference designs are the biggest problem for the average user, who like to O/C.

GPU boost 3.0 is already a nightmare for O/C'ers.

I wonder if the voltage on these new 1080TI will be the same. (I.E max 1.093v)

I also predict the memory will O/C comparably well to the 1080. (but maybe also a small gain)

Micron produces GDDR5X in 3 part options:
-10 GB/s
-11 GB/s
-12 GB/s

Nvidia choose the middle one for 1080TI (one step up from 1080).


Max operating frequency for the 10GB/s (in our 1080's) on the specsheet is 1250.
Yet we were seeing many people running stable memory O'C's up to 1445 (+775).
The max operating frequency for the 12GB/s is 1500, on mirons specsheet :-_) <--- Dragonspy got beyond that on only the first tier 10GB/s memory. with his stable +1000_

So do the math, better binned memory chips  We could be seeing 1600+ on the lucky cards.

----Core overclocking should be nice too. Nvidia claim the FE's will O/C to 2000mhz (from a stock speed of 1582mhz).

Imagine what a custom PCB could do with that (as GPU boost 3.0's limitations lifted; via higher max power limit in BIOS).

Can't say I'm not excited


----------



## hotrod717

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm the one who asked this. Is this evga rep your friend or he just felt like spilling the beans?
> 
> @All
> 
> Now thst the GTX 1080Ti is out and the price of the 1080 is lower by a $100, when will we see all these changes in the market? When will the Ti be available?


Last I heard Swiftech had been making EVGA's hydro block for years. Maybe EVGA designed the sticker.


----------



## ironhide138

5-6 years of future proofing?


----------



## kevindd992002

Quote:


> Originally Posted by *ironhide138*
> 
> 5-6 years of future proofing?


Is this directed to my question a few posts back?


----------



## pez

Quote:


> Originally Posted by *hotrod717*
> 
> Last I heard Swiftech had been making EVGA's hydro block for years. Maybe EVGA designed the sticker.


I'm afraid I don't know which thread I saw this in, but someone contacted EVGA or someone credible and said EVGA were doing the blocks in-house now.


----------



## Vellinious

Quote:


> Originally Posted by *pez*
> 
> I'm afraid I don't know which thread I saw this in, but someone contacted EVGA or someone credible and said EVGA were doing the blocks in-house now.


From what I've read, the EVGA Pascal blocks are made in house now. Was never the case before. I have a feeling that this is possibly what the little tiff with EK was over.


----------



## pez

Quote:


> Originally Posted by *Vellinious*
> 
> From what I've read, the EVGA Pascal blocks are made in house now. Was never the case before. I have a feeling that this is possibly what the little tiff with EK was over.


Yeah, it would make sense as well with the introduction of their AIOs...but not sure if those are in-house or if they have an OEM. Haven't looked into it enough.


----------



## Joshwaa

Who hooo.. Made it in on the 1080 ti pre-order! EK waterblock on the way also!


----------



## Vellinious

Not sure why anyone would want a reference board, but.....ENJOY! lol


----------



## KedarWolf

On Nvidia website, click on 'Pre-order 1080 Ti', go to check out, when I do, say 'Not available.' Go back to website, Pre-order button has changed to 'Notify When Available.' From the time I clicked on Pre-order until the time I checked out the last one available was sold.


----------



## Joshwaa

Quote:


> Originally Posted by *Vellinious*
> 
> Not sure why anyone would want a reference board, but.....ENJOY! lol


Any Gpu up until the 1080 I would have agreed with you. Can you point out any 1080 partner card that totally outshines the Founders Edition 1080?


----------



## Vellinious

Quote:


> Originally Posted by *Joshwaa*
> 
> Any Gpu up until the 1080 I would have agreed with you. Can you point out any 1080 partner card that totally outshines the Founders Edition 1080?


With watercooling? Yeah...the FTW has higher power limits, so no power limit throttling, and does it without the need to hardware mod. Then there's the Classy and HOF, that have the ability to adjust voltage above 1.093v, without the need to flash to a bios not made for their GPU, thus killing video outs, and possibly bricking a card.

So...yeah. Definite advantages in the aftermarket boards.


----------



## Joshwaa

Quote:


> Originally Posted by *Vellinious*
> 
> With watercooling? Yeah...the FTW has higher power limits, so no power limit throttling, and does it without the need to hardware mod. Then there's the Classy and HOF, that have the ability to adjust voltage above 1.093v, without the need to flash to a bios not made for their GPU, thus killing video outs, and possibly bricking a card.
> 
> So...yeah. Definite advantages in the aftermarket boards.


I agree with your points kinda. I have the FTW with an EK FC-WB and my power limit has never gone above 110%. Also I have not seen the Classy or the HOF put up any staggering numbers over the FE. If we had a BIOS editor I would completely agree with you.


----------



## Osirus23

Any idea when the announced 1080 price cut will hit? I just ordered one from Amazon last week and it arrived yesterday. I believe their price drop guarantee covers any permanent price drop within 7 days of delivery date.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> With watercooling? Yeah...the FTW has higher power limits, so no power limit throttling, and does it without the need to hardware mod. Then there's the Classy and HOF, that have the ability to adjust voltage above 1.093v, without the need to flash to a bios not made for their GPU, thus killing video outs, and possibly bricking a card.
> 
> So...yeah. Definite advantages in the aftermarket boards.


You planning on upgrading then Vellinious?

1080TI FTW or 1080TI Classy?
Or some other manufacturer?

The reference 1080ti has 14 dual-FETs (7 phase)

I wonder if that is the equivalent to how some mobo manufacturers claim double phases with dual stacked mosfets. (1 driver running 2 fets).

In other words, could Nvidia have gotten away with claiming the reference design was actually 14-phase?

Interesting because that would indicate quite nice power delivery for a reference model.


----------



## pez

I don't think a lot of people realize that FEs AND AIBs all have a MSRP of $699. With NVIDIA getting rid of the price premium of the FE now, I won't be surprised to see another MSI Gouging X and Z card for $800 or $850. With no price premium of FE, and for those going for water, I see no reason to spend another $100+ for 1-5% performance difference.

If the Ti ends up with an unlocked BIOS or something incredibly different than the 1070, 1080, TXP OC'ing numbers we've been seeing constantly, I'll be happy to eat my words.


----------



## Bal3Wolf

Looks like newegg has droped prices on alot of the 1080s they sell by 60-100 dollars, irs needs to get me my moneys lol so i can snag a water cooled 1080.


----------



## tiramoko

I just got my evga sc 1080 last week from Amazon. Got it for 510+tax. Bought it used. Idk if should return it and wait for them to lower their prices on 1080. Does Amazon charge for return video card?


----------



## Outcasst

Does anybody know what size the tiny screws are on the backplate of the Founder's Edition cards? And if they can be purchased anywhere?

Somehow lost three of mine...


----------



## Bal3Wolf

Quote:


> Originally Posted by *tiramoko*
> 
> I just got my evga sc 1080 last week from Amazon. Got it for 510+tax. Bought it used. Idk if should return it and wait for them to lower their prices on 1080. Does Amazon charge for return video card?


From what i can find nothing talks about a restocking fee you could order from newegg they have droped prices on a ton of the 1080s they sell.

https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601194948%20601203901
Quote:


> Amazon.com Return Policy
> 
> Items shipped from Amazon.com, including Warehouse Deals, can be returned within 30 days of receipt of shipment in most cases. Some products have different policies or requirements associated with them.


https://www.amazon.com/gp/help/customer/display.html?ie=UTF8&nodeId=15015721#aag_returns


----------



## FedericoUY

When will the new 1080 (499 price tag and new memory) non ti be available? I think a ti is overkill for a 2k monitor...


----------



## Bal3Wolf

Quote:


> Originally Posted by *FedericoUY*
> 
> When will the new 1080 (499 price tag and new memory) non ti be available? I think a ti is overkill for a 2k monitor...


prices are already droping i don't think they gave a time table for when new 1080s with faster memory will come out.


----------



## hertz9753

That reminds me of when the GTX 680 turned into the GTX 770.


----------



## hotrod717

Quote:


> Originally Posted by *pez*
> 
> I'm afraid I don't know which thread I saw this in, but someone contacted EVGA or someone credible and said EVGA were doing the blocks in-house now.


Makes sense with aio in mind. Although i would think they are contracting. Putting the money into both development and fabrication for these would be a huge undertaking for the return.


----------



## Cozmo85

EVGA uses asetek for their AIO.


----------



## Bal3Wolf

Quote:


> Originally Posted by *hotrod717*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pez*
> 
> I'm afraid I don't know which thread I saw this in, but someone contacted EVGA or someone credible and said EVGA were doing the blocks in-house now.
> 
> 
> 
> Makes sense with aio in mind. Although i would think they are contracting. Putting the money into both development and fabrication for these would be a huge undertaking for the return.
Click to expand...

yea from users iv talked to that bought the hydrocopper blocks they arent bad they cool pretty decent on the core and vrms i been eyeing a gtx1080 hydro copper card when i get my taxes in it just droped to 699 from 779.


----------



## ucode

Quote:


> Originally Posted by *Joshwaa*
> 
> I agree with your points kinda. I have the FTW with an EK FC-WB and my power limit has never gone above 110%. Also I have not seen the Classy or the HOF put up any staggering numbers over the FE. If we had a BIOS editor I would completely agree with you.


I don't think a VBIOS mod is going to make much difference with 1080's and personally I don't understand the aversion to HW modding, after all changing the cooler is a HW mod and an electrical HW mod could be superior to changing a reference level via VBIOS mod.

Never having gone over 110% means what exactly? Percentages tend to be misleading. For instance my 1080 FE can go over 200%, better if Watts can be used IMHO.

Yeah, haven't seen a great difference in what can be achived on 1080 FE and AIB then again I am a little surprised the likes of Galax don't go all out to advertise that their HOF cards support extra voltage. Maybe just not enough people trying to see that possibly big difference with the right silicon. Or perhaps it's being played quiet because of some results being potentially embarrassing. While I can get some gain (not huge) with my 1080 FE and extra voltage the 1050Ti I tried is pretty poor with an increase of only 50MHz for an increase from 1.118V to a little over 1.3V. Looking at a review of a 1070 HOF set to 1.25V (max 1.3V) didn't seem that impressive either.


----------



## Vellinious

Most people don't keep their cards cool enough to make extra voltage beneficial. Some voltage may gain them a little bit in clock speed, but without keeping the GPU core really cool, it's not going to run as well as it should. For most people out there, adding extra voltage may make them feel a little better about the overclock they get, but....that's about all they're gonna get out of it.


----------



## Joshwaa

Quote:


> Originally Posted by *ucode*
> 
> Never having gone over 110% means what exactly? Percentages tend to be misleading. For instance my 1080 FE can go over 200%, better if Watts can be used IMHO.


I was just stating that I have never seen the power on my FTW go over 110% meaning that if I had a 130% or 150% adjustment it is not needed with this card. However I am watercooled and do not have fans or leds adding to my power needs. If we had a bios mod that allowed the card to actually clock higher and use the extra voltage then maybe it might come into play. Even that is speculation though as these chips seem to be limited in what clocks you can achieve no matter the voltage.


----------



## Derek1

Will either MSi or EVGA be releasing a Lightning or King Pin edition of the Ti?
Or is that their branding of the Ti normally?


----------



## Vellinious

Quote:


> Originally Posted by *Joshwaa*
> 
> I was just stating that I have never seen the power on my FTW go over 110% meaning that if I had a 130% or 150% adjustment it is not needed with this card. However I am watercooled and do not have fans or leds adding to my power needs. If we had a bios mod that allowed the card to actually clock higher and use the extra voltage then maybe it might come into play. Even that is speculation though as these chips seem to be limited in what clocks you can achieve no matter the voltage.


The real limiter is temps, not voltage. For instance. On air my GPU ran their best scores at 2139. On water, with ambients at around 20c, their best runs are done at 2189 to 2202, depending on the benchmark. With the coolant dropped to near or just slightly under 0c, they hit their best runs at 2252. All at the same 1.093v.

I'm honestly not sure if lowering temps further, or if adding any extra voltage is what's needed at this point, but, from everything I've seen so far, I'm gonna go with lower temps. That said, eventually you're going to reach the point where lowering temps aren't going to do a whole lot, and you'll have to add voltage to get any further, but.....I don't think I've even reached that place yet.


----------



## mbm

Thinking of buyng the arctic extreme IV cooler to but on my gfx card.
standard cooler is running 60% (too loud) to keep my card (core=2050mhz) under 75C.
Hopefuly the AC acellero could keep my card at 70C and quiter?

Has anyone tried attach this cooler to this specific card and what where your results?
I can see there is a lot of thermal pads provided to fit on the back.. Where do I find these hot places on the card so they can benefit the most?


----------



## Derek1

Some major price slashing going on here.

http://www.canadacomputers.com/search_result.php?checkVal0=0&subcat01=2&checkVal1=1&checkVal2=0&subcat233=38&checkVal3=1&checkVal4=1&pagePos=0&keywords=&manu=0&search=1&ccid=1200&cPath=43_1200

I paid close to $1300 for my FTW and Hybrid conversion kit (taxes and shipping for the kit incl) and here the FTW ACX are on sale for $750!!!!
Even if I get one and buy another Hybrid kit it would still be less than what I originally paid just for my FTW back in September ($1000).


----------



## Bal3Wolf

Orderd me a evga gtx 1080 FTW hydro copper for 669 with a free game ofcourse.


----------



## hertz9753

Quote:


> Originally Posted by *Bal3Wolf*
> 
> Orderd me a evga gtx 1080 FTW hydro copper for 669 with a free game ofcourse.


That is so close to the retail price of 699.99 that the GTX 1080 Ti has.


----------



## Bal3Wolf

Quote:


> Originally Posted by *hertz9753*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> Orderd me a evga gtx 1080 FTW hydro copper for 669 with a free game ofcourse.
> 
> 
> 
> That is so close to the retail price of 699.99 that the GTX 1080 Ti has.
Click to expand...

699 for a air cooler tho add another 150+ for a waterblock and backplate tho, and i will have evga step-up if i decide to move up to a ti if they sell one with a waterblock. I play at 1080p right now might go up to 1440p so dont really need a ti right now have more things needing upgrades on my computer.


----------



## ucode

Quote:


> Originally Posted by *Vellinious*
> 
> Most people don't keep their cards cool enough to make extra voltage beneficial


I appreciate your comment but If I run my 1080 at 1.093V my best clock is about 2100MHz. With 1.2V I get a little over 2202MHz and better performance, 2240MHz start in Time Spy can break 8800 even though GPU temperature is hotter as expected. Of course if I ran 50C lower I could probably expect another 100MHz or 400MHz with some LN2 perhaps. But I don't have the finances to do such things on top of which my current dew temperature for where I am is higher than your 20C ambient temperature (25C).

So I'll enjoy some "free" performance increase for some benches. IMO the increase is not big enough with respect to power and temperature increases to bother running 24/7. But that's me, others are welcome to do it however they please.

The point I was trying to make (in case it wasn't evident) is that it seems very much YMMV. Some might get better performance, some might not.


----------



## Vellinious

Quote:


> Originally Posted by *ucode*
> 
> I appreciate your comment but If I run my 1080 at 1.093V my best clock is about 2100MHz. With 1.2V I get a little over 2202MHz and better performance, 2240MHz start in Time Spy can break 8800 even though GPU temperature is hotter as expected. Of course if I ran 50C lower I could probably expect another 100MHz or 400MHz with some LN2 perhaps. But I don't have the finances to do such things on top of which my current dew temperature for where I am is higher than your 20C ambient temperature (25C).
> 
> So I'll enjoy some "free" performance increase for some benches. IMO the increase is not big enough with respect to power and temperature increases to bother running 24/7. But that's me, others are welcome to do it however they please.
> 
> The point I was trying to make (in case it wasn't evident) is that it seems very much YMMV. Some might get better performance, some might not.


And you could have done the same thing with cooler temps and stock voltage....maybe even with better performance. We all know this architecture doesn't really react well to additional voltage.

Sure....you can throw a ton of volts at something and make it work eventually, but....I run 2200+ on stock voltage because I drop temps. And they run pretty [email protected] good when they do. lol

Temps first, then voltage. If you're doin it the other way, you're working against yourself. High ambient temps would be difficult to get by, though....I fully concede that point.


----------



## Fidelity21

I'm thinking of adding another 1080 to my setup for SLI as I've never tried SLI before. The question is, can I use an EVGA hydro copper card card with my currently installed ASUS 1080 FE card that's equipped with an XSPC water block and backplate? I'm sure that sounds like a foolish question for those of you that have done this several times. I just can't seem to find an Asus 1080 FE at a good price right now as they seem to be more expensive than even the Strix card...which seems crazy.

I need to reconfigure my water loop at some point anyway because the water flows directly from the CPU to the GPU, which works fine right now, but I'm sure I could keep the GPU below 50C if I'm feeding it cooler water. Right now, it peaks at 56C fully overclocked and 34C at idle.


----------



## Himura88

Hi, 6 months ago i have bought an Gigabyte GTX 1080 Xtreme Gaming and i have some weird behavior in gta 5 i don t know if these are artifacts/memory error or other issue with the card ....
Here you have a link with the artifacts/error or what is this ?




So can you tell me if these are artifacts ? my card is faulty ?
Can you tell me your opinion please ?
Thank you.


----------



## x7007

Quote:


> Originally Posted by *Himura88*
> 
> Hi, 6 months ago i have bought an Gigabyte GTX 1080 Xtreme Gaming and i have some weird behavior in gta 5 i don t know if these are artifacts/memory error or other issue with the card ....
> Here you have a link with the artifacts/error or what is this ?
> 
> 
> 
> 
> So can you tell me if these are artifacts ? my card is faulty ?
> Can you tell me your opinion please ?
> Thank you.


other games ? have those artefacts ?
do you use reshade or sweetfx or mods ? try disabling or removing them
heaven benchmark and such have the same issues ? it seems like its hardware also the small freezes every time. but to be sure check the other things which always the first things.


----------



## Himura88

Yes i run Reshade and enb.... this can be from reshade ? i will try without
the freeze in the video is from vegas/rendering not game.
I don t have these freez in game...


----------



## Himura88

i tried without reshade and it s the same....but everytime in that spot and other spots on the map
My card is not OC and if i underclock speed or memory nothing happens i still have those..
I play BF1 assetto corsa p cars and some racing games i didn t notice those but i want to try something that use gta 5 engine or something similar...
But those "artefacts" or what are they seems like a faulty card ?


----------



## Bal3Wolf

I will be getting my 1080 card tuesday from fedex i hope, i havet owned a nvida card in many years sence my 8800gt whats some good info to read up on so i know how to overclock it my card will be watercooled. Also what is the best driver to run the newest ? or is a older one better ?


----------



## pez

Quote:


> Originally Posted by *Himura88*
> 
> i tried without reshade and it s the same....but everytime in that spot and other spots on the map
> My card is not OC and if i underclock speed or memory nothing happens i still have those..
> I play BF1 assetto corsa p cars and some racing games i didn t notice those but i want to try something that use gta 5 engine or something similar...
> But those "artefacts" or what are they seems like a faulty card ?


What is GTA V estimating your VRAM usage at? Are you using resolution scaling?


----------



## KickAssCop

Sold one of my 1080s. Waiting for the other to sell.


----------



## KedarWolf

Quote:


> Originally Posted by *KickAssCop*
> 
> Sold one of my 1080s. Waiting for the other to sell.


How much did you get for it?


----------



## fat4l

I see, MSI EK 1080 watercooled, is selling for some good price now ....
I gues, it clocks worse than FE right ?


----------



## KedarWolf

Quote:


> Originally Posted by *fat4l*
> 
> I see, MSI EK 1080 watercooled, is selling for some good price now ....
> I gues, it clocks worse than FE right ?


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> I see, MSI EK 1080 watercooled, is selling for some good price now ....
> I gues, it clocks worse than FE right ?
Click to expand...

Sorry about the double post. My PC messed up, had to reboot.

I was saying you'd be better off getting a 1080 Ti for a bit more than a regular 1080 or at least wait until the 11GBPS GDDR5 memory regular 1080s are released. They are releasing them with the same type of memory as the 1080 Ti soon.


----------



## FedericoUY

Waiting for my Evga Gtx 1080 sc2 icx to arrive... I hope it behaves good. Couldn't wait for the 11gb version since I had to buy now mandatory.


----------



## feznz

Quote:


> Originally Posted by *Fidelity21*
> 
> I'm thinking of adding another 1080 to my setup for SLI as I've never tried SLI before. The question is, can I use an EVGA hydro copper card card with my currently installed ASUS 1080 FE card that's equipped with an XSPC water block and backplate? I'm sure that sounds like a foolish question for those of you that have done this several times. I just can't seem to find an Asus 1080 FE at a good price right now as they seem to be more expensive than even the Strix card...which seems crazy.
> 
> I need to reconfigure my water loop at some point anyway because the water flows directly from the CPU to the GPU, which works fine right now, but I'm sure I could keep the GPU below 50C if I'm feeding it cooler water. Right now, it peaks at 56C fully overclocked and 34C at idle.


I remember this coming up in the GTX770 forms there was a case where the 2 different models of EVGA 2Gb 770's (Think it was an early and later model classified) be had a different PCB architecture and weren't compatible in SLI,
very rare incompatibility but best to try keep the cards identical 99.9% of the time you will be fine

BTW loop order won't make a huge difference in temps as the water temp will equalise and have 1-2C° variance between blocks and radiators


----------



## fat4l

Quote:


> Originally Posted by *KedarWolf*
> 
> Sorry about the double post. My PC messed up, had to reboot.
> 
> I was saying you'd be better off getting a 1080 Ti for a bit more than a regular 1080 or at least wait until the 11GBPS GDDR5 memory regular 1080s are released. They are releasing them with the same type of memory as the 1080 Ti soon.


yeha but the question is, what if you already have 1080 non Ti, and you could get one more for SLi, getting ~80% more, instead of 35%.
yeah ..I know ..SLI ........ but still







...it alteast loooks cool


----------



## Bal3Wolf

got my 1080 hydro copper installed i like it now learning to overclock them does +106 with no voltage bump just raising power limit. What are the limits on voltage and power limit we should stay under ?


----------



## KedarWolf

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Sorry about the double post. My PC messed up, had to reboot.
> 
> I was saying you'd be better off getting a 1080 Ti for a bit more than a regular 1080 or at least wait until the 11GBPS GDDR5 memory regular 1080s are released. They are releasing them with the same type of memory as the 1080 Ti soon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yeha but the question is, what if you already have 1080 non Ti, and you could get one more for SLi, getting ~80% more, instead of 35%.
> yeah ..I know ..SLI ........ but still
> 
> 
> 
> 
> 
> 
> 
> ...it alteast loooks cool
Click to expand...

Yes, 1080 SLI would be the way to go, just be sure it's got the price drop where you buy it from, they are supposed to go to $499 USD or so.


----------



## KedarWolf

Quote:


> Originally Posted by *FedericoUY*
> 
> Waiting for my Evga Gtx 1080 sc2 icx to arrive... I hope it behaves good. Couldn't wait for the 11gb version since I had to buy now mandatory.


I'm pretty sure it's 11gbps RAM speed and not 11gb of RAM. Still will be 8gb of memory.


----------



## mbm

I would like to add smal heatsinks to the VRAM and other components.
The VRAM is obvious, but not sure what else should be cooled?


----------



## FedericoUY

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure it's 11gbps RAM speed and not 11gb of RAM. Still will be 8gb of memory.


Yea still 8gb but little faster. I missed the "Gbps", wanted to say that. I would have preffer to wait for that version of the card, anyway I think I'll be happy with this card to push a 1440p monitor. Anyone to share experiences with ICX cards?


----------



## toncij

Quote:


> Originally Posted by *fat4l*
> 
> yeha but the question is, what if you already have 1080 non Ti, and you could get one more for SLi, getting ~80% more, instead of 35%.
> yeah ..I know ..SLI ........ but still
> 
> 
> 
> 
> 
> 
> 
> ...it alteast loooks cool


Why not 1080Ti SLI?


----------



## Bal3Wolf

looks like i found my max overclock for my EVGA 1080 FTW HYDRO COPPER the Core does 2114Mhz and the Memory does 5580Mhz.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> looks like i found my max overclock for my EVGA 1080 FTW HYDRO COPPER the Core does 2114Mhz and the Memory does 5580Mhz.


Did you use the frequency / voltage curve? Really should be able to get higher than that. Don't just set an offset.


----------



## 6u4rdi4n

How? Or do you mean the FTW model? My SC card seems to start crapping itself around 2080-2090. I've had it running through FS just fine at over 2100, but suddenly it didn't want to.


----------



## Vellinious

Quote:


> Originally Posted by *6u4rdi4n*
> 
> How? Or do you mean the FTW model? My SC card seems to start crapping itself around 2080-2090. I've had it running through FS just fine at over 2100, but suddenly it didn't want to.


I've had 5 1080s and haven't found a single one yet, that wouldn't do at least 2170 using the frequency / voltage curve. Of the 3 I've had under water, all of them would do 2200+. But you have to use the curve.....offset method won't usually get you the same kinds of results.

Drivers can affect achievable clocks.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> looks like i found my max overclock for my EVGA 1080 FTW HYDRO COPPER the Core does 2114Mhz and the Memory does 5580Mhz.
> 
> 
> 
> Did you use the frequency / voltage curve? Really should be able to get higher than that. Don't just set an offset.
Click to expand...

hmm i just used offset my card wont seem to go above 1.063 either.


----------



## KedarWolf

Quote:


> Originally Posted by *toncij*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> yeha but the question is, what if you already have 1080 non Ti, and you could get one more for SLi, getting ~80% more, instead of 35%.
> yeah ..I know ..SLI ........ but still
> 
> 
> 
> 
> 
> 
> 
> ...it alteast loooks cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why not 1080Ti SLI?
Click to expand...

He already has one 1080 is why. Two regular 1080s superior to one 1080 Ti.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *6u4rdi4n*
> 
> How? Or do you mean the FTW model? My SC card seems to start crapping itself around 2080-2090. I've had it running through FS just fine at over 2100, but suddenly it didn't want to.
> 
> 
> 
> I've had 5 1080s and haven't found a single one yet, that wouldn't do at least 2170 using the frequency / voltage curve. Of the 3 I've had under water, all of them would do 2200+. But you have to use the curve.....offset method won't usually get you the same kinds of results.
> 
> Drivers can affect achievable clocks.
Click to expand...

i tried voltage curve and didnt get any better overclock sadly anything over +100 crashes 3dmark and my card never goes up to the 1.093 i see people say the cards can go up to.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> hmm i just used offset my card wont seem to go above 1.063 either.


It won't pull more voltage unless it needs to. You'll need to move the voltage slider all the way up as well.

At the clocks you're running, I'm shocked it's pulling more than 1.03v


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> hmm i just used offset my card wont seem to go above 1.063 either.
> 
> 
> 
> It won't pull more voltage unless it needs to. You'll need to move the voltage slider all the way up as well.
> 
> At the clocks you're running, I'm shocked it's pulling more than 1.03v
Click to expand...

Using freq offset for voltage i think i will be stable at +150 for a core clock of but nothing more 2164mhz it passed a few 3dmark tests but the next jump freezes 3dmark. after a few more runs 2164mhz was unstable also using the freq offset.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> Using freq offset for voltage i think i will be stable at +150 for a core clock of but nothing more 2164mhz it passed a few 3dmark tests but the next jump freezes 3dmark. after a few more runs 2164mhz was unstable also using the freq offset.


I prefer MSI's AB. The curve is laid out better, and easier to use.

I've found that the tighter the curve, the better.

Something like this:



Opposed to this:


----------



## FedericoUY

Is that curve present on a gtx 1070? I never saw it...


----------



## mbm

I have been playing with This curve, but really I have no idea what to do 
What is the task? More or less voltage pr. Mhz?


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> Using freq offset for voltage i think i will be stable at +150 for a core clock of but nothing more 2164mhz it passed a few 3dmark tests but the next jump freezes 3dmark. after a few more runs 2164mhz was unstable also using the freq offset.
> 
> 
> 
> 
> 
> I prefer MSI's AB. The curve is laid out better, and easier to use.
> 
> I've found that the tighter the curve, the better.
> 
> Something like this:
> 
> 
> 
> Opposed to this:
Click to expand...

oh i never even seen that im using the basic skin on msi ab will give that curve a shot, i also just did a psu swap when i had my 7970s i had tons of coil whine when i put my 1080 still had it so switched my corsair tx950 with my SeaSonic M12D SS-850 850W and i think the whines gone maybe i will have a more stable overclock.

Ok even using msi and doing same type thing you did in your pic anything over +100 crashes 3dmark 2114 is max it will run it seems like.


----------



## JunkaDK

Question, when overclocking the GTX1080 FTW could it be a power bottleneck that my cards are using only one 8-pin output from the PSU that daisy chains to the two 8 port inputs on the card?
Quote:


> Originally Posted by *Bal3Wolf*
> 
> oh i never even seen that im using the basic skin on msi ab will give that curve a shot, i also just did a psu swap when i had my 7970s i had tons of coil whine when i put my 1080 still had it so switched my corsair tx950 with my SeaSonic M12D SS-850 850W and i think the whines gone maybe i will have a more stable overclock.
> 
> Ok even using msi and doing same type thing you did in your pic anything over +100 crashes 3dmark 2114 is max it will run it seems like.


Yeah my cards also crash around the 100+ point, temps are around 35-40 on the core. Is it a power bottleneck that my cards are power fed from a single 8-pin output on the PSU that loops into both 8 port inputs on the card?


----------



## DrFreeman35

I was having issues with mine today, thought I had mine OC and stable at 2114 with 75mhz on core clock and 150 on memory, if I try to go above that heaven or firestrike crashes. Voltage is all the way up, I'm on the slave bios for both cards. The main GPU in my SLI hits voltage limit, but neither one hit power limit. Any idea what I can change? I'm on air as well btw..... EVGA Gtx 1080 FTW ACX, never knew about upgrade to IcX as I built my computer recently.


----------



## DrFreeman35

When doing SLI setup, recommend to OC one card at a time? Or both cards? I'm not sure why my main GPU is hitting the voltage llimit, and wondering if this is a problem. I normally don't hit anything higher than 65c... so thermal can't be the issue.


----------



## JunkaDK

Just wanted to share a few pics from the build i finished last weekend







Just LOVE the chrome backplates on my GTX1080 FTW's


----------



## mbm

det er bare fuc.... nice build det der JunkaDK


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> det er bare fuc.... nice build det der JunkaDK


TAK


----------



## Bal3Wolf

Quote:


> Originally Posted by *JunkaDK*
> 
> Question, when overclocking the GTX1080 FTW could it be a power bottleneck that my cards are using only one 8-pin output from the PSU that daisy chains to the two 8 port inputs on the card?
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> oh i never even seen that im using the basic skin on msi ab will give that curve a shot, i also just did a psu swap when i had my 7970s i had tons of coil whine when i put my 1080 still had it so switched my corsair tx950 with my SeaSonic M12D SS-850 850W and i think the whines gone maybe i will have a more stable overclock.
> 
> Ok even using msi and doing same type thing you did in your pic anything over +100 crashes 3dmark 2114 is max it will run it seems like.
> 
> 
> 
> Yeah my cards also crash around the 100+ point, temps are around 35-40 on the core. Is it a power bottleneck that my cards are power fed from a single 8-pin output on the PSU that loops into both 8 port inputs on the card?
Click to expand...

dont think mine can be a power issue iv tried 2 psus and only using one card.


----------



## mbm

JunkaDK
what are your load temps and boost clock on those cards?


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> JunkaDK
> what are your load temps and boost clock on those cards?


Power target 130%
Temp target 92
GPU clock offset +100
Mem clock offset +0

Boosting to 2050mhz
GPU temps 42-44c after 15 mins of max load.

GPU 1 hovers around 44 and GPU 2 around 40


----------



## mbm

Quote:


> Originally Posted by *JunkaDK*
> 
> Power target 130%
> Temp target 92
> GPU clock offset +100
> Mem clock offset +0
> 
> Boosting to 2050mhz
> GPU temps 42-44c after 15 mins of max load.
> 
> GPU 1 hovers around 44 and GPU 2 around 40


why no mem OC ?


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> why no mem OC ?


First i want to find the max clocks for the cards.. then i can focus on mem speeds.







Or is that not a good theory?


----------



## Vellinious

Ya'll should really get over using the offset method.....


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Ya'll should really get over using the offset method.....


sadly my card clocks no better in curve mode still tops out at around +100.


----------



## DrFreeman35

Quote:


> Originally Posted by *Bal3Wolf*
> 
> sadly my card clocks no better in curve mode still tops out at around +100.


I did not get any benefit from using the voltage curve in MSI Afterburner, but it did help me find out what to shoot for. I finally got mine stable and its down clocking at all. One of my cards is holding me back, which sucks, but it is the way it goes. 2088 on main card using 1.75v and 2088 on other using 1.43..... I can go higher, but do not see any real reason to considering my main gpu is already considerably higher than the 2nd.


----------



## JunkaDK

Quote:


> Originally Posted by *DrFreeman35*
> 
> I did not get any benefit from using the voltage curve in MSI Afterburner, but it did help me find out what to shoot for. I finally got mine stable and its down clocking at all. One of my cards is holding me back, which sucks, but it is the way it goes. 2088 on main card using 1.75v and 2088 on other using 1.43..... I can go higher, but do not see any real reason to considering my main gpu is already considerably higher than the 2nd.


How do you set the voltage ? I only see the option to max out voltage %....


----------



## DrFreeman35

Quote:


> Originally Posted by *JunkaDK*
> 
> How do you set the voltage ? I only see the option to max out voltage %....





Spoiler: Warning: Spoiler!







By selecting the icon next to Core Clock, the Voltage graph will come up, and you can create your own curve. Pic for Reference, hope it helps.


----------



## mbm

Quote:


> Originally Posted by *Vellinious*
> 
> Ya'll should really get over using the offset method.....


I wish you could be more specific on what to do with the curve, because I have noget idea what to do to benefit from this vs offset.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> I wish you could be more specific on what to do with the curve, because I have noget idea what to do to benefit from this vs offset.


Each point represents a voltage setting. Grab the one for 1.093v, or whichever voltage you'd like to test, and drag it up until it's at the clock you're trying to run.



Now, as you experiment, you're likely to find, like the rest of us have, that running higher clocks does not always equal better performance...in fact, sometimes, it means worse performance. The curve has a lot to do with that. Many of us have played with these voltage / frequency curves for months......Let me say that again. We've played with them for MONTHS. Try harder.....

Finding the place where your card will run the best will take time and patience. It's testing voltage points at certain clocks and certain temps....because a 5c or a 10c drop in temps can mean that you have to start over from scratch. Each GPU is different. I have done this on 5 cards...the 2 FTWs I have now, I have 12 different curves I run, for different clocks, different voltages and different temps. And I'm still finding more performance here and there.


----------



## nrpeyton

Quote:


> Originally Posted by *mbm*
> 
> I wish you could be more specific on what to do with the curve, because I have noget idea what to do to benefit from this vs offset.


Using the offset method simply increases the clock at every single voltage point.

Some of those clocks might be too high for that voltage point, which could cause a crash.

The curve method allows finer tuning.

So while a +75 on the core might be fine at 2025MHZ/1.093. (2100)
it could crash 1900/1.000v (as 1975 at 1.0v is too much for 1.0v)

the above /\ clocks/voltages were just examples not true stock values.

Some of us that spent months tweaking it, we would go through each stock voltage/frequency point and test how high we could go for every single voltage point along the curve before crashing. Knock it back 2 frequency steps (each step is 13mhz) for stability then move up to next voltage, then repeat.

The higher the voltage, the more the conservative the "stock curve" becomes. Obviously for stability at stock. So what we found is most of the gains are at the end of the curve.

To be fair, you don't need to do that.. just play around with the end of the curve and leave the first 3 quarters of it at stock.

But you also have to keep those temps down for gains at the high end of the voltage curve. If you go too high the card might not crash straight away, but unknown to you is it will go into "error-correcting mode" in an effort to stop its self crashing. It drops frames to do this.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Using the offset method simply increases the clock at every single voltage point.
> 
> Some of those clocks might be too high for that voltage point, which could cause a crash.
> 
> The curve method allows finer tuning.
> 
> So while a +75 on the core might be fine at 2025MHZ/1.093. (2100)
> it could crash 1900/1.000v (as 1975 at 1.0v is too much for 1.0v)
> 
> the above /\ clocks/voltages were just examples not true stock values.


You working more or something, you haven't been around as much? Ever hit 2300?


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> You working more or something, you haven't been around as much? Ever hit 2300?


Not really, I still get the odd PM or two every week (without fail) r.e. my youtube videos I posted a while back. Someone at EVGA forums started sharing them (so I was told by a moderator there). lol

_Usually just people asking how to get past 1.093v._

Done my first ever Extreme Cooling run on my old FX chip last month. Which was fun. (Dry ICE - I broke the record for fastest CPU frequency ever recorded using my mobo at HWBOT).

Kind of excited to see what EVGA comes up with in terms of Classified ICX/TI step-up. (as you know classified customers are still waiting for this elusive announcement lol).

Other than that, I grabbed a new mobo for my FX. It resulted in being stable at same CPU clocks for up to 150mv less voltage. So for example I used to need about 1.35v for prime95. On new mobo I only need 1.21v. (other things all overclock better too) so still playing around with that.

Not got rid of my FX as its the only modern chipset that apparently isn't affected by the "cold bug" and I'm just getting into the extreme cooling.

Thinking of grabbing a new ZEN setup, but I want to wait until the bugs are ironed out first. When they are.. I'll be going for the 1700. As they *all* overclock by up to 1GHZ on air. And once at those clocks, equal the performance of the most expensive 1800x within 1%.

i7 6900k performance for 300 bux ;-)


----------



## DrFreeman35

Quote:


> Originally Posted by *nrpeyton*
> 
> Not really, I still get the odd PM or two every week (without fail) r.e. my youtube videos I posted a while back. Someone at EVGA forums started sharing them (so I was told by a moderator there). lol
> 
> _Usually just people asking how to get past 1.093v._
> 
> Done my first ever Extreme Cooling run on my old FX chip last month. Which was fun. (Dry ICE - I broke the record for fastest CPU frequency ever recorded using my mobo at HWBOT).
> 
> Kind of excited to see what EVGA comes up with in terms of Classified ICX/TI step-up. (as you know classified customers are still waiting for this elusive announcement lol).


Holy Scotland, found your video on Youtube before i built my PC last week. That was some insane stuff going on, and I can not imagine all the questions you get about how you did it. Awesome video, congrats.


----------



## nrpeyton

Quote:


> Originally Posted by *DrFreeman35*
> 
> Holy Scotland, found your video on Youtube before i built my PC last week. That was some insane stuff going on, and I can not imagine all the questions you get about how you did it. Awesome video, congrats.


Thanks. ;-)

@Vellinious what you been up to yourself?


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> Thanks. ;-)
> 
> @Vellinious what you been up to yourself?


Traveling a lot for work. It's a lot busier than I had even planned on. I play as much as I can,

I made a post in the EVGA forums to explain the offset / curve method a little bit better. This might help some of you understand what you're doing.

See, when you set an offset, the voltage curve always keeps the top of the curve at a lower voltage. This would allow for boost 3.0 to do it's thing, and lower the voltage if at all possible. Trouble is, it also adds instability, because it's always trying to lower the voltage. Like in this below. See how the +160 offset has set the voltage / frequency curve? 1031mv.....it's never going to run that frequency at that voltage, without dropping temps down around 0c.



BUT.....modify that curve a little bit, like pictured below here, and the lowest voltage point, at the highest clock point, will avoid boost 3.0 trying to do it's thing, and lock in on that voltage / core clock. With this curve, the voltage won't try to drop below the 1.093v, and it'll lock into the 2202 core clock, and not try to drop even when temps get too high....it'll just drop frames, or it'll crash outright.



It's all in the curve....and that curve will change with every 8-10c drop, or rise in core temps.


----------



## JunkaDK

Quote:


> Originally Posted by *nrpeyton*
> 
> Not really, I still get the odd PM or two every week (without fail) r.e. my youtube videos I posted a while back. Someone at EVGA forums started sharing them (so I was told by a moderator there). lol
> 
> _Usually just people asking how to get past 1.093v._
> 
> Done my first ever Extreme Cooling run on my old FX chip last month. Which was fun. (Dry ICE - I broke the record for fastest CPU frequency ever recorded using my mobo at HWBOT).
> 
> Kind of excited to see what EVGA comes up with in terms of Classified ICX/TI step-up. (as you know classified customers are still waiting for this elusive announcement lol).
> 
> Other than that, I grabbed a new mobo for my FX. It resulted in being stable at same CPU clocks for up to 150mv less voltage. So for example I used to need about 1.35v for prime95. On new mobo I only need 1.21v. (other things all overclock better too) so still playing around with that.
> 
> Not got rid of my FX as its the only modern chipset that apparently isn't affected by the "cold bug" and I'm just getting into the extreme cooling.
> 
> Thinking of grabbing a new ZEN setup, but I want to wait until the bugs are ironed out first. When they are.. I'll be going for the 1700. As they *all* overclock by up to 1GHZ on air. And once at those clocks, equal the performance of the most expensive 1800x within 1%.
> 
> i7 6900k performance for 300 bux ;-)


So can someone tell me .. for instance.. whats wrong here? why isn't it going to the next voltage step? too hot?


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> So can someone tell me .. for instance.. whats wrong here? why isn't it going to the next voltage step? too hot?


Looks like that curve is set to run something like 2050ish at 1.081v. Boost 3.0 usually won't try to run voltage higher, especially if temps are above a certain point. If you want the GPU to run at 1.093v, you need to set the curve to run it there.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> Looks like that curve is set to run something like 2050ish at 1.081v. Boost 3.0 usually won't try to run voltage higher, especially if temps are above a certain point. If you want the GPU to run at 1.093v, you need to set the curve to run it there.


Man i feel so stupid right now







Maybe i just don't understand boost 3.0 and need to read up on that...


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> Man i feel so stupid right now
> 
> 
> 
> 
> 
> 
> 
> Maybe i just don't understand boost 3.0 and need to read up on that...


Man, click on that dot, and it'll tell you exactly what voltage marker you're on. The next one to the right, is the one for 1.093v.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> Man, click on that dot, and it'll tell you exactly what voltage marker you're on. The next one to the right, is the one for 1.093v.




Now the curve is at 2100mhz at 1.093v but it stays at 2000mhz (44c core)


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> 
> 
> Now the curve is at 2100mhz at 1.093v but it stays at 2000mhz (44c core)


Did you apply the overclock? Still gotta do that. Hit the checkmark on the main screen.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> Did you apply the overclock? Still gotta do that. Hit the checkmark on the main screen.


i did yes


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> i did yes


Is the voltage slider all the way up?


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> Is the voltage slider all the way up?


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*


Kick your fan speed all the way up. I can't think that'd be it, but...if it's already 40c under load.....


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> Kick your fan speed all the way up. I can't think that'd be it, but...if it's already 40c under load.....


all fans and pump are maxed out on the loop.. perhaps temps will drop next when when i delid my i7-7700k


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> all fans and pump are maxed out on the loop.. perhaps temps will drop next when when i delid my i7-7700k


Wait...you're on water and hitting 44c at idle? What are your ambients?


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> Wait...you're on water and hitting 44c at idle? What are your ambients?


Not idle.. MAX load efter 20 mins 48c

Idle 28-30c


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> Not idle.. MAX load efter 20 mins 48c
> 
> Idle 28-30c


That load temp isn't high enough to make it not kick in....

Try 2153 @ 1.093v.


----------



## Vellinious

Quote:


> Originally Posted by *Vellinious*
> 
> That load temp isn't high enough to make it not kick in....
> 
> Try 2153 @ 1.093v.


OH....ha, sorry. You're running SLI. If SLI is enabled, it won't set the curve for each card. Save that curve as a profile...unlink the GPUs to overclock them individually, and then use that same profile for the 2nd GPU. Took me a few tries to figure this out as well.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> OH....ha, sorry. You're running SLI. If SLI is enabled, it won't set the curve for each card. Save that curve as a profile...unlink the GPUs to overclock them individually, and then use that same profile for the 2nd GPU. Took me a few tries to figure this out as well.


thanks alot. will try







Pregnant GF yelling for me to rub her feet


----------



## mbm

I cant get it to boost higher gran 2076 mhz stable.. offset or curve..
It can boost to +2100 but not stable


----------



## seabiscuit68

I have one of the EVGA ACX 3.0 cards that supposedly burst into flames. EVGA is offering to upgrade to the iCX cooler for $100. Wondering if I just sell the ACX one for $400 or whatever I can get for it and take the $100 I would have spent on the iCX cooler and put the $500 towards a GTX 1080Ti. Been playing more and more on my Nvidia Shield hooked up to the 4k TV, aka...more 4k gaming. Plus I have the Vive. Seems like the 1080Ti might be a good idea...

If you were in my position, would you guys do the $99 upgrade to the iCX card and then wait for the 1180Ti (probably won't get more than $150 selling the 1080 at that point) or just cut my losses and drop the GTX 1080 and go for the GTX 1080Ti?

For those who are going to tell me about the reset of the Step-Up program with the iCX upgrade, they use the MSRP of the iCX GTX 1080, which I'm sure will drop to about $500 or $520. Subtract shipping to them ($20) and then shipping again (to send in for the 1080Ti) and then the $100 on the upgrade....and I'm only really getting $360 for it. I can get more on the used market.


----------



## Vellinious

Quote:


> Originally Posted by *seabiscuit68*
> 
> I have one of the EVGA ACX 3.0 cards that supposedly burst into flames. EVGA is offering to upgrade to the iCX cooler for $100. Wondering if I just sell the ACX one for $400 or whatever I can get for it and take the $100 I would have spent on the iCX cooler and put the $500 towards a GTX 1080Ti. Been playing more and more on my Nvidia Shield hooked up to the 4k TV, aka...more 4k gaming. Plus I have the Vive. Seems like the 1080Ti might be a good idea...
> 
> If you were in my position, would you guys do the $99 upgrade to the iCX card and then wait for the 1180Ti (probably won't get more than $150 selling the 1080 at that point) or just cut my losses and drop the GTX 1080 and go for the GTX 1080Ti?
> 
> For those who are going to tell me about the reset of the Step-Up program with the iCX upgrade, they use the MSRP of the iCX GTX 1080, which I'm sure will drop to about $500 or $520. Subtract shipping to them ($20) and then shipping again (to send in for the 1080Ti) and then the $100 on the upgrade....and I'm only really getting $360 for it. I can get more on the used market.


Pretty sure you already missed that upgrade window. As far as I know, it ended on 2/28....may wanna check on it.


----------



## Bal3Wolf

i havet played with my card to much today but i just noticed something if i only adjust 1.093 and up in the curve my card uses that voltage compared to if i start at 1.05 it never goes above 1.063 might be able to get some better clocks now.


----------



## seabiscuit68

Quote:


> Originally Posted by *Vellinious*
> 
> Pretty sure you already missed that upgrade window. As far as I know, it ended on 2/28....may wanna check on it.


I signed up for it (no money up front) so I will have the option to do it if I want to.


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> Traveling a lot for work. It's a lot busier than I had even planned on. I play as much as I can,
> 
> I made a post in the EVGA forums to explain the offset / curve method a little bit better. This might help some of you understand what you're doing.
> 
> See, when you set an offset, the voltage curve always keeps the top of the curve at a lower voltage. This would allow for boost 3.0 to do it's thing, and lower the voltage if at all possible. Trouble is, it also adds instability, because it's always trying to lower the voltage. Like in this below. See how the +160 offset has set the voltage / frequency curve? 1031mv.....it's never going to run that frequency at that voltage, without dropping temps down around 0c.
> 
> 
> 
> BUT.....modify that curve a little bit, like pictured below here, and the lowest voltage point, at the highest clock point, will avoid boost 3.0 trying to do it's thing, and lock in on that voltage / core clock. With this curve, the voltage won't try to drop below the 1.093v, and it'll lock into the 2202 core clock, and not try to drop even when temps get too high....it'll just drop frames, or it'll crash outright.
> 
> 
> 
> It's all in the curve....and that curve will change with every 8-10c drop, or rise in core temps


You got a link to it on EVGA forums ?
(I like the way you explain it)


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> You got a link to it on EVGA forums ?
> (I like the way you explain it)


It was in a different kind of thread. Someone was asking why their voltage would only go to 1.063v. Then of course, some other clown jumps in and says, "well, voltage won't ever read right, according to what the core actually receives". Which....is technically true, but at the same time....had ABSOLUTELY NOTHING to do with the question the guy was asking.

When responding to this stupidity, I realized, that I had just written a pretty good overclocking summary as well...so, decided to go ahead and post it. lol


----------



## KedarWolf

Midnight tonight, scour newegg and amazon for my 1080 Ti.


----------



## Vellinious

I just bought another 1080 FTW for $439. = )


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> You got a link to it on EVGA forums ?
> (I like the way you explain it)
> 
> 
> 
> It was in a different kind of thread. Someone was asking why their voltage would only go to 1.063v. Then of course, some other clown jumps in and says, "well, voltage won't ever read right, according to what the core actually receives". Which....is technically true, but at the same time....had ABSOLUTELY NOTHING to do with the question the guy was asking.
> 
> When responding to this stupidity, I realized, that I had just written a pretty good overclocking summary as well...so, decided to go ahead and post it. lol
Click to expand...

lol your help has been the only easy to really understand help i have found thanks alot i got card up to [email protected] sadly it wont go any higher without failing 3dmark.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> lol your help has been the only easy to really understand help i have found thanks alot i got card up to [email protected] sadly it wont go any higher without failing 3dmark.


Glad I could help out, man.


----------



## Bal3Wolf

lol and i spoke to soon i noticed my scores were lower even tho it was stable scores were droping i swear this card dont wanna be over 2114 without either kicking in Ecc or crashing its holding 1.093 like it should.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> lol and i spoke to soon i noticed my scores were lower even tho it was stable scores were droping i swear this card dont wanna be over 2114 without either kicking in Ecc or crashing its holding 1.093 like it should.


Higher temps do that....if it'll run at 2153....Lower the core temp by 10c, and you'll see the increase in performance that should go with the increased clock. The colder you keep the core, the better it'll perform at higher clocks. The warmer it gets, the more voltage hungry it'll be, and perform worse, or...become unstable and crash.

It's a never ending battle between clock speeds, voltage requirements, and keeping temps low enough to make them "GO"


----------



## Bal3Wolf

havet seen temps go above like 45c my rooms a little warm but even cold i thk im at limits of cooling on this evga block pretty sure no way i can get it down to 35c lol my room would need to be like 10-15c to do do that likely. Ok this is very wierd if i use say +100 offset with +588memory i get around 18k in 3dmark if i use curve with same +100 for same core freq of 2114 score drops 300 points but uses more volts so you would think it would be more stable lol. Lol figuring out this card is like putting a puzzle together that you keep losing a piece to lol.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> havet seen temps go above like 45c my rooms a little warm but even cold i thk im at limits of cooling on this evga block pretty sure no way i can get it down to 35c lol my room would need to be like 10-15c to do do that likely. Ok this is very wierd if i use say +100 offset with +588memory i get around 18k in 3dmark if i use curve with same +100 for same core freq of 2114 score drops 300 points but uses more volts so you would think it would be more stable lol. Lol figuring out this card is like putting a puzzle together that you keep losing a piece to lol.


Try that clock with lower voltage on the curve.


----------



## Bal3Wolf

ok i will, i dont even think im hitting Ecc now something else is up sence same clock works fine in offset.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> ok i will, i dont even think im hitting Ecc now something else is up sence same clock works fine in offset.


Boost 3.0 works awfully strange sometimes. Sometimes it's the difference in 1c.....a run where the core goes over 45c crashes or runs worse FPS than the previous run where the core stayed under 44c. With mine, @ 2252, if the core goes over 25c, the run is gonna suck. If I can keep it under 24c, it's gonna run great.

Learn where those temp layers are and use voltage and clocks to keep from crossing over them. They're there at about every 10c or so.


----------



## Bal3Wolf

well at 2114 anything below 1.062 or above it turned in a crappy 3dmark it has to be exacty 1.062 lol so maybe some of my higher clocks are really picky to lol.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> well at 2114 anything below 1.062 or above it turned in a crappy 3dmark it has to be exacty 1.062 lol so maybe some of my higher clocks are really picky to lol.


oh...they are.


----------



## DrFreeman35

I'm guessing I got one good card, and one bad one. GPU1 is hitting 2076 with 1.81v and GPU 2 is hitting same MHz with 1.43v. I'm only hitting 58c on air when stressing it, so I'm not sure what else to think. New to all of this, so it's been a pain. I can hit 2114 on my gpus but they always throttle back to 2076 when stressing, one card is always hitting voltage limit. Should I back it off? I tried the curve, yet still didn't help any.


----------



## DrFreeman35

Noob question, I've got my GPU's in slots 1&3, yet only GPU 1 will display anything when connected to monitors. I have both of my monitors plugged into it, is that something I would need to change in BIOS? Or am I missing something? Wondered why I couldn't have my second GPU hooked up to the monitors.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> OH....ha, sorry. You're running SLI. If SLI is enabled, it won't set the curve for each card. Save that curve as a profile...unlink the GPUs to overclock them individually, and then use that same profile for the 2nd GPU. Took me a few tries to figure this out as well.


Even though i set it to 2153, when i click saves it might change to something else close to 2153 @ 1.093v

I don't get how the profiles work? Doesn't it save all overclocking and the curve in 1 profile? If i do i for card one.. set the overclock and the curve with the cards unlinked, then i choose card 2.. load the profile, then the curve is not activated!.. or its flattened out and the i have to set the curve again for card 2.


----------



## mbm

just to clarify. When you talk more performance is that equal to most mhz? Or do you say that lowering the mhz or voltage may result in better performance (FPS)..

I have been playing with this curve now.. As I told I cant get it past 2100 mhz regardless of voltage. tried all up to 1.093V.
BUT I can lower the voltage all the way down to 1.000V and still do 2100 mhz.


----------



## max883

Evga gtx 1080 acx SC. Max temp: 66.c max fann speed: 40% 1460.rpm.

I did the thermal padd mod with grizly kryonut. Custom fann speed. 20% idle max fann speed 40%.

Down volt to 0.950v at 2000.mhz gpu


----------



## jase78

Got someone willing to pay 500 now for strix 1080 . What to do what to do . How long u guys think till we see AIB 1080tis ? Can always use onboard for a bit .


----------



## ondoy

500 for a used 1080, lucky.
gigabyte 1080 are now going for 499 on amazon, brand new.


----------



## DrFreeman35

Some people are not very smart, new ones are just as cheap, spend a little more get a Ti.....Do not understand how people thing sometimes.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> just to clarify. When you talk more performance is that equal to most mhz? Or do you say that lowering the mhz or voltage may result in better performance (FPS)..
> 
> I have been playing with this curve now.. As I told I cant get it past 2100 mhz regardless of voltage. tried all up to 1.093V.
> BUT I can lower the voltage all the way down to 1.000V and still do 2100 mhz.


More performance, as in, more FPS.

If you're getting 2100 @ 1.0v, you're likely missing something in there somewhere at a higher clock / voltage setting.


----------



## Bal3Wolf

yea iv found how picky they can be lol i gave up for awhile to much dang work tweaking them lol i left mine at 2115/5600.


----------



## dentnu

Hi can someone please provide the MSI Geforce GTX 1080 *Gaming X* bios. It was on the MSI website to download from there support page for this card but it looks like they took it down. Does someone have a copy by any chance ? Need to flash my card back to the original bios. Thanks


----------



## owikhan

Quote:


> Originally Posted by *dentnu*
> 
> Hi can someone please provide the MSI Geforce GTX 1080 *Gaming X* bios. It was on the MSI website to download from there support page for this card but it looks like they took it down. Does someone have a copy by any chance ? Need to flash my card back to the original bios. Thanks


I have same card with orignal bios how i help you?how i save my gpu bios and send you?


----------



## dmnclocker

Hi all. I just got my Asus strix gtx 1080. Was wondering if someone with the same card is running custom bios ok. Like the one from the oc edition card?


----------



## 6u4rdi4n

Quote:


> Originally Posted by *dentnu*
> 
> Hi can someone please provide the MSI Geforce GTX 1080 *Gaming X* bios. It was on the MSI website to download from there support page for this card but it looks like they took it down. Does someone have a copy by any chance ? Need to flash my card back to the original bios. Thanks


https://www.techpowerup.com/vgabios/184799/msi-gtx1080-8192-160606


----------



## DrFreeman35

Ok, so noob mistake. Not sure if it makes a difference or not, but I had the *SLI* switch on my Rampage V Edition 10 disabled. I forgot to enable it back after messing with my board before putting it in the case. I have been running this whole time with it 'disabled'........Is that a problem? does it make a difference? Also curious as to one of my fans has a higher rpm rating on MSI Afterburner? Pic for details.....


Spoiler: Warning: Spoiler!


----------



## mbm

Quote:


> Originally Posted by *Vellinious*
> 
> More performance, as in, more FPS.
> 
> If you're getting 2100 @ 1.0v, you're likely missing something in there somewhere at a higher clock / voltage setting.


Missing what?
Do you think it has to clock better? Why?


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> Missing what?
> Do you think it has to clock better? Why?


Don't know.
Yes.
Logic.


----------



## mbm

you dont think there is a limit?
All cores clock the same?
Whats your clocks?


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> you dont think there is a limit?
> All cores clock the same?
> Whats your clocks?


Are all cores the same? No. Is there a limit? Sure....there's always a limit. My clocks? I can get 2100 stable at .093v, and the highest clock I've hit personally is 2278 with coolant temps at 1c. At normal room temps, 2240 is doable, but doesn't run all that well because it's too warm, temps peaking at 35c.


----------



## mbm

Well Im on air... And 1.000v 2100 mhz 58C seems to be my best Setting.


----------



## 6u4rdi4n

My card seems like a "dud". Won't do much over 2000 whatever I do to it. Water cooled, temps around 39-41°C.


----------



## Bal3Wolf

I might of found a bug in 3dmark with overclocking 1080 i tested +100 in offset and then in curve and even tho i know +100 is stable 3dmark score droped in curve mode going to do more testing.


----------



## azcrazy

I just got my gtx 1080 today and let me tell you I'm disappointed, my RX 480 ran better


----------



## Bal3Wolf

Quote:


> Originally Posted by *azcrazy*
> 
> I just got my gtx 1080 today and let me tell you I'm disappointed, my RX 480 ran better


How so ? did you clean up the drivers etc befort switching cards and you could have a bottleneck limiting the the 1080.


----------



## azcrazy

Quote:


> Originally Posted by *Bal3Wolf*
> 
> How so ? did you clean up the drivers etc befort switching cards and you could have a bottleneck limiting the the 1080.


I might have a bottleneck with my system for sure, the games i play are little dated but in comparison with my 480 the 1080 runs 40 fps slower and stutters.

Th resolution I run is 1440 because the cards (system) cant handle 4k


----------



## Bal3Wolf

what games are you trying have you tested say with 3dmark and heavin and compared the two cards to make sure nothing is wrong with your 1080. I have a xfx rx480 rs in my other pc its def not as fast as my 1080 a good 30fps slower in alot of things.


----------



## azcrazy

I play TF2 and CS:GO mainly ,480 ran tf2 @290fps cs:[email protected], 1080 ran [email protected]/80fps cs:[email protected]/100, same settings for both cards.

Idint saw that big of a drop in frames from the 480


----------



## azcrazy

I also read that 378.78 driver is not that good, that could be the cause if the low performance


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> I might of found a bug in 3dmark with overclocking 1080 i tested +100 in offset and then in curve and even tho i know +100 is stable 3dmark score droped in curve mode going to do more testing.


Yup. Everything depends on the curve. One curve with the same exact clock and voltage will run better than another curve that has slightly different points set will perform much different.


----------



## azcrazy

Ok I reinstall the drivers, it seems 378.78 is not good.

with 378.49 is working better, but now afterburner cant read it


----------



## th3illusiveman

Is the +100% voltage option safe in Afterburner? I have an EVGA Superclocked GTX1080. Core temps are great but i don't know about the VRM's.


----------



## mbm

what program do you use to tjeck stability?
I use both Heaven and furmark.

Even if I set the curve the mhz doesnt maintain its value. It still jumps up and down. Is it still suppose to do this?


----------



## hertz9753

I use something called Folding.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> what program do you use to tjeck stability?
> I use both Heaven and furmark.
> 
> Even if I set the curve the mhz doesnt maintain its value. It still jumps up and down. Is it still suppose to do this?


Every 10c or so, starting as low as 15c, the voltage / clock speeds will adjust with boost 3.0. So, yes...it's normal.


----------



## mbm

so whats the point in setting different mhz for each voltage in the curve if it still clock below the setting?
You cant control the mhz better than the normal offset.


----------



## th3illusiveman

Is 2Ghz @ 0.950v even possible? I adjusted my voltage curve to try and get some stable clocks (not a fan of the clocks jumping up and down) and it reads 2Ghz @0.95v throughout firestrike (22473 Graphics score) and i played 1hr of wildlands @VeryHighQuality and saw no artifacts. Then again i just got this card yesterday and my last Nvidia card a a GTX570 so i don't know how these things OC these days.


----------



## BackHol3

I bought second hand Inno3D Gtx 1080 Twin X2. The card design is based reference pcb. The original cooler wasn't worst I've seen, but it was quite loud when using custom fan curve. So I bought Arctic Accelero Xtreme IV(59,90€) and installed it. Results are very nice. With original cooler+Thermal Grizzly Kryonaut paste I got temps 70C 2050Mhz/11000Mhz, with Accelero Xtreme IV+ Thermal Grizzly Kryonaut about 54C 2138,5Mhz/11000Mhz. With custom fan curve core clocks stays in 2126-2138'5MHz range and temp 47-49c range when playing BF1 Mp. Now the question is why I cant get the card go over 2138,5mhz(core)? No matter what? Is it BIOS limit, 1x8 pin power limit or what?


----------



## mbm

Very nice temps.
Have the same cooler but my temp. Peaks at 62C


----------



## L4TINO

does anyone else in australia feel screwed over on the re sale value of our cards! it infuriates me how they can drop the prices this low when we had to pay $1200 for a new card. please tell me there's some aussies here that feel the same way!

come on $400 off! when in america they just dropped the price by $100-$200


----------



## Bal3Wolf

Quote:


> Originally Posted by *L4TINO*
> 
> does anyone else in australia feel screwed over on the re sale value of our cards! it infuriates me how they can drop the prices this low when we had to pay $1200 for a new card. please tell me there's some aussies here that feel the same way!
> 
> come on $400 off! when in america they just dropped the price by $100-$200


well you was paying 1200 vs 600 in the usa so seems kinda logical to drop the price more.


----------



## feznz

Quote:


> Originally Posted by *L4TINO*
> 
> does anyone else in australia feel screwed over on the re sale value of our cards! it infuriates me how they can drop the prices this low when we had to pay $1200 for a new card. please tell me there's some aussies here that feel the same way!
> 
> come on $400 off! when in america they just dropped the price by $100-$200


wait till you in NZ







you will feel not just screwed but absolutely violated

Almost got a 1080Ti the stock arrived Saturday 10.30am I checked 11.00am out of stock


----------



## ondoy

*EVGA GeForce GTX 1080 SC GAMING ACX 3.0, 8GB GDDR5X, LED, DX12 OSD Support (PXOC) Graphics Card 08G-P4-6183-KR @ 499*


----------



## hotrod717

Quote:


> Originally Posted by *Vellinious*
> 
> Are all cores the same? No. Is there a limit? Sure....there's always a limit. My clocks? I can get 2100 stable at .093v, and the highest clock I've hit personally is 2278 with coolant temps at 1c. At normal room temps, 2240 is doable, but doesn't run all that well because it's too warm, temps peaking at 35c.


Could you post some bench results? Gpuz showing +2200 and running a bench where 2200mhz clock scales are two different things. I've ran just about every prominent gpu bench and 2062-2088 has been the best performing clock, depending on bench I've personally seen. Not saying it isnt possible, just would love to see some comparative results.


----------



## Vellinious

Quote:


> Originally Posted by *hotrod717*
> 
> Could you post some bench results? Gpuz showing +2200 and running a bench where 2200mhz clock scales are two different things. I've ran just about every prominent gpu bench and 2062-2088 has been the best performing clock, depending on bench I've personally seen. Not saying it isnt possible, just would love to see some comparative results.


This is the only one I have a picture of. As one might imagine, getting a run done at 2278 with coolant temps near 0c takes some preparation....so not something I'm going to do today. lol

This is 2252...same score as 2240, so frames didn't drop any, but they didn't raise any either....


----------



## x7007

Hi,

I have Inno3D 1080GTX .

I wondered just for testing what could the Power Limit do without overclock, just by upping it to 113% max from 100% could have benefits for whatever reason. but the worst just came true. Upping the Power Limit just causing randomly stuttering without affecting the FPS or FrameTime , there is just annoying fast repetitive graphics anomaly or artefact that you can barely feel but you can see it in the instant it happens. so me and my gf played Dragon Age Inquisition and without me changing anything it worked smooth and fine with Vsync 60 Fps all the times. I said I'll do some testing and play some BERSERK and the Band of the Hawk with Power limit set 113% I had the annoying stuttering randomly. I didn't check with MSI afterburner cause I played in 3D with Reshade and it was so annoying I just kept playing till I went to sleep which I didn't thought at first the power limit was the issue. I also added 2 commands to BCDEDIT .
bcdedit/set disabledynamictick yes
bcdedit/set tscsyncpolicy Enhanced
I have HPET Disabled.

So I thought it might be those commands, but it still happened.

So after we started playing Dragon Age Inquisition we both saw those exactly same anomalies I have in Berserk , so I tried default Power Limit to 100% and the stuttering stopped....

Why and how could I use power limit without having those stutterings ? why only this setting causing it ? how can I overclock if just by having higher power limit for overclock is causing a stuttering ....

It is not just me , just by searching a bit I've found more people

http://www.tomshardware.com/answers/id-3082847/gtx-1080-power-throttling-temp-throttling.html
Quote:


> " So are you saying that it's the number of pins causing the issue?
> 
> Also, I turned off the OC completely and left pwoer limit at 120%. Didn't see the power % getting too high, but the game stuttered the same."


http://www.tomshardware.com/answers/id-3082847/gtx-1080-power-throttling-temp-throttling.html#r18085567


----------



## KickAssCop

Quote:


> Originally Posted by *KedarWolf*
> 
> How much did you get for it?


Sold both of my cards for 500 a pop. They were ASUS STRIX AG editions w/ OG bios.


----------



## JunkaDK

2088mhz is also my fastest run with my 2 x GTX1080.. +550mhz on the mem. Quite happy with this performance







I might be able to clock higher this weekend when i delid my i7-7700k, cuz my loop water temps should drop a bit..

http://www.3dmark.com/fs/11957272


----------



## Bal3Wolf

So i was wondering can we do any bios modding on the 1080s yet or cross flash from other cards to get more mhz out of our cards ?


----------



## mbm

I cant get my memory past +300 mhz









So my card runs at 1.000V = GPU 2101-2088 mhz and memory 5300 mhz


----------



## Transmaniacon

I just finished my new ITX build in a Nano S case and am planning to upgrade to a 1080Ti in the coming days. I think a blower style GPU will be more beneficial in this case, so I am looking at those specifically.

How does the 1080 FE compare with say the Asus 1080 Turbo? Is it worth waiting for an aftermarket blower card for the 1080Ti, or should I stick with the FE that I can get sooner?


----------



## emsj86

Haven't been o. This thread in awhile. Is there a way now to bio hack (nv flash) my gtx 1080 ftw. Like how the gtx 780s had the Mai after burner 1.3v mod and could be nv flashed to unlock voltage and power delivery ?


----------



## Vellinious

There's the ASUS Strix T4 bios, but, there is no Pascal bios editor.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> There's the ASUS Strix T4 bios, but, there is no Pascal bios editor.


is it ok to run it on a evga 1080 ftw and do you really get any extra overclocking out of it ?


----------



## L4TINO

Quote:


> Originally Posted by *Bal3Wolf*
> 
> well you was paying 1200 vs 600 in the usa so seems kinda logical to drop the price more.


yes so it was $1200aud drop it to a minimum of $900-$1000 so then us 1080 owners can sell our cards for atleast $700-$850
if we are lucky we will be able to sell our used 1080s for just half the price now that they have dropped brand new cards this low...
do you understand the frustration a little better?


----------



## L4TINO

Quote:


> Originally Posted by *feznz*
> 
> wait till you in NZ
> 
> 
> 
> 
> 
> 
> 
> you will feel not just screwed but absolutely violated
> 
> Almost got a 1080Ti the stock arrived Saturday 10.30am I checked 11.00am out of stock


those where the premiums we where paying when they first started selling they even went up to $1300aud in some places. but like i said the US dropped the 1080 only $100us, so why is our prices dropping a staggering $400aud for brand new cards.. it should drop a minimum of $200aud so us 1080 owners can actually resell our cards for a decent price.. not get low balled half price now that new cards are $800....
dont know if you see where im coming from or not... maybe its just me... lol


----------



## Bal3Wolf

well i understand you hate losing resell value but sadly thats the nature of the beast with everything now days and they really want to price stuff so NEW looks much better then used so they make more money they dont make anything extra off a used product being sold.


----------



## mbm

I have experienced af problem when using the curve.
I have made af profile where the curve boost to 2101 mhz fro 1.000V and upwards. The profile startup automatic with windows.
BUT the curve increases it selves after boot and add approx. 25 mhz to the curve. resulting in failure when running 3D apps offcource








Has anyone experienced this and have a fix?


----------



## Bal3Wolf

can you post a screen shot of your curve exacty how you did it ? id say your issue is you started at 1.0volt you need to start higher i doubt 1volt is gonna run 2100mhz.


----------



## mbm

Quote:


> Originally Posted by *Bal3Wolf*
> 
> can you post a screen shot of your curve exacty how you did it ? id say your issue is you started at 1.0volt you need to start higher i doubt 1volt is gonna run 2100mhz.


not sure how to do this (post screenshot here)
But what I did in Afterburner was raise the 1000mV to the 2100 mhz mark and hit apply. Then the curve makes a straight line at 2100 mhz from 1000mV to 1200mV.

Running onscreen display in games etc. also show 1.000 V at 2100 mhz, so it works fine, but somhow it messes up after reboot and load of the saved profile.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> not sure how to do this (post screenshot here)
> But what I did in Afterburner was raise the 1000mV to the 2100 mhz mark and hit apply. Then the curve makes a straight line at 2100 mhz from 1000mV to 1200mV.
> 
> Running onscreen display in games etc. also show 1.000 V at 2100 mhz, so it works fine, but somhow it messes up after reboot and load of the saved profile.


Use Imgur to host the image, and then post it here.


----------



## mbm

So do your curve in a saved profile keep its curve after reboot?


----------



## feznz

Quote:


> Originally Posted by *L4TINO*
> 
> those where the premiums we where paying when they first started selling they even went up to $1300aud in some places. but like i said the US dropped the 1080 only $100us, so why is our prices dropping a staggering $400aud for brand new cards.. it should drop a minimum of $200aud so us 1080 owners can actually resell our cards for a decent price.. not get low balled half price now that new cards are $800....
> dont know if you see where im coming from or not... maybe its just me... lol


yeah I see it we have to pay import/reseller premium just that here the prices here are still steady I looked lastnight for a S/H 1080 new can get new $1000 S/H $800
As for performance scaling a 1080ti is 35% faster and cost 40% more I think I forget about 1080SLI and go for a 1080 Ti
Honestly I wouldn't pay more than 60% of retail for a S/H card so around $600 to $650 with warrantee issues just not worth the risk of a low binned S/H card


----------



## pez

Quote:


> Originally Posted by *max883*
> 
> 
> 
> 
> 
> Evga gtx 1080 acx SC. Max temp: 66.c max fann speed: 40% 1460.rpm.
> 
> I did the thermal padd mod with grizly kryonut. Custom fann speed. 20% idle max fann speed 40%.
> 
> Down volt to 0.950v at 2000.mhz gpu


Very nice numbers! Love seeing SFF builds utilize higher end parts







.

What the cut to the area around the GPU? Airflow concerns? Also, more pics of the build completed and such?


----------



## JunkaDK

Quote:


> Originally Posted by *pez*
> 
> Very nice numbers! Love seeing SFF builds utilize higher end parts
> 
> 
> 
> 
> 
> 
> 
> .
> 
> What the cut to the area around the GPU? Airflow concerns? Also, more pics of the build completed and such?


Indeed an amazing build







I am thinking about making a small but VERY powerful LAN box with 1080ti.. just need funding









My current PC is now so heavy that its very diffficult to move.. The Thermaltake suppressor F51 is VERY heavy, and with a full water loop and EK blocks.. its like 40kg









I can stare at those chrome backplates on my 1080's all day





















More pics can be found here







https://pcpartpicker.com/b/x6sZxr


----------



## Motley01

Quote:


> Originally Posted by *JunkaDK*
> 
> I can stare at those chrome backplates on my 1080's all day
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> More pics can be found here
> 
> 
> 
> 
> 
> 
> 
> https://pcpartpicker.com/b/x6sZxr


Holy crap that looks cool. Where did you get the backplates from?


----------



## JunkaDK

They are EK backplates







These: https://www.ekwb.com/shop/ek-fc1080-gtx-ftw-backplate-nickel .. i LOVE the reflection


----------



## pez

Quote:


> Originally Posted by *JunkaDK*
> 
> Indeed an amazing build
> 
> 
> 
> 
> 
> 
> 
> I am thinking about making a small but VERY powerful LAN box with 1080ti.. just need funding
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My current PC is now so heavy that its very diffficult to move.. The Thermaltake suppressor F51 is VERY heavy, and with a full water loop and EK blocks.. its like 40kg
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can stare at those chrome backplates on my 1080's all day
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> More pics can be found here
> 
> 
> 
> 
> 
> 
> 
> https://pcpartpicker.com/b/x6sZxr


Very nice. Lovely how they reflect the RoG lighting on the mobo







. And yeah, I love my M1 and cannot wait to be able to step up to a non-FE Ti. I can initiate a step up to a FE Ti, but that defeats the purpose of why I even bought a 1080 in the first place







. I could always step up to the Ti, then sell the TXP to fund a second Ti. But that would stimulate an entirely new build that I don't need and can't justify.

First world problems







.


----------



## DrFreeman35

I am looking to update to the Ti, and have 2 1080 FTW's both without the update to the thermal problems. It took me about a year to purchase and complete my whole PC, and I was not keeping up with the problems of these cards. This is my first build, and I am not WC my cards. One of my cards is available for Step-up program through EVGA, and I can upgrade to the Ti for $279.99. Would you guys upgrade? Or wait for the AIB's to come out and hopefully upgrade then? I am not sure when they will be out, but my card that is available has 71 days left for the program. I want to eventually upgrade to 2 of the Ti's, but I will most likely not WC them anytime soon. Just curious as to some opinions on what i should do? My current cards hit 2075 and seem to not get too hot while gaming, although I cannot see the VRM temps...... Any input would be greatly appreciated, because I have contacted EVGA and tried to ask about sending in my cards for the thermal issue, or upgrading the FTW2's. TIA


----------



## JunkaDK

Quote:


> Originally Posted by *DrFreeman35*
> 
> I am looking to update to the Ti, and have 2 1080 FTW's both without the update to the thermal problems. It took me about a year to purchase and complete my whole PC, and I was not keeping up with the problems of these cards. This is my first build, and I am not WC my cards. One of my cards is available for Step-up program through EVGA, and I can upgrade to the Ti for $279.99. Would you guys upgrade? Or wait for the AIB's to come out and hopefully upgrade then? I am not sure when they will be out, but my card that is available has 71 days left for the program. I want to eventually upgrade to 2 of the Ti's, but I will most likely not WC them anytime soon. Just curious as to some opinions on what i should do? My current cards hit 2075 and seem to not get too hot while gaming, although I cannot see the VRM temps...... Any input would be greatly appreciated, because I have contacted EVGA and tried to ask about sending in my cards for the thermal issue, or upgrading the FTW2's. TIA


Ask yourself.. why are you upgrading?







Just because the Ti is the latest/fastest? or do you have a need for the extra performance?







I would not change 2 GTX1080 for 2 gtx1080ti's.. wait for Volta / Next gen









But if you have the cash and just want the fastest cards, do it







I totally understand, even though its crazy


----------



## pez

Depends on resolution. The difference between the 1080 and TXP/1080Ti is pretty polarizing. From his sig rig, it looks to be 3440x1440. I can't say you'd necessarily need it with SLI, though your titles that don't use SLI are going to suffer a bit at that res with 'just' a 1080.


----------



## JunkaDK

Quote:


> Originally Posted by *pez*
> 
> Depends on resolution. The difference between the 1080 and TXP/1080Ti is pretty polarizing. From his sig rig, it looks to be 3440x1440. I can't say you'd necessarily need it with SLI, though your titles that don't use SLI are going to suffer a bit at that res with 'just' a 1080.


1 GTX1080ti is fine for that resolution @ 100hz


----------



## DrFreeman35

Quote:


> Originally Posted by *JunkaDK*
> 
> Ask yourself.. why are you upgrading?
> 
> 
> 
> 
> 
> 
> 
> Just because the Ti is the latest/fastest? or do you have a need for the extra performance?
> 
> 
> 
> 
> 
> 
> 
> I would not change 2 GTX1080 for 2 gtx1080ti's.. wait for Volta / Next gen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But if you have the cash and just want the fastest cards, do it
> 
> 
> 
> 
> 
> 
> 
> I totally understand, even though its crazy


Quote:


> Originally Posted by *pez*
> 
> Depends on resolution. The difference between the 1080 and TXP/1080Ti is pretty polarizing. From his sig rig, it looks to be 3440x1440. I can't say you'd necessarily need it with SLI, though your titles that don't use SLI are going to suffer a bit at that res with 'just' a 1080.


Quote:


> Originally Posted by *JunkaDK*
> 
> 1 GTX1080ti is fine for that resolution @ 100hz


Main reason I am upgrading is to prevent my worrying about these cards VRM's running too hot. I am not comfortable enough to open them up and replace thermal pads. I have watched videos, and see how easy it is, but I just don't feel comfortable. This is my first PC build, and I would rather just upgrade and be done with it. I have contacted EVGA multiple times about sending in my cards to get the ICX upgrade, or fix the VRM issue. No reply, and is quite annoying. I understand it is partly my fault for not keeping up with GPU news, but I did not receive an email or notification about my VRM problems on my ACX cooler of my 1080. I am upgrading because I am looking to get the new 4k Asus 27" and will be playing some games on that as well. I have no problem with the cards I have, but do not want to worry about the thermal issues. I can sell my oldest card for around $400 most likely and get a new FTW Ti, and do the Step-up Program for the other. I will upgrade to Volta when it comes out, I have no doubt in that. I guess I just feel its necessary to get rid of the worry in the back of my mind about these cards. Thanks for the input though.


----------



## pez

Quote:


> Originally Posted by *JunkaDK*
> 
> 1 GTX1080ti is fine for that resolution @ 100hz


Sure, but if you're already running SLI, there's no reason to downgrade assuming the titles he may play are utilizing it.


----------



## DrFreeman35

Quote:


> Originally Posted by *pez*
> 
> Sure, but if you're already running SLI, there's no reason to downgrade assuming the titles he may play are utilizing it.


Agreed, If i get can manage to get 2 Ti's there is no need for just 1. I will for sure be upgrading, just do not want to keep worrying about my cards. Guess I kind of answered my own questions, and if the FTW Ti's will be out in a month or so I will probably try to upgrade. Think I will need to sell my other FTW 1080 for cheaper than normal used ones since it does not have the Thermal Pad upgrade?


----------



## pez

I believe the second owner can request the thermal pads just based on the serial number and get them from EVGA that way. I don't think that should be a reason you get a lesser price.


----------



## DrFreeman35

Quote:


> Originally Posted by *pez*
> 
> I believe the second owner can request the thermal pads just based on the serial number and get them from EVGA that way. I don't think that should be a reason you get a lesser price.


Oh ok, well hopefully I can manage to sell it for a decent price. Appreciate the info


----------



## pez

Quote:


> Originally Posted by *DrFreeman35*
> 
> Oh ok, well hopefully I can manage to sell it for a decent price. Appreciate the info


No worries, bud







.


----------



## deejaykristoff

my TI already swimming... 40/42° at full load... some games 36/38°. have installed a hybrid evga kit modded with h105 rad with short tubes. only push, have better result if push pull but not necessary. excellent result at 1440p on rog swift. when playing on 1080p lcd tv temps never reach 33° on some demanding game. very happy with results.


----------



## mbm

What are your startup boost clocks and how low do the downclock?
Mine starts at 2114 but during gaming it goes down to 2063.
Regardsless setting a straight Line curve.


----------



## ucode

That's strange. With a straight line my clock goes up. What driver version?


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> What are your startup boost clocks and how low do the downclock?
> Mine starts at 2114 but during gaming it goes down to 2063.
> Regardsless setting a straight Line curve.


Probably because of thermals.. as the card get hotter it will lower the clocks. Thats what my cards do







At least if you are pushing it at the limit. If you lower the curve than you can make it go up







But if it starts at the top.. then it will only go one way







DOWN


----------



## mbm

Quote:


> Originally Posted by *ucode*
> 
> That's strange. With a straight line my clock goes up. What driver version?


the newest driver..
So let say your line is set at 2100 mhz from the begining at 1.000V. You wil then never drop below 2100 mhz. You will actually increase and go beyond 2100 mhz?


----------



## mbm

Quote:


> Originally Posted by *JunkaDK*
> 
> Probably because of thermals.. as the card get hotter it will lower the clocks. Thats what my cards do
> 
> 
> 
> 
> 
> 
> 
> At least if you are pushing it at the limit. If you lower the curve than you can make it go up
> 
> 
> 
> 
> 
> 
> 
> But if it starts at the top.. then it will only go one way
> 
> 
> 
> 
> 
> 
> 
> DOWN


be temps. stays below 60C, but it drops regardlees of the curve.. I cant set it and lock it.. it allways decreses.


----------



## JunkaDK

I am below 40c and also see decreases in MHZ.. So 60.. is not low. But to be honest, we should wait for some pro's to answer







I am still a noob with my 1080's









Did you max out voltage and power ?


----------



## GRABibus

Quote:


> Originally Posted by *JunkaDK*
> 
> I am below 40c and also see decreases in MHZ.. So 60.. is not low. But to be honest, we should wait for some pro's to answer
> 
> 
> 
> 
> 
> 
> 
> I am still a noob with my 1080's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you max out voltage and power ?


In order to avoid frequency decreases, one solution is to increase fan speed or to apply an agressive fan curve to the GTX 1080.

Check this :






I did it for my OC 2202MHz at 1.1V (With Asus Strix OC t4 Bios).
My frequency doesn't decrease anymore and stays stable at 2202MHz


----------



## mbm

Quote:


> Originally Posted by *GRABibus*
> 
> In order to avoid frequency decreases, one solution is to increase fan speed or to apply an agressive fan curve to the GTX 1080.
> 
> Check this :
> 
> 
> 
> 
> 
> 
> I did it for my OC 2202MHz at 1.1V (With Asus Strix OC t4 Bios).
> My frequency doesn't decrease anymore and stays stable at 2202MHz


will check it out.. but seems crazy if its about fanspeed an not temp/voltage or other stuff.
I have made a custom fancurve so I have a silent setup. I can get lower temps but it doesnt affect OC.

maxing out voltage dont give me higher mhz and its still decreses after a while.
so I run my card at 1.000V
1.093v just give me higher temps.


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> will check it out.. but seems crazy if its about fanspeed an not temp/voltage or other stuff.
> I have made a custom fancurve so I have a silent setup. I can get lower temps but it doesnt affect OC.
> 
> maxing out voltage dont give me higher mhz and its still decreses after a while.
> so I run my card at 1.000V
> 1.093v just give me higher temps.


Increasing fanspeed IS about temps


----------



## ucode

Quote:


> Originally Posted by *mbm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ucode*
> 
> That's strange. With a straight line my clock goes up. What driver version?
> 
> 
> 
> the newest driver..
> So let say your line is set at 2100 mhz from the begining at 1.000V. You wil then never drop below 2100 mhz. You will actually increase and go beyond 2100 mhz?
Click to expand...

Haven't tried the newest driver but here's a pic from an old post showing clock going up from 41C to 52C with straight line.


----------



## Vellinious

60c is awfully warm. There are "layers", so to speak, about every 10c or so, starting as low as 15c (that I have personally seen), that seem to tell the core it needs to reset either voltage or clock. If you watch very carefully, you can find which temp levels really give you problems. My personal nemesis is 24c to 25c. When I can keep the peak core temp under 24c, 2252 runs brilliantly. If the core temp goes above 25c, the frame rates will drop a little. At 35c it's not even worth running that clock, because it runs worse than 2189.

Setting your overclock isn't just about voltage and frequency. It's also about knowing which temps will work best with which clocks.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> 60c is awfully warm. There are "layers", so to speak, about every 10c or so, starting as low as 15c (that I have personally seen), that seem to tell the core it needs to reset either voltage or clock. If you watch very carefully, you can find which temp levels really give you problems. My personal nemesis is 24c to 25c. When I can keep the peak core temp under 24c, 2252 runs brilliantly. If the core temp goes above 25c, the frame rates will drop a little. At 35c it's not even worth running that clock, because it runs worse than 2189.
> 
> Setting your overclock isn't just about voltage and frequency. It's also about knowing which temps will work best with which clocks.


In your opinion, and from your knowledge, should i not be able to do 2100mhz @ 40c?


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> In your opinion, and from your knowledge, should i not be able to do 2100mhz @ 40c?


That sounds pretty reasonable to me....just figuring out which voltage level you need to run it there and keep it there, is the trick.


----------



## JunkaDK

Quote:


> Originally Posted by *Vellinious*
> 
> That sounds pretty reasonable to me....just figuring out which voltage level you need to run it there and keep it there, is the trick.


So its possible that setting a curve where 2100 is at 1.093v might not work.. but it could work at a lower volt?


----------



## mbm

Quote:


> Originally Posted by *JunkaDK*
> 
> Increasing fanspeed IS about temps


Okay havent looked at the link, but from what I understood he mentioned fanspeed and not temp as the issue.

Fanspeed and temp. doesnt allways connect.


----------



## JunkaDK

It does connect..







High fanspeed will lower your temps







And since u can't adjust the temp only way to get lower temps is to increase fanspeed and lower the voltage


----------



## mbm

Quote:


> Originally Posted by *Vellinious*
> 
> 60c is awfully warm. There are "layers", so to speak, about every 10c or so, starting as low as 15c (that I have personally seen), that seem to tell the core it needs to reset either voltage or clock. If you watch very carefully, you can find which temp levels really give you problems. My personal nemesis is 24c to 25c. When I can keep the peak core temp under 24c, 2252 runs brilliantly. If the core temp goes above 25c, the frame rates will drop a little. At 35c it's not even worth running that clock, because it runs worse than 2189.
> 
> Setting your overclock isn't just about voltage and frequency. It's also about knowing which temps will work best with which clocks.


60C is NOT hot.. many cards run +80C

I dont think any airc ooling will bring you much cooler temp.

Im not talking of OC my card to 2200 mhz. I just want a stable boost clock that doesnt decrease and I cant seem to find the key for this.


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> 60C is NOT hot.. many cards run +80C
> 
> I dont think any airc ooling will bring you much cooler temp.
> 
> Im not talking of OC my card to 2200 mhz. I just want a stable boost clock that doesnt decrease and I cant seem to find the key for this.


It's not hot if you wanna run will normal boost, but if you're trying to go beyond 2100mhz and reach a stable clock then it's hot









Don't add extra voltage or extra clock speed







then your boost clock should be stable







If not.. then you might wanna look at your case airflow


----------



## Roy360

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Im wondering if you can measure the space as i have a ek supremacy vga (49.5mm all around) for i,t but i dont know if i have to shave some metal to make it fit


How did it go?

Where you able to fit the EK Supremacy VGA on the 1080?

I just got my card last night, and my wallet hurts too much to buy a full cover waterblock.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> 60C is NOT hot.. many cards run +80C
> 
> I dont think any airc ooling will bring you much cooler temp.
> 
> Im not talking of OC my card to 2200 mhz. I just want a stable boost clock that doesnt decrease and I cant seem to find the key for this.


I didn't say hot, I said warm. And if you read what I said, you'd find that I told you what you needed to look for.


----------



## Vellinious

Quote:


> Originally Posted by *JunkaDK*
> 
> So its possible that setting a curve where 2100 is at 1.093v might not work.. but it could work at a lower volt?


YES! Absolutely. Depending on the temps you're talking about, of course, but yes...


----------



## GreedyMuffin

Hi!

I am running my 1080 at 2116 or so at 1.000V, but I consider to trow the stock FE cooler back on.

Do anybody have the 1080 with the FE cooler? Can you do some noise and temp testing at 0.900V, 0.950 and 1.000V?

I can do 2000 at 0.900V, so no need in having it in my loop really.


----------



## mbm

Quote:


> Originally Posted by *JunkaDK*
> 
> It's not hot if you wanna run will normal boost, but if you're trying to go beyond 2100mhz and reach a stable clock then it's hot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't add extra voltage or extra clock speed
> 
> 
> 
> 
> 
> 
> 
> then your boost clock should be stable
> 
> 
> 
> 
> 
> 
> 
> If not.. then you might wanna look at your case airflow


I Can get my temps. Down to 50C but that didnt help either.


----------



## mbm

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Hi!
> 
> I am running my 1080 at 2116 or so at 1.000V, but I consider to trow the stock FE cooler back on.
> 
> Do anybody have the 1080 with the FE cooler? Can you do some noise and temp testing at 0.900V, 0.950 and 1.000V?
> 
> I can do 2000 at 0.900V, so no need in having it in my loop really.


When you are saying run 2116 is This a static frequenze? It stays at 2116?
Doesnt decrease within 20-30 min. of gaming?


----------



## oblivious

Is there any real concrete information on the "refreshed" versions on the 1080? I am about to pull the trigger on a 1080 but I would hate myself if i got a 1080 next week and then in 2 months the refreshed cards came out and were noticeably better.


----------



## Vellinious

Quote:


> Originally Posted by *oblivious*
> 
> Is there any real concrete information on the "refreshed" versions on the 1080? I am about to pull the trigger on a 1080 but I would hate myself if i got a 1080 next week and then in 2 months the refreshed cards came out and were noticeably better.


I haven't heard anything more since the rumor first surfaced. /shrug


----------



## GreedyMuffin

Quote:


> Originally Posted by *mbm*
> 
> When you are saying run 2116 is This a static frequenze? It stays at 2116?
> Doesnt decrease within 20-30 min. of gaming?


Yep static, Same speed always.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> I Can get my temps. Down to 50C but that didnt help either.


Post a screenshot of the curve you're using, please?


----------



## mbm

https://drive.google.com/file/d/0ByT6JwHWuJ_hNXRKT3kzQ2xhX2s/view?usp=sharing


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> https://drive.google.com/file/d/0ByT6JwHWuJ_hNXRKT3kzQ2xhX2s/view?usp=sharing


Snipping tool in windows is your friend......

I'm not sure what's going on with your GPU. I would try various different voltages for that 2100 clock speed and see which one held that clock the best.


----------



## oblivious

Quote:


> Originally Posted by *Vellinious*
> 
> I haven't heard anything more since the rumor first surfaced. /shrug


Me neither and anything that comes up with Google is rumors and speculation. I really don't see how they could make big increases without it looking like a card suited between the 1080 and 1080ti which would not make sense to me to create such a card since i'm sure Nvidia is hoping to ship and sell as many 1080ti's as they possibly could.


----------



## mbm

Quote:


> Originally Posted by *Vellinious*
> 
> Post a screenshot of the curve you're using, please?


Quote:


> Originally Posted by *Vellinious*
> 
> Snipping tool in windows is your friend......
> 
> I'm not sure what's going on with your GPU. I would try various different voltages for that 2100 clock speed and see which one held that clock the best.


Well non actually. Thats my problem.








Using the curve or offset doesnt make a difference either


----------



## Vellinious

Quote:


> Originally Posted by *oblivious*
> 
> Me neither and anything that comes up with Google is rumors and speculation. I really don't see how they could make big increases without it looking like a card suited between the 1080 and 1080ti which would not make sense to me to create such a card since i'm sure Nvidia is hoping to ship and sell as many 1080ti's as they possibly could.


If they even release it, it'll be a 1080v2, or 1080+ or something odd like that. The only thing I've seen is that the memory would perform better and at higher clocks or something. Honestly, I haven't paid much attention. Rumors fly around all the time. Until I see something official from NVIDIA, I don't put much stock in them.


----------



## Vellinious

Quote:


> Originally Posted by *mbm*
> 
> Well non actually. Thats my problem.
> 
> 
> 
> 
> 
> 
> 
> 
> Using the curve or offset doesnt make a difference either


You either have the oddest GPU in the world of Pascal, have a broken GPU, or haven't set your curve correctly. WIthout being there, standing next to you while you're doing it, I can't tell which one it is. I'm out of suggestions and ideas. I've never seen anyone else have this problem.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *oblivious*
> 
> Me neither and anything that comes up with Google is rumors and speculation. I really don't see how they could make big increases without it looking like a card suited between the 1080 and 1080ti which would not make sense to me to create such a card since i'm sure Nvidia is hoping to ship and sell as many 1080ti's as they possibly could.
> 
> 
> 
> If they even release it, it'll be a 1080v2, or 1080+ or something odd like that. The only thing I've seen is that the memory would perform better and at higher clocks or something. Honestly, I haven't paid much attention. Rumors fly around all the time. Until I see something official from NVIDIA, I don't put much stock in them.
Click to expand...

Yes its just going to be faster memory 11Gbps GDDR5X vs current 10Gbps GDDR5X its up to each partner to do it nvidia has said they arent going to do it on any of the FE cards.

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mbm*
> 
> Well non actually. Thats my problem.
> 
> 
> 
> 
> 
> 
> 
> 
> Using the curve or offset doesnt make a difference either
> 
> 
> 
> You either have the oddest GPU in the world of Pascal, have a broken GPU, or haven't set your curve correctly. WIthout being there, standing next to you while you're doing it, I can't tell which one it is. I'm out of suggestions and ideas. I've never seen anyone else have this problem.
Click to expand...

lol my gpu is pretty wierd also for most part i have to use more curve then is needed in at lower voltages for 3dmark if i wanna keep same score as offset but in other benchmarks i am fine in curve.


----------



## mbm

tried lowering my curve to 2075, 2063, 2050... no matter what it still drops.
So in my case the offset set to +100 = 2114-2088mhz gives my the same or better mhz than the straight curve.


----------



## JunkaDK

Quote:


> Originally Posted by *mbm*
> 
> tried lowering my curve to 2075, 2063, 2050... no matter what it still drops.
> So in my case the offset set to +100 = 2114-2088mhz gives my the same or better mhz than the straight curve.


What are your idle gpu temps vs load temps after 15mins?


----------



## mbm

idle temps 43C after load... load temps 60C


----------



## Vellinious

The tighter the curve, the better. In other words, the closer you can keep it to looking like the offset curve that it would set if you just set an offset overclock, the better. But there are performance improvements there at higher clocks than what's available with the stock curve.

It took me a month of messing around with it to figure out what I was doing, and why something would perform really well one time, but the next time you went back to it, it wouldn't.

Just takes a lot of experimenting, as each GPU is slightly different.


----------



## mbm

okay tried this fan issues talked about earlier... there is something going on there ....

If I run my fan at 100% it will keep my mhz, but as soon as I run a custom fan it decreases my mhz.

this is REGARDLESS of temperature. So there is no link between temp and fanspeed..

I will test some more. But maybe its an issue in the bios telling not to keep boost if fan is not 100%. I have a EVGA card with the latest bios fixing some fanspeed issues regarding all the hype about too hot VRM´s..


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> The tighter the curve, the better. In other words, the closer you can keep it to looking like the offset curve that it would set if you just set an offset overclock, the better. But there are performance improvements there at higher clocks than what's available with the stock curve.
> 
> It took me a month of messing around with it to figure out what I was doing, and why something would perform really well one time, but the next time you went back to it, it wouldn't.
> 
> Just takes a lot of experimenting, as each GPU is slightly different.


Could you post some of your best curves


----------



## GRABibus

Quote:


> Originally Posted by *Bal3Wolf*
> 
> Could you post some of your best curves


Here's mine :

http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/9590#post_25863196

GIGABYTE GTX1080 Xtreme Gaming WATERFORCE flashed with ASUS Strix OC t4 BIOS.


----------



## Vellinious

Quote:


> Originally Posted by *GRABibus*
> 
> Here's mine :
> 
> http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/9590#post_25863196


Looks pretty close to what I end up running, with small variations here and there. The place I always start, is if I'm looking for the equivalent of a +200 offset, I set the offset to +170, bring up the curve it creates for that, then adjust from there.


----------



## GRABibus

Quote:


> Originally Posted by *Vellinious*
> 
> Looks pretty close to what I end up running, with small variations here and there. The place I always start, is if I'm looking for the equivalent of a +200 offset, I set the offset to +170, bring up the curve it creates for that, then adjust from there.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Roy360*
> 
> How did it go?
> 
> Where you able to fit the EK Supremacy VGA on the 1080?
> 
> I just got my card last night, and my wallet hurts too much to buy a full cover waterblock.


I had to cut


----------



## Roy360

Quote:


> Originally Posted by *zGunBLADEz*
> 
> I had to cut


Would it be too much trouble to ask for any pictures?

I'm guessing cutting voids the warranty, but can the card be reverted back to air cooling later?


----------



## Bal3Wolf

well i spent more time playing with the curve this gpu tops out at 2115 no matter what setup i try, using 3dmark might run higher in games tho.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Roy360*
> 
> Would it be too much trouble to ask for any pictures?
> 
> I'm guessing cutting voids the warranty, but can the card be reverted back to air cooling later?


It do voids the warranty, you can still put it back with the original heatsink no problems it dont affect nothing as the heatsink sits in top of the die with the 4 spring screws .
It cools pretty good..


----------



## GreedyMuffin

Was fun while it lasted.

Upgrading to a Ti.


----------



## Krzych04650

I recently played a bit with voltages on my 1080 SLI. Tweaking results are very satisfying, with some profiles it may even be possible to run this kind of setup while having a PC in the same room where you play









Stability, temps, performance and power draw (from the wall) tested in Witcher 3 at 5160x2160 resolution through DSR, 15 minutes gameplay in GPU heavy location, any unstable voltage/clock was crashing inside like 30 seconds, very effective stability test.

Fan speed locked at 60% (1500 RPM, MSI Gaming cards), somewhat reasonable although I would use that high speeds if I had me PC next to me.

*Standard voltage OC:*

2076/5500 MHz
1.050-1.063 V
56 FPS
620 W
84 C on top card and 69 C on bottom card (crash after 8 minutes) (ambient temp 17 C)
2025 MHz stabilized clock (after those 8 minutes)

*Undervolted OC:*

2050/5500 MHz
0.975 V
55 FPS
560 W
76 C on top card and 65 C on bottom card (after 15 minutes) (ambient temp 17 C)
2038 MHz stabilized clock (after 15 minutes)

*Light OC:*

2000/5500 MHz
0.900 MHz
53 FPS
483 W
66 C on top card and 58 C on bottom card (after 15 minutes) (ambient temp 15.6 C)
1987 MHz stabilized clock (after 15 minutes)

*Eco mode:*

1800/5500 MHz
0.800 V
45 FPS
395 W
53 C on top card and 48 C on bottom card (after 15 minutes) (ambient temp 15.6 C)
1785 MHz stabilized clock (after 15 minutes)

Especially this 2000/5500 0.900 V mode is very interesting, basically 20 C temp reduction and 140 W less power draw at the expanse of just 5,5% performance.
1800/5500 0.800 V is also interesting, this is typical performance for melting FE that cannot manage even a single card, and here you are basically getting this x2 with no noise and funny power draw, that's basically the power draw of PC with single overclocked 1080 on standard voltage and some power hungry CPU.


----------



## x-apoc

I'm debating if I should get hybrid cooler for my top card and if so, what fan should I upgrade to on the radiator.


----------



## 6u4rdi4n

Gentle Typhoon, EK Vardar, Noctua NF-F12, NB eLoop. Many good choices.


----------



## mbm

ive tried a arctic hybrid III on my 1080 gtx. But changed it to the arctic extreme IV----much better performance


----------



## DrFreeman35

Curious to see if anyone is still waiting for the ACX to ICX upgrade program on here? If so when you registered? I thought about doing step-up to Ti, but considering I am not WC..... doesn't make much sense.


----------



## x-apoc

I ended up ordering evga hybrid from B&H that included game key for 99 bucks total.
https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1273203&gclid=CMqa7ebo3dICFQIcaQod5MsMWg&is=REG&ap=y&m=Y&c3api=1876%2C92051677682%2C&Q=&A=details

Run firestrike test last night, no OC, fans set to 60%, GPU score was 40k not too shabby. Will see how it will performs once I get hybrid cooler next week.

Also run Witcher 3 at 5k, ultra with out hairwork was low 50s fps.

I need a recommendation for fans, I want to do push/pull on that 120mm rad.


----------



## DarthBaggins

Finally able to join this club, picked up my 1080SC earlier this week and got it installed. Love this card as so far it's done amazingly in Time Spy & FS: Extreme. I can't wait to slap my EK Acetal/Nickel block w/ backplate on.


----------



## mbm

How does it clock?


----------



## Vellinious

Testing the new FTW before I put the block on. So far, it runs almost exactly the same as all the others I've tested. 2150ish on air before it starts dropping frame rates. I'll get the new motherboard installed, block put on this, and get all 3 GPUs in for some benching fun this weekend. = )

Not bad considering I haven't even touched the voltage / frequency curve yet.

http://www.3dmark.com/3dm/18665064


----------



## HyperC

Just got my 1080 seahawk ek today , LOL I am a little lost here because I don't seem to have a power limit or temp slider I have latest drivers and afterburner here is my overclock so far


----------



## Vellinious

Quote:


> Originally Posted by *HyperC*
> 
> Just got my 1080 seahawk ek today , LOL I am a little lost here because I don't seem to have a power limit or temp slider I have latest drivers and afterburner here is my overclock so far


Odd....don't think I've ever seen that.

Check the settings, see if there's something in the user interface...maybe change the UI skin.


----------



## jdq1412

Hey, i want to sli my 1080 ftw hybrid but i want to use one of the rigid sli birdges. Does anyone know which cards will be compatible with the rigid bridge? Since the ftw has a larger pcb than a normal 1080 i know the sli bridge i have in mind wont work with all cards because of dimensions difference.


----------



## DarthBaggins

I would go with the EVGA HB bridge


----------



## jdq1412

Thats the one im planning to use. My question was which cards can i use with it. I need a card with the same pcb size as the 1080 ftw.


----------



## Vellinious

Quote:


> Originally Posted by *jdq1412*
> 
> Thats the one im planning to use. My question was which cards can i use with it. I need a card with the same pcb size as the 1080 ftw.


Doubtful you find one. The FTW is a one of a kind board....


----------



## Beagle Box

Quote:


> Originally Posted by *HyperC*
> 
> Just got my 1080 seahawk ek today , LOL I am a little lost here because I don't seem to have a power limit or temp slider I have latest drivers and afterburner here is my overclock so far


IIRC, the Seahawk's design takes the water-cooling into account. It's power limited by its single power plug and uses a card-specific BIOS that has different temp/power trim points (or none?), so that might explain the missing AB controls.

I wonder how its performance will compare to a Gaming X with an aftermarket liquid cooling solution.


----------



## Vellinious

No


----------



## Beagle Box

After a closer look at your pic, Vellinious is indeed correct.

Go to Settings -> User Interface -> User Interface skinning properties and choose a V3 skin or any of the MSI skins. You should not be seeing a Shader Clock slide bar.

Please report what you find your power limit to be. I'm curious.


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## DarthBaggins

1080's put down some good numbers in [email protected]


----------



## desmopilot

Snagged a deal on a Zotac AMP! 1080 I couldn't refuse. Loving this thing so far, though it did run a little warm at stock. After researching, I've set the power limit to 80% and a custom fan curve. Now it sits at a rock solid 70C @ 2GHz Core with +500 on the Memory!


----------



## pantsoftime

Has anyone found a BIOS from a board with 11GHz GDDR5X yet? Curious to know how a 10GHz card would perform with the updated memory timings.


----------



## nrpeyton

Quote:


> Originally Posted by *pantsoftime*
> 
> Has anyone found a BIOS from a board with 11GHz GDDR5X yet? Curious to know how a 10GHz card would perform with the updated memory timings.


Only the 1080 TI's have the next tier of Micron memory at 11ghz. (as far as I know).


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*


You making the upgrade soon, or sticking with your 1080's?


----------



## pantsoftime

Quote:


> Originally Posted by *nrpeyton*
> 
> Only the 1080 TI's have the next tier of Micron memory at 11ghz. (as far as I know).


When they announced the Ti and the 1080 price cut they also announced that new 1080's will start coming with 11Gbps memory. I guess they haven't but the shelves yet though.


----------



## nrpeyton

Quote:


> Originally Posted by *pantsoftime*
> 
> When they announced the Ti and the 1080 price cut they also announced that new 1080's will start coming with 11Gbps memory. I guess they haven't but the shelves yet though.


Oh I see, I wasn't aware of that, very nice. ;-)

Makes sense too I suppose, bulk order on 1 line. = cheaper.


----------



## ucode

http://digiworthy.com/2017/03/01/nvidia-gtx-1080-ti-unveil/
Quote:


> GeForce GTX 1080 Price Cut to $499, GTX 1060 to Ship with 9GB/s Memory
> 
> The GeForce GTX 1080 was launched last year at a price of $699 for the Founders Edition and $599 for custom variants. In the latest update, the card has got an official $100 price cut, bringing the MSRP to an *impressive* $499.
> 
> GeForce GTX 1080 Price Cut
> 
> Spec-wise, the Nvidia GTX 1080 is powered by the full-fat GP104 (GP104-400-A1) GPU. The card runs at a base clock of 1607MHz and boost clock of 1733MHz, but the company's latest 16nm FinFET process helps it overclock up to 2.1GHz clock speeds on air cooling.
> 
> The chip sports 8GB of GDDR5X memory spread across a 256-bit bus interface. As part of the update however, the card will also receive faster G5X memory. The new GTX 1080 models will be operating at 11Gbps, while delivering a bandwidth of 352GB/s over the 320GB/s which shippped as reference.
> 
> The GeForce GTX 1070 was originally priced at $499 for the Founders Edition and $379 for custom models. Nvidia has now cut the MSRP by $30 which means starting tomorrow, the card will be available for under $350 although the card didn't receive any memory upgrade.


Having paid 700+ for my 1080 FE, I would use a different word than "impressive"


----------



## Bal3Wolf

Quote:


> Originally Posted by *ucode*
> 
> http://digiworthy.com/2017/03/01/nvidia-gtx-1080-ti-unveil/
> Quote:
> 
> 
> 
> GeForce GTX 1080 Price Cut to $499, GTX 1060 to Ship with 9GB/s Memory
> 
> The GeForce GTX 1080 was launched last year at a price of $699 for the Founders Edition and $599 for custom variants. In the latest update, the card has got an official $100 price cut, bringing the MSRP to an *impressive* $499.
> 
> GeForce GTX 1080 Price Cut
> 
> Spec-wise, the Nvidia GTX 1080 is powered by the full-fat GP104 (GP104-400-A1) GPU. The card runs at a base clock of 1607MHz and boost clock of 1733MHz, but the company's latest 16nm FinFET process helps it overclock up to 2.1GHz clock speeds on air cooling.
> 
> The chip sports 8GB of GDDR5X memory spread across a 256-bit bus interface. As part of the update however, the card will also receive faster G5X memory. The new GTX 1080 models will be operating at 11Gbps, while delivering a bandwidth of 352GB/s over the 320GB/s which shippped as reference.
> 
> The GeForce GTX 1070 was originally priced at $499 for the Founders Edition and $379 for custom models. Nvidia has now cut the MSRP by $30 which means starting tomorrow, the card will be available for under $350 although the card didn't receive any memory upgrade.
> 
> 
> 
> Having paid 700+ for my 1080 FE, I would use a different word than "impressive"
Click to expand...

well you should know with computers every year something better comes out that makes your stuff cheap kinda like cars and losing value soon as you buy them.


----------



## DarthBaggins

Still feel happy for only paying $400 for my SuperClocked thanks to a friend offloading his for a deal I couldn't pass up.


----------



## ucode

I don't have a car, only a bicycle. Didn't know cars lose near half their value in 10 months. Guess lesson learned though and will have to be content with old stuff from now on.


----------



## Bal3Wolf

It's worse with cars you lose lots of value soon as you pull it off the lot


----------



## ZealotKi11er

What is stock memory clock for Reference GTX1080? eVGA Pression Report 4500MHz. In GPU-Z it report 1250MHz stock but under load its 1125MHz.


----------



## ucode

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is stock memory clock for Reference GTX1080? eVGA Pression Report 4500MHz. In GPU-Z it report 1250MHz stock but under load its 1125MHz.


Depends what P-State is running.

For P-State P0 for 3D apps usually ~1250MHz which is ~2500MHz or ~10000MT/s.

For P-State P2 for compute / CUDA usually ~1125MHz which is ~2250MHz or ~9000MT/s.

FWIW I have both memory clock offsets for P0 and P2 set such that both run at 1395MHz which is 2790MHz or 11160MT/s. Softwares such as GPU-z and IIRC MSI AB don't handle this correctly and report incorrect clocks.

Memory clock should be 1395MHz


----------



## Bal3Wolf

Quote:


> Originally Posted by *ucode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> What is stock memory clock for Reference GTX1080? eVGA Pression Report 4500MHz. In GPU-Z it report 1250MHz stock but under load its 1125MHz.
> 
> 
> 
> Depends what P-State is running.
> 
> For P-State P0 for 3D apps usually ~1250MHz which is ~2500MHz or ~10000MT/s.
> 
> For P-State P2 for compute / CUDA usually ~1125MHz which is ~2250MHz or ~9000MT/s.
> 
> FWIW I have both memory clock offsets for P0 and P2 set such that both run at 1395MHz which is 2790MHz or 11160MT/s. Softwares such as GPU-z and IIRC MSI AB don't handle this correctly and report incorrect clocks.
> 
> Memory clock should be 1395MHz
Click to expand...

howd you get it to hold clocks for p0 and p2 i tried to and it kept reseting back to the default.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ucode*
> 
> Depends what P-State is running.
> 
> For P-State P0 for 3D apps usually ~1250MHz which is ~2500MHz or ~10000MT/s.
> 
> For P-State P2 for compute / CUDA usually ~1125MHz which is ~2250MHz or ~9000MT/s.
> 
> FWIW I have both memory clock offsets for P0 and P2 set such that both run at 1395MHz which is 2790MHz or 11160MT/s. Softwares such as GPU-z and IIRC MSI AB don't handle this correctly and report incorrect clocks.
> 
> Memory clock should be 1395MHz


I am running mining which is probably CUDA. It is 1125 on GPU-Z sensors. Added +1000 to get 1375.


----------



## ucode

@Bal3Wolf I write my own OC software to do this. Nothing user friendly as compiled with hard values which is fine for my own needs. NvidiaInspector has OC options for different P-States but I don't remember if it worked with Pascal memory P-State, might be worth a try.

The problem of using a high memory OC with Afterburner while in P2 is if the GPU switches to P0 it may crash the card.

A little bit of trivia, the first Pascal driver 368.25 IIRC does not have a P2 P-State. P2 was introduced later with it's reduced default memory clock. I don't know the reason for the reduction, only reason I could think of is if there were a problem running the memory chips hard for long times causing overheating / failure otherwise why run them below their P0 default clock. Maybe someone out there knows and could shed some light?


----------



## Bal3Wolf

Quote:


> Originally Posted by *ucode*
> 
> @Bal3Wolf I write my own OC software to do this. Nothing user friendly as compiled with hard values which is fine for my own needs. NvidiaInspector has OC options for different P-States but I don't remember if it worked with Pascal memory P-State, might be worth a try.
> 
> The problem of using a high memory OC with Afterburner while in P2 is if the GPU switches to P0 it may crash the card.
> 
> A little bit of trivia, the first Pascal driver 368.25 IIRC does not have a P2 P-State. P2 was introduced later with it's reduced default memory clock. I don't know the reason for the reduction, only reason I could think of is if there were a problem running the memory chips hard for long times causing overheating / failure otherwise why run them below their P0 default clock. Maybe someone out there knows and could shed some light?


yea i tried nvida inspector wouldnt hold clocks for me in p2 state while folding.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> You making the upgrade soon, or sticking with your 1080's?


I bought another FTW to do some 3 way runs with. When I'm done with those, I plan on selling 2 of them and getting a TI Classy....assuming someone makes a block for them.

Was messing about this weekend a little bit after I got the new motherboard in. Just running normal room temp ambients (20c), but got some decent scores out of the new GPU. Had a couple 8.6k graphics score runs in Timespy. Almost broke 9k last night.....just need to get the ambients dropped a little bit and should be able to do it without having to push the CPU too hard.

8999.....almost made it. rofl

http://www.3dmark.com/spy/1411021


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> I bought another FTW to do some 3 way runs with. When I'm done with those, I plan on selling 2 of them and getting a TI Classy....assuming someone makes a block for them.
> 
> Was messing about this weekend a little bit after I got the new motherboard in. Just running normal room temp ambients (20c), but got some decent scores out of the new GPU. Had a couple 8.6k graphics score runs in Timespy. Almost broke 9k last night.....just need to get the ambients dropped a little bit and should be able to do it without having to push the CPU too hard.
> 
> 8999.....almost made it. rofl
> 
> http://www.3dmark.com/spy/1411021


I had a sneaking suspicion you might of went for a Classy next round ;-)

But aye you're right about the block situation; that's going to affect my choice too. However if the past is anything to go by; and EVGA keeps the same 14+3 phase VRM design they've stuck with for years; the old ones you already own may fit, as will mine ;-)

Thats certainly what I'd hope for anyway ;-)

All 14 of the phases are top quality and we know how well it dissipates heat too ;-)

Classy might of got u that 1 extra point for the 9000 lol.


----------



## Vellinious

Quote:


> Originally Posted by *nrpeyton*
> 
> I had a sneaking suspicion you might of went for a Classy next round ;-)
> 
> But aye you're right about the block situation; that's going to affect my choice too. However if the past is anything to go by; and EVGA keeps the same 14+3 phase VRM design they've stuck with for years; the old ones you already own may fit, as will mine ;-)
> 
> Thats certainly what I'd hope for anyway ;-)
> 
> All 14 of the phases are top quality and we know how well it dissipates heat too ;-)
> 
> Classy might of got u that 1 extra point for the 9000 lol.


Yeah...if they go the ICX route for the Classy cooler on the TI, that'll change things, though......

I looked up the top runs for a single 1080...put the top 8 in a comparison. Looks like the top 5 are LN2 runs, I'm 6th. I need to get the ambients dropped and push over 9k, just for grins. lol

http://www.3dmark.com/compare/spy/171598/spy/159142/spy/159090/spy/494390/spy/122388/spy/1411021/spy/106099/spy/1285084


----------



## shilka

So as it turns out the GTX 1070 i bought was defective and if the RMA cant fix the fault i was promised a full refund so i really hope they cant fix it
And now that the GTX 1080 has dropped in price i am going to get a GTX 1080 instead of another GTX 1070

I have narrowed down my choices to either the Asus Strix or the new EVGA FTW2
The FTW2 cost a bit more then the Strix so what i want to ask is the FTW2 worth the extra cost or should i just grab a Strix?

Thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *shilka*
> 
> So as it turns out the GTX 1070 i bought was defective and if the RMA cant fix the fault i was promised a full refund so i really hope they cant fix it.
> And now that the GTX 1080 has droppen in price i am going to get a GTX 1080 instead of another GTX 1070
> 
> I have narrowed down my choices to either the Asus Strix or the new EVGA FTW2
> The FTW2 cost a bit more then the Strix so what i want to ask is the FTW2 worth the extra cost or should i just grab a Strix?
> 
> Thanks.


Get the Strix. There is no point on spending extra $. Also check the warranty in your country. What is like about ASUS/Gigabyte/MSI is serial based RMA.


----------



## Vellinious

I'd go with the EVGA, just because of the customer service. I live in the US, though......if you're from outside the US, any of them will do.


----------



## hertz9753

I would also go with the EVGA card for the customer service. They also have a guest RMA based on serial number that goes by the date of manufacture if you don't have proof of purchase.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> I'd go with the EVGA, just because of the customer service. I live in the US, though......if you're from outside the US, any of them will do.


Quote:


> Originally Posted by *hertz9753*
> 
> I would also go with the EVGA card for the customer service. They also have a guest RMA based on serial number that goes by the date of manufacture if you don't have proof of purchase.


I do not think shilka lives in US hence me suggesting ASUS.


----------



## pez

EVGA has support in EU too, though, don't they? I only say that as I noticed that EU qualifies for things like step-up.


----------



## hertz9753

EVGA also has an Asian website.

http://asia.evga.com/


----------



## DarthBaggins

Yet they don't have a section just for you lol


----------



## hertz9753

I know.


----------



## nrpeyton

Not to mention the new thermal sensors (11 right across the PCB).. that is unprecedented for a Nvidia GPU...

I'd definitely go with EVGA..


----------



## x7007

It's funny, the stuttering I had was caused from the Raid 0 x3 Hardisk I had, one of them started to had issues and the stuttering was because one one hardisk. other 2 are working fine. so I disconnect , raid data was lost couldn't do the right thing to recover. so I had some games backuped but the rest I'll have to redownload again.

So lesson learned, stuttering comes from a lot of things even if it seems right, the GPU is barely seems to be the issue.


----------



## nrpeyton

Very impressed with the quality of the VRM phases, on my EVGA Classified 1080.

I connected my multimeter to card and monitored core voltage _(accurate to 1mv)_ .

Even while running furmark (drawing 250w at 1.093v) _which is more than any game or benchmark would draw_, core voltage wasn't fluctuating more than 5mv _(for the most part)_ with only a few spikes as much as 10mv _(between lowest & highest)._

I've seen medium-high range motherboards fluctuate as much as 100mv (or more) under load.

Would be interesting to see how well VRM's compare with Nvidia's Founders Edition. (Something I hope to put to the test when I receive my new Ti soon).


----------



## ZealotKi11er

What is easy way to undervolt these cards? I am using MSI AB.


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is easy way to undervolt these cards? I am using MSI AB.


Use the voltage / frequency curve


----------



## Bal3Wolf

is it normal to get +1000mhz on memory i was doing it for folding but i got new mass effect game and forgot to change it and it runs stable in the game lol i havet tested it in other things yet tho but i did drop it to +630 that i know is stable and frames droped about 6-8 vs +1000 so im def getting a boost from it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> is it normal to get +1000mhz on memory i was doing it for folding but i got new mass effect game and forgot to change it and it runs stable in the game lol i havet tested it in other things yet tho but i did drop it to +630 that i know is stable and frames droped about 6-8 vs +1000 so im def getting a boost from it.


Not seen any GTX1080 that does 12Gbps stable. Even those that go over 11Gbps have negative performance effect. Test it with 500 to 1000 on a synthetic benchmark and stop where you see negative performance.


----------



## FedericoUY

My 1080 icx does +1000, it shows some little artifacts at that speed, but +950 runs perfect. Doesn't controlled the negative performance, but will do... I use it at +500 for 24/7.


----------



## GRABibus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not seen any GTX1080 that does 12Gbps stable. Even those that go over 11Gbps have negative performance effect.


+1

Mine also makes lower scores in Time Spy above 11GHz.


----------



## ZealotKi11er

Can't wait to see the new GTX1080 with 11Gbps stock RAM. Should do 12Gbps easily.


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> is it normal to get +1000mhz on memory i was doing it for folding but i got new mass effect game and forgot to change it and it runs stable in the game lol i havet tested it in other things yet tho but i did drop it to +630 that i know is stable and frames droped about 6-8 vs +1000 so im def getting a boost from it.
> 
> 
> 
> Not seen any GTX1080 that does 12Gbps stable. Even those that go over 11Gbps have negative performance effect. Test it with 500 to 1000 on a synthetic benchmark and stop where you see negative performance.
Click to expand...

11.2gGbps was stable in 3dmark score scaled up till then but i didnt go any further its working good in mass effect seeing about 8fp boost from 630 to 1000 no aritfacts.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> 11.2gGbps was stable in 3dmark score scaled up till then but i didnt go any further its working good in mass effect seeing about 8fp boost from 630 to 1000 no aritfacts.


How old is the card?


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> 11.2gGbps was stable in 3dmark score scaled up till then but i didnt go any further its working good in mass effect seeing about 8fp boost from 630 to 1000 no aritfacts.
> 
> 
> 
> How old is the card?
Click to expand...

2-3 weeks just tried 3dmark 12Gbps def not stable in it lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> 2-3 weeks


Maybe it's using new 11Gbps memory. Maybe if you even open the card you can check the memory model and compare to older 1080s.


----------



## Vellinious

FINALLY!

http://www.3dmark.com/3dm/18760445


----------



## nrpeyton

Quote:


> Originally Posted by *Vellinious*
> 
> FINALLY!
> 
> http://www.3dmark.com/3dm/18760445
> 
> 
> Spoiler: Warning: Spoiler!


Congrats ;-)


----------



## nrpeyton

Quote:


> Originally Posted by *Bal3Wolf*
> 
> is it normal to get +1000mhz on memory i was doing it for folding but i got new mass effect game and forgot to change it and it runs stable in the game lol i havet tested it in other things yet tho but i did drop it to +630 that i know is stable and frames droped about 6-8 vs +1000 so im def getting a boost from it.


I can do +925 on memory, but only when I add a tiny little extra voltage (to memory) and keep the memory chips very very cool.

I can go higher (+1000) but I get errors. (artifacts)

OCCT is best for checking stability. It simply compares one frame to the last, any discrepancies = errors. _Simple & elegant_.


----------



## Derek1

This may not be the right place to post this but I have a question about Nvidia CP and limiting clock speed during gaming.
My FTW runs stable at 2151 in all benchmarks. However when I am gaming I recently saw that the clockspeed never goes above the rated 1721 for the card. Also, the fps seems to be stuck at 63. This is on 1440 and 4k.
When I use K Boost it does then run at 2151. Shouldnt the card do that without K Boost on? Control Panel is set to Max Performance both in Global and in the game preferences. Why does it run at max in benches but not in the games? Why can I get over 100 fps in benches but the games never go over 63?


----------



## Beagle Box

Quote:


> Originally Posted by *Derek1*
> 
> This may not be the right place to post this but I have a question about Nvidia CP and limiting clock speed during gaming.
> My FTW runs stable at 2151 in all benchmarks. However when I am gaming I recently saw that the clockspeed never goes above the rated 1721 for the card. Also, the fps seems to be stuck at 63. This is on 1440 and 4k.
> When I use K Boost it does then run at 2151. Shouldnt the card do that without K Boost on? Control Panel is set to Max Performance both in Global and in the game preferences. Why does it run at max in benches but not in the games? Why can I get over 100 fps in benches but the games never go over 63?


Does "Max Performance" include the Vertical Sync setting being 'Off' in your game profiles? What you describe is usually caused by the Vertical Sync setting being set to 'Off' in Global Settings, but some form of 'On' in Program Settings. Some games, like Fallout 4, need some form of fps control to remain stable. YMMV.

If that's not it, check the Vertical Sync settings within your games.


----------



## x-apoc

Got my evga hydro cooler, so far I like what I see, my top card in sli is about 18c cooler at 60% fan speed. Was getting 70c when OC during test before upgrade. Still waiting for my dual pack of ML120 fans for push pull on the rad, and yes I did connect rad. fan to MB header, for the freedom of control at will.


----------



## Derek1

Quote:


> Originally Posted by *Beagle Box*
> 
> Does "Max Performance" include the Vertical Sync setting being 'Off' in your game profiles? What you describe is usually caused by the Vertical Sync setting being set to 'Off' in Global Settings, but some form of 'On' in Program Settings. Some games, like Fallout 4, need some form of fps control to remain stable. YMMV.
> 
> If that's not it, check the Vertical Sync settings within your games.


Yes I have V-sync set to OFF in CP. The other options seemed to limit fps and that is something I didn't want.
My monitor doesn't have V sync controls and neither do the games I am playing from what I have seen in the Options menus. So not sure about that. (some are from 2012)

I find it really frustrating that I buy a card that will do 2100 mghz no problem and yet CP is limiting it to 1721 and capping my fps @ 63 (though I guess 63 fps at 4k is good for 1721 mghz clock, isn't it?). But that only happens in games which is annoying as hell. (I get notices from FS and TS that V sync is on but I asked them and they said that was a bug on their end, but who the hell knows)

I went to Nvidia chat to ask them but that was a pointless endeavour as all I got was nonsense abut OCing being bad and that if I am getting 1721 then it is performing as it should. Idiots.


----------



## Beagle Box

Quote:


> Originally Posted by *Derek1*
> 
> Yes I have V-sync set to OFF in CP. The other options seemed to limit fps and that is something I didn't want.
> My monitor doesn't have V sync controls and neither do the games I am playing from what I have seen in the Options menus. So not sure about that. (some are from 2012)
> 
> I find it really frustrating that I buy a card that will do 2100 mghz no problem and yet CP is limiting it to 1721 and capping my fps @ 63 (though I guess 63 fps at 4k is good for 1721 mghz clock, isn't it?). But that only happens in games which is annoying as hell. (I get notices from FS and TS that V sync is on but I asked them and they said that was a bug on their end, but who the hell knows)
> 
> I went to Nvidia chat to ask them but that was a pointless endeavour as all I got was nonsense abut OCing being bad and that if I am getting 1721 then it is performing as it should. Idiots.


I had an issue with 3Dmark benchmarks telling me v-sync was set on and that it might affect my score. It was caused by registry entries related to my old AMD cards. But I'm not sure it actually affected my frame rates. The warning went away when I used a registry cleaner and removed all things AMD, removed my nvidia drivers with REVO set to medium and then did a fresh install of new nvidia drivers.

Don't know where NVIDIA's software currently stores its settings, but you may just have a setting somewhere that has v-sync erroneously set to "1" .
Maybe try setting all your vertical sync settings to FAST - it will not limit # of frames processed - and see if it makes any difference.


----------



## Vellinious

Quote:


> Originally Posted by *Beagle Box*
> 
> I had an issue with 3Dmark benchmarks telling me v-sync was set on and that it might affect my score. It was caused by registry entries related to my old AMD cards. But I'm not sure it actually affected my frame rates. The warning went away when I used a registry cleaner and removed all things AMD, removed my nvidia drivers with REVO set to medium and then did a fresh install of new nvidia drivers.
> 
> Don't know where NVIDIA's software currently stores its settings, but you may just have a setting somewhere that has v-sync erroneously set to "1" .
> Maybe try setting all your vertical sync settings to FAST - it will not limit # of frames processed - and see if it makes any difference.


It's a bug....does it on mine too. Doesn't affect anything unless you truly have VSync on.


----------



## Beagle Box

Quote:


> Originally Posted by *Vellinious*
> 
> It's a bug....does it on mine too. Doesn't affect anything unless you truly have VSync on.


Yeah. I didn't think my scores were being affected. I assumed the message had something to do with all the old AMD stuff because it did stop showing up immediately after I cleaned things up and reinstalled my 1080.


----------



## FedericoUY

Quote:


> Originally Posted by *nrpeyton*
> 
> I can do +925 on memory, but only when I add a tiny little extra voltage (to memory) and keep the memory chips very very cool.
> 
> I can go higher (+1000) but I get errors. (artifacts)
> 
> OCCT is best for checking stability. It simply compares one frame to the last, any discrepancies = errors. _Simple & elegant_.


Where and what voltage are you setting to the vram?


----------



## Vellinious

Quote:


> Originally Posted by *FedericoUY*
> 
> Where and what voltage are you setting to the vram?


He's got a Classy....


----------



## nrpeyton

Quote:


> Originally Posted by *FedericoUY*
> 
> Where and what voltage are you setting to the vram?


FBVDD - using the Classy voltage tool.

Stock is 1.37v.

1.44v for + 925 _(max is 1.5v)_


----------



## Bal3Wolf

i did find something out the newer drivers my core clocks not as stable as the last version in 3dmark, i find 3dmark to be very picky used heavin befor and had scores scale up and be stable while 3dmark can be hit or miss for me might make 15passes then freeze on the 16th.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> i did find something out the newer drivers my core clocks not as stable as the last version in 3dmark, i find 3dmark to be very picky used heavin befor and had scores scale up and be stable while 3dmark can be hit or miss for me might make 15passes then freeze on the 16th.


Yup, I noticed the same thing on the latest drivers. I roll back for Firestrike runs. Use the newest for Timespy, though.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> i did find something out the newer drivers my core clocks not as stable as the last version in 3dmark, i find 3dmark to be very picky used heavin befor and had scores scale up and be stable while 3dmark can be hit or miss for me might make 15passes then freeze on the 16th.
> 
> 
> 
> Yup, I noticed the same thing on the latest drivers. I roll back for Firestrike runs. Use the newest for Timespy, though.
Click to expand...

I came from 2 7970s in crossfire i will say i like nvidias drivers better overall better when they crash and crash less but one thing with amd drivers really never helped or hurt my overclock tho i was on older gen cards.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> I came from 2 7970s in crossfire i will say i like nvidias drivers better overall better when they crash and crash less but one thing with amd drivers really never helped or hurt my overclock tho i was on older gen cards.


Optimizations can cause heavier loading on the core....early drivers usually allow for really high clocks, but as the performance drivers continue, they tend to allow for less core clock. Saw it with Keppler on up. Keppler is when I REALLY got into overclocking.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> I came from 2 7970s in crossfire i will say i like nvidias drivers better overall better when they crash and crash less but one thing with amd drivers really never helped or hurt my overclock tho i was on older gen cards.
> 
> 
> 
> Optimizations can cause heavier loading on the core....early drivers usually allow for really high clocks, but as the performance drivers continue, they tend to allow for less core clock. Saw it with Keppler on up. Keppler is when I REALLY got into overclocking.
Click to expand...

ah so my core could run higher clocks in games and not freeze or kick in any ecc but have issues in 3dmark.


----------



## Vellinious

Quote:


> Originally Posted by *Bal3Wolf*
> 
> ah so my core could run higher clocks in games and not freeze or kick in any ecc but have issues in 3dmark.


Possible....drivers are all a bit different. For scoring runs in benchmarks, I roll back to whatever driver gives me the best performance, regardless of clocks.


----------



## pez

Quote:


> Originally Posted by *Derek1*
> 
> Yes I have V-sync set to OFF in CP. The other options seemed to limit fps and that is something I didn't want.
> My monitor doesn't have V sync controls and neither do the games I am playing from what I have seen in the Options menus. So not sure about that. (some are from 2012)
> 
> I find it really frustrating that I buy a card that will do 2100 mghz no problem and yet CP is limiting it to 1721 and capping my fps @ 63 (though I guess 63 fps at 4k is good for 1721 mghz clock, isn't it?). But that only happens in games which is annoying as hell. (I get notices from FS and TS that V sync is on but I asked them and they said that was a bug on their end, but who the hell knows)
> 
> I went to Nvidia chat to ask them but that was a pointless endeavour as all I got was nonsense abut OCing being bad and that if I am getting 1721 then it is performing as it should. Idiots.


What's your GPU usage sitting at in Fallout 4? That was my biggest issue for the longest time. I couldn't get my 1080 or TXP to utilize the GPU and I was getting stupid drops. Fine tuning GPU settings is what worked for me, but using DSR might help as well.

Strangely, I never had the issue while running 2 x 1080s at 4K.


----------



## mbm

Quote:


> Originally Posted by *nrpeyton*
> 
> FBVDD - using the Classy voltage tool.
> 
> Stock is 1.37v.
> 
> 1.44v for + 925 _(max is 1.5v)_


where can ram voltage be set?
only know about vcore


----------



## FedericoUY

Had also with latest drivers some performance issues, including lag and fps drops mostly on bf1. Also, power usage of the card is not reading correctly. Reverted to previous whql driver (378.78).


----------



## DarthBaggins

I have yet to update to the newest driver set, mainly due to the chance of having issues with [email protected] again.


----------



## Bal3Wolf

Quote:


> Originally Posted by *DarthBaggins*
> 
> I have yet to update to the newest driver set, mainly due to the chance of having issues with [email protected] again.


i have been folding on new drivers problem free not sure if ppd has changed i dont keep a close look on mine.


----------



## nrpeyton

Quote:


> Originally Posted by *mbm*
> 
> where can ram voltage be set?
> only know about vcore


Only certain cards have the option.

EVGA Classy's & Galax HOF's using their respective voltage tools, and I believe Gigabyte allow a minor adjustment to memory voltage using their overclocking app (but it only works on their flagship card). And I'm not sure how much of a boost it allows.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> ah so my core could run higher clocks in games and not freeze or kick in any ecc but have issues in 3dmark.


I too am coming from AMD and from testing the card for eveything but gaming I must say that I do not like how you OC. I like a fixed type of OC. I also do not like the fact that they have 2 Apps instead of a combined one.
There are also 13 processes under Nvidia which I have no clue what they do.


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> ah so my core could run higher clocks in games and not freeze or kick in any ecc but have issues in 3dmark.
> 
> 
> 
> I too am coming from AMD and from testing the card for eveything but gaming I must say that I do not like how you OC. I like a fixed type of OC. I also do not like the fact that they have 2 Apps instead of a combined one.
> There are also 13 processes under Nvidia which I have no clue what they do.
Click to expand...

i agree i liked amds overclocking better tho the 480 i have clocks sorta like nvida set your clock then have to play with power limit raise or lower voltage to get it to stay at that clock not as much of a hassle tho. One thing i do like better for me the nvida drivers are far more stable if it crashes you come back to your desktop over half the time when my amd crashed it froze windows.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> i agree i liked amds overclocking better tho the 480 i have clocks sorta like nvida set your clock then have to play with power limit raise or lower voltage to get it to stay at that clock not as much of a hassle tho. One thing i do like better for me the nvida drivers are far more stable if it crashes you come back to your desktop over half the time when my amd crashed it froze windows.


As far as I can remember unstable OC for my 290X just crashed the game of got Driver Have been reset. I do not remember last time it crashed Windows.


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> i agree i liked amds overclocking better tho the 480 i have clocks sorta like nvida set your clock then have to play with power limit raise or lower voltage to get it to stay at that clock not as much of a hassle tho. One thing i do like better for me the nvida drivers are far more stable if it crashes you come back to your desktop over half the time when my amd crashed it froze windows.
> 
> 
> 
> As far as I can remember unstable OC for my 290X just crashed the game of got Driver Have been reset. I do not remember last time it crashed Windows.
Click to expand...

well i was on older 7970s if driver crashed on them for me usualy froze windows hardy ever got a driver reset.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> well i was on older 7970s if driver crashed on them for me usualy froze windows hardy ever got a driver reset.


Just tried an "unstable OC" and system crashed and black screen.


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> well i was on older 7970s if driver crashed on them for me usualy froze windows hardy ever got a driver reset.
> 
> 
> 
> Just tried an "unstable OC" and system crashed and black screen.
Click to expand...

funny iv yet to crash my windows with my 1080 on core or memory overclock usualy resets in 2mins or less.


----------



## BigBeard86

Quote:


> Originally Posted by *Bal3Wolf*
> 
> funny iv yet to crash my windows with my 1080 on core or memory overclock usualy resets in 2mins or less.


same here; I have pushed the card ridiculously far, and the worst I got was a driver reset.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BigBeard86*
> 
> same here; I have pushed the card ridiculously far, and the worst I got was a driver reset.


Where you running CUDA?


----------



## jehovah3003

Anyone here having issues with the new Ryzen boards with their 1080 ?


----------



## mbm

Crazy mem speeds you got
I Can only run 5400 mhz


----------



## x7007

Quote:


> Originally Posted by *jehovah3003*
> 
> Anyone here having issues with the new Ryzen boards with their 1080 ?


what issues ?


----------



## Vellinious

Quote:


> Originally Posted by *BigBeard86*
> 
> same here; I have pushed the card ridiculously far, and the worst I got was a driver reset.


I crashed mine trying for 2300....usually, though, the driver just resets.


----------



## Spartoi

I couldn't find any 1080 Tis in stock, so I ended getting another GTX 1080 for a SLI build in hopes of "furture-proofing" (as best as possible). Anyways, I'm new to SLI and was wondering if you anyone knows/has a beginners guide for using SLI?


----------



## shilka

Quote:


> Originally Posted by *Spartoi*
> 
> I have I couldn't find any 1080 Tis in stock, so I ended getting another GTX 1080 for a SLI build in hopes of "furture-proofing" it (as best as possible). Anyways, I'm new to SLI and was wondering if you anyone knows/has a beginners guide for using SLI?


SLI support is getting worse and worse so unless you really really need more GPU power i say dont bother its not worth the cost the headache and all the problems involved
I have been a long time SLI user with many setups and the support flatout sucks with many newer games and its worse then it has been in a very long time

Even Nvidia cares less about SLI then ever before.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Spartoi*
> 
> I couldn't find any 1080 Tis in stock, so I ended getting another GTX 1080 for a SLI build in hopes of "furture-proofing" (as best as possible). Anyways, I'm new to SLI and was wondering if you anyone knows/has a beginners guide for using SLI?


SLI? Return it.


----------



## Bal3Wolf

i agree sli and crossfire just arent being worked on and optimized now days like in the past can take months to years to get working profiles sometimes.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> i agree sli and crossfire just arent being worked on and optimized now days like in the past can take months to years to get working profiles sometimes.


I think the problem is dev are trying to change to DX12/Vulkan, you also have buggy games in release more so then ever. They do not got time for 0.01% of Dual GPU users.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *DrFreeman35*
> 
> Curious to see if anyone is still waiting for the ACX to ICX upgrade program on here? If so when you registered? I thought about doing step-up to Ti, but considering I am not WC..... doesn't make much sense.


Unless you're using a hybrid, wc makes alot of sense! Start looking at your memory temps. Custom WC blocks will cool the memory and what I've seen so far (people speaking of 60C throttle) indicates memory temps through the roof.


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> i agree sli and crossfire just arent being worked on and optimized now days like in the past can take months to years to get working profiles sometimes.
> 
> 
> 
> I think the problem is dev are trying to change to DX12/Vulkan, you also have buggy games in release more so then ever. They do not got time for 0.01% of Dual GPU users.
Click to expand...

true but even games that come out on dx11 dont usualy have working support even when its a engine that has support most times it takes months to get it working right in new games if ever.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bal3Wolf*
> 
> true but even games that come out on dx11 dont usualy have working support even when its a engine that has support most times it takes months to get it working right in new games if ever.


I basically wanted to get Witcher 3 to work with 2 x 290X. The power was there. 1 GPU was getting 30-35 fps at 4K so with 2 cards and no CPU limitations it was going to hit 60 fps. It never did. The problem was AMD could not do anything about it and Witcher 3 based on support ticket told me "no official support" for CFX/SLI. This tells me all SLI/CFX can do is try to run on games. Really not worth bothering. Think if want the MAX fps just keep buying the latest $700 GPU every year and I think its enough.


----------



## shilka

I spent 7000 kr which is about $1000 US on two GTX 970 cards right after they came out and many of the games i wanted to play ran like crap
Or in the case of Company of Heroes 2 and Just Cause 3 had zero SLI support

It has only gotten worse since then with more games like Witcher 3 with very limited or no SLI support at all
SLI is a waste of time and money and even if it works its not going to run at all that great

Forget about SLI return the GTX 1080 and get a GTX 1080 Ti instead.

Edit: i had a GTX 680 SLI setup before the GTX 970 cards and back then SLI support was much better and broader then it is today
Wonder why it has gone so far downhill?


----------



## ZealotKi11er

Quote:


> Originally Posted by *shilka*
> 
> I spent 7000 kr which is about $1000 US on two GTX 970 cards right after they came out and many of the games i wanted to play ran like crap
> Or in the case of Company of Heroes 2 and Just Cause 3 had zero SLI support
> 
> It has only gotten worse since then with more games like Witcher 3 with very limited or no SLI support at all
> SLI is a waste of time and money and even if it works its not going to run at all that great
> 
> Forget about SLI return the GTX 1080 and get a GTX 1080 Ti instead.
> 
> Edit: i had a GTX 680 SLI setup before the GTX 970 cards and back then SLI support was much better and broader then it is today
> Wonder why it has gone so far downhill?


Yeah CFX has been much better before 2015. For 2 years now it was terrible. I think the main reason is games have so many other problems at launch that Dual GPUs never get a turn.


----------



## Bal3Wolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> true but even games that come out on dx11 dont usualy have working support even when its a engine that has support most times it takes months to get it working right in new games if ever.
> 
> 
> 
> I basically wanted to get Witcher 3 to work with 2 x 290X. The power was there. 1 GPU was getting 30-35 fps at 4K so with 2 cards and no CPU limitations it was going to hit 60 fps. It never did. The problem was AMD could not do anything about it and Witcher 3 based on support ticket told me "no official support" for CFX/SLI. This tells me all SLI/CFX can do is try to run on games. Really not worth bothering. Think if want the MAX fps just keep buying the latest $700 GPU every year and I think its enough.
Click to expand...

My whole reason i left amd to come to a 1080 one powerful card seems the only good choice now days, the 7970s were even worse seemed like some newer cards had working crossfire sometimes but the 7970s didnt even on some games that worked befor amd broke them and never fixed it.


----------



## mbm

SLI depends om the games...
I had 970 gtx SLI... loved them. SLI almost scaled 100% in battlefield 4..


----------



## ZealotKi11er

Quote:


> Originally Posted by *mbm*
> 
> SLI depends om the games...
> I had 970 gtx SLI... loved them. SLI almost scaled 100% in battlefield 4..


Yeah BF4. I loved 2x290X for BF4. That was the only game that ran like that. Had 2x290X just for BF1 but BF1 was much worse than BF4 in terms of Dual GPUs.


----------



## ryanallan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah BF4. I loved 2x290X for BF4. That was the only game that ran like that. Had 2x290X just for BF1 but BF1 was much worse than BF4 in terms of Dual GPUs.


word.

I sold my 1070 SLI + HB bridge for a 1080, and my BF1 experience actually got better.

Same with BF4. SLI 1070's made BF4 unplayable. Piss poor frame rates, studders, ...etc. A single 1080 fixed that.


----------



## shilka

Battlefield 4 is over 3 years old now and while the support for SLI and Crossfire might be good in that game the support almost all newer games has crap to non existing support
Unless the support gets better multiple video cards is a waste of money and you might as well take your money and throw it out the window because the results are going to be the same.


----------



## ZealotKi11er

Quote:


> Originally Posted by *shilka*
> 
> Battlefield 4 is over 3 years old now and while the support for SLI and Crossfire might be good in that game the support almost all newer games has is crap to non existing
> Unless the support gets better multiple video cards is a waste of money and you might as well take your money and throw it out the window because the results are going to be the same.


CFX/SLI is dead. Only hope is Dual GPU via DX12/Vulkan. This is how Dual GPUs should be done. If this becomes normal Dual GPUs will come back.


----------



## pez

With the Tis out, getting 1070 SLI or 1080 SLI at this point is a bit backwards. You'd do better off getting the better minimum frames of the Ti over what you would have with 1070 or 1080 SLI. However, I cannot agree that SLI is dead because games like Just Cause 3 and the new Deus Ex don't support it fully.


----------



## jehovah3003

Quote:


> Originally Posted by *x7007*
> 
> what issues ?


Stuttering in all games with freezes even in Windows.


----------



## Motley01

Well add me to this list. I just got the Asus Strix 1080. This thing is wicked awesome! Holy crap batman.

I'm using the Asus Tweak II tool to OC it.

I have it set for user defined OC.

GPU Boost clock is maxed out at +162 1972mhz. But when I'm playing games I see the monitor show clock is reaching 2072mhz. Is this OK? (new to Nvidia, had AMD cards for the last 6 years).

Other settings:

GPU Voltage: +46
Memory clock: +272
Power Target: 120
Fan Speed: Auto


----------



## Spartoi

Well, I wasn't expecting everyone ot bash choosing 1080 SLI over a 1080 TI. I going to keep the SLI setup because I still believe it give me more performance in the long run over a single 1080 TI and I rarely play games on release so no SLI support on launch is less of an issue for me. I am surprised that almost everyone thinks 1080 TI is better than a 1080 SLI for a "future-proof" build. SLI might be "dying" but it will be replaced with superior DX12 and Vulkan multi-gpu support. And for the games that don't support SLI or multi-gpus then I'll just disable it and game with a single 1080. I honestly think the 1080 SLI is a better setup for me than a single 1080 TI but I understand everyone's concerns.


----------



## Vellinious

Well...finally found a 1080 that doesn't clock very well. This FTW I've been messing with simply won't do anything over 2151 with normal ambient temps, and 2139 seems to be the limit for performance. I'll take it outside this weekend, see if I can get a little more out of it, but.....I'm betting 2151, maybe 2164 will be all I gain from it.

That's really going to limit the 3 way SLI runs I wanted to do. Meh


----------



## xartic1

Quote:


> Originally Posted by *SmackHisFace*
> 
> Hi guys Ive had my 1080 about a month and Im noticing that it no longer boosts to 1.093v but rather caps out at 1.062v. I have max power limit, max voltage, and max temp limit. Doesn't matter what temp the card is at its topping out at 1.062. I know the cards throttle and voltages lower but they used to always be at least 1.075v during gaming. What gives, why the sudden change? Driver 376.33.


Quote:


> Originally Posted by *Vellinious*
> 
> Well...finally found a 1080 that doesn't clock very well. This FTW I've been messing with simply won't do anything over 2151 with normal ambient temps, and 2139 seems to be the limit for performance. I'll take it outside this weekend, see if I can get a little more out of it, but.....I'm betting 2151, maybe 2164 will be all I gain from it.
> 
> That's really going to limit the 3 way SLI runs I wanted to do. Meh


How much more performance do you realistically get from 2151mhz to say 2300?


----------



## Motley01

Quote:


> Originally Posted by *Vellinious*
> 
> Well...finally found a 1080 that doesn't clock very well. This FTW I've been messing with simply won't do anything over 2151 with normal ambient temps, and 2139 seems to be the limit for performance. I'll take it outside this weekend, see if I can get a little more out of it, but.....I'm betting 2151, maybe 2164 will be all I gain from it.
> 
> That's really going to limit the 3 way SLI runs I wanted to do. Meh


What are you going for a world record or something? Do you actually play any games, or do you just run benchmarks all day?


----------



## Bal3Wolf

Quote:


> Originally Posted by *Vellinious*
> 
> Well...finally found a 1080 that doesn't clock very well. This FTW I've been messing with simply won't do anything over 2151 with normal ambient temps, and 2139 seems to be the limit for performance. I'll take it outside this weekend, see if I can get a little more out of it, but.....I'm betting 2151, maybe 2164 will be all I gain from it.
> 
> That's really going to limit the 3 way SLI runs I wanted to do. Meh


well id say your quite lucky lol as my ftw hydro copper wont do over 2100.


----------



## Vellinious

Quote:


> Originally Posted by *xartic1*
> 
> How much more performance do you realistically get from 2151mhz to say 2300?


For benchmarks? Enough to make a difference.
Quote:


> Originally Posted by *Motley01*
> 
> What are you going for a world record or something? Do you actually play any games, or do you just run benchmarks all day?


I do game. But I also submit scores to HWBot. So, I push my hardware as far as I possibly can.


----------



## philhalo66

I'm planning to sell off my 1070 and get an EVGA 1080 classified, Do you guys think my 3570K will cause a substantial bottleneck?


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> I'm planning to sell off my 1070 and get an EVGA 1080 classified, Do you guys think my 3570K will cause a substantial bottleneck?


At 5GHz not really. I do see *some* bottlenecking from the i5 in my GFs rig (second sig rig) with a 1070, but I think it's due more to pushing high refresh rate than anything. But we're also not sitting at 5GHz on that thing







. Also of course, at 1440p, 1070 or 1080 with an i5 is going to be super solid. Performance loss would be negligible at best.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> At 5GHz not really. I do see *some* bottlenecking from the i5 in my GFs rig (second sig rig) with a 1070, but I think it's due more to pushing high refresh rate than anything. But we're also not sitting at 5GHz on that thing
> 
> 
> 
> 
> 
> 
> 
> . Also of course, at 1440p, 1070 or 1080 with an i5 is going to be super solid. Performance loss would be negligible at best.


I usually run it at 4.8GHz due to heat limitations. So basically there will be some but not enough to worry about? If so then ill go ahead and grab one then see if i cant find a cheap 4930K or something similar. Most of the games i play are heavily single threaded like the original Crysis, DOOM 3 and A few source games. Only game i see a bottleneck in currently is Rise of the Tomb Raider geothermal vally and a very mild one in ARK, But the vast majority of the time every game has my 1070 pegged at 99% even at 2100MHz.


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> I usually run it at 4.8GHz due to heat limitations. So basically there will be some but not enough to worry about? If so then ill go ahead and grab one then see if i cant find a cheap 4930K or something similar. Most of the games i play are heavily single threaded like the original Crysis, DOOM 3 and A few source games. Only game i see a bottleneck in currently is Rise of the Tomb Raider geothermal vally and a very mild one in ARK, But the vast majority of the time every game has my 1070 pegged at 99% even at 2100MHz.


Yeah, unless you're on 1080p, I always agree with doing a GPU upgrade first. however, that Haswell i7 might give a nice boost as well







.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> Yeah, unless you're on 1080p, I always agree with doing a GPU upgrade first. however, that Haswell i7 might give a nice boost as well
> 
> 
> 
> 
> 
> 
> 
> .


Unfortunately i am on 1080P. HO badly do you think it will bottleneck me at 1080P?


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> Unfortunately i am on 1080P. HO badly do you think it will bottleneck me at 1080P?


Hmm, my question is more along the lines of 'what aren't you maxing out at 1080p?'

I find that a 1070 is great for 1440p 144hz. You won't get 144fps in every single game with the 1070, but I can't imagine you're not maxing everything out currently and getting 60+. I would say that a CPU upgrade is more worthwhile for you...maybe even a monitor upgrade.


----------



## Astreon

so how's my gainward doing?

http://www.3dmark.com/3dm/18795305?

GLH bios, memory OC +350 in the afterburner (1399mhz in the gpu-z) and +50 to core.

I kinda forgot to OC my i5-4690k, it was at 4.2 ghz during the mark.

It does buzz (like every other GPU I owned







) in 3Dmark, and whine at 500+ fps (loudly at 2000+), but if it's great, keeping it is an option.

I have to decide till Tuesday, the alternative is to buy AMD VEGA instead, maybe it won't coilbuzz, hah.


----------



## Motley01

Quote:


> Originally Posted by *philhalo66*
> 
> Unfortunately i am on 1080P. HO badly do you think it will bottleneck me at 1080P?


Instead of buying a new GTX1080, just buy youself a nice 1440p monitor. Then later after you save up again get the 1080.

At 1080p using a GTX1080 would be useless.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> Hmm, my question is more along the lines of 'what aren't you maxing out at 1080p?'
> 
> I find that a 1070 is great for 1440p 144hz. You won't get 144fps in every single game with the 1070, but I can't imagine you're not maxing everything out currently and getting 60+. I would say that a CPU upgrade is more worthwhile for you...maybe even a monitor upgrade.


Rise of the tomb raider is one of them, it dips to mid 20's with the AA turned up and it watched it, it is the 1070 holding me back. ARK drops to the low teens on epic settings again its the 1070 holding me back gpu load is pegged at 100%, crysis 3 and a couple other games i play regularly. The processor would be a waste of money right now i only bottleneck in 3 games and its only a very small bottleneck hence why i was wondering how bad it would be for 1080.
Quote:


> Originally Posted by *Motley01*
> 
> Instead of buying a new GTX1080, just buy youself a nice 1440p monitor. Then later after you save up again get the 1080.
> 
> At 1080p using a GTX1080 would be useless.


im not interested in 1440P most of the games i play now barely run maxed out because the 1070 isnt fast enough trying to run them at 1440P would be a slideshow you seem to be way overestimating the 1070, tomb raider barely runs at 1080P with 2xaa 1440p would be single digits, I refuse to play on medium or low i might as well get it for my ps4 know what i mean?


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> Rise of the tomb raider is one of them, it dips to mid 20's with the AA turned up and it watched it, it is the 1070 holding me back. ARK drops to the low teens on epic settings again its the 1070 holding me back gpu load is pegged at 100%, crysis 3 and a couple other games i play regularly. The processor would be a waste of money right now i only bottleneck in 3 games and its only a very small bottleneck hence why i was wondering how bad it would be for 1080.
> im not interested in 1440P most of the games i play now barely run maxed out because the 1070 isnt fast enough trying to run them at 1440P would be a slideshow you seem to be way overestimating the 1070, tomb raider barely runs at 1080P with 2xaa 1440p would be single digits, I refuse to play on medium or low i might as well get it for my ps4 know what i mean?


What load percentages are you seeing on the CPU? Just based on the arstechnica review (below) they're not hitting below 69 on a stock card. Of course, this might be the built-in benchmark. I wish I had the game or I'd test it on the GFs i5 system.

https://arstechnica.com/gadgets/2016/06/nvidia-gtx-1070-review/

Is RoTR DX12 now? If so, that might be why







.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> What load percentages are you seeing on the CPU? Just based on the arstechnica review (below) they're not hitting below 69 on a stock card. Of course, this might be the built-in benchmark. I wish I had the game or I'd test it on the GFs i5 system.
> 
> https://arstechnica.com/gadgets/2016/06/nvidia-gtx-1070-review/
> 
> Is RoTR DX12 now? If so, that might be why
> 
> 
> 
> 
> 
> 
> 
> .


depends on the game but in ROTTR it sits about 80-85% CPU load and 100% GPU load. In geothermal valley CPU usage spikes to 90-97% and GPU load drops to 76% but that's the only area in the game that i see this. that review obviously doesn't have the AA turned on at all, If i leave the AA to FXAA i get well over 100 FPS but it looks like a console game with huge aliasing all over everything. It does have DX12 but i actually get a hefty FPS drop when i use it DX11 gives me the best framerates.


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> depends on the game but in ROTTR it sits about 80-85% CPU load and 100% GPU load. In geothermal valley CPU usage spikes to 90-97% and GPU load drops to 76% but that's the only area in the game that i see this. that review obviously doesn't have the AA turned on at all, If i leave the AA to FXAA i get well over 100 FPS but it looks like a console game with huge aliasing all over everything. It does have DX12 but i actually get a hefty FPS drop when i use it DX11 gives me the best framerates.


That makes sense. And yeah, AA is definitely still desirable at 1080p.

I won't be one to stop you on a GPU upgrade, but i'm afraid you won't see as large of a boost as you normally would without also going with a monitor upgrade (upgrade as in going up in resolution) as well.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> That makes sense. And yeah, AA is definitely still desirable at 1080p.
> 
> I won't be one to stop you on a GPU upgrade, but i'm afraid you won't see as large of a boost as you normally would without also going with a monitor upgrade (upgrade as in going up in resolution) as well.


This upgrade is more of a phase one type deal. I was going to get a 1080 then upgrade to a 4960x or a 4930K whichever is cheaper, then move to a 1440p monitor. Right now the plan is get the 1080 classified then around october/november snag a 6 core intel.


----------



## motov8

Quote:


> Originally Posted by *Bal3Wolf*
> 
> well id say your quite lucky lol as my ftw hydro copper wont do over 2100.


Don't worry mate, it doesnt really matter in my situation as owner of Gainward gtx 1080 (GLH bios).
My card has a best score with memory 11.200 - above this speed i get less score (probably higher timings going up).
As a core i can get 2076 MHz stable and this speed is giving me more FPS than via curve 2114 MHz.


----------



## DarthBaggins

I upgraded to a 1080, and I'm currently running at 1080p since I budgeted more towards the 5930k w/ 32GB's of memory with a 970 the original build. Now that the Strix 970 is gone and I'm getting ready to mount the block on the 1080 I've been eyeing the Dell S2716DG once I can find a killer deal on it like I did the 1080 (hard to pass on a $400 1080SC). So far looks to be that 2080 is my limit on the Air OC on the SC. Now to see what it can do under water - it was looking thirsty lol


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> This upgrade is more of a phase one type deal. I was going to get a 1080 then upgrade to a 4960x or a 4930K whichever is cheaper, then move to a 1440p monitor. Right now the plan is get the 1080 classified then around october/november snag a 6 core intel.


With the 1080s on the way with faster GDDR5X, you may want to hold out a tad a longer. That or defer and upgrade to one of the other items in your list.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> With the 1080s on the way with faster GDDR5X, you may want to hold out a tad a longer. That or defer and upgrade to one of the other items in your list.


i forgot about that. maybe ill upgrade my processor and motherboard while i hold out for the faster 1080's

Actually thinking about a 7700K and an EVGA Z270 Classified board along with 16gb of ram. should be a hefty upgrade and eliminate any and all bottlenecks.


----------



## Blaze051806

played sc2 for 5 hours max temp 39c haha this AIO 1080 is awesome lol 260fps XD

my old 480 ran 80c in sc2 for ref lol


----------



## Derpinheimer

Quote:


> Originally Posted by *Vellinious*
> 
> Well...finally found a 1080 that doesn't clock very well. This FTW I've been messing with simply won't do anything over 2151 with normal ambient temps, and 2139 seems to be the limit for performance. I'll take it outside this weekend, see if I can get a little more out of it, but.....I'm betting 2151, maybe 2164 will be all I gain from it.
> 
> That's really going to limit the 3 way SLI runs I wanted to do. Meh


???

A bad clocker? I'm toying with the T4 bios on a 1080 FTW and the best I can get, error free on OCCT, is 2138. Games run fine far higher, but I about 1 error/second at 2154.

On that note, has anyone else tried using error check on OCCT gpu tester?


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> i forgot about that. maybe ill upgrade my processor and motherboard while i hold out for the faster 1080's
> 
> Actually thinking about a 7700K and an EVGA Z270 Classified board along with 16gb of ram. should be a hefty upgrade and eliminate any and all bottlenecks.


I agree with this upgrade path














.


----------



## mbm

Quote:


> Originally Posted by *Derpinheimer*
> 
> ???
> 
> A bad clocker? I'm toying with the T4 bios on a 1080 FTW and the best I can get, error free on OCCT, is 2138. Games run fine far higher, but I about 1 error/second at 2154.
> 
> On that note, has anyone else tried using error check on OCCT gpu tester?


Well I think thats a great OC 2138mhz.
You say you are able to game for hours far higher? at what clocks?
How do you get these high clocks?

I my self cannot get any higher than 2038 mhz.. I can get it to boost to 2100 mhz, but it stabelise at 2038 mhz when gaming for a while.


----------



## Vellinious

Quote:


> Originally Posted by *Derpinheimer*
> 
> ???
> 
> A bad clocker? I'm toying with the T4 bios on a 1080 FTW and the best I can get, error free on OCCT, is 2138. Games run fine far higher, but I about 1 error/second at 2154.
> 
> On that note, has anyone else tried using error check on OCCT gpu tester?


I don't care if it's stable.....I just need it benchmark stable. lol


----------



## Derpinheimer

Quote:


> Originally Posted by *mbm*
> 
> Well I think thats a great OC 2138mhz.
> You say you are able to game for hours far higher? at what clocks?
> How do you get these high clocks?
> 
> I my self cannot get any higher than 2038 mhz.. I can get it to boost to 2100 mhz, but it stabelise at 2038 mhz when gaming for a while.


I should be more careful with my choice of words







. Oh also, this card is watercooled - don't feel bad









I thought it was stable at 2202 but I got a crash there, and then again at 2189 after some time. So, it's stable at 2176 gaming, or OCCT error free at 2138.

Comparing Metro LL framerate across some OC settings,

Stock: 100%
+182/0: 108%
+270/0: 110%
+0/595: 106.9%
+182/595: 113.3%
+270/595: 116.2%

Core
+182 yields 2088 using offset method
+270 yields 2176 using voltage/frequency curve

Memory
+595 memory gives best performance. It runs fine all the way to +1000 but performance is lower. +600 triggers the performance drop, then it almost recovers it all at +1000

----

Has it been determined why the curves give lower performance than plain offset? I remember reading about a hidden "video clock" but I can't find anything about it. I thought someone said if you increased the frequency at a certain voltage, you'd get higher framerates even if that bin wasn't being used. (i.e. if you set 1.1v to 2100MHz and leave every other point at stock, you'll get higher performance if you set 1.0-1.093v to 2088Mhz and 1.1v to 2100Mhz)

I believe this is true from my limited testing, but it's hard to verify.


----------



## Beagle Box

In my experience, there are 3 different curve types that give maximum performance on an air-cooled card.

*Maximum Standard Curve* - maximum area beneath the useable stock curve for finicky DX11 benchmarks and general game playing. MAX voltage won't make a lot of difference. Cooling is more important.
*Maximum Flattened Curve* - higher lows and lower highs to level out fps for long-session high graphics game playing. This is my twist based on the concept found in the MSI Gaming BIOS. Lower Voltage than MAX can often be used for better overall performance due to lower temps.
*Maximum Spiked Curve* - a less aggressive version of either of the first two methods in low and mid-range + a high top end spike where you shoot for one maximum voltage/MHZ point on the curve and try to keep it there. This can be an actual spike or a sharp upward curve of a few points. If cooling is a problem for you, your benchmark scores will be worse with this technique than had you used methods 1 or 2.

I've also also heard rumors of a *Steep Curve* method. I tried that using Afterburner and AB's infuriating random point scattering means I'd rather have a root canal than ever, _ever_ try _that_ again.

My conclusion: Heat is the enemy. Water-cooling will always be better than air. I don't know if the above also holds true for those on H2O.

These are just my observations based on my experience with my air-cooled MSI GTX 1080 Gaming X. YMMVGDOYS. And, as always, I could just be doing it all wrong.


----------



## mbm

No matter what curve or voltage I use,, The card will settle at 2038-2050 mhz after gaming for 30 min.


----------



## philhalo66

EVGA said they have no plans for a 1080 with faster memory, A bit disappointing TBH


----------



## pez

They're also the last ones to get their AIB cards out...for the second (well third if you count 1070/1080 as two different releases) time in a row. So I ask why are they the last to get their GPU out when the ACX cooler ended up being faulty in the end? To top it off, I'm still a bit 'salty' about the Ti/TXP hybrid cooler being $160.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> They're also the last ones to get their AIB cards out...for the second (well third if you count 1070/1080 as two different releases) time in a row. So I ask why are they the last to get their GPU out when the ACX cooler ended up being faulty in the end? To top it off, I'm still a bit 'salty' about the Ti/TXP hybrid cooler being $160.


Yeah im a bit salty about them not refreshing the 1080 like everyone else. Maybe ill jump ship to the 1080Ti, I really doubt my 7700K will bottleneck that.


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> Yeah im a bit salty about them not refreshing the 1080 like everyone else. Maybe ill jump ship to the 1080Ti, I really doubt my 7700K will bottleneck that.


Very nice! You just upgraded right? Sorry if my memory is escaping me, but I believe you were coming from a i5 or i7 3xxx?

I just ordered a Ti myself and decided that I wasn't going to put up with EVGA this time


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> Very nice! You just upgraded right? Sorry if my memory is escaping me, but I believe you were coming from a i5 or i7 3xxx?
> 
> I just ordered a Ti myself and decided that I wasn't going to put up with EVGA this time


Yeah 7700K, Mobo and ram should be here thursday next week. Upgraded from a 3570K.

Honestly Even if EVGA is slower, their legendary customer service makes up for it for me at least. When the fans on my 1070 started failing gigabyte told me to "buy new fans on the internet" or i can pay 70 dollars for the shipping and get a used card as a replacement for my 2 month old card that works fine otherwise.


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> Yeah 7700K, Mobo and ram should be here thursday next week. Upgraded from a 3570K.
> 
> Honestly Even if EVGA is slower, their legendary customer service makes up for it for me at least. When the fans on my 1070 started failing gigabyte told me to "buy new fans on the internet" or i can pay 70 dollars for the shipping and get a used card as a replacement for my 2 month old card that works fine otherwise.


Yeah, I agree that EGA CS is quite amazing. They've come through me countless times.

In the age of Twitter and other social media, though, I'm not too worried about bad CS from Asus or GB. Because of that, I had GB come through to set up my RMA on a free item I got from them with my 1080 G1.

EDIT: Good to know my memory isn't that bad







. Congrats on your upgrade







.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> Yeah, I agree that EGA CS is quite amazing. They've come through me countless times.
> 
> In the age of Twitter and other social media, though, I'm not too worried about bad CS from Asus or GB. Because of that, I had GB come through to set up my RMA on a free item I got from them with my 1080 G1.
> 
> EDIT: Good to know my memory isn't that bad
> 
> 
> 
> 
> 
> 
> 
> . Congrats on your upgrade
> 
> 
> 
> 
> 
> 
> 
> .


I only ever dealt with an RMA a few times and the gigabyte one left a bad taste in my mouth so im very un trusting of companies now. But on the other hand Kingston replaced my faulty coud 2 headset without any hassle they even shipped out a replacement headset a few hours after the faulty one arrived at the RMA facility.

thanks! I'm pretty hyped to test a few games and it's going to be funny when i blow away my friends 6800K in nearly every game especially considering i paid 400 dollars less for my upgrades


----------



## netxzero

is there a way to increase the power limit or use a different bios for the 1080 FE?


----------



## mbm

So more voltage equals higher mhz?
On my Card it dosent matter IF i run 1.093 or 0.950 The frequenze settle in 2050 after a while.
I would love to have a custom bios that boost to 2050 mhz at 0.950v so I could drop afterburner


----------



## philhalo66

Welp, I guess newegg wanted me to buy a 1080. they cancelled my 7700K order because some dummy didn't count the stock right so i said heck with it and bought the EVGA FTW2 1080 instead.


----------



## mbm

Quote:


> Originally Posted by *philhalo66*
> 
> Welp, I guess newegg wanted me to buy a 1080. they cancelled my 7700K order because some dummy didn't count the stock right so i said heck with it and bought the EVGA FTW2 1080 instead.


Dont Think it will fit in The cpu socket


----------



## philhalo66

Quote:


> Originally Posted by *mbm*
> 
> Dont Think it will fit in The cpu socket


Sure it will, little duct tape and some elbow grease and anything is possible.


----------



## Bal3Wolf

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mbm*
> 
> Dont Think it will fit in The cpu socket
> 
> 
> 
> Sure it will, little duct tape and some elbow grease and anything is possible.
Click to expand...

dont forget the super glue and electric tape befor i had a nice setup i had cases that had electric tape all over with my home made wire extensions.


----------



## philhalo66

Quote:


> Originally Posted by *Bal3Wolf*
> 
> dont forget the super glue and electric tape befor i had a nice setup i had cases that had electric tape all over with my home made wire extensions.


you should have seen my old case, i literally cut out pieces and re-welded new ones on because it was so badly rusted from the humidity in my area lol.


----------



## Bal3Wolf

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> dont forget the super glue and electric tape befor i had a nice setup i had cases that had electric tape all over with my home made wire extensions.
> 
> 
> 
> you should have seen my old case, i literally cut out pieces and re-welded new ones on because it was so badly rusted from the humidity in my area lol.
Click to expand...

lol if i had a welder id probbly have a whole case peiced together lol in the old days we didnt have the easy wire managment we have today or psus with tons of extra connectors had to add that crap yourself lol.


----------



## philhalo66

Question, Should i go for the EVGA FTW ACX 3.0 or the SC2 iCX? the iCX is 10 dollars more expensive and does have lower clocks but it boasts about the new iCX cooler.


----------



## pez

The ICX is a cooler they had to release too early because of the ACX 3.0s flaws. The FTW wasn't affected IIRC. The FTW *should* theoretically be the better performer, but really it can go either way.


----------



## philhalo66

Quote:


> Originally Posted by *pez*
> 
> The ICX is a cooler they had to release too early because of the ACX 3.0s flaws. The FTW wasn't affected IIRC. The FTW *should* theoretically be the better performer, but really it can go either way.


I ended up going with the FTW2 iCX. should be here Wednesday if newegg doesn't screw up again.


----------



## max883

I recomend the 1080 SC2 iCX. 6.days ago my 1080 SC ACX startet artifakting in games and it got realy bad !!!


It loked like this in all games and only got more artifacts!!

Now i have the iCX and it is a better card overal.


----------



## philhalo66

Quote:


> Originally Posted by *max883*
> 
> I recomend the 1080 SC2 iCX. 6.days ago my 1080 SC ACX startet artifakting in games and it got realy bad !!!
> 
> 
> It loked like this in all games and only got more artifacts!!
> 
> Now i have the iCX and it is a better card overall.


Ouch! Give EVGA a call or email them they will take care of it they're legendary for their customer service. That's the reason i went with an EVGA Card. Honestly im a little shocked it started artifacting, more than once I've turned off the fans and forgot and played Games for hours with the card running well over 95C and its still working perfectly.


----------



## ZealotKi11er

Quote:


> Originally Posted by *philhalo66*
> 
> EVGA said they have no plans for a 1080 with faster memory, A bit disappointing TBH


I am sure they will release them if Vega does anything. There is no reason for them to say that 11GBps are coming when they have old GTX1080 stock still to move. You can probably Step-Up hopefully.


----------



## philhalo66

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am sure they will release them if Vega does anything. There is no reason for them to say that 11GBps are coming when they have old GTX1080 stock still to move. You can probably Step-Up hopefully.


most likely they will.

Just out of curiosity does anyone here with a 4930K have any issues with bottlenecking? I found a dirt cheap 4930K and i can probably grab an x79 board cheap off ebay or something.

a 4770K/4790K is cheap nowadays too would any of those have issues with bottlenecking?


----------



## ZealotKi11er

Quote:


> Originally Posted by *philhalo66*
> 
> most likely they will.
> 
> Just out of curiosity does anyone here with a 4930K have any issues with bottlenecking? I found a dirt cheap 4930K and i can probably grab an x79 board cheap off ebay or something.
> 
> a 4770K/4790K is cheap nowadays too would any of those have issues with bottlenecking?


You cant grab a cheap X79 off ebay. Thats is the problem







.


----------



## philhalo66

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You cant grab a cheap X79 off ebay. Thats is the problem
> 
> 
> 
> 
> 
> 
> 
> .


there's a few under 250 but generally yeah pretty expensive. Guess if i do go for used its going to be a 4790K or a 4770K since i can get a sabertooth Z87 for about 150. If i can sell my 1070 for atleast 300 i should be able to snag a 4790K off here and a sabertooth off ebay.


----------



## ZealotKi11er

Quote:


> Originally Posted by *philhalo66*
> 
> there's a few under 250 but generally yeah pretty expensive. Guess if i do go for used its going to be a 4790K or a 4770K since i can get a sabertooth Z87 for about 150. If i can sell my 1070 for atleast 300 i should be able to snag a 4790K off here and a sabertooth off ebay.


Why are you spending so much for Z87. Just man up and get Z270. They go for like $150-250.


----------



## NYU87

Quote:


> Originally Posted by *philhalo66*
> 
> most likely they will.
> 
> Just out of curiosity does anyone here with a 4930K have any issues with bottlenecking? I found a dirt cheap 4930K and i can probably grab an x79 board cheap off ebay or something.
> 
> a 4770K/4790K is cheap nowadays too would any of those have issues with bottlenecking?


4930K owner here. Also had a MSI GTX 1080 Gaming X 8G before I moved up to a 1080 Ti.

No bottlenecking what's so ever, even with the GTX 1080 Ti @ 2.1GHz. Currently at 4.4GHz with -0.010 offset (1.29v at load). I can go up to 4.8GHz for bench runs. I had this baby for several years now and it's been a great performer but I'll be moving up to a 6900K and selling this guy off to a buddy of mine.


----------



## philhalo66

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why are you spending so much for Z87. Just man up and get Z270. They go for like $150-250.


plus 350 for a 7700K then another 150 for ram. i can get a 4790K and a sabertooth for 380 dollars then recycle my current ram.


----------



## hertz9753

Maybe it's money issue with parts.


----------



## NYU87

Quote:


> Originally Posted by *philhalo66*
> 
> plus 350 for a 7700K then another 150 for ram. i can get a 4790K and a sabertooth for 380 dollars then recycle my current ram.


If you can find a cheap X79 motherboard I say go for the 4930K. I'm seeing it as low as ~$220 for the 4930K on ebay.


----------



## Bal3Wolf

finding cheap x79 boards that are reliable can be a pain i looked in the past and struck out at the time.


----------



## philhalo66

Quote:


> Originally Posted by *hertz9753*
> 
> Maybe it's money issue with parts.


pretty much this. I didnt expect to get boned by newegg like that. i only have about 400-425 max and thats provided i can sell my 1070 for 300-325.
Quote:


> Originally Posted by *NYU87*
> 
> If you can find a cheap X79 motherboard I say go for the 4930K. I'm seeing it as low as ~$220 for the 4930K on ebay.


yeah i been checking regularly every few weeks for the past couple months if you want one that isnt dead or has something not working its going to be expensive. the 4930K is the chea part, its the motherboards that are the problem.


----------



## DarthBaggins

If you go 4790k at least go with a z97 chipset.


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> If you go 4790k at least go with a z97 chipset.


What does Z97 actually add? the reason i ask is because the sabertooth board im looking at is Z87 and the Z97 version is over 100 dollars more expensive.


----------



## DarthBaggins

Alot have more power phases for OC stability, also no having to worry about flashing the bios for a 4790k to work right off the start. I'm personally not a fan of the Sabertooth boards, just over paying for plastic shielding really.


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> Alot have more power phases for OC stability, also no having to worry about flashing the bios for a 4790k to work right off the start. I'm personally not a fan of the Sabertooth boards, just over paying for plastic shielding really.


From what I've seen both sabertooth boards are the same as far as phase count. and with the BIOS flashback thing on all asus boards from the last few years it takes 10 seconds to flash the bios and you dont even need to have a cpu or ram installed to do it. I like the sabertooth boards ive had a few on AMD's side and one or two intel they seems to be significantly more stable plus the thermal radar is what im most interested in. having a real temp sensor on the VRM's is a huge selling point to me.


----------



## Bal3Wolf

Quote:


> Originally Posted by *philhalo66*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DarthBaggins*
> 
> Alot have more power phases for OC stability, also no having to worry about flashing the bios for a 4790k to work right off the start. I'm personally not a fan of the Sabertooth boards, just over paying for plastic shielding really.
> 
> 
> 
> From what I've seen both sabertooth boards are the same as far as phase count. and with the BIOS flashback thing on all asus boards from the last few years it takes 10 seconds to flash the bios and you dont even need to have a cpu or ram installed to do it. I like the sabertooth boards ive had a few on AMD's side and one or two intel they seems to be significantly more stable plus the thermal radar is what im most interested in. having a real temp sensor on the VRM's is a huge selling point to me.
Click to expand...

i will say i liked my p67 sabertooth still works now very reliable and durable most come with 5 year warrantys also you might be able to pick up a used one with a warranty still.


----------



## netxzero

guess nobody knows if an aftermarket bios can be flashed on the FE.


----------



## philhalo66

Quote:


> Originally Posted by *netxzero*
> 
> guess nobody knows if an aftermarket bios can be flashed on the FE.


from a logical standpoint if the PCB is the same as a FE card in theory you should be fine.


----------



## netxzero

Quote:


> Originally Posted by *philhalo66*
> 
> from a logical standpoint if the PCB is the same as a FE card in theory you should be fine.


Do you happen to know which partners have the same pcb as FE?


----------



## philhalo66

Quote:


> Originally Posted by *netxzero*
> 
> Do you happen to know which partners have the same pcb as FE?


Unfortunately i don't but have a look around google. I am by no means an expert but when i was messing around with the bios for my 580 i noticed the board ID always matched so long as it was reference so maybe that might help but keep in mind how old the 580 is, same rules might not apply anymore. Good luck.


----------



## mbm

Been testing a lot with voldtage and mh Z curve in afterburner..
my Card seems to do best at 0.996v here it settles and maintain a steady 2076 mhz.


----------



## netxzero

Quote:


> Originally Posted by *philhalo66*
> 
> Unfortunately i don't but have a look around google. I am by no means an expert but when i was messing around with the bios for my 580 i noticed the board ID always matched so long as it was reference so maybe that might help but keep in mind how old the 580 is, same rules might not apply anymore. Good luck.


been looking around and i can't see a card with the same PCB design as the FE.


----------



## CyBorg807

So just got a Asus FE off a buddy. What is the best OC software? I used to use Precision X with my 600/700 series cards.


----------



## V5-aps

Quote:


> Originally Posted by *netxzero*
> 
> been looking around and i can't see a card with the same PCB design as the FE.


Have a look at the Inno3d Ichill x3 1080


----------



## pez

Quote:


> Originally Posted by *philhalo66*
> 
> I ended up going with the FTW2 iCX. should be here Wednesday if newegg doesn't screw up again.


Very nice, congrats







.


----------



## AllGamer

Is it worth it to upgrade for *SLI GTX 1080* on water, to the new *SLI GTX 1080 Ti*?

As in would there be any significant difference for the naked eyes?

Running Surround View triple screen setup, and the SLI GTX 1080 is handling it just fine.

That's why I'm not sure if I should bother, besides I've yet to see any GTX 1080 Ti with factory water blocks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *AllGamer*
> 
> Is it worth it to upgrade for *SLI GTX 1080* on water, to the new *SLI GTX 1080 Ti*?
> 
> As in would there be any significant difference for the naked eyes?
> 
> Running Surround View triple screen setup, and the SLI GTX 1080 is handling it just fine.
> 
> That's why I'm not sure if I should bother, besides I've yet to see any GTX 1080 Ti with factory water blocks.


You will see ~ 25-30% more fps. It's your call.


----------



## nrpeyton

I hate to say it, but SLI is a thing of the past.


----------



## ZealotKi11er

I have always wondered, do pascal cards have any Bios Editing tool?


----------



## asefsef

HELP! - Anyone have original bios for: MSI 1080 GAMING 8G ? (not x or z, just gaming 8g). I can't find it on TPU BIOS database.

*My issue:*
Memory clock runs at a low 4500mhz. I can over clock to 5500mhz using afterburner no issues.
*BUT*, sometimes the computer boots up at the correct 5000mhz, then adds the afterburner's +1000mhz and my computer crashes to oblivion.

To solve the issue I flashed the x and z bios, no luck. Now i want to RMA but I'm afraid of their discovering the bios flash =(


----------



## pez

If all you play are games that support SLI or multi-GPU, it's totally worth it, but that criteria is very important. Lots of games still support SLI, but plenty don't.


----------



## ZealotKi11er

Quote:


> Originally Posted by *asefsef*
> 
> HELP! - Anyone have original bios for: MSI 1080 GAMING 8G ? (not x or z, just gaming 8g). I can't find it on TPU BIOS database.
> 
> *My issue:*
> Memory clock runs at a low 4500mhz. I can over clock to 5500mhz using afterburner no issues.
> *BUT*, sometimes the computer boots up at the correct 5000mhz, then adds the afterburner's +1000mhz and my computer crashes to oblivion.
> 
> To solve the issue I flashed the x and z bios, no luck. Now i want to RMA but I'm afraid of their discovering the bios flash =(


What are you using the card for?


----------



## AllGamer

Quote:


> Originally Posted by *nrpeyton*
> 
> I hate to say it, but SLI is a thing of the past.


Quote:


> Originally Posted by *pez*
> 
> If all you play are games that support SLI or multi-GPU, it's totally worth it, but that criteria is very important. Lots of games still support SLI, but plenty don't.


I don't know about you guys, but plenty of games I play runs fine on SLI (except a couple at most, which I needed to create a profile to disable SLI), almost all the games I play works fine on SLI, but at the same time all the games I play actually supports multi screens, they are mostly Sims-Racing-Flying-Space-etc.

but other good games like Fallout series and Witcher series, or those one off like War Thunder, MechWarrior Online and StarCraft, even games like Star Trek Online and Star Wars Battlefront plays great on SLI and super wide surround view setups.


----------



## philhalo66

my 1080 Finally showed up. It's alot faster than i anticipated. I doubled my FPS in Ghost recon and gained a solid 20 FPS in ARK.
https://www.techpowerup.com/gpuz/details/gmya


----------



## ZealotKi11er

Quote:


> Originally Posted by *philhalo66*
> 
> my 1080 Finally showed up. It's alot faster than i anticipated. I doubled my FPS in Ghost recon and gained a solid 20 FPS in ARK.
> https://www.techpowerup.com/gpuz/details/gmya


What did u have before?


----------



## philhalo66

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What did u have before?


A gigabyte 1070 G1 gaming


----------



## ZealotKi11er

Quote:


> Originally Posted by *philhalo66*
> 
> A gigabyte 1070 G1 gaming


Really. I have not played much with mine but in Witcher 3 I only got 50% boost over my 290X. Was really expecting a bit more.


----------



## philhalo66

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Really. I have not played much with mine but in Witcher 3 I only got 50% boost over my 290X. Was really expecting a bit more.


Yeah im pretty happy with this card. probably should have gone for the classified but newegg doesn't carry that. I don't know much about AMD cards anymore my last one was a 4870 so i can't say if that's normal or not.


----------



## asefsef

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What are you using the card for?


For gaming and VR


----------



## ZealotKi11er

Quote:


> Originally Posted by *asefsef*
> 
> For gaming and VR


No I mean are you running anything in the background? Mining? [email protected]?


----------



## raisethe3

Quote:


> Originally Posted by *philhalo66*
> 
> A gigabyte 1070 G1 gaming


Could you test and compare all the games you have? I am just curious, because right now, you're the only user that has the "actual hands" on both cards (GTX 1070 and GTX 1080). I know I posted in the GTX 1070 debating whether I should get this card or the GTX 1080, but since the GTX 1070 fits my budget, I may trrryyyy really hard to save a bit more just to get the top card, lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *raisethe3*
> 
> Could you test and compare all the games you have? I am just curious, because right now, you're the only user that has the "actual hands" on both cards (GTX 1070 and GTX 1080). I know I posted in the GTX 1070 debating whether I should get this card or the GTX 1080, but since the GTX 1070 fits my budget, I may trrryyyy really hard to save a bit more just to get the top card, lol.


You really want then GTX1080 trust me. Wait for Vega and price will drop more. Do no be afraid to buy used.


----------



## raisethe3

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You really want then GTX1080 trust me. Wait for Vega and price will drop more. Do no be afraid to buy used.


How do you know it will drop??


----------



## ZealotKi11er

Quote:


> Originally Posted by *raisethe3*
> 
> How do you know it will drop??


Else AMD is in trouble if they cant make it drop







or you can buy a Vega







.


----------



## asefsef

Quote:


> Originally Posted by *ZealotKi11er*
> 
> No I mean are you running anything in the background? Mining? [email protected]?


Nothing is going on in background. Do you have any suggestions for apps that could affect the clock speed when I game?

I done: with and without afterburner in background, with and without gaming app


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> Could you test and compare all the games you have? I am just curious, because right now, you're the only user that has the "actual hands" on both cards (GTX 1070 and GTX 1080). I know I posted in the GTX 1070 debating whether I should get this card or the GTX 1080, but since the GTX 1070 fits my budget, I may trrryyyy really hard to save a bit more just to get the top card, lol.


I already have the 1070 packed up, im trading it for a 4790K. I say go for a 1080 or wait for vega. 1070 is a good card but if you want to max out games it wont cut it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *asefsef*
> 
> Nothing is going on in background. Do you have any suggestions for apps that could affect the clock speed when I game?
> 
> I done: with and without afterburner in background, with and without gaming app


Well set the card to Afterburner defaults. Check with GPU-Z what clock you have during games. 4500MHz is when card is doing CUDA. I too though card was only hitting 4500MHz and did the +1000MHz OC and crashed.


----------



## asefsef

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Well set the card to Afterburner defaults. Check with GPU-Z what clock you have during games. 4500MHz is when card is doing CUDA. I too though card was only hitting 4500MHz and did the +1000MHz OC and crashed.


Wow, that's something new! Doing CUDA? Please elaborate.

I set afterburner to default, the clock goes to 4504mhz Mem running heaven.

I can run +1000mhz OC, the card runs FireStrike at the resulting 5500mhz with no issues.


----------



## raisethe3

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Else AMD is in trouble if they cant make it drop
> 
> 
> 
> 
> 
> 
> 
> or *you can buy a Vega*
> 
> 
> 
> 
> 
> 
> 
> .


No thanks.
Quote:


> Originally Posted by *philhalo66*
> 
> I already have the 1070 packed up, im trading it for a 4790K. I say go for a 1080 or wait for vega. 1070 is a good card but if you want to max out games it wont cut it.


I game on a 1680x1050 though.


----------



## pez

Quote:


> Originally Posted by *AllGamer*
> 
> I don't know about you guys, but plenty of games I play runs fine on SLI (except a couple at most, which I needed to create a profile to disable SLI), almost all the games I play works fine on SLI, but at the same time all the games I play actually supports multi screens, they are mostly Sims-Racing-Flying-Space-etc.
> 
> but other good games like Fallout series and Witcher series, or those one off like War Thunder, MechWarrior Online and StarCraft, even games like Star Trek Online and Star Wars Battlefront plays great on SLI and super wide surround view setups.


I've used SLI for just about every generation for the last 3 or 4 and haven't had issues, but I get flamed every time for saying SLI isn't dead







.


----------



## ZealotKi11er

Quote:


> Originally Posted by *raisethe3*
> 
> No thanks.
> I game on a 1680x1050 though.


Elaborate please.


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> No thanks.
> I game on a 1680x1050 though.


then a 1080 is massive overkill. a 1070 will easily max out almost every game at that rez.


----------



## raisethe3

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Elaborate please.


LOL, what?

I owned a 6850 in the past, and had nothing but trouble. So yeah, if that's what you're asking.
Quote:


> Originally Posted by *philhalo66*
> 
> then a 1080 is massive overkill. a 1070 will easily max out almost every game at that rez.


That's what I was thinking.


----------



## ZealotKi11er

Quote:


> Originally Posted by *raisethe3*
> 
> LOL, what?
> 
> I owned a 6850 in the past, and had nothing but trouble. So yeah, if that's what you're asking.
> That's what I was thinking.


What are you upgrading from?


----------



## raisethe3

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What are you upgrading from?


EVGA 8800GT in the sig rig. (Come on now) 
I had the 6850, like I mentioned, but driver problems frustrated the hell out of me, so I returned it for a refund.

I know I made a thread a while back that I had the GTX670 FTW which I sold to a friend because he begged for it. So, I dust off this old card and put it in my rig for now. People here told me to wait for the 1000 series, which it now has arrived. So yeah, I am in the market for a new card to play games which I have sitting on the hard drives, but not installing or playing it because they don't meet the graphics requirement obviously. Tried to play ROT (wouldn't let me).

Now sit back and imagine this. I had that card (EVGA) for almost 9 years (bought in 2008). So yeah, times have changed and so did the hardware.

Thank you for taking the time to reply. Really appreciate it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *raisethe3*
> 
> EVGA 8800GT in the sig rig. (Come on now)
> I had the 6850, like I mentioned, but driver problems frustrated the hell out of me, so I returned it for a refund.
> 
> I know I made a thread a while back that I had the GTX670 FTW which I sold to a friend because he begged for it. So, I dust off this old card and put it in my rig for now. People here told me to wait for the 1000 series, which it now has arrived. So yeah, I am in the market for a new card to play games which I have sitting on the hard drives, but not installing or playing it because they don't meet the graphics requirement obviously. Tried to play ROT (wouldn't let me).
> 
> Now sit back and imagine this. I had that card (EVGA) for almost 9 years (bought in 2008). So yeah, times have changed and so did the hardware.
> 
> Thank you for taking the time to reply. Really appreciate it.


If you playing 1680*1050 I suggest you do not even spend GTX1070 money and get GTX1060.


----------



## raisethe3

^^Why?? It has only 6GB of VRAM. That's like almost getting a GTX 980, unless you think its a good idea that I get the 980 instead?


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> ^^Why?? It has only 6GB of VRAM. That's like almost getting a GTX 980, unless you think its a good idea that I get the 980 instead?


at 1680x1050 you dont need more than 6GB VRAM ever.


----------



## TK421

7700K CPU 4.9
x2 GPU +26/+400

System is still limited to 480w (laptop)

Still waiting for mod bios to unlock EC power limit

http://www.3dmark.com/fs/12320719


----------



## TK421

CPU @ 5

+39/+400

http://www.3dmark.com/fs/12320842


----------



## ZealotKi11er

Quote:


> Originally Posted by *raisethe3*
> 
> ^^Why?? It has only 6GB of VRAM. That's like almost getting a GTX 980, unless you think its a good idea that I get the 980 instead?


6GB is 50% more than 4GB. 1060 is better buy than GTX 980 at this point.


----------



## raisethe3

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 6GB is 50% more than 4GB. 1060 is better buy than GTX 980 at this point.


Should I think about upgrading the monitor just for the sake of getting the 1070 then? Nah? Or just stick with what I have?


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> Should I think about upgrading the monitor just for the sake of getting the 1070 then? Nah? Or just stick with what I have?


get a 1070 first then save up for a new monitor.

What i did was went from a GTX 580 and a 1440x900 monitor to a GTX 1070 then upgraded to 1080P and then got a GTX 1080.


----------



## DarthBaggins

Hell I went from a HD7870 w/ my ASUS VS238H to a GTX 970 then I recently upgraded to the 1080 SC, but now I'm looking into a 30-34" 1440p monitor (looking at 75-144hz) but not going G-Sync due to budget. I plan on going GSync once 4k is over 100hz etc and will again upgrade GPU's in the next 1.5-2.5 years (seems to be my cycle)


----------



## raisethe3

Quote:


> Originally Posted by *philhalo66*
> 
> get a 1070 first then save up for a new monitor.
> 
> What i did was went from a GTX 580 and a 1440x900 monitor to a GTX 1070 then upgraded to 1080P and then got a GTX 1080.


So card first, then monitor?


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> Hell I went from a HD7870 w/ my ASUS VS238H to a GTX 970 then I recently upgraded to the 1080 SC, but now I'm looking into a 30-34" 1440p monitor (looking at 75-144hz) but not going G-Sync due to budget. I plan on going GSync once 4k is over 100hz etc and will again upgrade GPU's in the next 1.5-2.5 years (seems to be my cycle)


Man i got a very long history of cards lol.

ATi Rage 128 Pro 16MB, Radeon 9200 Pro 128MB, some random creative 3d card pretty sure it was 32MB, Nvidia Riva TNT 2 8MB (was given to me), Geforce 6200 256MB, Radeon 9600 Pro, Geforce 8600 GT, Radeon 3870, Geforce 9800 GTX+, Geforce 8800GT, Geforce 9800GX2, another 8800GT after the dude on here scammed me and gave me a dead card, Radeon 4870, Geforce GTX 285, Geforce GTX 580, GTX 1070 and finally my EVGA GTX 1080 FTW2
Quote:


> Originally Posted by *raisethe3*
> 
> So card first, then monitor?


That's what i did and im happy with the way it worked out.


----------



## d1sappeared

i just received my gigabyte gtx 1080 g1 GPU

and i tested it in battlefield 1 with 1440p resolution

i think there is some thing wrong with it , the average FPS were 75

and 80-95 with 1080p

video settings set to ultra

any idea ??


----------



## philhalo66

Quote:


> Originally Posted by *d1sappeared*
> 
> i just received my gigabyte gtx 1080 g1 GPU
> 
> and i tested it in battlefield 1 with 1440p resolution
> 
> i think there is some thing wrong with it , the average FPS were 75
> 
> and 80-95 with 1080p
> 
> video settings set to ultra
> 
> any idea ??


Might be a CPU bottleneck at 1080P. What processor do you have?


----------



## d1sappeared

impossible

this is my spec :

https://pcpartpicker.com/list/yzmzRG


----------



## philhalo66

Quote:


> Originally Posted by *d1sappeared*
> 
> impossible
> 
> this is my spec :
> 
> https://pcpartpicker.com/list/yzmzRG


it's NOT impossible just unlikely. im not sure what's going on.


----------



## d1sappeared

i reinstalled the driver and the fps become 90-130

is it good ??
Quote:


> Originally Posted by *philhalo66*
> 
> it's NOT impossible just unlikely. im not sure what's going on.


i reinstalled the driver and the fps become 90-130

is it good ??


----------



## artemis2307

Quote:


> Originally Posted by *d1sappeared*
> 
> i reinstalled the driver and the fps become 90-130
> 
> is it good ??
> i reinstalled the driver and the fps become 90-130
> 
> is it good ??


SP or MP?
if MP, it's very good
if SP, it's ok, my Oced 290 got 90-105 in SP


----------



## d1sappeared

my FPS in Multi player

But i have a question would you guys advice me to take a 1440p monitor or it will be laggy ??


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## derpa

Folding away, though for some reason I'm having an issue signing up for the Foldathon, so I sent an email to see what's up. Probably something I did wrong, per the norm.


----------



## raisethe3

Quote:


> Originally Posted by *philhalo66*
> 
> Man i got a very long history of cards lol.
> 
> ATi Rage 128 Pro 16MB, Radeon 9200 Pro 128MB, some random creative 3d card pretty sure it was 32MB, Nvidia Riva TNT 2 8MB (was given to me), Geforce 6200 256MB, Radeon 9600 Pro, Geforce 8600 GT, Radeon 3870, Geforce 9800 GTX+, Geforce 8800GT, Geforce 9800GX2, another 8800GT after the dude on here scammed me and gave me a dead card, Radeon 4870, Geforce GTX 285, Geforce GTX 580, GTX 1070 and finally my EVGA GTX 1080 FTW2
> That's what i did and im happy with the way it worked out.


Sorry to ask, but GTX 1070 or GTX 1080? You said card first, then monitor.


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> Sorry to ask, but GTX 1070 or GTX 1080? You said card first, then monitor.


I originally bought a GTX 1070, then a few weeks later bought a 1080P monitor then because i wasn't happy with how a few games were running i bought a GTX 1080. SO my vote is go with the 1070 first then get a monitor.


----------



## DarthBaggins

I would say GTX 1080/Ti then monitor (if you're already at 1080p at least)


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> I would say GTX 1080/Ti then monitor (if you're already at 1080p at least)


IIRC he said he was running 1680x1050.


----------



## DarthBaggins

Also if you're still running a 1155/6 CPU you'll see a hindrance on anything above a 9** series GPU. .


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> Also if you're still running a 1155/6 CPU you'll see a hindrance on anything above a 9** series GPU. .


It depends on what clocks he's running. My ASRock board gets here tomorrow so i been using an i5 3570K with my 1080 for about a week and im getting almost no bottleneck but i'm also running it at 5GHz.


----------



## DarthBaggins

Running 5GHz would be the reason, now I can say running the 4790k on a good 12phase board you should see 5.0-5.2 GHz no problem. I got mine to 5.2GHz 1.285 on the z97x-SOC Force under water (well the second one since the first one died after 6mo of 4.8-4.9 1.32 running [email protected] 24/7 CPU only)


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> Running 5GHz would be the reason, *now I can say running the 4790k on a good 12phase board you should see 5.0-5.2 GHz no problem*. I got mine to 5.2GHz 1.285 on the z97x-SOC Force under water (well the second one since the first one died after 6mo of 4.8-4.9 1.32 running [email protected] 24/7 CPU only)


0.o seriously? I was only going to try for 4.8GHz.


----------



## raisethe3

Guys, my CPU is clocked at 4.4Ghz, so I haven't really tried pushing it far since I am on air cooling. So....what to do now?


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> Guys, my CPU is clocked at 4.4Ghz, so I haven't really tried pushing it far since I am on air cooling. So....what to do now?


get the 1070. everyone told me not to but i ignored them and i never regretted it.


----------



## raisethe3

^^^Your sig says GTX 1080??!


----------



## DarthBaggins

He just got the 1080, much like I got mine a little over a month ago (but jumped from a 970 to the 1080)


----------



## philhalo66

Quote:


> Originally Posted by *raisethe3*
> 
> ^^^Your sig says GTX 1080??!


i got my 1070 november last year. wasn't till last week that i got a 1080.


----------



## Jorginto

Guys, is my score ok?


----------



## DJ_OXyGeNe_8

Hi guys you can add me too; Evga GTX 1080 FTW 2



But something strange, can't see ASIC quality;



https://www.techpowerup.com/gpuz/details/3rcwm


----------



## Beagle Box

Quote:


> Originally Posted by *DJ_OXyGeNe_8*
> 
> ...
> But something strange, can't see ASIC quality;


Yeah. That's normal. Info not provided for this chip.


----------



## raisethe3

Quote:


> Originally Posted by *DarthBaggins*
> 
> He just got the 1080, much like I got mine a little over a month ago (but jumped from a 970 to the 1080)


I used to have a 970, but I sold that to my friend.

Since y'all say that my 1680x1050 monitor needs serious upgrade to accomodate either the 1070 or 1080, what should I be looking at? Philhalo?


----------



## philhalo66

Quote:


> Originally Posted by *DJ_OXyGeNe_8*
> 
> Hi guys you can add me too; Evga GTX 1080 FTW 2
> 
> 
> 
> But something strange, can't see ASIC quality;
> 
> 
> 
> https://www.techpowerup.com/gpuz/details/3rcwm


Nice! we got the same card. Weighs a ton though doesn't it?
Quote:


> Originally Posted by *raisethe3*
> 
> I used to have a 970, but I sold that to my friend.
> 
> Since y'all say that my 1680x1050 monitor needs serious upgrade to accomodate either the 1070 or 1080, what should I be looking at? Philhalo?


I would say 1440P. 1080P minimum.


----------



## raisethe3

^^^Gotcha thanks!


----------



## Vitaminx

I'm currently thinking about selling my EVGA GeForce GTX 770 SC graphics card and replacing with a EVGA GeForce GTX 1070 SC and was wondering how the GeForce 10 series cards handle in regards to heat. I've been reading some articles that the VRM on these cards have overheated when stressed hard enough and actually caused small fires. What exactly is EVGA's solution to this problem and is EVGA still a recommended manufacturer after this issue was brought to light last year? I do love EVGA graphics cards and have never had an issues with them. I just wanted to get others opinions before I jump right in.


----------



## DarthBaggins

I've had zero issues from my 1080 SC and I run it full tilt at 2050/5000 on [email protected] with a custom fan profile, but I also have my waterblock to throw on it once I get a chance.


----------



## Beagle Box

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, is my score ok?


Sure.









You should submit it and compare it to all other entries in the Superposition Benchmark thread.


----------



## Tdbeisn554

Quote:


> Originally Posted by *raisethe3*
> 
> I used to have a 970, but I sold that to my friend.
> 
> Since y'all say that my 1680x1050 monitor needs serious upgrade to accomodate either the 1070 or 1080, what should I be looking at? Philhalo?


A 3440 X 1440 Ultrawide is amazing, I also have a GTX 1080, and it is amazing! Best purchase since going with a desktop for me!


----------



## derpa

UW here as well, and loving it! I upgraded from an almost ancient 24" 19x12 Samsung, and WOW!!! My 980Ti wasn't QUITE able to push games fully at the upgraded resolution, hence the 1080.


----------



## DarthBaggins

Should be a higher score I would think, here's mine at 4K


Quote:


> Originally Posted by *Jorginto*
> 
> Guys, is my score ok?


----------



## derpa

I noticed the same thing on my machine; 1080p Extreme is MUCH more taxing than 4K Optimized. Same holds true over on the Superposition Leaderboard thread


----------



## mbm

is there another tool for OC the 1080 gtx than afterburner?
My problem with afterburner is that it cant maintain my preset curve over time. It either raise or lower the curve over time..

My preset that works the best is a straight curve at 2088 mhz at 0.975V... This will settle at 2075 mhz when gaming (If I just could put this boost in the BIOS)
But after some days this preset curve will either raise to +2100 mhz or lower at 2036 mhz, so I need to apply a new curve at 2088 mhz again..


----------



## Beagle Box

Quote:


> Originally Posted by *mbm*
> 
> is there another tool for OC the 1080 gtx than afterburner?
> My problem with afterburner is that it cant maintain my preset curve over time. It either raise or lower the curve over time..
> 
> My preset that works the best is a straight curve at 2088 mhz at 0.975V... This will settle at 2075 mhz when gaming (If I just could put this boost in the BIOS)
> But after some days this preset curve will either raise to +2100 mhz or lower at 2036 mhz, so I need to apply a new curve at 2088 mhz again..


You are not alone.

The curve seems to:
Distort when you enter it by hitting the [Apply] button (checkmark).
Distort when you save it to a memory preset number (or when you retrieve it from a memory preset button. How do you tell?)
Distort when you shut down/start the computer.









--Rant On --
I got tired of trying custom curves because of this. At first I thought my curves were degrading toward my card's base power curve, but it really looks like it's degrading away from the .012 steps to the least-wanted .013 step. It's reading my mind and doing the opposite!







It acts like poorly implemented AI. I really don't like software telling me it knows better, _especially when it's dead wrong_.
--Rant Off--

I've noticed that the Voltage steps that are available in the curve don't exactly match the Voltage steps reported in other parts of the program, but that's probably due to the Curve Editor using industry-standard Voltage step values and the reports using simple rounding.

To answer your question: Yes. EVGA and ASUS have similar software for their cards, but the Afterburner Curve Editor is a great concept and, though frustrating, works the best for my MSI card at this time.


----------



## mbm

yes the curve is great mainly because of the undervolting.. really like I can run my boost at only 0.975V... = 2075 mhz
If i just use the +100 mhz it seetle at 2075 mhz but at 1.050 V or higher..
really loved my old 970 gtx where I modded boost in the bios..

but sadly we are not going to se this option with pascal


----------



## DarthBaggins

So far I've stuck with Precision X to OC my card and it's been rock solid.


----------



## derpa

Quote:


> Originally Posted by *DarthBaggins*
> 
> So far I've stuck with Precision X to OC my card and it's been rock solid.


Same here. On my 980Ti, I used AB to OC with good success, but once I got the 1080, I switched back to Precision, and have had better luck with it. I haven't messed with the curves at all yet; just the sliders and fan profile.


----------



## wholeeo

Can someone let me know what bios is best for a watercooled FE? Thread is too large to look through.


----------



## philhalo66

Here is some interesting news. EVGA released a BIOS update for the 1080 SC2 and FTW2 that overclocks your card to match their new faster memory cards i tried it and i actually gained a solid 5% performance in the new superposition benchmark.

Here is the link to the official post from EVGA https://forums.evga.com/EVGA-GeForce-GTX-1080-FTW2-and-SC2-OPTIONAL-11GHz-BIOS-Update-Available-Now-m2652350.aspx


----------



## Bal3Wolf

i just put the 6686 on my 1080 hydrocopper and seems to work fine my card would already do 5600mhz on the memory tho, crazy part my core is overclocking better on this bios then it ever did on the stock got my highest 3dmark score ever.


----------



## mbm

Quote:


> Here is the link to the official post from EVGA https://forums.evga.com/EVGA-GeForce-GTX-1080-FTW2-and-SC2-OPTIONAL-11GHz-BIOS-Update-Available-Now-m2652350.aspx


I see it is for the ICX cards.
So not for my older ACX card?


----------



## wholeeo

Quote:


> Originally Posted by *Bal3Wolf*
> 
> i just put the 6686 on my 1080 hydrocopper and seems to work fine my card would already do 5600mhz on the memory tho, crazy part my core is overclocking better on this bios then it ever did on the stock got my highest 3dmark score ever.


How were you able to extract the bios from the updater? It wouldn't let me install on my FE because of the ID mismatch I'm assuming.


----------



## derpa

Welp....it was awesome while it lasted, but alas, I'm headed out, lol


----------



## Bal3Wolf

Quote:


> Originally Posted by *wholeeo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bal3Wolf*
> 
> i just put the 6686 on my 1080 hydrocopper and seems to work fine my card would already do 5600mhz on the memory tho, crazy part my core is overclocking better on this bios then it ever did on the stock got my highest 3dmark score ever.
> 
> 
> 
> How were you able to extract the bios from the updater? It wouldn't let me install on my FE because of the ID mismatch I'm assuming.
Click to expand...

it flashed for me no problem maybe cause i have a EVGA card.


----------



## wholeeo

Quote:


> Originally Posted by *Bal3Wolf*
> 
> it flashed for me no problem maybe cause i have a EVGA card.


Yeah, that's most likely it. Do you mind backing up the bios and sharing it via PM?


----------



## Bal3Wolf

not a problem just backed it up with gpuz.


----------



## Spartoi

I'm trying to undervolt my GPU via Afterburner to 1.063V @ 2153Mhz but when temps exceed 60C, my GPU downclocks to 196Mhz @ 1.025V range. Here is my frequency curve.



What am I doing wrong? I have power and temp limit at 120 and 92 respectively (un-linked).

Also, I've tried to OC to 2200Mhz using the curve but with 1.1V and instead my GPU locks to 2000Mhz. Why is this?


----------



## philhalo66

Quote:


> Originally Posted by *Spartoi*
> 
> I'm trying to undervolt my GPU via Afterburner to 1.063V @ 2153Mhz but when temps exceed 60C, my GPU downclocks to 196Mhz @ 1.025V range. Here is my frequency curve.
> 
> 
> 
> What am I doing wrong? I have power and temp limit at 120 and 92 respectively (un-linked).
> 
> Also, I've tried to OC to 2200Mhz using the curve but with 1.1V and instead my GPU locks to 2000Mhz. Why is this?


thats normal, That would be GPU Boost 3.0 throttling you. mine does it too unless i crank up the fans to stay under 63C


----------



## Spartoi

Quote:


> Originally Posted by *philhalo66*
> 
> thats normal, That would be GPU Boost 3.0 throttling you. mine does it too unless i crank up the fans to stay under 63C


Why does it throttle at 63C? That's a lot of headroom from the 92C temp limit I set. Anyway to change the throttle range of GPU Boost 3.0? If not, is there another to undervolt and OC while keeping downclocking and downvolting?


----------



## Beagle Box

Quote:


> Originally Posted by *Spartoi*
> 
> I'm trying to undervolt my GPU via Afterburner to 1.063V @ 2153Mhz but when temps exceed 60C, my GPU downclocks to 196Mhz @ 1.025V range. Here is my frequency curve.
> 
> 
> 
> What am I doing wrong? I have power and temp limit at 120 and 92 respectively (un-linked).
> 
> Also, I've tried to OC to 2200Mhz using the curve but with 1.1V and instead my GPU locks to 2000Mhz. Why is this?


Boost 3.0 assumes the power curve in the graph is relatively close to the max Voltage/Speed your card can actually achieve within a standard temperature range. When you move your values off too far off your realistic performance path, Boost 3.0 takes you at your word and, believing it's taking prudent action, downclocks your card at (Temp+X) points across the curve. It's going to do this even if the card could easily achieve that speed on the stock curve. These are soft points and they _act_ as though they are based not on temperature, but on temperature _change_.

In my experience, if you make your jump from normal curve to max curve too great or too small, your performance really takes a hit.

Have you ever run a benchmark like Heaven in an open window and enabled the Afterburner power curve graph and the hardware monitor at the same time? You can watch the algorithm at work as Boost attempts to find an acceptable Voltage/Temp/MHz point at which to run. You an modify your curve in real time if you're careful and, like changing a golf swing, the result is often the exact opposite of what you intended. I think you will find experimenting in that environment most instructive.

Remember: Some cards cannot do high speeds. And you can reach a point where higher speed does not = higher fps.


----------



## philhalo66

Quote:


> Originally Posted by *Spartoi*
> 
> Why does it throttle at 63C? That's a lot of headroom from the 92C temp limit I set. Anyway to change the throttle range of GPU Boost 3.0? If not, is there another to undervolt and OC while keeping downclocking and downvolting?


I'm not sure but i experienced this on my 1070 and my 1080.


----------



## ALVARIX

Hi guys, Someone has the bios of MSI GTX 1080 gaming x (OC mode by default), already tried to search and can not find anywhere.

The bios in question is this: https://www.techpowerup.com/223571/techpowerup-impact-msi-issues-oc-mode-by-default-bioses

Thanks in advance


----------



## shilka

Spoiler: Finally had time to install my new EVGA GTX 1080 FTW2 that i got yesterday











And yes i know the the machine is a bit dusty so you dont need to tell me i know!


Really hope this card actually works unlike my Gigabyte GTX 1070 Xtreme Gaming which was broken.


----------



## derpa

Nice!


----------



## shilka

Have not had time to really test it yet but the few test i already ran shows numbers that are about 30% higher then my broken GTX 1070 in 1440P
Going to run all the benchmarks that i have once i get bigger SSD.


----------



## Beagle Box

Quote:


> Originally Posted by *ALVARIX*
> 
> Hi guys, Someone has the bios of MSI GTX 1080 gaming x (OC mode by default), already tried to search and can not find anywhere.
> 
> The bios in question is this: https://www.techpowerup.com/223571/techpowerup-impact-msi-issues-oc-mode-by-default-bioses
> 
> Thanks in advance


I have a MSI GTX 1080 Gaming X and wanted to do the same thing. I was never able to find that BIOS, though I suppose it exists in my card (and yours) as the unobtainable OC BIOS. If your stock Gaming X BIOS shows a Default Clock of 1683 in GPU-Z, you're running the original gaming BIOS mentioned in the article. If it's 1709, your card is already upgraded and the article is irrelevant. There are a number of BIOSs available with Default clocks above that which means they're more aggressive than both those BIOSs:

Tech Power Up MSI GTX 1080 Gaming X Bios List

If you want to upgrade your BIOS, research those in that list for your desired properties. Default clock, memory speeds, max power, etc. I eventually upgraded mine to a Gaming Z OC BIOS.









*Use GPU-Z to back you BIOS up and store it in a safe place before doing any flashing!*

Good luck.


----------



## DarthBaggins

Finally got my SC underwater last night, and it's cruising beautifully at 2100/4568 crushing projects on [email protected]


----------



## philhalo66

Quote:


> Originally Posted by *DarthBaggins*
> 
> Finally got my SC underwater last night, and it's cruising beautifully at 2100/4568 crushing projects on [email protected]


you couldn't keep it at 2100 without water? maybe i got lucky, if i set it to 2100 it stays 2100 no matter what temp or load.


----------



## DarthBaggins

I would get to 2088 on air, I didn't really mess with the voltage since I didn't want to kill it (never got the thermal pad update on this one - ACX version SC not ICX SC2).


----------



## nrpeyton

You won't kill a card going within Voltage slider on any of the major overclocking apps (MSI Afterburner / EVGA Precision X)

The limit is 1.093v (with voltage slider at 100%) and 1.043v (with slider at 0% or default).

53 milli volts ++ is quite safe-- even for long-term use


----------



## DarthBaggins

In the case of the series of card I have there was a greater chance of an issue happening due to heat etc - guess people forgot about the SC & FTW's catching on fire. .


----------



## nrpeyton

That issue wasn't related to the Thermal Mod issue (two different issues).. the "press" made an awful mess of that; and made two separate issues into one issue....

the catching fire was the usual 1% manufacturing flaw (you could have with any manufacturer)

The thermal mod thing issue was also blown totally out of proportion because of it, which wasn't fair on EVGA at all _(or some of its worried customers)_


----------



## philhalo66

these are my settings, temps seem to hover around 66-67C i'm happy with it. plus if something happens EVGA has the best customer service i ever dealt with and i got the 7 year extended warranty so im good for 10 years.


----------



## nrpeyton

Quote:


> Originally Posted by *philhalo66*
> 
> EVGA has the best customer service i ever dealt with and i got the 7 year extended warranty so im good for 10 years.


Agreed


----------



## ALVARIX

Quote:


> Originally Posted by *Beagle Box*
> 
> I have a MSI GTX 1080 Gaming X and wanted to do the same thing. I was never able to find that BIOS, though I suppose it exists in my card (and yours) as the unobtainable OC BIOS. If your stock Gaming X BIOS shows a Default Clock of 1683 in GPU-Z, you're running the original gaming BIOS mentioned in the article. If it's 1709, your card is already upgraded and the article is irrelevant. There are a number of BIOSs available with Default clocks above that which means they're more aggressive than both those BIOSs:
> 
> Tech Power Up MSI GTX 1080 Gaming X Bios List
> 
> If you want to upgrade your BIOS, research those in that list for your desired properties. Default clock, memory speeds, max power, etc. I eventually upgraded mine to a Gaming Z OC BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Use GPU-Z to back you BIOS up and store it in a safe place before doing any flashing!*
> 
> Good luck.


Hi thanks for your answer, my bios is this: https://www.techpowerup.com/vgabios/183841/msi-gtx1080-8192-160527

But however is there a newer version, is it safe to flash this bios?
Version 1709: https://www.techpowerup.com/vgabios/184799/msi-gtx1080-8192-160606

I do not want to flash the gaming z bios because my first graphics card burned after 3 days of use, it could be a graphic card defect but I do not want to risk it again.


----------



## Beagle Box

Quote:


> Originally Posted by *ALVARIX*
> 
> Hi thanks for your answer, my bios is this: https://www.techpowerup.com/vgabios/183841/msi-gtx1080-8192-160527
> 
> But however is there a newer version, is it safe to flash this bios?
> Version 1709: https://www.techpowerup.com/vgabios/184799/msi-gtx1080-8192-160606
> 
> I do not want to flash the gaming z bios because my first graphics card burned after 3 days of use , it could be a graphic card defect but I do not want to risk it again.


Looks like that will work just fine. Not telling you to do it, mind you. Even flashing the world's most perfect BIOS can go utterly wrong. You, like everyone, flash at your own risk.









If your last card burned? Whichever BIOS you choose, remember that when pushed these cards generate huge amounts of heat. Always create a more aggressive custom curve and if you plan to overclock it at all, add more fan - card and case, especially while gaming or benching. On the MSI, they'll still be pretty quiet up to 85%.

I've run the Gaming Z BIOS since I got the card. Never a problem.


----------



## ALVARIX

Quote:


> Originally Posted by *Beagle Box*
> 
> Looks like that will work just fine. Not telling you to do it, mind you. Even flashing the world's most perfect BIOS can go utterly wrong. You, like everyone, flash at your own risk.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If your last card burned? Whichever BIOS you choose, remember that when pushed these cards generate huge amounts of heat. Always create a more aggressive custom curve and if you plan to overclock it at all, add more fan - card and case, especially while gaming or benching. On the MSI, they'll still be pretty quiet up to 85%.
> 
> I've run the Gaming Z BIOS since I got the card. Never a problem.


On the other graphics card I had flashed the bios that is on page 477, and by the time it crashed I was just browsing Google and suddenly had a bluescreen and no longer started. After I asked for a replacement of the graphics card, the graphics card probably had a manufacturing defect.

I will try the 1709 version and then say if it was stable, now what is the version of nvflash that you recommend to flash the bios?

Thanks again for your availability.


----------



## Beagle Box

Quote:


> Originally Posted by *ALVARIX*
> 
> On the other graphics card I had flashed the bios that is on page 477, and by the time it crashed I was just browsing Google and suddenly had a bluescreen and no longer started. After I asked for a replacement of the graphics card, the graphics card probably had a manufacturing defect.
> 
> I will try the 1709 version and then say if it was stable, now what is the version of nvflash that you recommend to flash the bios?
> 
> Thanks again for your availability.


Actually, I'm pretty certain I used the version found in the zip file on _page_ 477. I run Windows 10 64-bit. It worked.


----------



## Roy360

Q: I have a founder editions 1080 card that's running stock, and while gaming MSI Afterburner reports 74 Celsius temps.

My computer is watercooled, so there's little to no airflow aside from the single 120mm fan on the side.

Is this anything to be concerned about?

What's a cheap waterblock i could install on this card?

TIA


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roy360*
> 
> Q: I have a founder editions 1080 card that's running stock, and while gaming MSI Afterburner reports 74 Celsius temps.
> 
> My computer is watercooled, so there's little to no airflow aside from the single 120mm fan on the side.
> 
> Is this anything to be concerned about?
> 
> What's a cheap waterblock i could install on this card?
> 
> TIA


Completely fine. 76C is very safe. FE does not need much air flow.


----------



## Beagle Box

Quote:


> Originally Posted by *Roy360*
> 
> Q: I have a founder editions 1080 card that's running stock, and while gaming MSI Afterburner reports 74 Celsius temps.
> 
> My computer is watercooled, so there's little to no airflow aside from the single 120mm fan on the side.
> 
> Is this anything to be concerned about?
> 
> What's a cheap waterblock i could install on this card?
> 
> TIA


No cheap water block as far as I know. $100-150. 74C isn't hot, but it's warm and it's hurting your performance. Good performance is below 68C. Truly great performance below 62C. If your side fan can blow directly into the card's intake, you can see some improvement. Laugh if you want, but I do it on my MSI GTX 1080 Gaming X and it's good for 4C.


----------



## philhalo66

Quote:


> Originally Posted by *Roy360*
> 
> Q: I have a founder editions 1080 card that's running stock, and while gaming MSI Afterburner reports 74 Celsius temps.
> 
> My computer is watercooled, so there's little to no airflow aside from the single 120mm fan on the side.
> 
> Is this anything to be concerned about?
> 
> What's a cheap waterblock i could install on this card?
> 
> TIA


your perfectly safe 74C isn't hot at all, most FE cards load in the mid 80's i dont know of any cheap waterblock but EK has a couple for around $140. Tempted to grab one myself.


----------



## raisethe3

Quote:


> Originally Posted by *shilka*
> 
> 
> 
> Spoiler: Finally had time to install my new EVGA GTX 1080 FTW2 that i got yesterday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes i know the the machine is a bit dusty so you dont need to tell me i know!
> 
> 
> Really hope this card actually works unlike my Gigabyte GTX 1070 Xtreme Gaming which was broken.


Congrats on your new card. Hope everything works out for you this time!


----------



## koven

Just got a BNIB Seahawk EK X off eBay for under $500, cheaper than buying a EK block + card so I went for it.

Love how quiet it is, idling mid 20s and mid 30s under load. Using it w/ the PG348Q. Will probably do a conservative OC since 3440x1440 is so demanding.


----------



## ALVARIX

It looks like the bios that is here ( https://www.techpowerup.com/vgabios/184799/msi-gtx1080-8192-160606 ) is exactly what I was looking for in this article ( https://www.techpowerup.com/223571/techpowerup-impact-msi-issues-oc-mode-by-default-bioses).

I have already flashed without any problems.


Thanks to @Beagle Box for your help.


----------



## Beagle Box

Quote:


> Originally Posted by *ALVARIX*
> 
> It looks like the bios that is here ( https://www.techpowerup.com/vgabios/184799/msi-gtx1080-8192-160606 ) is exactly what I was looking for in this article ( https://www.techpowerup.com/223571/techpowerup-impact-msi-issues-oc-mode-by-default-bioses).
> 
> I have already flashed without any problems.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Thanks to @Beagle Box for your help.











You're welcome.


----------



## broodro0ster

I've waited for the pricedrop of the GTX 1080 and bought one two weeks ago (Asus Strix non oc version), but I have a question about the memory clocks.

I oc'ed mine to +225mhz on the core and +525mh on the memory. When I benchmark in unigine heaven I'm getting 11080MHz on the memory, but when playing GTA V or Titanfall 2, it downclocks the memory to 10080MHz. When I unload my overclock, the memory goes down to 9GHz when gaming.
Is this because not all bandwidth is used? I thought only the core downclocked when hitting temperature/power limits.

My GPU core nicely starts out at 2140mhz and stays between 2105-2025mhz when warmed up (67-69°C during full load, slighty lower when gaming with the fans on auto) so I think that's a nice OC, but I'm not sure about the memory.
When benching in Unigine Heaven, my points increase until I go past +525mhz om the memory. At +550mhz I'm getting the same score as on +525mhz and at +600mhz my score starts dropping, so I'm guessing +525mhz is the optimal point.

I went from 2600-2700 points to 3104 in Unigine Heaven after my OC (1920*1080, ultra, 8x AA with a stock i5 4960), so that's a pretty nice card imo. But I'm not sure why my memory downclocks with 1GHz while gaming.


----------



## Roy360

Quote:


> Originally Posted by *philhalo66*
> 
> your perfectly safe 74C isn't hot at all, most FE cards load in the mid 80's i dont know of any cheap waterblock but EK has a couple for around $140. Tempted to grab one myself.


After doing a bit of testing, it actually goes to 88 degrees.

I wonder if this would work: http://m.ebay.ca/itm/Water-Block-High-Pressure-Pump-For-GTX1080-1070-VGA-Card-GPU-For-Nvidia-ATI-/172568273486

Waterblock + pump for 50$


----------



## steeludder

Hello fellas, it's been a while.

I've been away from benching for a while... but the urge is catching up!

So since there isn't any bios editor for Pascal, I figured my best bet would be to try to flash the Asus Strix T4 bios onto my FE card.

I power modded the card with liquid metal. It's water chilled (Hailea HC-500A), so I can get water temps under 10C and bench around 2200-2215 fairly easily. But that 1.093V limit is annoying me...

Has there been reports of bricked FE's after flashing to the T4 bios? I haven't seen any, it seems to have a good success rate... (famous last words)









Any last minute warnings?


----------



## x7007

Why there are 2 models of 1080GTX TI ? which one would I want to get ?

notice the ROPs/TMUs


----------



## philhalo66

Quote:


> Originally Posted by *Roy360*
> 
> After doing a bit of testing, it actually goes to 88 degrees.
> 
> I wonder if this would work: http://m.ebay.ca/itm/Water-Block-High-Pressure-Pump-For-GTX1080-1070-VGA-Card-GPU-For-Nvidia-ATI-/172568273486
> 
> Waterblock + pump for 50$


Honestly, i wouldn't risk it, think about it this way your card was probably close to 500 dollars give or take right? Do you really feel safe cheaping out on something that if turned out to be junk would kill your card? I wouldnt even touch that, it has nothing to keep the memory or power delivery cool except a fan and that's not nearly enough. IMO go for full cover block or don't bother. 88C is perfectly fine for a blower style card i still have a GTX 580 that ran at 95C+ daily for 6 years.


----------



## Roy360

Quote:


> Originally Posted by *philhalo66*
> 
> Honestly, i wouldn't risk it, think about it this way your card was probably close to 500 dollars give or take right? Do you really feel safe cheaping out on something that if turned out to be junk would kill your card? I wouldnt even touch that, it has nothing to keep the memory or power delivery cool except a fan and that's not nearly enough. IMO go for full cover block or don't bother.


I wouldn't actually use that









I was considering either
1) hybrid setup, where I replace the heatsink with a 45mm waterblock

2) leave as is

3) remove side panel and replace with box fan (this way I keep mobo mosfets cool)


----------



## philhalo66

Quote:


> Originally Posted by *Roy360*
> 
> How do you feel about hybrid setups? There are a few people in this thread that posted pictures of them replacing the heatsink with a 45mm waterblock and keeping the stock VRAM fan.
> 
> At this point I'm either going to go that route, or place a large box fan beside my computer (with the side door open)


I'm very wary of those. i have seen first hand too many times just what can happen if you do not cool VRM's properly especially on reference cards. I personally would never even give them a chance it's not worth the time or moneyif it died from VRM failure. With that said i am very very cautious with Power delivery cooling the primary driving force for me getting the ICX 1080 was so i could keep an eye on the VRM and Memory temps so take this with a grain of salt. Why not just crank up the GPU fans using MSI Afterburner if your really worried about the heat?

Quote:


> Originally Posted by *x7007*
> 
> Why there are 2 models of 1080GTX TI ? which one would I want to get ?
> 
> notice the ROPs/TMUs


Holy crap that texture fill rate on the top one is nearly double what my 1080 does o.0


----------



## TK421

Does anyone know how to get that program to display the TDP of the vBIOS?

http://www.expreview.com/53224-all.html


----------



## GRABibus

Hi,
I wanted to share my best overclock "Games + benchmarks stable" on my Gigabyte GTX 1080 Xtreme Gaming WATERFORCE 8G :

*Core => 2228MHz
Memory => 11088MHz
Vddc = 1.181V*

Bios => ASUS Strix OC t4 ( Version 86.04.17.00.76) => No more power limit, voltage manageable until 1,2V
+240MHz on core clock (Base clock at 1607MHz) and tweaking of the curve to have 2228MHz at 1.181V.
+595MHz on memory => 11088MHz memory frequency.
Fan speed of the AIO WC of the card set to 70%

Results :
- The card never dowclocks neither in games neither in benchmark => Stable core frequency at 2228MHz and stable memory frequency at 11088MHz
- Maximum core temperature around 53°C at 20°C ambient (Depending on the game)

http://www.casimages.com/img.php?i=17042601192417369815000488.png

Maximum bench core frequency 2253MHz at 1.2V !
Graphics score => 8629pts

http://www.3dmark.com/3dm/19462559


----------



## Beagle Box

Quote:


> Originally Posted by *GRABibus*
> 
> Hi,
> I wanted to share my best overclock "Games + benchmarks stable" on my Gigabyte GTX 1080 Xtreme Gaming WATERFORCE 8G :
> 
> *Core => 2228MHz
> Memory => 11088MHz
> Vddc = 1.181V*
> 
> Bios => ASUS Strix OC t4 ( Version 86.04.17.00.76) => No more power limit, voltage manageable until 1,2V
> +240MHz on core clock (Base clock at 1607MHz) and tweaking of the curve to have 2228MHz at 1.181V.
> +595MHz on memory => 11088MHz memory frequency.
> Fan speed of the AIO WC of the card set to 70%
> 
> Results :
> - The card never dowclocks neither in games neither in benchmark => Stable core frequency at 2228MHz and stable memory frequency at 11088MHz
> - Maximum core temperature around 53°C at 20°C ambient (Depending on the game)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.casimages.com/img.php?i=17042601192417369815000488.png
> 
> 
> 
> Maximum bench core frequency 2253MHz at 1.2V !
> Graphics score => 8629pts
> 
> http://www.3dmark.com/3dm/19462559


Wow!








You've really got that card tuned, my friend! Running 2228 solid is just nuts!
I give up. You win.









Where do I go to surrender?


----------



## jprovido

just got my nvidia sli bridge. looks dope


----------



## Motley01

I just got the Asus Strix 1080, for my new Ryzen build.

Can't get it a lick past 2040mhz on the core, and memory 1321. Even when I try just 10mhz faster it crashes.

But I got a decent 3dmark score: 7918 http://www.3dmark.com/spy/1638278


----------



## jprovido

Quote:


> Originally Posted by *Motley01*
> 
> I just got the Asus Strix 1080, for my new Ryzen build.
> 
> Can't get it a lick past 2040mhz on the core, and memory 1321. Even when I try just 10mhz faster it crashes.
> 
> But I got a decent 3dmark score: 7918 http://www.3dmark.com/spy/1638278


been very unlucky with gtx 1080's. I've had 4 so far. (strix, two FE's then gigabyte blower style cooler) and what's funny is the cheapest card, without a backplate, crappy cooler, the gigabyte turbo gtx 1080 on my ryzen build is the best overclocker. gets 2120mhz at stock voltage smh. my other cards barely hits 2050mhz with max voltage


----------



## steeludder

Quote:


> Originally Posted by *steeludder*
> 
> Hello fellas, it's been a while.
> 
> I've been away from benching for a while... but the urge is catching up!
> 
> So since there isn't any bios editor for Pascal, I figured my best bet would be to try to flash the Asus Strix T4 bios onto my FE card.
> 
> I power modded the card with liquid metal. It's water chilled (Hailea HC-500A), so I can get water temps under 10C and bench around 2200-2215 fairly easily. But that 1.093V limit is annoying me...
> 
> Has there been reports of bricked FE's after flashing to the T4 bios? I haven't seen any, it seems to have a good success rate... (famous last words)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any last minute warnings?


Well, thanks for all the warnings!









Finally flashed the Asus Strix T4 bios onto my FE card. It opens up max core voltage to 1.2V (vs 1.093 previously).

So without any further ado, I shall retake the crown of the absolute fastest 5960x + GTX1080 Time Spy Score on Futuremark with this:

*=== 8 9 2 1 ===*

5960x @ 4600MHz
[email protected] 2303MHz Core / 5556 MHz Mem
Water chilled by Hailea HC-500A (Water temps between 7.3-7.9C during benchmark run)

Line up here for autographs.

EDIT: and just for the lulz, here's the global HWBOT rankings page for all Time Spy scores with 5960x and GTX 1080: Looky Looky! (yeah, that dragonmike dude, that's me


----------



## Beagle Box

Quote:


> Originally Posted by *GRABibus*
> 
> Hi,
> I wanted to share my best overclock "Games + benchmarks stable" on my Gigabyte GTX 1080 Xtreme Gaming WATERFORCE 8G :
> 
> *Core => 2228MHz
> Memory => 11088MHz
> Vddc = 1.181V*
> 
> Bios => ASUS Strix OC t4 ( Version 86.04.17.00.76) => No more power limit, voltage manageable until 1,2V
> +240MHz on core clock (Base clock at 1607MHz) and tweaking of the curve to have 2228MHz at 1.181V.
> +595MHz on memory => 11088MHz memory frequency.
> Fan speed of the AIO WC of the card set to 70%
> 
> Results :
> - The card never dowclocks neither in games neither in benchmark => Stable core frequency at 2228MHz and stable memory frequency at 11088MHz
> - Maximum core temperature around 53°C at 20°C ambient (Depending on the game)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.casimages.com/img.php?i=17042601192417369815000488.png
> 
> 
> 
> Maximum bench core frequency 2253MHz at 1.2V !
> Graphics score => 8629pts
> 
> http://www.3dmark.com/3dm/19462559


What's the speed at 1.075V on your power curve?
Do you change it for different benchmarks? My best curve differs per benchmark.


----------



## GRABibus

Quote:


> Originally Posted by *Beagle Box*
> 
> What's the speed at 1.075V on your power curve?
> Do you change it for different benchmarks? My best curve differs per benchmark.


No, i don't change.
I apply 240MHz in MSI AB on core frequency.
Then, i tweak the curve as you can see for 2228MHZ at 1.181V.

If I set +245MHz, I crahs or artefact in timee Spy.

+240MHz on core is the absolute maximum I can add without problems.

concerning your question for 1.075V, I think it is 2138MHz or 2151MHz.
Sorry, I am not at home and can't checj precisely.

Your curve is much more better if I remember.


----------



## noles1983

nm


----------



## Beagle Box

Quote:


> Originally Posted by *GRABibus*
> 
> No, i don't change.
> I apply 240MHz in MSI AB on core frequency.
> Then, i tweak the curve as you can see for 2228MHZ at 1.181V.
> 
> If I set +245MHz, I crahs or artefact in timee Spy.
> 
> +240MHz on core is the absolute maximum I can add without problems.
> 
> concerning your question for 1.075V, I think it is 2138MHz or 2151MHz.
> Sorry, I am not at home and can't checj precisely.
> 
> Your curve is much more better if I remember.


My curve is based on the standard power curve. I raise the core curve unaltered until 1.075V = 2151MHZ. [+77MHz in my BIOS]. If 1.075V is also 2151MHz on your curve, they are practically identical until I spike mine up to [email protected] My card can't hold 2227 for long, but it's the only way it will run @2214MHz.

I can't run a spiked curve in Time Spy. I have to run a rounded curve with a max of 2176. I don't know why.


----------



## GRABibus

Quote:


> Originally Posted by *Beagle Box*
> 
> My curve is based on the standard power curve. I raise the core curve unaltered until 1.075V = 2151MHZ. [+77MHz in my BIOS]. If 1.075V is also 2151MHz on your curve, they are practically identical until I spike mine up to [email protected] My card can't hold 2227 for long, but it's the only way it will run @2214MHz.
> 
> I can't run a spiked curve in Time Spy. I have to run a rounded curve with a max of 2176. I don't know why.


I don't know which curve I posted above.
It is different from the one I checked some minutes ago









Here is the good one:

http://www.casimages.com/img.php?i=17042609421217369815002598.png

The flat curve starts at 2038MHz at 1.043V until 1.175V


----------



## GRABibus

Quote:


> Originally Posted by *Beagle Box*
> 
> My curve is based on the standard power curve. I raise the core curve unaltered until 1.075V = 2151MHZ. [+77MHz in my BIOS]. If 1.075V is also 2151MHz on your curve, they are practically identical until I spike mine up to [email protected] My card can't hold 2227 for long, but it's the only way it will run @2214MHz.
> 
> I can't run a spiked curve in Time Spy. I have to run a rounded curve with a max of 2176. I don't know why.


i wonder if I would try another gtx1080Ti, or, to get a new Gigabyte GTX1080 Xtreme Gaming WATERFORCE to make a SLI.
With thios OC Strix Bios t4, as the card is on water, it helps to raise frequzncy and to reach nice overclocks even in SLI.
Silicon lottery with 1080Ti seems so boring....


----------



## Beagle Box

Quote:


> Originally Posted by *GRABibus*
> 
> i wonder if I would try another gtx1080Ti, or, to get a new Gigabyte GTX1080 Xtreme Gaming WATERFORCE to make a SLI.
> With thios OC Strix Bios t4, as the card is on water, it helps to raise frequzncy and to reach nice overclocks even in SLI.
> Silicon lottery with 1080Ti seems so boring....


That's a good question.








Two 1080s will benchmark better and will be more over-clockable. But I'm not certain you will get better real-world performance with two 1080s over one good Ti. What had you planned to do with your current 1080 when you bought the Seahawk? If you planned to keep it for another pc or sell it, you should probably try another 1080Ti. If you had no set plans for it, a second, identical 1080 would be fun to experiment with.

It's not a bad '_problem_' to have...


----------



## GRABibus

Quote:


> Originally Posted by *Beagle Box*
> 
> That's a good question.
> 
> 
> 
> 
> 
> 
> 
> 
> Two 1080s will benchmark better and will be more over-clockable. But I'm not certain you will get better real-world performance with two 1080s over one good Ti. What had you planned to do with your current 1080 when you bought the Seahawk? If you planned to keep it for another pc or sell it, you should probably try another 1080Ti. If you had no set plans for it, a second, identical 1080 would be fun to experiment with.
> 
> It's not a bad '_problem_' to have...


Yeah, I had planned to resale my 1080.
Now, the question is should I get a 1080Ti FE or not ?
My case is small and I can't mount graphic cards with a lenght more than 29cm.


----------



## GRABibus

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> just got my nvidia sli bridge. looks dope


Nice !
I have a SLI HB Gigabyte bridge so I should be ready for SLI.

Did you test some games and can you give feedback on gaming performances between single GTX1080 and SLI ?


----------



## GRABibus

Quote:


> Originally Posted by *Motley01*
> 
> I just got the Asus Strix 1080, for my new Ryzen build.
> 
> Can't get it a lick past 2040mhz on the core, and memory 1321. Even when I try just 10mhz faster it crashes.
> 
> But I got a decent 3dmark score: 7918 http://www.3dmark.com/spy/1638278


Don't you plan to flash the Bios with Strix OC Bios t4 ?
This will unlock power.
Of cousre, at your own risk...


----------



## Bal3Wolf

any one else with evga 1080 hydro coppers here curious on your temps i been trying to get mine to run cooler i even took the block off and redid paste and now it hits 40-44c in games and 46c folding befor it was about 4-6c higher amb around 19-22c. My loop consist of a sr2 360 in the top a ek coolstream 420 in the front a xspc rx240 in the bottom with a ek sbay with dual ddc pumps running 3000rpms have 6 140 fans in the front 6 120mm vardars in the top and 2 on the bottom and a 6800k and the gtx 1080 being cooled.


----------



## Motley01

Quote:


> Originally Posted by *GRABibus*
> 
> Don't you plan to flash the Bios with Strix OC Bios t4 ?
> This will unlock power.
> Of cousre, at your own risk...


Wait I didn't even know I could do that. I will look into flashing the bios.


----------



## broodro0ster

Is there anyone else who is experiencing a downclock of the memory on his GTX? Mine goes to 10GHz in games such as GTA V and CS:GO, while it nicely stays at 11GHz during benching (+500mhz overlock on memory)


----------



## DJ_OXyGeNe_8

Quote:


> Originally Posted by *shilka*
> 
> 
> 
> Spoiler: Finally had time to install my new EVGA GTX 1080 FTW2 that i got yesterday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes i know the the machine is a bit dusty so you dont need to tell me i know!
> 
> 
> Really hope this card actually works unlike my Gigabyte GTX 1070 Xtreme Gaming which was broken.


Hi, I've too. Did you see new bios;

EVGA GeForce GTX 1080 FTW2 and SC2 - OPTIONAL 11GHz BIOS Update Available Now

https://forums.evga.com/EVGA-GeForce-GTX-1080-FTW2-and-SC2-OPTIONAL-11GHz-BIOS-Update-Available-Now-m2652350.aspx


----------



## marik123

I got my new MSI Gaming Armor 1080 OC arrived in the mail today, boot it up and can only hit 2075mhz on the core and 11000mhz on the memory. Is there anything I can do to boost it to 2100mhz? I tried push the voltage from 1.0675v (stock) to 1.093v and still crash.


----------



## Beagle Box

Quote:


> Originally Posted by *marik123*
> 
> I got my new MSI Gaming Armor 1080 OC arrived in the mail today, boot it up and can only hit 2075mhz on the core and 11000mhz on the memory. Is there anything I can do to boost it to 2100mhz? I tried push the voltage from 1.0675v (stock) to 1.093v and still crash.


Dunno. What have you done so far?


----------



## marik123

Quote:


> Originally Posted by *Beagle Box*
> 
> Dunno. What have you done so far?


I tried max the fan speed to 100%, max the voltage slider (limited to 1.093v under load).


----------



## Beagle Box

Quote:


> Originally Posted by *marik123*
> 
> I tried max the fan speed to 100%, max the voltage slider (limited to 1.093v under load).


Well, you've actually done nothing, yet.

I assume you're using MSI Afterburner. Do you have access to the Power Limit, Temp Limit, Core Clock sliders and Power Curve? If you can't see them, you need to choose a skin in Afterburner Properties that give you that access. I suggest you use the MSI Gaming Afterburner skin.

What you do next depends on which sliders are locked and which are unlocked.


----------



## marik123

Quote:


> Originally Posted by *Beagle Box*
> 
> Well, you've actually done nothing, yet.
> 
> I assume you're using MSI Afterburner. Do you have access to the Power Limit, Temp Limit, Core Clock sliders and Power Curve? If you can't see them, you need to choose a skin in Afterburner Properties that give you that access. I suggest you use the MSI Gaming Afterburner skin.
> 
> What you do next depends on which sliders are locked and which are unlocked.


I have power limit and temperature set to max already, voltage max, and fan speed max (max temp max fan speed @ 65c under Heaven 4.0 extreme 1080p).


----------



## Bal3Wolf

Do curve overclocking might get you to 2100+


----------



## Beagle Box

Access the Power Curve utility. You'll see the curve is based on the stock power curve- meaning that it peaks @ 1.05V and it's flat from there over to the right. By sliding the Voltage slider to max, you've opened for yourself the points on the curve beyond 1.050V to 1.093V.

There are philosophies on the best way to go about finding your card's best curve.

You can start by leaving the curve at its stock setting and raising each point to the right up 12 or 13 points on the Curve Editor. (Boost will try to move each point to a point 12 or 13 points above the position below it because these are standards for Voltages and speeds and you'll just have to work with it.). Do this by using your mouse to drag the 1.063V point up 13 points and press the checkmark button. That point should be set to the new value and all points to the right will up up to that level as well. By doing this one point at a time, you'll get the feel of how it works. Once you have a viable curve out to 1.093V, you can use the Core Clock slider to move that curve up until the card artifacts and crashes, and then back it down to a stable level.

An alternate method is to use the Core Clock slider first and then adjust the right side of the curve.

Once the basic curve is set, work with your memory to find the fastest, most stable setting. Faster does not mean more fps. At a certain point, faster memory speeds means poorer performance.

Lastly, tweak your curve. If you look in the benchmark sections you'll find where some of us have posted our curves. Don't try our curves first. These are card specific and it will be a waste of time trying to get them to run on your card. Also, you'll find different curves run best for different software. So some of us have multiple curves. For instance, a maximum max curve will benchmark well, but most likely suck for gaming.

I suggest you consult the primer on how to use Afterburner's Curve Editor. It does more than I've explained here.

Good luck.


----------



## marik123

Quote:


> Originally Posted by *Beagle Box*
> 
> Access the Power Curve utility. You'll see the curve is based on the stock power curve- meaning that it peaks @ 1.05V and it's flat from there over to the right. By sliding the Voltage slider to max, you've opened for yourself the points on the curve beyond 1.050V to 1.093V.
> 
> There are philosophies on the best way to go about finding your card's best curve.
> 
> You can start by leaving the curve at its stock setting and raising each point to the right up 12 or 13 points on the Curve Editor. (Boost will try to move each point to a point 12 or 13 points above the position below it because these are standards for Voltages and speeds and you'll just have to work with it.). Do this by using your mouse to drag the 1.063V point up 13 points and press the checkmark button. That point should be set to the new value and all points to the right will up up to that level as well. By doing this one point at a time, you'll get the feel of how it works. Once you have a viable curve out to 1.093V, you can use the Core Clock slider to move that curve up until the card artifacts and crashes, and then back it down to a stable level.
> 
> An alternate method is to use the Core Clock slider first and then adjust the right side of the curve.
> 
> Once the basic curve is set, work with your memory to find the fastest, most stable setting. Faster does not mean more fps. At a certain point, faster memory speeds means poorer performance.
> 
> Lastly, tweak your curve. If you look in the benchmark sections you'll find where some of us have posted our curves. Don't try our curves first. These are card specific and it will be a waste of time trying to get them to run on your card. Also, you'll find different curves run best for different software. So some of us have multiple curves. For instance, a maximum max curve will benchmark well, but most likely suck for gaming.
> 
> I suggest you consult the primer on how to use Afterburner's Curve Editor. It does more than I've explained here.
> 
> Good luck.


I tried to curve editor like you described and I'm still wall at 2075mhz. Anything higher will result in black screen in games.


----------



## mbm

Quote:


> Originally Posted by *marik123*
> 
> I tried to curve editor like you described and I'm still wall at 2075mhz. Anything higher will result in black screen in games.


I cant go any higher on the long term either regardsless of voltage.
The only advance to use the curve is I can lower the voltage to 0.975 and till do 2075mhz


----------



## dayveegravy

There's only a few fps difference between 2070 and 2150.


----------



## ondoy




----------



## Rhadamanthis

Hi anyone have a bios 1080 strix 11 gbs?


----------



## IronAge

That is a EVGA FTW2 11Gbps VBios

https://www.techpowerup.com/vgabios/191353/191353

You are aware that the GTX1080 Strix 11 Gpbs in fact got different VRAM ... it has the Ti Microns.

I have just sold my Palit Jetstream which does 2215/5594 with just 1.1V via AB Curve with StrixXOC_t4 Bios.

Benchmark result:

http://www.3dmark.com/3dm/19179203

I have another Palit/GW Design based ... a Phoenix which does 2228/5584 with 1.125V.


----------



## bl4ckdot

Hey guys.
I'm currently looking to upgrade my second PC (i7 3770k at 4.6, 16Go of RAM and two GTX 680 in SLI).
I could get a Gainward Phoenix 1080 for 485€ (not the GS or GLH). Should I go for it or wait some times for Vega / lower prices ? I'm not sure to be that patient









Thanks


----------



## IronAge

Just get it .... the price is fine ... my PHOENIX is the lowest bin also ... still its the best overclocker of over a dozen GTX1080 i have tested so far.

It has a Dual Bios ... so you can flash the GLH Bios to Bios 2 and still have the original Bios on Bios 1 for warranty purposes.


----------



## V5-aps

Anyone flashed the MSI Gaming X plus bios to their MSI or any other card ?


----------



## nrpeyton

Anyone interested in:

I have a waterblock I don't need anymore, fully compatible TX 780 Classy, GTX 780 Ti Classy, GTX 780 Ti Kingpin, 980 Ti Classy & 1080 Classified cards

?
PM me


----------



## Randomocity

Speaking of which, I have a 1080 FTW EK block and backplate if anyone is interested.


----------



## Beagle Box

Quote:


> Originally Posted by *V5-aps*
> 
> Anyone flashed the MSI Gaming X plus bios to their MSI or any other card ?


Why would that BIOS, with a power max of 250W, be better than the 86.04.17.00.83 Gaming X BIOS with a 295W limit?

My MSI GTX 1080 Gaming X runs a Gaming Z BIOS which also has a 295W power limit and _it is still power-limited_. I can't imagine that particular BIOS would be an upgrade, even with adjusted memory timings.

What's the allure?


----------



## V5-aps

Quote:


> Originally Posted by *Beagle Box*
> 
> Why would that BIOS, with a power max of 250W, be better than the 86.04.17.00.83 Gaming X BIOS with a 295W limit?
> 
> My MSI GTX 1080 Gaming X runs a Gaming Z BIOS which also has a 295W power limit and _it is still power-limited_. I can't imagine that particular BIOS would be an upgrade, even with adjusted memory timings.
> 
> What's the allure?


Just curious to see if the memory clocks any better with this bios and if it is possible


----------



## ZhopkaPopka

Hi guys.

Recently i bought new PC. 7700k+1080gtx Asus Strix Advanced + Monitor:: Acer XB271Hu.

Sometimes im getting this artifact in middle of screen after Motherboard-Bios Logo and then nothing happens, just black screen and "No signal".Only rebooting PC or Monitor helps.

Problem in:

1) Monitor
2) DP Cable
3) GPU

I bought all this PC components recently so i need to found out asap what is the source of the problem. So i can RMA it without problem (14 day since buy).

I googled it and thats quite common problem: someone says its problem in Windows 10 - Fast Boot, others say its Integrated GPU fault, someone says its Monitor/Cable. Too many opinions.

I already resend Monitor back to Amazon, because it has dead pixels and box was already opened before me, so tomorrow im gonna try with new monitor. But what if Problem is in GPU? Should i hurry with RMA GPU?

Have anyone here the same problem and how did you fix it?


----------



## bl4ckdot

Got the Gainward Phoenix for a really good price. Flashed the GLH bios. In game it never went beyond 65°C oc'ed at 2126Mhz. I could push it more but I'm satisfied. Very happy


----------



## L4TINO

The MSI Gaming X plus is the same PCB as the non plus, but the actual memory is changed from a D9TXS which is rated at 10000mhz (Gaming X). to the D9VRL which is rated at 11000mhz (Gaming X Plus)

i was also looking into changing the bios to the Plus, but i did my research so i'm definitely not going there.
might try the gaming Z bios.

i forgot to add that the chip is also different.
Gaming X Plus: GP104-410
Gaming X: GP104-400


----------



## L4TINO

Techpowerup doesn't even have a Gaming Z bios :S:S


----------



## x7007

Quote:


> Originally Posted by *ZhopkaPopka*
> 
> Hi guys.
> 
> Recently i bought new PC. 7700k+1080gtx Asus Strix Advanced + Monitor:: Acer XB271Hu.
> 
> Sometimes im getting this artifact in middle of screen after Motherboard-Bios Logo and then nothing happens, just black screen and "No signal".Only rebooting PC or Monitor helps.
> 
> Problem in:
> 
> 1) Monitor
> 2) DP Cable
> 3) GPU
> 
> I bought all this PC components recently so i need to found out asap what is the source of the problem. So i can RMA it without problem (14 day since buy).
> 
> I googled it and thats quite common problem: someone says its problem in Windows 10 - Fast Boot, others say its Integrated GPU fault, someone says its Monitor/Cable. Too many opinions.
> 
> I already resend Monitor back to Amazon, because it has dead pixels and box was already opened before me, so tomorrow im gonna try with new monitor. But what if Problem is in GPU? Should i hurry with RMA GPU?
> 
> Have anyone here the same problem and how did you fix it?


I think it's a known issue with the drivers. check guru3d forums they talked about it on one of the threads. so don't worry much.


----------



## Beagle Box

Quote:


> Originally Posted by *L4TINO*
> 
> Techpowerup doesn't even have a Gaming Z bios :S:S


This worked well for me.

Post 4761 in this thread containing the MSI GTX Gaming Z OC and Gaming BIOS Installer App








Use it at your own risk.


----------



## lanofsong

Hey there GTX 1080 owners,

Would you consider signing up with Team OCN for the 2017 Pentathlon (*May 5th through May 19th*). There is so much time left an we really could use your help.

This event is truly a GLOBAL battle with you team OCN going up against many teams from across the world and while we put in a good showing at last year's event by finishing 6th, we could do with a lot more CPU/GPU compute power. All you need to do is sign up and crunch on any available hardware that you can spare.

The cool thing about this event is that it spread over 5 disciplines over *varying lengths of time* (different projects) so there is a lot of *strategy/tactics* involved.

We look forward to having you and your hardware on our team. Again, this event lasts for two weeks and takes place May 5th through the 19th.


Download the software here.

https://boinc.berkeley.edu/download.php

Presently we really would like some help with the following project:

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

To find your Cross Project ID# - sign into your account and it will be located under Computing and Credit


Please check out the GUIDE - How to add BOINC Projects page for more information about running different projects:

This really is an exciting and fun event and i look forward to it every year and I am hoping that you will join us and participate in this event









BTW - There is an awesome BOINC Pentathlon badge for those who participate









lanofsong

OCN - FTW


----------



## emsj86

Is there a bios editor yet like the 780s did where I can flash than change voltage and power limits ?


----------



## Beagle Box

Quote:


> Originally Posted by *emsj86*
> 
> Is there a bios editor yet like the 780s did where I can flash than change voltage and power limits ?


Nope. Not even rumors. Your best bet is to find an existing BIOS that better suits your needs.


----------



## mariojuniorjp

Guys, why do Pascal cards lose performance with undervolting?


----------



## TK421

Locking voltage on msi afterburner only applies to 2nd card, 1st card is still using default voltage and ignoring curve.

If I set one, and switch to the other it will show up as default (+0 core) instead of (curve).

Thoughts?


----------



## Menthol

You need to uncheck sink cards and overclock each individually


----------



## BlivAK

Hi all, I'm the newbie here and hope I'm permitted to make this post.

I've always thought that liquid cooled GPUS will use less power, perhaps not significant but less than air cooled card. So I've created a form with only 5 questions.

I appreciate that this will mean going a little bit out of your way to run some test software/benchmark but your participation will really mean a lot to put my suspicions to rest.

To keep it somewhat standardized, I kindly ask you overclock GPU CORE TO 2000MHz/Mem to 5500MHz( I use MSI AB).

Below is how sections without a radio button should be answered.

GPU/MEM OVERCLOCK - +100/+500mHZ
Peak power draw - 200W
Test software - 3D Firestrike

That's it.
















Thank you.

Google Form - https://goo.gl/forms/o6ihtPW3c0sSsAnl2


----------



## lanofsong

Hey there GTX 1080 owners,

We truly could use your help here. Presently we are #1 and just ahead of two of the Great TITAN's when it comes to Distributed Computing and to stay there we could use the help from you and your BOSS GPU's. Only 4 days left.




Download the software here.

https://boinc.berkeley.edu/download.php

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

Thanks in advance.

lanofsong

OCN - FTW


----------



## sirleeofroy

Quote:


> Originally Posted by *lanofsong*
> 
> Hey there GTX 1080 owners,
> 
> We truly could use your help here. Presently we are #1 and just ahead of two of the Great TITAN's when it comes to Distributed Computing and to stay there we could use the help from you and your BOSS GPU's. Only 4 days left.
> 
> 
> 
> 
> 
> Download the software here.
> 
> 
> 
> https://boinc.berkeley.edu/download.php
> 
> Add the following *GPU* project - *Einsteinathome.org*
> 
> 
> 
> Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.
> 
> 
> Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful " src="https://www.overclock.net/images/smilies/thumbsupsmiley.png" style="border:0px;">
> 
> 
> 
> 
> 8th BOINC Pentathlon thread
> 
> Thanks in advance.
> 
> lanofsong
> 
> OCN - FTW" src="https://www.overclock.net/images/smilies/sonic.gif" style="border:0px;">


I'll get involved.....

Just moved so the machine has been boxed up but I'm all set up now, I'll start when I get home from work tonight..


----------



## TK421

Quote:


> Originally Posted by *Menthol*
> 
> You need to uncheck sink cards and overclock each individually


why do we need to uncheck sync cards?

isn't the purpose of sync cards to apply the same settings over both of the card?


----------



## lanofsong

Quote:


> Originally Posted by *sirleeofroy*
> 
> I'll get involved.....
> 
> Just moved so the machine has been boxed up but I'm all set up now, I'll start when I get home from work tonight..


Thank you for joining the battle


----------



## Vellinious

Quote:


> Originally Posted by *TK421*
> 
> why do we need to uncheck sync cards?
> 
> isn't the purpose of sync cards to apply the same settings over both of the card?


Because the curve for each card is going to be slightly different (or wildly different), they don't allow the curve to sync to the other GPU.


----------



## spddmn24

Anyone have experience with the msi 1080 duke? I snagged one for $435 assuming the rebates go through and justified it by potentially selling my 1070 quicksilver and waterblock for it. But if the cooling is mediocre my block will fit it. Can't find any reviews anywhere on it.


----------



## lever2stacks

Looks like I just joined the club.



Lever


----------



## Danja

I'm interested in joining the owners club... Noticed that the gigabyte cards are dipping into the high $400s range while other brands are still over $500. Are there any drawbacks to the gigabytes?


----------



## Alwrath

Quote:


> Originally Posted by *spddmn24*
> 
> Anyone have experience with the msi 1080 duke? I snagged one for $435 assuming the rebates go through and justified it by potentially selling my 1070 quicksilver and waterblock for it. But if the cooling is mediocre my block will fit it. Can't find any reviews anywhere on it.


Always put it on water if you can, better OC stability.


----------



## spddmn24

Quote:


> Originally Posted by *Alwrath*
> 
> Always put it on water if you can, better OC stability.


I only gained 30-50mhz on my 1070 under water. I'll see how the stock one is, if it keeps it in the 60's I'll probably just pocket the $$ from selling the block vs 1-2% gains. I'm not sure how long I'm going to keep this anyway, if vega is good that would let me finally get a 1440p monitor for the center one on my tripples. Unless nvidia decides to let you mix resolutions in surround, but I'm not holding my breath.


----------



## TK421

Quote:


> Originally Posted by *Vellinious*
> 
> Because the curve for each card is going to be slightly different (or wildly different), they don't allow the curve to sync to the other GPU.


ahh...


----------



## Yukss

mine comes tomorrow.. .. silly jump from a gtx 1070


----------



## SuperZan

Quote:


> Originally Posted by *Yukss*
> 
> mine comes tomorrow.. .. silly jump from a gtx 1070


Hey, more performance is more performance.







I 'justified' my diagonal-grade by giving my 1070 to my better half. I've been very pleased with the performance on 1440p/144hz.


----------



## spddmn24

Do 1080's run this hot or is the cooler just that bad on the msi duke? 296 watts







This was firestrike ultra stress test so gaming should pull quite a bit less power.


----------



## philhalo66

Quote:


> Originally Posted by *spddmn24*
> 
> Do 1080's run this hot or is the cooler just that bad on the msi duke? 296 watts
> 
> 
> 
> 
> 
> 
> 
> This was firestrike ultra stress test so gaming should pull quite a bit less power.


that's alot warmer than my EVGA card. do you leave the fans on auto or do you have a custom fan profile? i leave mine on auto and it peaks at 74C.


----------



## mbm

I have a evga sc gaming.
It ran hot and Loud.
I have now Mounted a Arctic extreme IV.
Absolutely silent now. Max. Temp. Is 60C. It Will run Even cooler with higher fanspeed but like it silent.


----------



## spddmn24

Quote:


> Originally Posted by *philhalo66*
> 
> that's alot warmer than my EVGA card. do you leave the fans on auto or do you have a custom fan profile? i leave mine on auto and it peaks at 74C.


Manual. Just tried the game that ran my 1070 the hottest and it hit 75 peak. So it looks like mid-low 70's is what it will run at gaming. I think the shroud just chokes off the airflow, the sides are pretty much sealed so there is nowhere for the air to go. My waterblock will fit it, but I was hoping to sell it + my 1070 to recoup most of the cost for the upgrade.


----------



## Yukss

Quote:


> Originally Posted by *SuperZan*
> 
> Hey, more performance is more performance.
> 
> 
> 
> 
> 
> 
> 
> I 'justified' my diagonal-grade by giving my 1070 to my better half. I've been very pleased with the performance on 1440p/144hz.


thats what i thought.. i got the 1080 FE with ek waterblock for 450$ locally


----------



## Koniakki

Soon...!









https://postimg.org/image/5na733pob/

https://postimg.org/image/c3xcbp3tx/

https://postimg.org/image/nhjvmwecl/

https://postimg.org/image/4qhycqjs5/


----------



## L4TINO

Quote:


> Originally Posted by *spddmn24*
> 
> Do 1080's run this hot or is the cooler just that bad on the msi duke? 296 watts
> 
> 
> 
> 
> 
> 
> 
> This was firestrike ultra stress test so gaming should pull quite a bit less power.


those temps are fine. i run a MSI 1080 Gaming X and have it overclocked with the voltage and power limit set to max, i get around 83c on stock fan speeds with a game called black desert online running the gpu usage at %100 and the power usage around %100 mark constant on high end graphics mode.


----------



## OccamRazor

Quote:


> Originally Posted by *Koniakki*
> 
> Soon...!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://postimg.org/image/5na733pob/
> 
> https://postimg.org/image/c3xcbp3tx/
> 
> https://postimg.org/image/nhjvmwecl/
> 
> https://postimg.org/image/4qhycqjs5/


Looking great Bro! Keep those pics coming when its finished!

Cheers

Occamrazor


----------



## Koniakki

Quote:


> Originally Posted by *OccamRazor*
> 
> Looking great Bro! Keep those pics coming when its finished!
> 
> Cheers
> 
> Occamrazor


Occam bro!! I hope you all do well. I know I'm not that active anymore(ANL(Analog Life) issues) but I try my best to check in when possible.









Btw the coolant has changed to EK-CryoFuel Navy Blue.

I think it looks marvelous and I wanted to test it anyway.









And the whole build is a complete transfer from my existing and although lovely, really troublesome Core P5.

Stay digital!


----------



## Yukss

new toy. gtx 1080 and ek WB


----------



## SuperZan

Looking good, that's a nifty build.


----------



## Yukss

Quote:


> Originally Posted by *SuperZan*
> 
> Looking good, that's a nifty build.


thanks


----------



## Yukss

Quote:


> Originally Posted by *x7007*
> 
> Why there are 2 models of 1080GTX TI ? which one would I want to get ?
> 
> notice the ROPs/TMUs


this is very interesting..


----------



## Dasboogieman

Quote:


> Originally Posted by *x7007*
> 
> Why there are 2 models of 1080GTX TI ? which one would I want to get ?
> 
> notice the ROPs/TMUs


That looks like a GPUz reading error. I used to get something similar with my old MSI AMD 290 which would register as having 2816 cores.
I don't think its possible on GP102 to disable the memory controller without also disabling the attached L2 and ROPs block as well.


----------



## Vellinious

Selling my two best overclockers, if anyone knows someone that's looking. In the classifieds here.


----------



## Yukss

not bad results i guess... can i push it any further ?


----------



## Agueybana_II

Does anyone know if there is a full water block that will fit Gigabyte GV-N1080D5X-8GD? It's currently on sale for 420~ want to pick it up but not sure I can get underwater. A quick search didn't yield any definite answer was wondering if someone had experience or the card.

Thanks in advance


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> Selling my two best overclockers, if anyone knows someone that's looking. In the classifieds here.


Damn. Just got my 2nd FTW Hybrid last week. Had I known you were selling yours I would have jumped on them as I am intending to go Custom Loop in the future anyway.

Good luck with the house and the sale.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Damn. Just got my 2nd FTW Hybrid last week. Had I known you were selling yours I would have jumped on them as I am intending to go Custom Loop in the future anyway.
> 
> Good luck with the house and the sale.


Thanks, man


----------



## GRABibus

Quote:


> Originally Posted by *Yukss*
> 
> not bad results i guess... can i push it any further ?


Yes, flash With ASUS strix OC t4 Bios (at your own risk).
It unlocks power and you can set voltage until 1,2v (no more 1,093v limitation)
I run my SLI of Gigabyte xtreme gaming waterforce at 2215/5544 @ 1,15v.


----------



## Yukss

Quote:


> Originally Posted by *GRABibus*
> 
> Yes, flash With ASUS strix OC t4 Bios (at your own risk).
> It unlocks power and you can set voltage until 1,2v (no more 1,093v limitation)
> I run my SLI of Gigabyte xtreme gaming waterforce at 2215/5544 @ 1,15v.


mine is founders editions, that bios works on this one too.. ?

i pushed a little further. (+0.75v)


----------



## GRABibus

Quote:


> Originally Posted by *Yukss*
> 
> mine is founders editions, that bios works on this one too.. ?


I don't know.
But water or improved air cooler are better for this Bios.
In BF1 for example, I do maximum 52°C at 22°C ambient at 2215MHz/5544MHz at 1.15V.

Here is my time Spy score :

http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1100#post_26092077

Roughly 17000pts graphics score, which is not too bad









Maybe some owner's who tried this Bios on FE cards can give their feedback...


----------



## Vellinious

Quote:


> Originally Posted by *GRABibus*
> 
> I don't know.
> But water or improved air cooler are better for this Bios.
> In BF1 for example, I do maximum 52°C at 22°C ambient at 2215MHz/5544MHz at 1.15V.
> 
> Here is my time Spy score :
> 
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1100#post_26092077
> 
> Roughly 17000pts graphics score, which is not too bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe some owner's who tried this Bios on FE cards can give their feedback...


Nice score. Just a tiny bit higher than the graphics score my 2 run. I lowered temps, though, as opposed to raising voltage.


----------



## spinejam

Count me in this club, ...finally!


----------



## spddmn24

Put my 1080 duke under water. I was planning on selling my ekwb block from my 1070, but the duke cooler being laughably bad + the block being on sale killing its resale made it a no brainer. I think the fan shroud sealing off the sides just kills airflow through the heatsink. Temps in firestrike ultra stress test went from 79c at 95% fans to 44c max.

Not sure how this stacks up to other cards, 2139/11050 seems stable.

http://www.3dmark.com/fs/12657830


----------



## Yukss

Quote:


> Originally Posted by *GRABibus*
> 
> I don't know.
> But water or improved air cooler are better for this Bios.
> In BF1 for example, I do maximum 52°C at 22°C ambient at 2215MHz/5544MHz at 1.15V.
> 
> Here is my time Spy score :
> 
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1100#post_26092077
> 
> Roughly 17000pts graphics score, which is not too bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe some owner's who tried this Bios on FE cards can give their feedback...


i better keep it this way


----------



## Yukss

is this card not worthy anymore ?


----------



## Dasboogieman

Quote:


> Originally Posted by *Yukss*
> 
> is this card not worthy anymore ?


It depends on your budget and power envelope.


----------



## Derek1

Finally got the SLI setup running.







Just my standard OC for now and will be pushing them to see what I can get this weekend.


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> Finally got the SLI setup running.
> 
> 
> 
> 
> 
> 
> 
> Just my standard OC for now and will be pushing them to see what I can get this weekend.


Nice graphics score on Timespy. My highest was 16897.


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Vellinious*





Nice graphics score on Timespy. My highest was 16897.

Thanks
Was lookin at yours and a few others ahead of me and noticed your reported clock speed was like 1875 or something. Is that a glitch?
Not ever gonna catch the Ti's or the Titan XP's so I am resigned to that and am happy with the moderate OC scores I am getting. (fired up Witcher 3 all ultra at 2160p and am getting 85fps @45C)
I also need to convince Mr TooShort to sell me his Xeon 1680 v2 to make up ground on the 6950's lol. (your cpu score of 12k is twice mine)


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Thanks
> Was lookin at yours and a few others ahead of me and noticed your reported clock speed was like 1875 or something. Is that a glitch?
> Not ever gonna catch the Ti's or the Titan XP's so I am resigned to that and am happy with the moderate OC scores I am getting. (fired up Witcher 3 all ultra at 2160p and am getting 85fps @45C)
> I also need to convince Mr TooShort to sell me his Xeon 1680 v2 to make up ground on the 6950's lol. (your cpu score of 12k is twice mine)


If I recall correctly, my best scores came with the core at between 2214 and like 2240 somewhere in SLI (depending on the coolant temps) The memory on those cards didn't seem to overclock as well as most, though. Early version of the cards vs later versions? Dunno. It always held back my scores quite a bit.


----------



## fat4l

Guys.
I know theres a guy that make a "guide" of how to set the curve properly for increased perfomance. Any link please ? Thanks
Like he was adding + 100mhz on the core, then fiddling with the curve ....


----------



## GRABibus

Quote:


> Originally Posted by *Vellinious*
> 
> Nice graphics score on Timespy. My highest was 16897.


I have reached 16989 Graphis score









http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1100#post_26101399

I am sure I can go over 17000


----------



## Derek1

Quote:


> Originally Posted by *GRABibus*
> 
> I have reached 16989 Graphis score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1100#post_26101399
> 
> I am sure I can go over 17000


Best I can do on these 2 I think. 16817 Graphics @2152/5800

http://www.3dmark.com/3dm/20056070

Though I am not using the T4 nor the AB curve method, just Precision OCX.


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## donkidonki

Hello.

I just subscribed to this thread as I am awaiting delivery of a pair of MSI 1080 Seahawk X EK's to replace my trusty old EVGA 780ti's with EK water blocks. It was a tough decision as I could have got a brand new MSI 1080ti Seahawk EK for the same price which was tempting but I really like the look of the sli cards with my EK plexi bridge so that forced the decision.

I'll post some pics and benchmarks once up and running.


----------



## GRABibus

I finally succeeded to pass 17000 graphic score at TimeSpy with my SLI









http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1110#post_26110493

Its not stable (Some artefacts), but score is valid


----------



## Jackl2

Is it possible to adjust the default (BIOS) fan curve on a GTX 1080 Hybrid FTW card? It starts at 56% and at that RPM the fan makes a weird loud noise, anything above or below it its fine. Also, are there any How-To on updating or modifying the BIOS on these cards?


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Jackl2*
> 
> Is it possible to adjust the default (BIOS) fan curve on a GTX 1080 Hybrid FTW card? It starts at 56% and at that RPM the fan makes a weird loud noise, anything above or below it its fine. Also, are there any How-To on updating or modifying the BIOS on these cards?


Simple answer: No.

We can't edit or make changes to the BIOS of these cards.

What you can do to solve your problem; use a program like MSI Afterburner and create a custom fan curve.


----------



## Jackl2

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Simple answer: No.
> 
> We can't edit or make changes to the BIOS of these cards.
> 
> What you can do to solve your problem; use a program like MSI Afterburner and create a custom fan curve.


I did a custom fan curve, but it only works as long as am running Windows. Sometimes I use Linux or a bootable CD, and that is where it becomes really annoying.


----------



## ZhopkaPopka

Anyone bothered with *undervolting*? I made my 1080 ASUS Strix Advanced to run on 0.9v on 1949 mhz. Stock voltage was 1.063.

Lowering voltage gave me -10 grad temps (now its 63 at peak, and it was 73). and -12% power drain.

Someone who did undervolt post your results pls.

I have 2 questions:
1) Why my STRIX Advanced had 1949 mhz at stock, while Advanced version should boost only to 1835mhz? Its not OC version.
2) I set in MSI Voltage CURVE to 1960mhz on 0.9v, but those 1960 maintain only first seconds when test begings and then only 1949 mhz all the time. If i increase voltage - nothing changes, still 1949.


----------



## h2spartan

Anyone know if the Strix bios will work on EVGA hybrid ftw? Anyone try it?

Should I try it lol?


----------



## ZhopkaPopka

Quote:


> Originally Posted by *h2spartan*
> 
> Anyone know if the Strix bios will work on EVGA hybrid ftw? Anyone try it?
> 
> Should I try it lol?


Why do you want to try it?


----------



## h2spartan

Quote:


> Originally Posted by *ZhopkaPopka*
> 
> Why do you want to try it?


Ive read that you can unlock voltage to 1.2 with that bios


----------



## h2spartan

Isnt the strix oc 1080 a non reference design? If people are having success flashing reference 1080s with its bios, is there a good chance it will work on my evga 1080 ftw hybrid? Thanks for any help.

And honestly, if not, im happy with my card. Its pulling 2100 stable at 1.08v


----------



## hertz9753

So why do you want to change the bios?


----------



## h2spartan

This is a comment from evga forums...not sure how accurate it is but, he seems to believe the "T4 bios" is compatible with FTW. Honestly i hope it is, i would like to get more out of it.
Quote:


> T4 BIOS is NOT compatible with Classified. It IS however compatible with FTW.
> 
> To correct this >>>> A new Classified 1080 voltage tool was released a few days ago.
> 
> Nothing dramatic, your still hard-limited to 1.25v (so quite safe).
> 
> However before this release, we were BIOS limited to 1.093v. Most classified owners should at least now get 2200 on water. Some may need "chilled" water depending on Silicon Lottery.
> 
> This is currently the ONLY software option for Classified 1080 voltage (Classified is NOT compatible with the ASUS T4 BIOS). So this release has corrected a terrible disadvantage that the Classified had this round when compared against the FTW.
> 
> FTW owners could flash the ASUS T4 BIOS. Seems the voltage controllers on STRIX/FTW are similar enough for the ASUS BIOS to "talk to" FTW cards. Be careful though.


----------



## hertz9753

I joined the EVGA Forum in 2003 but I'm still called new because I don't post very often. Do you have a link?


----------



## h2spartan

Quote:


> Originally Posted by *hertz9753*
> 
> I joined the EVGA Forum in 2003 but I'm still called new because I don't post very often. Do you have a link?


Oh yah. Here ya go:

https://forums.evga.com/m/tm.aspx?m=2548379&p=6


----------



## hertz9753

@nrpeyton is also a member here. You can send him a PM and talk about the bios.


----------



## h2spartan

Quote:


> Originally Posted by *hertz9753*
> 
> @nrpeyton is also a member here. You can send him a PM and talk about the bios.


Thank you so much. I'm definitely going to do that before I attempt it and end up bricking my card.

+1 rep to you sir!


----------



## donkidonki

Well today I swapped out my old EVGA GTX780ti's (with EK waterblocks) for a pair of MSI GTX1080 Seahawk X EK's.

All went well but Jesus! Those MSI's are huge! Half again the size of the old cards. They only just fit in my case. Was a bit of a surprise was all.


----------



## Jackl2

Quote:


> Originally Posted by *h2spartan*
> 
> Isnt the strix oc 1080 a non reference design? If people are having success flashing reference 1080s with its bios, is there a good chance it will work on my evga 1080 ftw hybrid? Thanks for any help.
> 
> And honestly, if not, im happy with my card. Its pulling 2100 stable at 1.08v


Quote:


> Originally Posted by *h2spartan*
> 
> Thank you so much. I'm definitely going to do that before I attempt it and end up bricking my card.
> 
> +1 rep to you sir!


I have the exact same card as you, and I am very interested as the same goal as you as well!

please do keep this forum posted with your results.

Question for you: does the fan on the card itself (not the radiator) make a humming noise for you at idle speeds (around 56% I believe)? Mine does, only way to stop it is to either run the fan faster (through XOC) or slower (slowing it down manually for testing).


----------



## nick779

So, I think im going to buy a 1080 to replace my 780ti. Question is, which one? Ive had decent luck with EVGA, but my ACX v1 cooler is pretty loud and the fans almost like rattle.

Is the iCX cooler on the SC2 a pretty good matchup for performance and quiet? or should I go for another brand? (I have little to no intentions of overclocking, just want a nice factory OC card)

What would you guys suggest? Trying to keep it around $550, and I dont really want a hybrid.


----------



## h2spartan

Quote:


> Originally Posted by *Jackl2*
> 
> I have the exact same card as you, and I am very interested as the same goal as you as well!
> 
> please do keep this forum posted with your results.
> 
> Question for you: does the fan on the card itself (not the radiator) make a humming noise for you at idle speeds (around 56% I believe)? Mine does, only way to stop it is to either run the fan faster (through XOC) or slower (slowing it down manually for testing).


Okay, will do!

As far as the humming noise, I'm not getting anything like that. Might be a fan issue with yours. You might consider an RMA unless it doesn't bother you that much.


----------



## bl4ckdot

Quote:


> Originally Posted by *nick779*
> 
> So, I think im going to buy a 1080 to replace my 780ti. Question is, which one? Ive had decent luck with EVGA, but my ACX v1 cooler is pretty loud and the fans almost like rattle.
> 
> Is the iCX cooler on the SC2 a pretty good matchup for performance and quiet? or should I go for another brand? (I have little to no intentions of overclocking, just want a nice factory OC card)
> 
> What would you guys suggest? Trying to keep it around $550, and I dont really want a hybrid.


I would recommend the Gainward Phoenix


----------



## Jackl2

Quote:


> Originally Posted by *h2spartan*
> 
> Okay, will do!
> 
> As far as the humming noise, I'm not getting anything like that. Might be a fan issue with yours. You might consider an RMA unless it doesn't bother you that much.


It drives me crazy lol

Getting a replacement on the way. What's ironic is that buying a full waterblock is more expensive than getting the Hybrid FTW version. Plus, having a separate radiator for the GPU wont pump hot coolant from the CPU to the GPU or vice versa... Kept separate they work much cooler and independent of each others


----------



## TUFinside

Quote:


> Originally Posted by *ZhopkaPopka*
> 
> Anyone bothered with *undervolting*? I made my 1080 ASUS Strix Advanced to run on 0.9v on 1949 mhz. Stock voltage was 1.063.
> 
> Lowering voltage gave me -10 grad temps (now its 63 at peak, and it was 73). and -12% power drain.
> 
> Someone who did undervolt post your results pls.
> 
> I have 2 questions:
> 1) Why my STRIX Advanced had 1949 mhz at stock, while Advanced version should boost only to 1835mhz? Its not OC version.
> 2) I set in MSI Voltage CURVE to 1960mhz on 0.9v, but those 1960 maintain only first seconds when test begings and then only 1949 mhz all the time. If i increase voltage - nothing changes, still 1949.


How do you do that ? I'm confused with the curve.


----------



## SpykeZ

So I did a search in this thread and all the bios flashing etc for voltage unlocking is in reference to the asus.

I got a Gigabyte G1 Gaming 1080, any way to unlock the voltage on this one?


----------



## Beagle Box

Quote:


> Originally Posted by *nick779*
> 
> So, I think im going to buy a 1080 to replace my 780ti. Question is, which one? Ive had decent luck with EVGA, but my ACX v1 cooler is pretty loud and the fans almost like rattle.
> 
> Is the iCX cooler on the SC2 a pretty good matchup for performance and quiet? or should I go for another brand? (I have little to no intentions of overclocking, just want a nice factory OC card)
> 
> What would you guys suggest? Trying to keep it around $550, and I dont really want a hybrid.


Don't know the current market price, but my MSI Gaming X runs very well and it's one of the quietest you'll find. Fans sit idle until 60C. When spinning the fans are almost silent below 80%.


----------



## zzztopzzz

Finally got around to subscribing to this very good thread. My build has been complete since early last year and really haven't had time/gumption to tweak the thing outside of bumping the CPU to 4600. Here are the results of my first run Time Spy. Where do I need to go from here?


----------



## GRABibus

deleted


----------



## GRABibus

Quote:


> Originally Posted by *SpykeZ*
> 
> So I did a search in this thread and all the bios flashing etc for voltage unlocking is in reference to the asus.
> 
> I got a Gigabyte G1 Gaming 1080, any way to unlock the voltage on this one?


The only way to unlock power and voltage (Until 1.2V), is to flash with BIOS ASUS Strix OC t4, at your own risk...


----------



## pmachado

Quote:


> Originally Posted by *h2spartan*
> 
> This is a comment from evga forums...not sure how accurate it is but, he seems to believe the "T4 bios" is compatible with FTW. Honestly i hope it is, i would like to get more out of it.


I can confirm the T4 bios works on the FTW. Im using it on mine on the secondary/slave bios position without any issues. Hope that helps.


----------



## schoolofmonkey

I got a question for you guys.

I'm going to be looking at Ryzen build next month, currently I'm using a GTX1080 Strix (2020Mhz) and a [email protected] Benq.
Now to over come that little 1080p Ryzen quirk I was going to get a Benq [email protected]

Do you think the GTX1080 will be ok for that until Volta drops or should I grab at GTX1080ti at the same time as the upgrade.


----------



## 6u4rdi4n

Try it and see for yourself.

The GTX 1080 is a very capable card, but we all want more and more performance eh? I'm currently using a GTX 1080 with an Asus [email protected] Most games run very well at max, but some games like Battlefield 1 needs to have a few settings lowered if I want minimum FPS to be higher than 80-90.

But like I said, try it yourself. How much time do you "lose" by doing that? Keep the GTX 1080 with the new build, try it out and see how it performs. If it's not satisfactory, buy a GTX 1080 Ti.


----------



## Jackl2

Quote:


> Originally Posted by *pmachado*
> 
> I can confirm the T4 bios works on the FTW. Im using it on mine on the secondary/slave bios position without any issues. Hope that helps.


Can you write a step by step How-Toif its not too time consuming for you?


----------



## pmachado

Quote:


> Originally Posted by *Jackl2*
> 
> Can you write a step by step How-Toif its not too time consuming for you?


I followed the instructions from this thread

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti

Before flashing switch to the 2nd bios position and as always back up the bios before flashing.


----------



## Rhadamanthis

this is a bios strix 08G 11 GBps, anyone try?


----------



## AllGamer

*Help!*

so I was installing an EK water block and Backplate for my GTX 1080 (Founders Edition) and I accidentally shattered this ceramic resistor or whatever it was supposed to be, and that is the problem.

I don't know what was that little piece.

Does anyone have a GTX 1080 (Founders Edition) and let me know which part is broken, and hopefully find a spare piece for sale somewhere.

circled in red

  

 



I believe it's part C503 ?

Anyone know where I can order one?

The card will not work with that cracked, I tried to power it up already and nothing.

.


----------



## virpz

Quote:


> Originally Posted by *AllGamer*
> 
> *Help!*
> 
> so I was installing an EK water block and Backplate for my GTX 1080 (Founders Edition) and I accidentally shattered this ceramic resistor or whatever it was supposed to be, and that is the problem.
> 
> I don't know what was that little piece.
> 
> Does anyone have a GTX 1080 (Founders Edition) and let me know which part is broken, and hopefully find a spare piece for sale somewhere.
> 
> circled in red
> 
> 
> 
> 
> 
> 
> 
> I believe it's part C503 ?
> 
> Anyone know where I can order one?
> 
> The card will not work with that cracked, I tried to power it up already and nothing.
> 
> .


Capacitor it is. Now you need to find it's value.


----------



## AllGamer

Quote:


> Originally Posted by *virpz*
> 
> Capacitor it is. Now you need to find it's value.


Does anyone knows the value of that "brown" ? capacitor ? ... (it seems like brown, but it might be another shade, like olive color maybe?)


----------



## hertz9753

I wouldn't try that replacement and I don't think you should. Why don't you send the card in for RMA? I understand that you broke the card but you are not going to able to fix it.


----------



## AllGamer

Quote:


> Originally Posted by *hertz9753*
> 
> I wouldn't try that replacement and I don't think you should. Why don't you send the card in for RMA? I understand that you broke the card but you are not going to able to fix it.


to fix it, it's not that hard, just unsolder and solder the new part back on.

the hardest problem is to find the correct value of the broken part.

back in the days it was a simple thing to walk into Radio Shack and back out with the parts you need, now, it's ridiculously difficult to find any hobby shop with electronics parts to replace easy to fix stuff.

even a broken TV now in days just toss it away and buys a new one.

BTW, I used to repair TV and radios in the past, until the profession became obsolete with the low prices of mass manufactured products.

so much electronics stuff are being tossed away, that could easily be fixed / recycled.

I could send it in for RMA, I guess I'll give that a try,

finding that little piece out in the wild now in days it's proving to be quite a challenge.

Even regular Electric shops they don't carry as much of these little stuff anymore because... no market.

I used to have so much fun back in the days building my own DIY radios/TVs/RC cars/etc.

This is are going to be some serious lost knowledge, lost tech in the future when the world is reduced to just factory and consumers.

Just like how people in the city doesn't know how to farm anymore, you ask kids where do the food comes from? they'll say the supermarket.


----------



## virpz

Quote:


> Originally Posted by *AllGamer*
> 
> to fix it, it's not that hard, just unsolder and solder the new part back on.
> 
> the hardest problem is to find the correct value of the broken part.
> 
> back in the days it was a simple thing to walk into Radio Shack and back out with the parts you need, now, it's ridiculously difficult to find any hobby shop with electronics parts to replace easy to fix stuff.
> 
> even a broken TV now in days just toss it away and buys a new one.
> 
> BTW, I used to repair TV and radios in the past, until the profession became obsolete with the low prices of mass manufactured products.
> 
> so much electronics stuff are being tossed away, that could easily be fixed / recycled.
> 
> I could send it in for RMA, I guess I'll give that a try,
> 
> finding that little piece out in the wild now in days it's proving to be quite a challenge.
> 
> Even regular Electric shops they don't carry as much of these little stuff anymore because... no market.
> 
> I used to have so much fun back in the days building my own DIY radios/TVs/RC cars/etc.
> 
> This is are going to be some serious lost knowledge, lost tech in the future when the world is reduced to just factory and consumers.
> 
> Just like how people in the city doesn't know how to farm anymore, you ask kids where do the food comes from? they'll say the supermarket.


Go for RMA.
You are clearly not versed with electronics - Could not differentiate a cap from a resistor.


----------



## Dasboogieman

Quote:


> Originally Posted by *AllGamer*
> 
> to fix it, it's not that hard, just unsolder and solder the new part back on.
> 
> the hardest problem is to find the correct value of the broken part.
> 
> back in the days it was a simple thing to walk into Radio Shack and back out with the parts you need, now, it's ridiculously difficult to find any hobby shop with electronics parts to replace easy to fix stuff.
> 
> even a broken TV now in days just toss it away and buys a new one.
> 
> BTW, I used to repair TV and radios in the past, until the profession became obsolete with the low prices of mass manufactured products.
> 
> so much electronics stuff are being tossed away, that could easily be fixed / recycled.
> 
> I could send it in for RMA, I guess I'll give that a try,
> 
> finding that little piece out in the wild now in days it's proving to be quite a challenge.
> 
> Even regular Electric shops they don't carry as much of these little stuff anymore because... no market.
> 
> I used to have so much fun back in the days building my own DIY radios/TVs/RC cars/etc.
> 
> This is are going to be some serious lost knowledge, lost tech in the future when the world is reduced to just factory and consumers.
> 
> Just like how people in the city doesn't know how to farm anymore, you ask kids where do the food comes from? they'll say the supermarket.


Thats a ceramic capacitor, to my knowledge, theres no real harm if the replacement capacitor has a higher rating, the problem is making sure the new one isn't rated lower. Additionally, I don't think these ceramic caps are polar so soldering should be a little simpler.

According to this: http://www.wikihow.com/Read-a-Capacitor
C503 means Ceramic, 0.05 uF rating.

Check Mouser Electronics or Digikey.


----------



## AllGamer

Quote:


> Originally Posted by *Dasboogieman*
> 
> Thats a ceramic capacitor, to my knowledge, theres no real harm if the replacement capacitor has a higher rating, the problem is making sure the new one isn't rated lower. Additionally, I don't think these ceramic caps are polar so soldering should be a little simpler.
> 
> According to this: http://www.wikihow.com/Read-a-Capacitor
> C503 means Ceramic, 0.05 uF rating.
> 
> Check Mouser Electronics or Digikey.


Digikey seems to have it, but they will not sell a single one, it needs a larger quantity order of a lot. (expected)

Mouser Electronics is even crazier minimum order of 2000 and up...







I don't own an electronic factory, otherwise I might consider it. LOL









This is the sad thing with Radio Shack going out of electronic hobby business, so hard to find small quantify of spare parts.

Well, guess I'll open a RMA ticket and send that thing back in for repair, I doubt I'll be able to find that replacement piece as single units

Thanks for the links.


----------



## nrpeyton

Last night I made a really stupid mistake.

One I thought it is absolutely impossible to make.

Well. Hell knows. But I done it.

Forgot to screw the plugs into unused ports on new GPU waterblock. (It was 5am--i'd been up all night & was tired).

Switched system on (not just the pump). and u get the picture.

Water shooted out the port, getting wet the VRM and CPU area on the motherboard. (Luckily cpu is covered by an Ek Supremacy block).

I dived for the 'off switch' the moment i realised!

Dried everything I could see with paper towels.

There were some bits I could see but couldn't reach which are dry now. (13hrs later) I left the central heating on in the house (rest of night & all day today). Its 6pm now.

There doesn't appear to be any visible damage. Never heard any pops or strange noises.

Surprisingly the GPU its self looked dry (except PCI-E lane which is now dry also)

I read that distilled water isn't conductive. (Nor is the coolant).

Hopefully no damage.

Think its safe to try it? (Everything APPEARS to have been dry for some time now).

I certainly wont ever make this mistake again! But it sure proves it can happen to anyone!.

When i had my old card i was always taking block off to perfect pad arrangement/re-paste/extreme cooling session) etc etc. I done it so many times. And the plugs never had to be touched except the first time!
Last night i must have been on auto-pilot!.

Anyway--anyone have an experience share? I'm praying to god I haven't destroyed £1500 worth of computer equipment.


----------



## Yukss

Quote:


> Originally Posted by *nrpeyton*
> 
> Last night I made a really stupid mistake.
> 
> One I thought it is absolutely impossible to make.
> 
> Well. Hell knows. But I done it.
> 
> Forgot to screw the plugs into unused ports on new GPU waterblock. (It was 5am--i'd been up all night & was tired).
> 
> Switched system on (not just the pump). and u get the picture.
> 
> Water shooted out the port, getting wet the VRM and CPU area on the motherboard. (Luckily cpu is covered by an Ek Supremacy block).
> 
> I dived for the 'off switch' the moment i realised!
> 
> Dried everything I could see with paper towels.
> 
> There were some bits I could see but couldn't reach which are dry now. (13hrs later) I left the central heating on in the house (rest of night & all day today). Its 6pm now.
> 
> There doesn't appear to be any visible damage. Never heard any pops or strange noises.
> 
> Surprisingly the GPU its self looked dry (except PCI-E lane which is now dry also)
> 
> I read that distilled water isn't conductive. (Nor is the coolant).
> 
> Hopefully no damage.
> 
> Think its safe to try it? (Everything APPEARS to have been dry for some time now).


Pls confirm if you were using distilled water? Its so, and after all this time looks dried, you can also use a hair dryer on medium heat and blow everything for 20min aprox and power on your ystem. You ahould also check you battery, take it out and check for moisture and residual water in there


----------



## wholeeo

Quote:


> Originally Posted by *nrpeyton*
> 
> Last night I made a really stupid mistake.
> 
> One I thought it is absolutely impossible to make.
> 
> Well. Hell knows. But I done it.
> 
> Forgot to screw the plugs into unused ports on new GPU waterblock. (It was 5am--i'd been up all night & was tired).
> 
> Switched system on (not just the pump). and u get the picture.
> 
> Water shooted out the port, getting wet the VRM and CPU area on the motherboard. (Luckily cpu is covered by an Ek Supremacy block).
> 
> I dived for the 'off switch' the moment i realised!
> 
> Dried everything I could see with paper towels.
> 
> There were some bits I could see but couldn't reach which are dry now. (13hrs later) I left the central heating on in the house (rest of night & all day today). Its 6pm now.
> 
> There doesn't appear to be any visible damage. Never heard any pops or strange noises.
> 
> Surprisingly the GPU its self looked dry (except PCI-E lane which is now dry also)
> 
> I read that distilled water isn't conductive. (Nor is the coolant).
> 
> Hopefully no damage.
> 
> Think its safe to try it? (Everything APPEARS to have been dry for some time now).
> 
> I certainly wont ever make this mistake again! But it sure proves it can happen to anyone!.
> 
> When i had my old card i was always taking block off to perfect pad arrangement/re-paste/extreme cooling session) etc etc. I done it so many times. And the plugs never had to be touched except the first time!
> Last night i must have been on auto-pilot!.
> 
> Anyway--anyone have an experience share? I'm praying to god I haven't destroyed £1500 worth of computer equipment.


This reminds me of this video I was watching the other day...lol




@ 6:20


----------



## nrpeyton

Thanks for reply.

Yes. I can confirm it was distilled water & EK coolant. Only.


----------



## Derek1

Quote:


> Originally Posted by *nrpeyton*
> 
> Thanks for reply.
> 
> Yes. I can confirm it was distilled water & EK coolant. Only.


Damn Nick, alllllll that work you put into your Classy and now I see you jumped to a Ti?
I would have thought you would have at least waitied for the KPE.


----------



## Yukss

Quote:


> Originally Posted by *nrpeyton*
> 
> Thanks for reply.
> 
> Yes. I can confirm it was distilled water & EK coolant. Only.


so then you will not have any problems. just power on the system. just remember to check the battery housing first .


----------



## nrpeyton

Quote:


> Originally Posted by *Derek1*
> 
> Damn Nick, alllllll that work you put into your Classy and now I see you jumped to a Ti?
> I would have thought you would have at least waitied for the KPE.


Lol I got the Ti as a free step up from the Classy.

How u doing mate.

Quote:


> Originally Posted by *Yukss*
> 
> so then you will not have any problems. just power on the system. just remember to check the battery housing first .


I took the CPU out. It didnt appear to be wet its self. But the Asus CPU tool it sits in was wet. And a bit of the block.
And this is 15hrs since it happened.

So thats not good.

Av just put my CPU and mobo into the oven at 50 c.

Don't want to take any chances.

The 1080Ti never got very wet at all. But as a precaution I've got it on the radiator. I've left the block on it. Don't want to damage the thermal pads.


----------



## Yukss

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol I got the Ti as a free step up from the Classy.
> 
> How u doing mate.
> I took the CPU out. It didnt appear to be wet its self. But the Asus CPU tool it sits in was wet. And a bit of the block.
> And this is 15hrs since it happened.
> 
> So thats not good.
> 
> Av just put my CPU and mobo into the oven at 50 c.
> 
> Don't want to take any chances.
> 
> The 1080Ti never got very wet at all. But as a precaution I've got it on the radiator. I've left the block on it. Don't want to damage the thermal pads.


a hair dryer will be more than enough


----------



## nrpeyton

Quote:


> Originally Posted by *Yukss*
> 
> a hair dryer will be more than enough


If i had one i'd use it. 

Thought I'd try the oven first before having to call my sister. Lol

Its only at 50c.

And its fan assisted.

The VRM heatsinks got very hot in there. But i was still able to touch them for a few seconds. And VRM can do 125c. So it should be fine.

Am hoping this will let any residual water that is stuck evaporate. (Any i can't see).

What should have been a 1st time shunt mod & then simple block install has turned into a 2 day nigntmare.

I'm in the process of rebuilding now.

I'm going to to have to test with the iGPU first before trying with the GPU in system. As the GPU is still shunt modded from last night. And the accident happened while (before) i was preparing for first power-on test with new shunt mod.


----------



## philhalo66

Quote:


> Originally Posted by *nrpeyton*
> 
> If i had one i'd use it.
> 
> Thought I'd try the oven first before having to call my sister. Lol
> 
> Its only at 50c.
> 
> And its fan assisted.
> 
> The VRM heatsinks got very hot in there. But i was still able to touch them for a few seconds. And VRM can do 125c. So it should be fine.
> 
> Am hoping this will let any residual water that is stuck evaporate. (Any i can't see).
> 
> What should have been a 1st time shunt mod & then simple block install has turned into a 2 day nigntmare.
> 
> I'm in the process of rebuilding now.
> 
> I'm going to to have to test with the iGPU first before trying with the GPU in system. As the GPU is still shunt modded from last night. And the accident happened while (before) i was preparing for first power-on test with new shunt mod.


when a friend dropped his phone in a lake we put it in a bag of rice for 3 days and it was bone dry after and worked fine, try that rice is cheap so get a big bag and put the card in the middle of the bag.


----------



## nrpeyton

Quote:


> Originally Posted by *philhalo66*
> 
> when a friend dropped his phone in a lake we put it in a bag of rice for 3 days and it was bone dry after and worked fine, try that rice is cheap so get a big bag and put the card in the middle of the bag.


Card appears to be working perfectly 

So is my new shunt mod too 

Only 'reporting' 194 watts in The Witcher 3 at 4k.

1080 Ti at 1.093v 2100 Mhz

In the same scene before the shunt mod, I was at 0.9v 1833 Mhz and still at 300 watts (300 is the limit on a 1080Ti FE)

Good bye power throttling


----------



## xOverClocked

Hi guys kinda new to GPU overclocking but atm, I cant seem to get pass 0.95v while overclocking. TDP and temp sits at around 65~75% and 50c while running firestrike (gpu-z logs). I'm running a Gigabtye xtreme gaming card and with evga precision x, I can rise the MHz all the way to 2.6MHz but the voltage will always stay the same at 0.95v (using one bar method in manual overclock), if I try to go over 0.95v the just defaults to factory OC speeds. I'm totally confused as to what is happening as the numbers don't make sense.

http://www.3dmark.com/fs/12795726 <~ 2582 MHz but graphic score doesn't match up with it.
Would the shunt mod enable me to go over 0.95v? I haven't fully looked into what it actually does but would rather not void my warranty if it won't help me.


----------



## Yukss

Quote:


> Originally Posted by *xOverClocked*
> 
> Hi guys kinda new to GPU overclocking but atm, I cant seem to get pass 0.95v while overclocking. TDP and temp sits at around 65~75% and 50c while running firestrike (gpu-z logs). I'm running a Gigabtye xtreme gaming card and with evga precision x, I can rise the MHz all the way to 2.6MHz but the voltage will always stay the same at 0.95v (using one bar method in manual overclock), if I try to go over 0.95v the just defaults to factory OC speeds. I'm totally confused as to what is happening as the numbers don't make sense.
> 
> http://www.3dmark.com/fs/12795726 <~ 2582 MHz but graphic score doesn't match up with it.
> Would the shunt mod enable me to go over 0.95v? I haven't fully looked into what it actually does but would rather not void my warranty if it won't help me.


hello, welcome to OCN, here you will find ppl that could help you, but this is the wrong section, i suggest you to open a new thread in the NVIDIA section. good luck


----------



## Yukss

Quote:


> Originally Posted by *nrpeyton*
> 
> Card appears to be working perfectly
> 
> So is my new shunt mod too
> 
> Only 'reporting' 194 watts in The Witcher 3 at 4k.
> 
> 1080 Ti at 1.093v 2100 Mhz
> 
> In the same scene before the shunt mod, I was at 0.9v 1833 Mhz and still at 300 watts (300 is the limit on a 1080Ti FE)
> 
> Good bye power throttling


yaiii..


----------



## FlyingSolo

Whats the best driver for GTX 1080 now. Just ordered one, Should have it in couple of hours time.


----------



## philhalo66

Anyone else notice considerable desktop lag after closing a game? i will drag a file across my primary screen and it will look like its dropping frames or something. I also found killing desktop windows manager fixes it till i close out of a game again.


----------



## zzztopzzz

I have a pair of Nvidia 1080's and simply use their auto update. Both work flawlessly and I've never had a problem. Good luck with your new card!


----------



## sinholueiro

Hi! I will get my 1080 Armor tomorrow. I get it really cheap, so even if it was the FE, I would get it anyways. I'm a bit concerned about the cooler. Under sustained load, what temps should I expect (at stock and at OC)?


----------



## Dasboogieman

Quote:


> Originally Posted by *sinholueiro*
> 
> Hi! I will get my 1080 Armor tomorrow. I get it really cheap, so even if it was the FE, I would get it anyways. I'm a bit concerned about the cooler. Under sustained load, what temps should I expect (at stock and at OC)?


Terrible, marginally better than the FE minus the sexy and with more irritating sound harmonics.


----------



## sinholueiro

Quote:


> Originally Posted by *Dasboogieman*
> 
> Terrible, marginally better than the FE minus the sexy and with more irritating sound harmonics.


Well, the temps and noise of the FE are really really bad. From what I saw, the 1070 Armor is pretty decent in terms of temps and noise. Is the 1080 so much worse? In theory, the 1080 shouldn't consume much more power, so the amount of heat to disipate has to be only a little higher. How can be so much worse?


----------



## Dasboogieman

Quote:


> Originally Posted by *sinholueiro*
> 
> Well, the temps and noise of the FE are really really bad. From what I saw, the 1070 Armor is pretty decent in terms of temps and noise. Is the 1080 so much worse? In theory, the 1080 shouldn't consume much more power, so the amount of heat to disipate has to be only a little higher. How can be so much worse?


The 1080s do consume a fair bit more power than the 1070s. They do have 25% more active chip after-all. Additionally, the 1070 has a strong heat dissipation advantage due to more of the die being dead silicon. From what I've observed, its not so much the raw noise of the Armor cooler thats irritating but the harmonics. It has this annoying high pitched screech, at least on the 1080ti model.


----------



## PCGuy 5960

Quote:


> Originally Posted by *sinholueiro*
> 
> Hi! I will get my 1080 Armor tomorrow. I get it really cheap, so even if it was the FE, I would get it anyways. I'm a bit concerned about the cooler. Under sustained load, what temps should I expect (at stock and at OC)?


Get a GTX 1080 G1 Gaming instead, it runs extremely cool (68-70C under load) and it also has RGB LEDs


----------



## ZealotKi11er

Finally after 3 months of owning GTX 1080 FE I decided to do some gaming. I though my +225/+500 OC was stable but I Battlefront keeps crashing and dropping to desktop. Tried +175/+0 and still happens. Now I am thinking of getting a custom 1080 Ti but man I hate overclocking these Nvidia GPUs. Different MHz different Voltage, Power Target, way too many variables.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Finally after 3 months of owning GTX 1080 FE I decided to do some gaming. I though my +225/+500 OC was stable but I Battlefront keeps crashing and dropping to desktop. Tried +175/+0 and still happens. Now I am thinking of getting a custom 1080 Ti but man I hate overclocking these Nvidia GPUs. Different MHz different Voltage, Power Target, way too many variables.


Wait till you find out Pascal has a HW monitoring unit inside the die that trims performance dynamically independent of the BIOS or any external PCB factors. Like 1.062V 2100 mhz at 85C yields lower scores than 1.062 2100mhz at 30C. Same clockspeed, same voltage, different results that can only be observed directly by comparing comparing benchmark scores.


----------



## sinholueiro

Quote:


> Originally Posted by *PCGuy 5960*
> 
> Get a GTX 1080 G1 Gaming instead, it runs extremely cool (68-70C under load) and it also has RGB LEDs


I'm upgrading from the RX480 for 115€ (not that I sell the 480 for much, but instead I got the 1080 very cheap). I would have done that even for the FE!








I think that I will see how the temps are in load ad decide then what to do. If I buy a GPU bracket to put a simple AIO (the cheapest 120mm one), what temps should I expect?


----------



## ZealotKi11er

Quote:


> Originally Posted by *sinholueiro*
> 
> I'm upgrading from the RX480 for 115€ (not that I sell the 480 for much, but instead I got the 1080 very cheap). I would have done that even for the FE!
> 
> 
> 
> 
> 
> 
> 
> 
> I think that I will see how the temps are in load ad decide then what to do. If I buy a GPU bracket to put a simple AIO (the cheapest 120mm one), what temps should I expect?


Low 50s.


----------



## MK-Professor

Generally what is the average OC I should expect from GTX1080 and more specifically from a gpu like the MSI Armor?


----------



## ZealotKi11er

Quote:


> Originally Posted by *MK-Professor*
> 
> Generally what is the average OC I should expect from GTX1080 and more specifically from a gpu like the MSI Armor?


2050+ based on the cooler.


----------



## PCGuy 5960

Quote:


> Originally Posted by *sinholueiro*
> 
> If I buy a GPU bracket to put a simple AIO (the cheapest 120mm one), what temps should I expect?


This should help


----------



## philhalo66

Quote:


> Originally Posted by *PCGuy 5960*
> 
> This should help


removing those plates like he did and running the card bare PCB will void any warranty even from EVGA. that's just asking for trouble.


----------



## ZealotKi11er

Quote:


> Originally Posted by *philhalo66*
> 
> removing those plates like he did and running the card bare PCB will void any warranty even from EVGA. that's just asking for trouble.


I do not know what he removed but you can remove the cooler for eVGA. I just contacted them and they are fine. They just want to know if the card has been physically open.


----------



## hertz9753

EVGA uses a mid plate under the cooler and I never pulled one off when adding an AIO because it's there for a reason. Many people have pulled the mid plate and added tiny heat sinks and killed the card when they try to pull them off before selling.


----------



## EDK-TheONE

Is this graphic score is real?


----------



## andydabeast

The last Nvidia card I owned was the GTX 460 with 768mb of vram! Vega is too late getting here so tomorrow I will be back on team green with a Gigabyte Windforce GTX 1080! Yayyy


----------



## pmachado

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Is this graphic score is real?


Well it says invalid score for that run but the graphics score looks about right. I get around the same with my GTX 1080.


----------



## EDK-TheONE

Quote:


> Originally Posted by *pmachado*
> 
> Well it says invalid score for that run but the graphics score looks about right. I get around the same with my GTX 1080.


could you post your result?


----------



## Elboy0001

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Is this graphic score is real?


I do 25900 at firestrike and 6000 at firestrike ultra with drivers 382.33

With 382.53 i do 25200, 5950 in ultra
And 8450(graphic) at timespy (2114/5800 andtdp unlock)


----------



## Vellinious

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Is this graphic score is real?


Looks about right for a mildly overclocked 1080.


----------



## panosxidis

Hello guys i have one question,last days i sell my 980tis SLI gygabyte extreme gaming to 1080s sli extreme gaming how to faster than 980tis?and watts?thnnx sorry my english


----------



## pmachado

Quote:


> Originally Posted by *EDK-TheONE*
> 
> could you post your result?


Here's mine.


----------



## Aeonyx7

My Zotac 1080 AMP gets here on Thursday









Anyone have any experience with this card?


----------



## Dante007

Quote:


> Originally Posted by *panosxidis*
> 
> Hello guys i have one question,last days i sell my 980tis SLI gygabyte extreme gaming to 1080s sli extreme gaming how to faster than 980tis?and watts?thnnx sorry my english


i have 2 GTX980Ti Xtreme overclocked @ 1530/2000 and it's equal 2 GTX 1080 in sli even with overclocked 2GTX 1080 the results will be like 10-15%
in watts My SLI 980Ti total between 550W toward 700W depend Game or V-sync on/off
the GTX 1080 from what i test will be around 440W Stock toward 640W Overclocked


----------



## Yukss

Quote:


> Originally Posted by *Dante007*
> 
> i have 2 GTX980Ti Xtreme overclocked @ 1530/2000 and it's equal 2 GTX 1080 in sli even with overclocked 2GTX 1080 the results will be like 10-15%
> in watts My SLI 980Ti total between 550W toward 700W depend Game or V-sync on/off
> the GTX 1080 from what i test will be around 440W Stock toward 640W Overclocked


you said it's "equal" then you said is "10-15%" which in fact are more like 20-25%

980 ti oced to death are just 2 gtx 1070s , cant match two overclocked gtx 1080s


----------



## Vellinious

I've seen some pretty high overclocked 980tis put up scores that are pretty close to what I can pull on my overclocked 1080s. Granted, those are really rare 980tis pulling 1630ish core clocks.


----------



## Chopxsticks

Interested in the EVGA 1080 FTW, would anyone not recommend Amazon Prime for a GPU?


----------



## Vellinious

Quote:


> Originally Posted by *Chopxsticks*
> 
> Interested in the EVGA 1080 FTW, would anyone not recommend Amazon Prime for a GPU?


Nope, they should be fine to purchase from. I buy motherboards and GPUs from Amazon on occasion, when the shipping times are more beneficial than ordering from NewEgg.


----------



## tbob22

Here's my validation:
https://www.techpowerup.com/gpuz/ew649

Last nVidia card I had was a GTX 470. Decided to take advantage of the mining craze, get rid of my 390, and grab a 1080. Very happy with it so far.

A quick Fire Strike result, haven't messed with voltages or anything yet:
http://www.3dmark.com/fs/12886054


----------



## hertz9753

I didn't even know that a new mining craze happened because the last one was bad enough. Is it still, we will pay you to transactions but you will never know what you are doing?


----------



## Dasboogieman

Quote:


> Originally Posted by *hertz9753*
> 
> I didn't even know that a new mining craze happened because the last one was bad enough. Is it still, we will pay you to transactions but you will never know what you are doing?


The last craze killed the market for AMD 290s. The gamers were fighting tooth and nail with the LiteCoin homies. The only real winners were the builders on a budget who came along after the craze died, picking up cards for like $200.


----------



## zzztopzzz

Quote:


> Originally Posted by *Chopxsticks*
> 
> Interested in the EVGA 1080 FTW, would anyone not recommend Amazon Prime for a GPU?


I have an Asus GTX-1080 Ultra that is a available. Original packaging, etc. Presently part of my SLI setup and I'm considering doing something different. Warranty still in effect, pristine condition.


----------



## Dante007

Quote:


> Originally Posted by *Yukss*
> 
> you said it's "equal" then you said is "10-15%" which in fact are more like 20-25%
> 
> 980 ti oced to death are just 2 gtx 1070s , cant match two overclocked gtx 1080s


i know alot don't believe it at first but check this score


GTX 1070 no way can get GTX980Ti @1520+


----------



## tbob22

Quote:


> Originally Posted by *Dante007*
> 
> i know alot don't believe it at first but check this score
> 
> 
> GTX 1070 no way can get GTX980Ti @1520+


That's pretty impressive, those 980 ti's clocked really well.

My result:


Still getting used to all this Boost 3.0 stuff, it a bit frustrating with voltages jumping all over the place. Clocks were sitting around 2100-2150mhz or so.

Improved my firestrike a bit
http://www.3dmark.com/fs/12896031

Gotta bump my chip up to 5ghz for a few benching sessions.


----------



## spddmn24

Can you sli a 10 gbps and 11 gbps 1080 if the ram is running at the same speed?


----------



## coreykill99

OK so there's a pretty slim change anyone would know this offhand.
I have an MSI gaming (no prefix) 1080 currently running @ 2088mhz under water. in gpuz I keep getting perfcap for vrel. which its hitting the voltage limit on the card. its showing 1.08v now I thought the voltage limit on these was 1.096 volts. I can only imagine the Bios is the limiting factor here as for the power limit. its not temps as under sustained load. im seeing around 54C
I happen to have the MSI gaming Z bios laying around but I haven't flashed it as before I put it under water I didn't see the need afterburner was plenty enough. now im wondering if anyone knows the chances of a different bios increasing the available voltage to the card? or if there's a resource I can look through that shows the differences between BIOS Techpowerup seems rather ambiguous.
or if you all think its a waste of time and not to bother.

or am i thinking about this wrong due to higher clocked cards being binned better and needing lower voltage?


----------



## chubalz

Can a Seasonic M12II-620 EVO 620W 80Plus Bronze Fully Modular enough to power a Gigabyte GTX 1080 Ti Aorus 11GB?


----------



## lanofsong

Hello GTX 1080 owners,

We are having our monthly Foldathon from Monday 19th - Wednesday 21st - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

June 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## TheJack

Just got my 1080, so I figured I'd join the club. I have the Gigabyte version with 3 fans and backplate. It seemed like a good deal at $500 with Destiny 2 included, especially when my current/old card NITRO 380X is at a similar price now. (Twice what I paid for it over a year ago)


----------



## hertz9753

Quote:


> Originally Posted by *TheJack*
> 
> Just got my 1080, so I figured I'd join the club. I have the Gigabyte version with 3 fans and backplate. It seemed like a good deal at $500 with Destiny 2 included, especially when my current/old card NITRO 380X is at a similar price now. (Twice what I paid for it over a year ago)


Congrats on the new GPU.







Either your real name is Jack or you like AC/DC, of course it could be neither or both.


----------



## Pillendreher

Hey guys,

I'm gonna sell my 390 while people still are willing to pay outrageous sums for it, and will replace it with a 1080. Now I'm wondering whether you guys can recommend a specific model. I was thinking of getting the Gainward Phoenix since it's supposed to be both cool and silent.

Any recommendations?


----------



## KingAlkaiser

is there a general consencus on the best version of the gtx 1080? ( brand and version for aftermarket coolers )?

was going to buy a rx580 due to cheap price back before but due to mining craze you can't buy anything else surprisingly


----------



## sirleeofroy

Quote:


> Originally Posted by *Pillendreher*
> 
> Hey guys,
> 
> I'm gonna sell my 390 while people still are willing to pay outrageous sums for it, and will replace it with a 1080. Now I'm wondering whether you guys can recommend a specific model. I was thinking of getting the Gainward Phoenix since it's supposed to be both cool and silent.
> 
> Any recommendations?


Quote:


> Originally Posted by *KingAlkaiser*
> 
> is there a general consencus on the best version of the gtx 1080? ( brand and version for aftermarket coolers )?
> 
> was going to buy a rx580 due to cheap price back before but due to mining craze you can't buy anything else surprisingly


It would depend on whether you want to watercool or not, if watercooling is your thing then it would seem that anything based on the FE edition board design (usually blower type coolers) would suit and can be had cheaply (relatively speaking!).

As for aftermarket cards, I can only speak for the Gainward Phoenix GLH which I believe is one of the best out there for stock cooling and overall performance. Out of the box mine boosted to 2054Mhz on stock fan profile. With some tweaks in AF and a slightly more aggressive fan curve, I'm game stable at 2.1Ghz and 11.5Ghz on the memory.

The only issue for some might be the size of the card, it's pretty big!

I hope that helps


----------



## Pillendreher

Thanks for your answer man. I'm cooling with air, so any water cooling related matters are irrelevant to me.

The size of the Gainward 1080 Phoenix doesn't matter to me either; there's plenty of space in my Fractal Design Define R4.


----------



## Harrywang

I just made a purchase for 1080 and I kind of have such buyers remorse already. Seems like this is one of the worst times to buy a video card with all the mining craze. I bought my r9 280x 4 years ago for relatively cheap price.

I wanted to get a 1070 as I'm only going to be playing on 1080p and 1070 seems perfect for that. However with the mining craze the prices of 1070 are getting to overpriced for my liking. I'm forced to buy a 1080 as the game I want to play runs total **** on my r9 280x. Anyways I got the evga hybrid FTW at a really good price. It just got in stock again so I had to get it asap before the prices go up again. A 1070 was about 100$ less. I just can't justify a 1080 over a 1070 at 1080p right now but if anyone can make me feel better please do. I can afford it but I hate spending for something I don't need!

Anyways I'm going to try to overclock this like crazy however I heard pascal isn't that great at OCing compared to the other cards. How much of a performance boost can I be expecting at 1080p when ocing? What overclocks better a 1070 or a 1080?


----------



## hertz9753

If you only have 1080P monitor why would you need to OC that beast? You also forgot about a GTX 980 Ti, probably because it's older.


----------



## Harrywang

Quote:


> Originally Posted by *hertz9753*
> 
> If you only have 1080P monitor why would you need to OC that beast? You also forgot about a GTX 980 Ti, probably because it's older.


Because I want every last frame possible since i'm running 144hz. Also why would i be on www.overclock.net and not overclock? That would just be disrespectful. 980 TI's aren't in stock at all. I'm also very picky when it comes to the type of card I want and the new hybrid cards coming out now seem perfect. I need low noise and low temperatures.

And again 1070s are going for 600+ CAD right now(none in stock). I could wait and get the zotac amp(non extreme) for 530$ cad but who knows when that will be in stock again and at that price. This evga ftw hybrid was heavily discounted to 749$ cad.


----------



## hertz9753

You didn't mention a refresh rate or say that you live in Canada so I thought 60Mz because that is what most 1080P monitors run at.. If you don't provide the proper information it's hard for people to help.


----------



## Harrywang

Quote:


> Originally Posted by *hertz9753*
> 
> You didn't mention a refresh rate or say that you live in Canada so I thought 60Mz because that is what most 1080P monitors run at.. If you don't provide the proper information it's hard for people to help.


True. I definitely should of been clear on that =)


----------



## Pillendreher

Quote:


> Originally Posted by *Harrywang*
> 
> I just made a purchase for 1080 and I kind of have such buyers remorse already. Seems like this is one of the worst times to buy a video card with all the mining craze. I bought my r9 280x 4 years ago for relatively cheap price.
> 
> I wanted to get a 1070 as I'm only going to be playing on 1080p and 1070 seems perfect for that. However with the mining craze the prices of 1070 are getting to overpriced for my liking. I'm forced to buy a 1080 as the game I want to play runs total **** on my r9 280x. Anyways I got the evga hybrid FTW at a really good price. It just got in stock again so I had to get it asap before the prices go up again. A 1070 was about 100$ less. I just can't justify a 1080 over a 1070 at 1080p right now but if anyone can make me feel better please do. I can afford it but I hate spending for something I don't need!
> 
> Anyways I'm going to try to overclock this like crazy however I heard pascal isn't that great at OCing compared to the other cards. How much of a performance boost can I be expecting at 1080p when ocing? What overclocks better a 1070 or a 1080?


Yeah I kinda have the same problem. Upgrading to a 1070 is just not worth it.


----------



## gordesky1

So i went to the green side which i wasn't happy about lol sense i been using ati sense 2002. But i couldn't pass up a 1080 nvidia founders that was local for 400$







I just always prefer the drivers of ati which i know they also a hit and a miss with others lol

But is there anyway to check what the temp of the vrms are running at? On my 390 hwin and gpuz always show what they are. But i cant find anything about vrm for the 1080.

Also sad there isn't a fan option in the control panel So been using msi afterburner which usely i hate using.

But the performance is great!








Quote:


> Originally Posted by *Harrywang*
> 
> I just made a purchase for 1080 and I kind of have such buyers remorse already. Seems like this is one of the worst times to buy a video card with all the mining craze. I bought my r9 280x 4 years ago for relatively cheap price.
> 
> I wanted to get a 1070 as I'm only going to be playing on 1080p and 1070 seems perfect for that. However with the mining craze the prices of 1070 are getting to overpriced for my liking. I'm forced to buy a 1080 as the game I want to play runs total **** on my r9 280x. Anyways I got the evga hybrid FTW at a really good price. It just got in stock again so I had to get it asap before the prices go up again. A 1070 was about 100$ less. I just can't justify a 1080 over a 1070 at 1080p right now but if anyone can make me feel better please do. I can afford it but I hate spending for something I don't need!
> 
> Anyways I'm going to try to overclock this like crazy however I heard pascal isn't that great at OCing compared to the other cards. How much of a performance boost can I be expecting at 1080p when ocing? What overclocks better a 1070 or a 1080?


Did you try running dsr? i run a 1080p tv but i always have ran vsr on amd and it made games look really good and you than can flex the mussels of the 1080 also lol


----------



## 6u4rdi4n

I think the only GTX 1080 cards with VRM temp sensors are the EVGA iCX models.

I could be wrong, but I know at least most doesn't.


----------



## gordesky1

Quote:


> Originally Posted by *6u4rdi4n*
> 
> I think the only GTX 1080 cards with VRM temp sensors are the EVGA iCX models.
> 
> I could be wrong, but I know at least most doesn't.


Hmm dam.. It just worries me cause i usely like too see what they go too ever sense my uncles gtx 570 vrms went up in smoke on me one night when i was using it lol

But than again those cards are known too have crap vrms.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *gordesky1*
> 
> Hmm dam.. It just worries me cause i usely like too see what they go too ever sense my uncles gtx 570 vrms went up in smoke on me one night when i was using it lol
> 
> But than again those cards are known too have crap vrms.


Indeed. Lots of dead GTX 570s. I had one myself, but had no problem at all getting it replaced.

On to the GTX 1080s, I believe the VRMs can handle up to 120°C and still be within specc, so I suspect you could run the VRMs naked and pretty hard and still not be in any danger. Just my thoughts on it.


----------



## Rhadamanthis

anyone try to flash bios asu 10gbps @11gbps?


----------



## TheJack

You're correct haha. I randomly picked an ACDC song from the back of a CD when coming up with a character name over a decade ago. I've used it quite a bit since then. (not my real name)


----------



## gordesky1

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Indeed. Lots of dead GTX 570s. I had one myself, but had no problem at all getting it replaced.
> 
> On to the GTX 1080s, I believe the VRMs can handle up to 120°C and still be within specc, so I suspect you could run the VRMs naked and pretty hard and still not be in any danger. Just my thoughts on it.


Yea lol one 4 years ago late night i ony had a mild oc going on the 570gtx with stock voltage core temps was fine under 70c max fan couldn't tell what the vrms was cause no sensor and was playing guild wars 2 than i herd a pop than screen went black..

Thought it was my pc but nope was the 570.. But gigabyte replaced it with another one. Than a month later the vrms pop on my gigabyte am2 motherboard playing the same game lol..... Bad luck that month..

This 1080 fe does run cold tho even at load when i have it on anything over 50% fan 50s and low 60s so im sure the vrms are not hot either. And i herd it will throttle down and not let the temps pass 83c so it should be fine i guess lol... And its also in a big case core x9 so alot of air flow.

I do like the blower style tho my whole system runs alot cooler than with the cards i had with the fans blowing all the hot air from the gpu around in the case.


----------



## gordesky1

Another question how durable is the fe blower fan? Why i ask i mostly keep it at 80 to 100% cause it runs so cold at that %. That and i also do mine with it too while gaming too when i do.

I had a amd 5870 blower style and while it did last a good bit i think it lasted like 3 to 4 years? But one day it started to make a clunking noise which it still spin but seems like the bearings went out on it.

But was able to replace it from another card a 4890 i had.

I looked on ebay but cant find any replacement fans for this? Maybe its too new for them to show up.

I have a core x9 case so the fan is sideways so i would think that takes less wear off it too.


----------



## Dasboogieman

Quote:


> Originally Posted by *gordesky1*
> 
> Another question how durable is the fe blower fan? Why i ask i mostly keep it at 80 to 100% cause it runs so cold at that %. That and i also do mine with it too while gaming too when i do.
> 
> I had a amd 5870 blower style and while it did last a good bit i think it lasted like 3 to 4 years? But one day it started to make a clunking noise which it still spin but seems like the bearings went out on it.
> 
> But was able to replace it from another card a 4890 i had.
> 
> I looked on ebay but cant find any replacement fans for this? Maybe its too new for them to show up.
> 
> I have a core x9 case so the fan is sideways so i would think that takes less wear off it too.


The OEM motor that NVIDIA uses for their FE fans is extremely top tier. Its so good that I actually have not seen it deployed by any other manufacturer.


----------



## gordesky1

Quote:


> Originally Posted by *Dasboogieman*
> 
> The OEM motor that NVIDIA uses for their FE fans is extremely top tier. Its so good that I actually have not seen it deployed by any other manufacturer.


Good too know Yea i also notice even at 100% its alot quieter than other blowers smoother too.


----------



## Dasboogieman

Quote:


> Originally Posted by *gordesky1*
> 
> Good too know Yea i also notice even at 100% its alot quieter than other blowers smoother too.


Yup, IIRC when they designed the very first FE blower, the NVIDIA CEO personally oversaw every part of the design and he delayed the whole thing a lot just perfecting the fan alone. They tested something like hundreds of motors before they found the current one. Because apparently the main cause of noise on the blower designs is the motor noise, not so much airflow noise like the open designs.

You can clearly see the difference when you do a side-by-side comparison with the ASUS Turbo series or MSI single blowers of the same GPU generation. Those "look" a lot like FEs but their motor is different and thus have the usual AMD 290X - style roaring noise.


----------



## gordesky1

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yup, IIRC when they designed the very first FE blower, the NVIDIA CEO personally oversaw every part of the design and he delayed the whole thing a lot just perfecting the fan alone. They tested something like hundreds of motors before they found the current one. Because apparently the main cause of noise on the blower designs is the motor noise, not so much airflow noise like the open designs.
> 
> You can clearly see the difference when you do a side-by-side comparison with the ASUS Turbo series or MSI single blowers of the same GPU generation. Those "look" a lot like FEs but their motor is different and thus have the usual AMD 290X - style roaring noise.


Yea i remember my 5870 blower that thing sounded like a jet engine at 100% lol.... I was worried the same happening when i got this card cause i did buy it used so i couldn't return it.

But im happy with it and also happy cause everything runs cooler now compare to the the aftermarket fans cooling the gpu but all the heat spews all around the case lol..


----------



## tbob22

Quote:


> Originally Posted by *KingAlkaiser*
> 
> is there a general consencus on the best version of the gtx 1080? ( brand and version for aftermarket coolers )?
> 
> was going to buy a rx580 due to cheap price back before but due to mining craze you can't buy anything else surprisingly


I'm happy with my Windforce 1080, it's similar to the G1 but slightly lower stock boost and no LEDs, I'm sure it's not the best but it is one of the least expensive models out there and it has a 3yr warranty, I picked it up for $386 after ebay bucks a few weeks back.

It stays nice and cool, I'm able to clock it to about 2150/11000mhz stable, it'll drop down to 2000-2050mhz under heavy load like furmark. Temps usually sit around 60-65c in demanding games and stays very quiet, furmark hits low 70's but it does get fairly loud.


----------



## TUFinside

Please could you add Zotac GTX 1080 Mini to the list ?


----------



## bagrata

hello.
I have one gtx 1080 wondforce OC and one gtx 1080 aorus.
i use them for mining. when i am mining with windforce version, and it is loaded 100%, tdp also goes up to 100% and it gives me better results in mining, but when I am mining with aorus version, its tdp doesn't go above 75% and gives me worse results compared to windforce oc edition. I tried msi afterburner but nothing changed, it only works if power limit slider is under 75% but not above that, unlike windforce versions. I think there is some kind of limiter on this card? how can i disable it?
sorry for my English


----------



## Harrywang

Got my evga hybrid 1080 2 days ago and have been testing it around. I'm able to get an OC of 2164 clock and 5507 memory at the max volt of 1.093. It downclocks to 2154 though. I haven't been tuning it much but this seems to be the max. Anything past 2164 and I crash pretty fast. My max temp is 55c with a ambient temp of 28c or so right now.

I'm not sure what to think if I got lucky or this is normal oc for watercooled cards? Should I keep it at this or try to fine tune it with lower volts? The temps stay the same usually no matter what volts I'm using.

First time using an AIO cooler and it's pretty good so far however i'm really dissapointed that I can't reduce the pump noise from the card. It makes a buzzing sound but I've heard that I can fix this by plugging the GPU fan into my motherboard. Does anyone else have a evga hybrid that has pump noise as well?


----------



## Derek1

Quote:


> Originally Posted by *Harrywang*
> 
> Got my evga hybrid 1080 2 days ago and have been testing it around. I'm able to get an OC of 2164 clock and 5507 memory at the max volt of 1.093. It downclocks to 2154 though. I haven't been tuning it much but this seems to be the max. Anything past 2164 and I crash pretty fast. My max temp is 55c with a ambient temp of 28c or so right now.
> 
> I'm not sure what to think if I got lucky or this is normal oc for watercooled cards? Should I keep it at this or try to fine tune it with lower volts? The temps stay the same usually no matter what volts I'm using.
> 
> First time using an AIO cooler and it's pretty good so far however i'm really dissapointed that I can't reduce the pump noise from the card. It makes a buzzing sound but I've heard that I can fix this by plugging the GPU fan into my motherboard. Does anyone else have a evga hybrid that has pump noise as well?


Your OC is above average for the Hybrid. I have 2 that will do that. I can push them to 2177 but as you say as soon as the temps hit ~45C they will come down to 2164 and if they go over 50C will come down again to 2139. I run mine at 2152 and so the only downclock I get is to 2139 and they never go over 47-48C @ ~1.08v.

I did a Hybrid Conversion on my original FTW and then bought a Factory Hybrid. Neither have the pump noise I have seen you and other complain about. Before installing them though I made sure to give them a little shake upside down so that all the coolant went to the pump and air to the rad, this may be what keeps the pump from buzzing as you may have an air bubble in there, which would also explain your higher temps.

ETA; You can try and push them further using Afterburner Curve Method which some have success with. You may also flash the T4 bios to the card which will give you 1.2 volts. While you may break 2200 the fps gain will be insignificant.


----------



## Vellinious

Running colder works a LOT better than adding voltage. Raising voltage without first lowering temps is really pretty worthless.


----------



## Harrywang

Quote:


> Originally Posted by *Derek1*
> 
> Your OC is above average for the Hybrid. I have 2 that will do that. I can push them to 2177 but as you say as soon as the temps hit ~45C they will come down to 2164 and if they go over 50C will come down again to 2139. I run mine at 2152 and so the only downclock I get is to 2139 and they never go over 47-48C @ ~1.08v.
> 
> I did a Hybrid Conversion on my original FTW and then bought a Factory Hybrid. Neither have the pump noise I have seen you and other complain about. Before installing them though I made sure to give them a little shake upside down so that all the coolant went to the pump and air to the rad, this may be what keeps the pump from buzzing as you may have an air bubble in there, which would also explain your higher temps.
> 
> ETA; You can try and push them further using Afterburner Curve Method which some have success with. You may also flash the T4 bios to the card which will give you 1.2 volts. While you may break 2200 the fps gain will be insignificant.


Are you sure your hybrids make NO noise? It's not really a buzzing sound but pump noise. I've read that ALL AIO's have this noise and its completely normal. My fans are dead silent so the only thing you can hear is the noise from the graphics card.

Are my temps considered high? It is summer over here and my room can get pretty hot as I don't ahve AC.


----------



## MK-Professor

I have an MSI GTX 1080 gaming X out of the box(without any OC) the boost clocks stay around 1911MHz.

I OC to 2050Mhz (didn't change any voltage) and it was stable, however I did encounter some weird performance gains from stock clocks

valley benchmark
1911/5000 stock
2050/5000 (+2% from stock performance)
2050/5300 (+5% from stock performance)
2012/5300 (+5% from stock performance)
2050/5400 (+0% from stock performance)

ok from this it is clear that 5300mhz is the vram limit before it start the performance redaction (due to error repair). but still I should see more performance gains from increasing the core frequency right?

It looks like I can only get 5% more performance from stock which is preity poor.


----------



## TUFinside

I had the GTX 1070 Hybrid from EVGA, horribe pump noise and fan, i had to send it back. I prefer the noise of my noisy Zotac GTX 1080 Mini.


----------



## Derek1

Quote:


> Originally Posted by *Harrywang*
> 
> Are you sure your hybrids make NO noise? It's not really a buzzing sound but pump noise. I've read that ALL AIO's have this noise and its completely normal. My fans are dead silent so the only thing you can hear is the noise from the graphics card.
> 
> Are my temps considered high? It is summer over here and my room can get pretty hot as I don't ahve AC.


No noise whatsoever? I am sure if you put a stethoscope to them or something then sure. But I do have 10 fans in my case. Both Hybrids are in push/pull and at the lowest idle setting I put the fans, approx 800rpm and with the H110i having 4 fans at 800rpm there is no way in hell I can hear the pump. I hear the street noise more than my fans when idle so make of that what you will.
My case ambient temp is 22-25C in summer so far here.
Your temps are a tad high to me but to others here they are very high being under water, not necessarily Hybrids though and some even run their cards on chillers and their temps never break 30C.
What was posted earlier is correct, you want to get colder before volts. The cooler your chip is the better it OCs right.


----------



## Harrywang

Quote:


> Originally Posted by *Derek1*
> 
> No noise whatsoever? I am sure if you put a stethoscope to them or something then sure. But I do have 10 fans in my case. Both Hybrids are in push/pull and at the lowest idle setting I put the fans, approx 800rpm and with the H110i having 4 fans at 800rpm there is no way in hell I can hear the pump. I hear the street noise more than my fans when idle so make of that what you will.
> My case ambient temp is 22-25C in summer so far here.
> Your temps are a tad high to me but to others here they are very high being under water, not necessarily Hybrids though and some even run their cards on chillers and their temps never break 30C.
> What was posted earlier is correct, you want to get colder before volts. The cooler your chip is the better it OCs right.


Your case ambient is 22-25c? Mine is prob 30c so it seems pretty normal TBH. I'm running 3 intake fans as well and have a noctua dh-15. So not completely watercooled. My gpu idle is 4-5c higher then my CPU at idle temps. They both never go past 50c when gaming though.

Yes when my fans are at the lowest speeds you can hear SOME pump noise. It's not a big deal but when its dead quiet you can hear it. I'll definitely be buying the wire to lower the pump so I can have it completely silent at idle.

I'm pretty rock stable at 1.093 at 2164 to 2152. Should I fine tune it at lower volts or something or just leave it the way it is?


----------



## Derek1

Quote:


> Originally Posted by *Harrywang*
> 
> Your case ambient is 22-25c? Mine is prob 30c so it seems pretty normal TBH. I'm running 3 intake fans as well and have a noctua dh-15. So not completely watercooled. My gpu idle is 4-5c higher then my CPU at idle temps. They both never go past 50c when gaming though.
> 
> Yes when my fans are at the lowest speeds you can hear SOME pump noise. It's not a big deal but when its dead quiet you can hear it. I'll definitely be buying the wire to lower the pump so I can have it completely silent at idle.
> 
> I'm pretty rock stable at 1.093 at 2164 to 2152. Should I fine tune it at lower volts or something or just leave it the way it is?


You can try and lower volts to 1.06-1.08 if you like but you will always be over 40C and therefore your clocks will not change. You might gain a 1-2 fps.
Up to you.


----------



## Beagle Box

Quote:


> Originally Posted by *MK-Professor*
> 
> I have an MSI GTX 1080 gaming X out of the box(without any OC) the boost clocks stay around 1911MHz.
> 
> I OC to 2050Mhz (didn't change any voltage) and it was stable, however I did encounter some weird performance gains from stock clocks
> 
> valley benchmark
> 1911/5000 stock
> 2050/5000 (+2% from stock performance)
> 2050/5300 (+5% from stock performance)
> 2012/5300 (+5% from stock performance)
> 2050/5400 (+0% from stock performance)
> 
> ok from this it is clear that 5300mhz is the vram limit before it start the performance redaction (due to error repair). but still I should see more performance gains from increasing the core frequency right?
> 
> It looks like I can only get 5% more performance from stock which is preity poor.


If you experiment enough, you'll find multiple VRAM 'sweet spots' that will increase your performance slightly. These will differ with processor speed and test performed.

From my experience with the MSI Gaming X, the greatest increases in performance are usually attained by lowering temps and raising the power limit. It's possible to increase the freq to over 2200MHz for some tests, but if you're power limited or your cooling system can't compensate for the extra heat, the card won't stay there and you'll see little or no performance gain.
.


----------



## MK-Professor

Quote:


> Originally Posted by *Beagle Box*
> 
> If you experiment enough, you'll find multiple VRAM 'sweet spots' that will increase your performance slightly. These will differ with processor speed and test performed.
> 
> From my experience with the MSI Gaming X, the greatest increases in performance are usually attained by lowering temps and raising the power limit. It's possible to increase the freq to over 2200MHz for some tests, but if you're power limited or your cooling system can't compensate for the extra heat, the card won't stay there and you'll see little or no performance gain.
> .


I think up to +300 on the vram is the sweet spot for me because if I do +350 I lose some performance, and if I go +400 I lose even more performance.

but my concerns is that the vram is "holding" the card back, I mean even if I do 2200MHz the performance will not increase compared to 2012 MHz


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> I think up to +300 on the vram is the sweet spot for me because if I do +350 I lose some performance, and if I go +400 I lose even more performance.
> 
> but my concerns is that the vram is "holding" the card back, I mean even if I do 2200MHz the performance will not increase compared to 2012 MHz


If you're not gaining any performance by running 2200 on the core, vs 2012 on the core, it's because it's running too warm. Get it colder, and it will. With Pascal, just because you can run a clock, doesn't mean you should run a clock......


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> If you're not gaining any performance by running 2200 on the core, vs 2012 on the core, it's because it's running too warm. Get it colder, and it will. With Pascal, just because you can run a clock, doesn't mean you should run a clock......


I set the fan speed to 100% and the max temp I see under load was 57C

I adjust the boost clock to get 2076Mhz under load

the results:

2076Mhz vs 2012Mhz = only see 1% performance increase (keep in mind that from 2012 to 2076 is a 3% more frequency)


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> I set the fan speed to 100% and the max temp I see under load was 57C
> 
> I adjust the boost clock to get 2076Mhz under load
> 
> the results:
> 
> 2076Mhz vs 2012Mhz = only see 1% performance increase (keep in mind that from 2012 to 2076 is a 3% more frequency)


Yup, too warm


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> Yup, too warm


if 57C consider too warm then there isn't a point to go above 2 GHz without water cooling.

With the fan speed on auto which is extremely quiet (and with the boost clock adjust to get 2012Mhz) I get max temp 70C which is way wormer than 57C and the performance is similar both on 57C and 70C

From 2012Mhz(70C fan speed auto) to 2076MHz(57C fan speed 100%) = 1% performance increase


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> if 57C consider too warm then there isn't a point to go above 2 GHz without water cooling.
> 
> With the fan speed on auto which is extremely quiet (and with the boost clock adjust to get 2012Mhz) I get max temp 70C which is way wormer than 57C and the performance is similar both on 57C and 70C
> 
> From 2012Mhz(70C fan speed auto) to 2076MHz(57C fan speed 100%) = 1% performance increase


All of the 1080s I've tested could do 2100ish on air pretty easy, and with a VERY aggressive fan curve with 20c ambient temps, they'd run between 2100 and 2150 pretty decent, with a custom frequency / voltage curve set, to keep clocks from dropping.

Ya just gotta remember...the colder it is, the higher it'll boost and the better it'll run there.

For instance....just throwing some numbers out here....at 70c under load, you might need 1.093v to keep 2120 stable. But at 40c under load, you might only need 1.04v to keep 2120 stable, AND, it'd have better performance there.

With Pascal, it is very much about keeping the voltages as low as possible, for your desired clock, in order to keep temps down, so that boost 3.0 doesn't step in and either raise voltages to keep the clock constant, OR, drop the core clock.

Find the happy place for your GPU, and the temps that it operates at, and run with it.


----------



## warpuck

PRICE drop?

https://www.amazon.com/dp/B01ITIY01O/?&tag=tomshardware_onesignal-20


----------



## gordesky1

Quote:


> Originally Posted by *warpuck*
> 
> PRICE drop?
> 
> https://www.amazon.com/dp/B01ITIY01O/?&tag=tomshardware_onesignal-20


what was the price? its gone now... either bitcoin people snagged them again or gamers lol


----------



## bagrata

I have stock 1080 with standard 10gbps memory but msi afterburner reports only 4500mhz so 9gbps right?
I also have aorus version with 11gbps memory but msi afterburner still reports 4500 instead of 5500mhz, both cards under 100% load, with voltage and power limit slider at 100%.
how can I fix this?


----------



## gordesky1

Guys whats the lowest you guys under volted you cards too at stock and overclocked? Why i ask im still trying to find the lowest i can go at around 1900mhz which so far the lowest i tried is 850mv and its been stable for days

This time around i figure i leave heavy overclocking out for this card cause for one im pretty sure my warranty is gone cause i bought it local:/ But couldn't pass up the price 400$. So aiming for a cold long lasting card. 2 I really see no need to overclock a 1080 lol. I might see how well it overclocks in the future tho.

Here's the info in afterburner. This is when its running bitcoin 100% load. 

Its lower in games

Yea i know i said i want it to last longer and here i am running bitcoin on it lol.. Tho im pretty sure aslong as it runs cold it should be fine?


----------



## sinholueiro

I already
Quote:


> Originally Posted by *gordesky1*
> 
> Guys whats the lowest you guys under volted you cards too at stock and overclocked? Why i ask im still trying to find the lowest i can go at around 1900mhz which so far the lowest i tried is 850mv and its been stable for days
> 
> This time around i figure i leave heavy overclocking out for this card cause for one im pretty sure my warranty is gone cause i bought it local:/ But couldn't pass up the price 400$. So aiming for a cold long lasting card. 2 I really see no need to overclock a 1080 lol. I might see how well it overclocks in the future tho.
> 
> Here's the info in afterburner. This is when its running bitcoin 100% load.
> 
> Its lower in games
> 
> Yea i know i said i want it to last longer and here i am running bitcoin on it lol.. Tho im pretty sure aslong as it runs cold it should be fine?


I think ~1650 at 0.67-0.7V is my lowest, but I have ~1850 at 0.76-0.8 as my daily driver.


----------



## gordesky1

Quote:


> Originally Posted by *sinholueiro*
> 
> I already
> I think ~1650 at 0.67-0.7V is my lowest, but I have ~1850 at 0.76-0.8 as my daily driver.


nice

Hmm is there a way to show the lower voltages? the lowest it shows in fan curve is 800..


----------



## MK-Professor

I am trying to find my vram sweet spot, up to +300 I gain some performance but at +350 and +400 lose some performance, however if I go +500 I gain performance and at +550 I gain again more performance. what is wrong with +350 and +400?


----------



## bagrata

..............................................


----------



## bagrata

Quote:


> Originally Posted by *bagrata*
> 
> I have stock 1080 with standard 10gbps memory but msi afterburner reports only 4500mhz so 9gbps right?
> I also have aorus version with 11gbps memory but msi afterburner still reports 4500 instead of 5500mhz, both cards under 100% load, with voltage and power limit slider at 100%.
> how can I fix this?


no one?







(

for example
guru3d reviewed my card
gtx 1080 AORUS with 11Gbps memory
their overclocked screen.


my card with same settings and 100% gpu load.


there is whole 1000mhz difference between them!
how can I fix that?


----------



## sinholueiro

Quote:


> Originally Posted by *gordesky1*
> 
> nice
> 
> Hmm is there a way to show the lower voltages? the lowest it shows in fan curve is 800..


Just use the power limit. I have +175 in the core and 50% and 70% power limit for the settings that I said earlier.


----------



## spddmn24

Quote:


> Originally Posted by *bagrata*
> 
> no one?
> 
> 
> 
> 
> 
> 
> 
> (
> 
> for example
> guru3d reviewed my card
> gtx 1080 AORUS with 11Gbps memory
> their overclocked screen.
> 
> 
> my card with same settings and 100% gpu load.
> 
> 
> there is whole 1000mhz difference between them!
> how can I fix that?


Is that a gaming load?


----------



## bagrata

Quote:


> Originally Posted by *spddmn24*
> 
> Is that a gaming load?


mining.


----------



## spddmn24

Quote:


> Originally Posted by *bagrata*
> 
> mining.


Probably entering p2 state for compute tasks which slows down the vram.


----------



## ucode

P2 memory clock can be adjusted independently of P0 such that they can be the same for both P0 and P2 but for whatever reason the author of MSI AB showed no interest in adding that feature, don't know if that's changed since. Same with the adding the missing lower and upper voltages.

FWIW the first Pascal driver didn't have a P2 state, don't know why it was introduced afterward to run memory below it's rated clock when running computational tasks / CUDA..


----------



## KingAlkaiser

what is the best gtx 1080 company to purchase from and how do you guys choose which one is better? I have never owned a nvidia card due to price to performance for many years ( i always bought within 200 dollars price point) and concidering i might buy a 1440p monitor and no amd cards are ever in stock i was curious to try out a nvidia card. ( gtx 1060 too weak for 1440p and gtx 1070 out of stock as well due to etherium ).

there is too many variances of cards per company for nvidia line its a bit of a pain in the ass to be honest lol. example:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814487318

i heard good things about evga but this one has 11 different versions of the same card this is insane lol.

what card has the best cooling solution and the least amount of coil whine / sound. These are always my most important things i look for when buying gpu besides brand reliability.

there is some for 500 bucks which sounds perfect point id like to spend but how reliable are gigabyte or msi cards? i love gigabyte mobos but never owned gpu variants of them.

thank you.


----------



## sinholueiro

Get the cheapest and watercool it, either kraken or custom.


----------



## Vellinious

Quote:


> Originally Posted by *KingAlkaiser*
> 
> what is the best gtx 1080 company to purchase from and how do you guys choose which one is better? I have never owned a nvidia card due to price to performance for many years ( i always bought within 200 dollars price point) and concidering i might buy a 1440p monitor and no amd cards are ever in stock i was curious to try out a nvidia card. ( gtx 1060 too weak for 1440p and gtx 1070 out of stock as well due to etherium ).
> 
> there is too many variances of cards per company for nvidia line its a bit of a pain in the ass to be honest lol. example:
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487318
> 
> i heard good things about evga but this one has 11 different versions of the same card this is insane lol.
> 
> what card has the best cooling solution and the least amount of coil whine / sound. These are always my most important things i look for when buying gpu besides brand reliability.
> 
> there is some for 500 bucks which sounds perfect point id like to spend but how reliable are gigabyte or msi cards? i love gigabyte mobos but never owned gpu variants of them.
> 
> thank you.


If you're wanting to overclock...REALLY overclock, you'll want to make sure you find one with higher power limits. The FE series cards (reference models) are limited in that way. Usually, anything with a custom PCB, is going to have increased power limits.

If you're not going to really push the overclocks, and lower ambient temps down....it probably won't matter.

In before, "MUH SHUNT MOD, BRAH". Shut up.....it voids warranty, and he's asking about how to choose his next video card....nuff said.

EVGA has the best warranty and customer service in the business. I highly recommend them.


----------



## Beagle Box

Quote:


> Originally Posted by *KingAlkaiser*
> 
> what is the best gtx 1080 company to purchase from and how do you guys choose which one is better? I have never owned a nvidia card due to price to performance for many years ( i always bought within 200 dollars price point) and concidering i might buy a 1440p monitor and no amd cards are ever in stock i was curious to try out a nvidia card. ( gtx 1060 too weak for 1440p and gtx 1070 out of stock as well due to etherium ).
> 
> there is too many variances of cards per company for nvidia line its a bit of a pain in the ass to be honest lol. example:
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487318
> 
> i heard good things about evga but this one has 11 different versions of the same card this is insane lol.
> 
> what card has the best cooling solution and the least amount of coil whine / sound. These are always my most important things i look for when buying gpu besides brand reliability.
> 
> there is some for 500 bucks which sounds perfect point id like to spend but how reliable are gigabyte or msi cards? i love gigabyte mobos but never owned gpu variants of them.
> 
> thank you.


I really like my MSI Gaming X. It's got one of the best coolers made and the best fans.

Cool, quiet and there's nothing on air that's consistently faster.

Research reviews for fans, cooling, sound and look at my benchmarks on this site for performance.

$600 with newer RAM at Microcenter. A used one can be had for less than $500 from those looking to upgrade to the Ti and beyond.


----------



## KingAlkaiser

thank you very much for information.

I found it weird to see gtx 1080 for 500 bucks on newegg maybe they have some sort of fault or something.

by the way how well does the 1080 handle 1440p gaming? i want to purchase a new monitor and want something that can handle and max everything with 60+ fps at 1440p.

I use a core I5 4690k for cpu and i still use ddr3 but no need to upgrade for a while, will the ram + cpu bottleneck the 1080 in any way?

dammit MSI version is out of stock how good is it anyways? tripple fans seem decent on it.

https://www.newegg.com/Product/Product.aspx?Item=N82E16814137084


----------



## Beagle Box

Quote:


> Originally Posted by *KingAlkaiser*
> 
> thank you very much for information.
> 
> I found it weird to see gtx 1080 for 500 bucks on newegg maybe they have some sort of fault or something.
> 
> by the way how well does the 1080 handle 1440p gaming? i want to purchase a new monitor and want something that can handle and max everything with 60+ fps at 1440p.
> 
> I use a core I5 4690k for cpu and i still use ddr3 but no need to upgrade for a while, will the ram + cpu bottleneck the 1080 in any way?
> 
> dammit MSI version is out of stock how good is it anyways? tripple fans seem decent on it.
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814137084


Despite its 3 fans, the cooler is not as good (noise and cooling) as the Gaming X or Gaming Z. It is considered a better card overall than an Aero and Armor and not as good as a Gaming X or Gaming Z.

If you're going to remove the heatsink and fans to add a water block, get the Aero, Armor or Duke, whichever is least expensive.


----------



## croikie

Quote:


> Originally Posted by *KingAlkaiser*
> 
> thank you very much for information.
> 
> I found it weird to see gtx 1080 for 500 bucks on newegg maybe they have some sort of fault or something.
> 
> by the way how well does the 1080 handle 1440p gaming? i want to purchase a new monitor and want something that can handle and max everything with 60+ fps at 1440p.


You ask how it runs 1440P games?
I have the Aorus GTX 1080 extreme. and it runs 1440p just fine with a few exceptions. 559.99 from Microcenter right now. It may be worth your attention.

I run mass effect andromeda on all custom settings of Ultra , it looks soooo fresh. 2560x1440 @144hz
I haven't tested other games yet.
Video card hasn't gone over 71 degrees c yet. I haven't played with the overclocking yet...
But when i have my second screen going and running other apps it does impede the performance, so far my rig can only run 1440p at full power with the game being the only app running.
I have an i7-6700K with 16GB ram. SSD Raid 0 for the O.S. 7200rpm for the data.


----------



## spddmn24

Quote:


> Originally Posted by *KingAlkaiser*
> 
> what is the best gtx 1080 company to purchase from and how do you guys choose which one is better? I have never owned a nvidia card due to price to performance for many years ( i always bought within 200 dollars price point) and concidering i might buy a 1440p monitor and no amd cards are ever in stock i was curious to try out a nvidia card. ( gtx 1060 too weak for 1440p and gtx 1070 out of stock as well due to etherium ).
> 
> there is too many variances of cards per company for nvidia line its a bit of a pain in the ass to be honest lol. example:
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487318
> 
> i heard good things about evga but this one has 11 different versions of the same card this is insane lol.
> 
> what card has the best cooling solution and the least amount of coil whine / sound. These are always my most important things i look for when buying gpu besides brand reliability.
> 
> there is some for 500 bucks which sounds perfect point id like to spend but how reliable are gigabyte or msi cards? i love gigabyte mobos but never owned gpu variants of them.
> 
> thank you.


Pretty happy with my 11gbps strix. They gave it the same cooler as the strix 1080ti as well, so it runs cool and pretty much silent.


----------



## fat4l

Guys.
I wasnt here for some time so thats why this question.

Are we able to flash our "old" 1080 FE with 10000MHz mems, to 11000MHz mems one ? Like Asus 1080 Strix 11G version ?
Did they changed the timings to achieve 11GHz or is it somehow better/binned memory and would we potentially get something from it if we flash to 11G stock ?
We know that T4 Strix bios works with FE so ....

Also is there any better bios that last years Strix T4v2 bios ? Thanks


----------



## Dasboogieman

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I wasnt here for some time so thats why this question.
> 
> Are we able to flash our "old" 1080 FE with 10000MHz mems, to 11000MHz mems one ? Like Asus 1080 Strix 11G version ?
> Did they changed the timings to achieve 11GHz or is it somehow better/binned memory and would we potentially get something from it if we flash to 11G stock ?
> We know that T4 Strix bios works with FE so ....
> 
> Also is there any better bios that last years Strix T4v2 bios ? Thanks


Better binned, the new 11GB/s VRAM is a completely different SKU. That being said, most 10GB/s VRAM can OC well enough to 11GB/s but I see little benefit in flashing a BIOS just to achieve that.


----------



## Chicken Patty

What are you guys using to overclock the 1080's. I'm using EVGA Precision, but just wondering if there is any better software out there for the job.


----------



## Vellinious

Quote:


> Originally Posted by *Chicken Patty*
> 
> What are you guys using to overclock the 1080's. I'm using EVGA Precision, but just wondering if there is any better software out there for the job.


I use MSI AB. The voltage / frequency curve is easier to use with more defined labels for the various points.


----------



## Chicken Patty

Quote:


> Originally Posted by *Vellinious*
> 
> I use MSI AB. The voltage / frequency curve is easier to use with more defined labels for the various points.


I have used it before but never on the 1080. I'll give it a crack and see.


----------



## GRABibus

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I wasnt here for some time so thats why this question.
> 
> Are we able to flash our "old" 1080 FE with 10000MHz mems, to 11000MHz mems one ? Like Asus 1080 Strix 11G version ?
> Did they changed the timings to achieve 11GHz or is it somehow better/binned memory and would we potentially get something from it if we flash to 11G stock ?
> We know that T4 Strix bios works with FE so ....
> 
> Also is there any better bios that last years Strix T4v2 bios ? Thanks


i use ASUS Strix Bios t4 on my SLI GTX1080.
You speak about t4v2 version ?

What is the Bios version ? (For the one t4 I use, BIOS version is 86.04.17.00.76) => https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803


----------



## Vellinious

Better off lowering temps, before adding voltage....


----------



## MK-Professor

I have 2 questions regarding OC with my MSI GTX1080 gaming x

From some testing that I have done with valley benchmark I found that core frequency make very litle impact on peormacne, however vram frequency have quite a drastic impact on performance, why is that?
1911/5004 fps 112 - stock - temp 68C
1734/5556 fps 114 - underclocked core but OC vram - temp 58C
2012/5556 fps 121 - OC core and vram - temp 69C

Also with the vram up to +300Mhz I gain some performance but at +350Mhz and +400Mhz I lose performance, however if I go +500Mhz I gain performance and at +550Mhz I gain even more performance(+550 it gives the best performance). what is wrong with +350Mhz and +400Mhz?


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> I have 2 questions regarding OC with my MSI GTX1080 gaming x
> 
> From some testing that I have done with valley benchmark I found that core frequency make very litle impact on peormacne, however vram frequency have quite a drastic impact on performance, why is that?
> 1911/5004 fps 112 - stock - temp 68C
> 1734/5556 fps 114 - underclocked core but OC vram - temp 58C
> 2012/5556 fps 121 - OC core and vram - temp 69C
> 
> Also with the vram up to +300Mhz I gain some performance but at +350Mhz and +400Mhz I lose performance, however if I go +500Mhz I gain performance and at +550Mhz I gain even more performance(+550 it gives the best performance). what is wrong with +350Mhz and +400Mhz?


Valley is horribly CPU limited. Run a different benchmark. Firestrike, Timespy, Superposition....anything but Valley.

The memory on these cards acts.....oddly. You'll see those kinds of hills and valleys all the way through the offsets. Find a place that runs good consistently, and stick with it.


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> Valley is horribly CPU limited. Run a different benchmark. Firestrike, Timespy, Superposition....anything but Valley.
> 
> The memory on these cards acts.....oddly. You'll see those kinds of hills and valleys all the way through the offsets. Find a place that runs good consistently, and stick with it.


I have found similar behavior on other benchmarks and games(like Superposition and dishonored 2, I may post some numbers later) core frequency makes very litle impact on performance. but if I OC the vram from 5004 to 5556 I get a nice 8% fps bump in pretty much anywhere.

for me +550Mhz on the memory gives the best performance so I will stick with that


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> I have found similar behavior on other benchmarks and games(like Superposition and dishonored 2, I may post some numbers later) core frequency makes very litle impact on performance. but if I OC the vram from 5004 to 5556 I get a nice 8% fps bump in pretty much anywhere.
> 
> for me +550Mhz on the memory gives the best performance so I will stick with that


If increasing core frequency isn't effecting performance, then the GPU is running too warm.


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> If increasing core frequency isn't effecting performance, then the GPU is running too warm.


it is not too warm, 69C(with core at 2012mhz) and 58C(with core that 1734mhz)
bisides it is not that my GTX1080 is underperforming, I have check performance with others with GTX1080(2000+/ 5550) and the performance is exactly the same, I just found it weird that core frequency makes very little impact.


----------



## Chicken Patty

In the PC world, all components have their "sweet spot". I sometimes have overclocked something, and I have found it runs better at a slightly lower clock. Just find what works and stick with it.


----------



## Dasboogieman

Quote:


> Originally Posted by *MK-Professor*
> 
> it is not too warm, 69C(with core at 2012mhz) and 58C(with core that 1734mhz)
> bisides it is not that my GTX1080 is underperforming, I have check performance with others with GTX1080(2000+/ 5550) and the performance is exactly the same, I just found it weird that core frequency makes very little impact.


you are internally power throttling. All GP102 cards have some kind of hardware power management unit built in which can dynamically trim GPU performance without changing the clockspeeds so as to maintain TDP. VRAM OC increases performance much more because the impact of raising TDP vs clockspeed is much lower than the Core speed.


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> it is not too warm, 69C(with core at 2012mhz) and 58C(with core that 1734mhz)
> bisides it is not that my GTX1080 is underperforming, I have check performance with others with GTX1080(2000+/ 5550) and the performance is exactly the same, I just found it weird that core frequency makes very little impact.


The warmer they are, the worse they run. The colder they are, the better they run.

I see performance gains that scale with clocks all the way up to 2189 because temps stay below 34c.


----------



## Dasboogieman

Quote:


> Originally Posted by *Vellinious*
> 
> The warmer they are, the worse they run. The colder they are, the better they run.
> 
> I see performance gains that scale with clocks all the way up to 2189 because temps stay below 34c.


Yup, this is the internal power management unit at work.


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> The warmer they are, the worse they run. The colder they are, the better they run.
> 
> I see performance gains that scale with clocks all the way up to 2189 because temps stay below 34c.


it doesn't in my case anyway
with core at 2012mhz it get up to 69C(fan auto) and it have exactly the same performance with core at 2038mhz and 57C (fan 100%)


----------



## Beagle Box

Quote:


> Originally Posted by *MK-Professor*
> 
> it doesn't in my case anyway
> with core at 2012mhz it get up to 69C(fan auto) and it have exactly the same performance with core at 2038mhz and 57C (fan 100%)


How are you setting and determining your CPU speeds while running the benchmarks?


----------



## MK-Professor

Quote:


> Originally Posted by *Beagle Box*
> 
> How are you setting and determining your CPU speeds while running the benchmarks?


cpu speed is always at 4.7GHz(6700K)


----------



## Beagle Box

Quote:


> Originally Posted by *MK-Professor*
> 
> cpu speed is always at 4.7GHz(6700K)


Heh. Typo.









How are you setting and determining your *GPU speeds* while running the benchmarks?


----------



## Vellinious

Quote:


> Originally Posted by *Beagle Box*
> 
> Heh. Typo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How are you setting and determining your *GPU speeds* while running the benchmarks?


I'd be curious to see the GPUz sensors tab during these benchmark runs as well. Something is off if there's 0 increase between those clocks.

Those temps are higher than I'd allow my rig to run, but.....aircooling. /shivers


----------



## MK-Professor

Quote:


> Originally Posted by *Beagle Box*
> 
> Heh. Typo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How are you setting and determining your *GPU speeds* while running the benchmarks?


I do +120MHz on the core and power limit 104%(max it can go) temp limit 92C and fan speed to auto(temp I am getting are 69C), and after a few seconds then the frequency stabilize around 2012MHz (it starts somewhere around 2050-2060)


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> I'd be curious to see the GPUz sensors tab during these benchmark runs as well. Something is off if there's 0 increase between those clocks.
> 
> Those temps are higher than I'd allow my rig to run, but.....aircooling. /shivers



while running superposition benchmark


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> 
> while running superposition benchmark


There's green in there. You're hitting the power limit at 2012 @ 1.043v. 69c isn't doin ya any favors either. Which Superposition bench was that? 4k? 1080 Extreme? 1080 medium?

Do you have the power limit slider all the way up?


----------



## Beagle Box

Quote:


> Originally Posted by *Vellinious*
> 
> There's green in there. You're hitting the power limit at 2012 @ 1.043v. 69c isn't doin ya any favors either. Which Superposition bench was that? 4k? 1080 Extreme? 1080 medium?
> 
> Do you have the power limit slider all the way up?


That's a pretty low performance level to be hitting the power limit. I run the same card, but different BIOS. Mine goes much higher.

I think a more aggressive fan curve and customized voltage/speed curve are needed to see what it can really do.


----------



## Vellinious

Quote:


> Originally Posted by *Beagle Box*
> 
> That's a pretty low performance level to be hitting the power limit. I run the same card, but different BIOS. Mine goes much higher.
> 
> I think a more aggressive fan curve and customized voltage/speed curve are needed to see what it can really do.


Yup. It should be able to maintain 2012 with 1.0v...probably less. IF he used a really aggressive fan curve.


----------



## Chicken Patty

I'm able to maintain 2012 perfectly fine, but I do have a pretty aggressive fan curve. It's quiet when not gaming, but the moment it sees the slightest rise in temperature, it goes up in a jiffy!


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> There's green in there. You're hitting the power limit at 2012 @ 1.043v. 69c isn't doin ya any favors either. Which Superposition bench was that? 4k? 1080 Extreme? 1080 medium?
> 
> Do you have the power limit slider all the way up?


I use the settings that most people use so it can be comparable and from what I can see the performance is right on spot with other people with similar clocks


like I said before I can lower my temp from 69C to 57C with fan speed 100% but the performance remains the same, also I can't go any lower than 57C because it is summer and the ambient temperature is 27C. So auto fan speed is giving me the same performance (with fan speed 100%) and the system is quiet when I game, and silent when I play less demanding games(fans start spinning if temp goes above 60C)

the power limit slider is all the way up to 104% can't go any higher than that (probably because it is a custom card and the power limit was already more than the reference cards)

also I can push the core further with without any instability up to 2050MHz but because the performance difference is 0% I keep a conservative 2012MHz for a peace of mind.


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> I use the settings that most people use so it can be comparable and from what I can see the performance is right on spot with other people with similar clocks
> 
> 
> like I said before I can lower my temp from 69C to 57C with fan speed 100% but the performance remains the same, also I can't go any lower than 57C because it is summer and the ambient temperature is 27C. So auto fan speed is giving me the same performance (with fan speed 100%) and the system is quiet when I game, and silent when I play less demanding games(fans start spinning if temp goes above 60C)
> 
> the power limit slider is all the way up to 104% can't go any higher than that (probably because it is a custom card and the power limit was already more than the reference cards)
> 
> also I can push the core further with without any instability up to 2050MHz but because the performance difference is 0% I keep a conservative 2012MHz for a peace of mind.


Your GPU is hitting the power limit at 2012, so yeah, you're not going to see any improvement at 2050 because of it. At this point, the higher you clock it, the more voltage it's going to take, and create an even bigger hit from the power limit. There are ways to bring that under control a little bit more, but that'd mean switching to watercooling.

If you're happy with it, that's all that matters.


----------



## MK-Professor

Quote:


> Originally Posted by *Vellinious*
> 
> Your GPU is hitting the power limit at 2012, so yeah, you're not going to see any improvement at 2050 because of it. At this point, the higher you clock it, the more voltage it's going to take, and create an even bigger hit from the power limit. There are ways to bring that under control a little bit more, but that'd mean switching to watercooling.
> 
> If you're happy with it, that's all that matters.


I run an superposition benchmark with power limit 95% instead of 104% everything else was the same, the result was a core frequency fluctuation 1974-2012(before was a steady 2012mhz), performance loss was 0.5% too close to draw a clear conclusion but it is looks like that the gpu is power limited.


----------



## Vellinious

Quote:


> Originally Posted by *MK-Professor*
> 
> I run an superposition benchmark with power limit 95% instead of 104% everything else was the same, the result was a core frequency fluctuation 1974-2012(before was a steady 2012mhz), performance loss was 0.5% too close to draw a clear conclusion but it is looks like that the gpu is power limited.


Can always do the shunt mod.


----------



## fat4l

Quote:


> Originally Posted by *GRABibus*
> 
> i use ASUS Strix Bios t4 on my SLI GTX1080.
> You speak about t4v2 version ?
> 
> What is the Bios version ? (For the one t4 I use, BIOS version is 86.04.17.00.76) => https://www.techpowerup.com/vgabios/185156/asus-gtx1080-8192-160803


V2 is prolly what u are using anyway.

http://forum.hwbot.org/showthread.php?p=455871#post455871


----------



## GRABibus

Quote:


> Originally Posted by *fat4l*
> 
> V2 is prolly what u are using anyway.
> 
> http://forum.hwbot.org/showthread.php?p=455871#post455871


OK, thanks.


----------



## thesebastian

Hi all,

I have a Gigabyte G1 Gaming with the latest stock bios (F2). (Currently with +130Mhz, I hit the 108% PWR usage pretty much often and much more if raise the clock a bit more).
*Current max TDP (with 108%) is 216W*

1) Do you think it's possible to flash the G1 Xtreme Gaming bios? (F2_BETA). Is this procedure "safe" or instant brick?
- Please consider that the Xtreme Gaming bios has more than 8-pin.

2) How do I do this? (I have nvflash 5.370 and Windows 10 x64, but I've never used nvflash before).

More info:

*Current G1 stock F2 bios:*

Release for MICRON Memory
NVIDIA Source BIOS Version: 86.04.17.40.38
BIOS frequency: Base/Boost: 1695/1835 MHz; Memory:5005 MHz
Memory: 10010MHz
Improve the Fan Performance
Release for F1 BIOS

*Xtreme Gaming F2_beta bios (The bios I'd like to flash, since the power limit is much higher):*

Release for MICRON Memory
.NVIDIA Source BIOS Version: 86.04.17.00.A3
BIOS frequency: Base/Boost: 1759/1898 MHz
Memory: 10206MHz
Increase stability under OC mode
For F1 BIOS flash

So far I've tried to update running gigabyte "N1080X8DP.F2_Beta.exe" but got a "Board is incompatible with firmware version 86.04.17.40.38.


----------



## xzamples

EVGA GeForce GTX 1080 FTW2 GAMING, 08G-P4-6686-KR, 8GB GDDR5X, iCX - 9 Thermal Sensors & RGB LED G/P/M - https://www.evga.com/products/product.aspx?pn=08G-P4-6686-KR

Does anybody know what the best Afterburner OC settings are for it? it runs really cool and quiet on stock.


----------



## arc1880

Hey guys. I just recently got an eVGA GTX 1080 FTW Hybrid card this weekend. When I launch a game, is it normal to hear a slight buzzing sound? I've never had a closed loop CPU or GPU before. I am thinking it is the pump that's working? I don't think it's coil whine because it doesn't sound like whining, although I don't know for sure.


----------



## Vellinious

Quote:


> Originally Posted by *arc1880*
> 
> Hey guys. I just recently got an eVGA GTX 1080 FTW Hybrid card this weekend. When I launch a game, is it normal to hear a slight buzzing sound? I've never had a closed loop CPU or GPU before. I am thinking it is the pump that's working? I don't think it's coil whine because it doesn't sound like whining, although I don't know for sure.


It's most likely coil whine. It's probably nothing to worry about. A recording of what it sounds like might be helpful though. I give it a 99% chance that it's coil whine, though.


----------



## Harrywang

Isn't it just pump noise? normal for all hybrid cards I believe.


----------



## Vellinious

Quote:


> Originally Posted by *Harrywang*
> 
> Isn't it just pump noise? normal for all hybrid cards I believe.


I would think if it's the pump, it'd be doing it all the time....


----------



## Harrywang

Hm true. I have the evga hybrid as well and it makes a small buzzing noise all the time, even when idle. Just the pump noise I believe. But if it's just only when you are on load then it might be coil whine.


----------



## pez

Pump noise should be constant noise. IIRC that cable is a 2-pin cable adapted to a 4-pin connector (i.e. ground and power). You'll notice the pump noise less usually when fans are running, but that all depends on how noisy the pump is. Coil whine should be fairly distinct and quite a bit more noticeable depending on fan speed/noise.


----------



## thesebastian

Quote:


> Originally Posted by *thesebastian*
> 
> Hi all,
> 
> I have a Gigabyte G1 Gaming with the latest stock bios (F2). (Currently with +130Mhz, I hit the 108% PWR usage pretty much often and much more if raise the clock a bit more).
> *Current max TDP (with 108%) is 216W*
> 
> 1) Do you think it's possible to flash the G1 Xtreme Gaming bios? (F2_BETA). Is this procedure "safe" or instant brick?
> - Please consider that the Xtreme Gaming bios has more than 8-pin.
> 
> 2) How do I do this? (I have nvflash 5.370 and Windows 10 x64, but I've never used nvflash before).
> 
> More info:
> 
> *Current G1 stock F2 bios:*
> 
> Release for MICRON Memory
> NVIDIA Source BIOS Version: 86.04.17.40.38
> BIOS frequency: Base/Boost: 1695/1835 MHz; Memory:5005 MHz
> Memory: 10010MHz
> Improve the Fan Performance
> Release for F1 BIOS
> 
> *Xtreme Gaming F2_beta bios (The bios I'd like to flash, since the power limit is much higher):*
> 
> Release for MICRON Memory
> .NVIDIA Source BIOS Version: 86.04.17.00.A3
> BIOS frequency: Base/Boost: 1759/1898 MHz
> Memory: 10206MHz
> Increase stability under OC mode
> For F1 BIOS flash
> 
> So far I've tried to update running gigabyte "N1080X8DP.F2_Beta.exe" but got a "Board is incompatible with firmware version 86.04.17.40.38.


On a second thought, it is nearly usefull to increase my Power Target......
Because I have 8 pins, and these have 150W. Plus 75W from the board. That totals 225W.
So my current limit can only be raised from 216W to 225W If I changed the bios, right? If this is just to gain 9W this doesn't worth......

And If I apply a destructive mod like this, I'd be in the situation, right?
http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus/0_100

Thanks
Sebastian


----------



## Vellinious

Quote:


> Originally Posted by *thesebastian*
> 
> On a second thought, it is nearly usefull to increase my Power Target......
> Because I have 8 pins, and these have 150W. Plus 75W from the board. That totals 225W.
> So my current limit can only be raised from 216W to 225W If I changed the bios, right? If this is just to gain 9W this doesn't worth......
> 
> And If I apply a destructive mod like this, I'd be in the situation, right?
> http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus/0_100
> 
> Thanks
> Sebastian


An 8 pin can pull a lot more than just 150 watts. That's usually limited in the bios...the shunt mod bypasses those limitations.


----------



## arc1880

Quote:


> Originally Posted by *Vellinious*
> 
> I would think if it's the pump, it'd be doing it all the time....


Quote:


> Originally Posted by *Harrywang*
> 
> Hm true. I have the evga hybrid as well and it makes a small buzzing noise all the time, even when idle. Just the pump noise I believe. But if it's just only when you are on load then it might be coil whine.


Quote:


> Originally Posted by *pez*
> 
> Pump noise should be constant noise. IIRC that cable is a 2-pin cable adapted to a 4-pin connector (i.e. ground and power). You'll notice the pump noise less usually when fans are running, but that all depends on how noisy the pump is. Coil whine should be fairly distinct and quite a bit more noticeable depending on fan speed/noise.


It's both pump noise and coil whine I believe now. I changed some bios settings last night to turn my sys fans on to PWM and I heard the pump with a constant little buzz. I don't get bothered by that. But when playing a game there is coil whine. The noise changes pitch when in the loading screen vs playing the game although when playing with friends I don't hear it since I'm talking to them on the headset. I definitely notice it when playing solo. I hope I can get used to the noise since I bought the card second hand. My fans are pretty quiet too. I never noticed these noises while using a 1050 ti.


----------



## Vellinious

Quote:


> Originally Posted by *arc1880*
> 
> It's both pump noise and coil whine I believe now. I changed some bios settings last night to turn my sys fans on to PWM and I heard the pump with a constant little buzz. I don't get bothered by that. But when playing a game there is coil whine. The noise changes pitch when in the loading screen vs playing the game although when playing with friends I don't hear it since I'm talking to them on the headset. I definitely notice it when playing solo. I hope I can get used to the noise since I bought the card second hand. My fans are pretty quiet too. I never noticed these noises while using a 1050 ti.


Usually, the higher the frame rates, the more coil whine there will be. At least, in my personal experience, that's what I've noticed.


----------



## thesebastian

Quote:


> Originally Posted by *Vellinious*
> 
> An 8 pin can pull a lot more than just 150 watts. That's usually limited in the bios...the shunt mod bypasses those limitations.


Amazing didn't know this. So do you know if I can flash another Gigabyte BIOS using that nvflash app?
I'd like to flash 1080 Xtreme Gaming Bios (a card with more pins than mine) to my 1080 G1 Gaming (with 8 pins). Someone in YouTube did this but I don't know how, and if it's trusty source.

But that BIOS have much more power limit than mine (216W):


----------



## Vellinious

Quote:


> Originally Posted by *thesebastian*
> 
> Amazing didn't know this. So do you know if I can flash another Gigabyte BIOS using that nvflash app?
> I'd like to flash 1080 Xtreme Gaming Bios (a card with more pins than mine) to my 1080 G1 Gaming (with 8 pins). Someone in YouTube did this but I don't know how, and if it's trusty source.
> 
> But that BIOS have much more power limit than mine (216W):


I wouldn't flash to a bios for GPU that's not made for your specific GPU. If your GPU has a dual bios, it pretty much removes the dangers involved with doing it, but......on a single bios card? I wouldn't tempt the reaper.


----------



## pez

Quote:


> Originally Posted by *arc1880*
> 
> It's both pump noise and coil whine I believe now. I changed some bios settings last night to turn my sys fans on to PWM and I heard the pump with a constant little buzz. I don't get bothered by that. But when playing a game there is coil whine. The noise changes pitch when in the loading screen vs playing the game although when playing with friends I don't hear it since I'm talking to them on the headset. I definitely notice it when playing solo. I hope I can get used to the noise since I bought the card second hand. My fans are pretty quiet too. I never noticed these noises while using a 1050 ti.


Quote:


> Originally Posted by *Vellinious*
> 
> Usually, the higher the frame rates, the more coil whine there will be. At least, in my personal experience, that's what I've noticed.


Exactly. Don't even feel remotely bad about coil whine.

My TXP, TXp, 1070, 1080, and about 3 different 1080 Tis all had coil whine. The Titans and Tis were inherently worse than the 1080 and 1070, but it still exists. I use my TXP with a hybrid and while the hybrid AIO for the GPU doesn't have any buzz, my H75 does. The buzzing gets drowned out by airflow to the point where it makes it blend in. The coil whine can blend, too, but like you guys both said, it intensifies in scenes with high FPS. I want to say the worst I've heard from my card is in GoW4. I get some ridiculous amount of FPS (999 or something) on the first few loading screens and my card screams.

Essentially the VRMs on the higher end Pascal cards runs at such a high frequency it's barely avoidable, and most of the time, it's still there, but people don't notice it because their cases are super silent, their fan noise overpowers it, or they don't care/use a headset/speakers blasting.


----------



## Scotty99

Hey guys, what the quietest/coolest air cooled 1080 out there? Was debating a hybrid model from evga but am going for silent PC as well, and pump noise might bother me.


----------



## mbm

I have attached the arctic extreme IV... and with custom fancurve it is silent and cool.. it runs max. 25% fan and hits 60C (core2100/mem5300)


----------



## arc1880

Has anyone with a EVGA Hybrid card modified it so the pump is controlled by the motherboard??


----------



## tangelo

Hi.

I just bought a MSI 1080 X+ and I'm wondering why I'm unable to go higher than 104% on the Power Limit on MSI Afterburner.
I've seen reviews of _*this same card*_ where people are able to go to 114%-120%.

I have checked the voltage control stuff on settings. Tried the standard, extended, 3rd party options etc. Tried different settings on the MSI Gaming app. Different versions of Afterburner. Nothing helps. I'm stuck with a maximum of 104% Power Limit.

Any ideas?


----------



## ycodryn

Quote:


> Originally Posted by *tangelo*
> 
> Hi.
> 
> I just bought a MSI 1080 X+ and I'm wondering why I'm unable to go higher than 104% on the Power Limit on MSI Afterburner.
> I've seen reviews of this same card where people are able to go to 114%-120%.
> 
> I have checked the voltage control stuff on settings. Tried the standard, extended, 3rd party options etc. Tried different settings on the MSI Gaming app. Different versions of Afterburner. Nothing helps. I'm stuck with a maximum of 104% Power Limit.
> 
> Any ideas?


Every manufacturer has its own power target. If you ever made a bios edit on previous editable bios cards you could see that if you increase the base power number, the target % will be decreasing so for example (these are just numbers) a base 100000 and a 120000 power target will result a 120% increase but a 116000 base and a 120000 power target will result a 104% increase. But in the end is the same thing. I hope that I made a good example. You should make some benchmarks and look for similar hardware like yours, see the scores and clocks and compare it with your own.


----------



## tangelo

Quote:


> Every manufacturer has its own power target.


I understand what you are saying, but my question was why does my card have a lower Power Limit than other cards from the same manufacturer and the same model and sku? I'm not talking about comparing Asus cards to MSI etc. I'm talking about identical cards (GTX 1080 Gaming X*+*) from the same manufacturer (MSI). The clocks are same.

Why does people who have reviewed this card have 114% Power Target when my card only has 104%?

Has MSI updated/modified the bios after the reviews and if they have, why?
And if this indeed is the case, should I just keep this bios or flash it to older one if I want to try better oc?


----------



## sirleeofroy

Quote:


> Originally Posted by *tangelo*
> 
> I understand what you are saying, but my question was why does my card have a lower Power Limit than other cards from the same manufacturer and the same model and sku? I'm not talking about comparing Asus cards to MSI etc. I'm talking about identical cards (GTX 1080 Gaming X*+*) from the same manufacturer (MSI). The clocks are same.
> 
> Why does people who have reviewed this card have 114% Power Target when my card only has 104%?
> 
> Has MSI updated/modified the bios after the reviews and if they have, why?
> And if this indeed is the case, should I just keep this bios or flash it to older one if I want to try better oc?


It's possibly an inefficient chip and has had to use to use some of that headroom to guarantee the clocks, giving you less to play with.

might be worth looking at the power draw under load, it may be using more power when compared to "identical" cards.

Then again, I could be spouting absolute rubbish


----------



## Beagle Box

Quote:


> Originally Posted by *tangelo*
> 
> I understand what you are saying, but my question was why does my card have a lower Power Limit than other cards from the same manufacturer and the same model and sku? I'm not talking about comparing Asus cards to MSI etc. I'm talking about identical cards (GTX 1080 Gaming X*+*) from the same manufacturer (MSI). The clocks are same.
> 
> Why does people who have reviewed this card have 114% Power Target when my card only has 104%?
> 
> Has MSI updated/modified the bios after the reviews and if they have, why?
> And if this indeed is the case, should I just keep this bios or flash it to older one if I want to try better oc?


Different BIOSs have different stock power settings. The MSI original Gaming X cards came with 2 BIOSs, 'Gaming' and 'OC'.

If your card has a higher power limit at stock settings, you won't have as much headroom for an increase, right?. So your stock power setting may actually be higher than a card whose slider tops @ 120%.

FWIW, I have an early GTX 1080 Gaming X and replaced my BIOS with the Gaming Z overclock BIOS. It pulls insane amounts of power, but my AB Power slider tops @ only 107%.

You should definitely be comparing total power draw and benchmark performance amongst similar cards before flashing a BIOS.


----------



## Vellinious

120% of 200 watts is 240 watts.
110% of 300 watts is 330 watts.

Give me the 110% all day long. The % is completely arbitrary.

Also, if you're not hitting the power limit perf cap (GPUz sensors tab), having more power limit isn't going to do anything for you.


----------



## tangelo

After digging around the internet I found out that majority of ppl with MSI GTX 1080 Gaming X*+* use a bios version 86.04.66.00.2c which gives them max 114%. My card came with 86.04.66.00.52 and only the 104%. Which makes the Default power to 210W and max at 220W.

So it indeed is a bios thing.

But I will do what you have suggested and run some benchmarks and compare the results. It's not a thing I'm actually worried about per se. I just found it strange and got curious on the reason for the change.


----------



## Vellinious

Quote:


> Originally Posted by *tangelo*
> 
> After digging around the internet I found out that majority of ppl with MSI GTX 1080 Gaming X*+* use a bios version 86.04.66.00.2c which gives them max 114%. My card came with 86.04.66.00.52 and only the 104%. Which makes the Default power to 210W and max at 220W.
> 
> So it indeed is a bios thing.
> 
> But I will do what you have suggested and run some benchmarks and compare the results. It's not a thing I'm actually worried about per se. I just found it strange and got curious on the reason for the change.


Get GPUz open to the sensors tab, and watch the perf cap reason line. If you're not hitting the power limit, you don't have anything to worry about anyway. It'll show up green in the graph, and display PWR.

Might see something like this.....the green is bad.


----------



## buellersdayoff

Quote:


> Originally Posted by *tangelo*
> 
> After digging around the internet I found out that majority of ppl with MSI GTX 1080 Gaming X*+* use a bios version 86.04.66.00.2c which gives them max 114%. My card came with 86.04.66.00.52 and only the 104%. Which makes the Default power to 210W and max at 220W.
> 
> So it indeed is a bios thing.
> 
> But I will do what you have suggested and run some benchmarks and compare the results. It's not a thing I'm actually worried about per se. I just found it strange and got curious on the reason for the change.


You have the + version with higher wattage and gddr5x don't flash a bios from the other version
https://www.techpowerup.com/reviews/MSI/GTX_1080_Gaming_X_Plus_11_Gbps/28.html


----------



## tangelo

Quote:


> Originally Posted by *buellersdayoff*
> 
> You have the + version with higher wattage and gddr5x don't flash a bios from the other version
> https://www.techpowerup.com/reviews/MSI/GTX_1080_Gaming_X_Plus_11_Gbps/28.html


Yes I know. And there are people with the + version that have higher power limits on their bioses. I've tried to underline this point. It even says so on the same review you just linked. Look at the chart on https://www.techpowerup.com/reviews/MSI/GTX_1080_Gaming_X_Plus_11_Gbps/33.html

It clearly says "+14%" not +4% that my cards bios has...

This was the whole point of my messages, trying to figure out why "identical" boards have different bioses.


----------



## buellersdayoff

Quote:


> Originally Posted by *tangelo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> You have the + version with higher wattage and gddr5x don't flash a bios from the other version
> https://www.techpowerup.com/reviews/MSI/GTX_1080_Gaming_X_Plus_11_Gbps/28.html
> 
> 
> 
> Yes I know. And there are people with the + version that have higher power limits on their bioses. I've tried to underline this point. It even says so on the same review you just linked. Look at the chart on https://www.techpowerup.com/reviews/MSI/GTX_1080_Gaming_X_Plus_11_Gbps/33.html
> 
> It clearly says "+14%" not +4% that my cards bios has...
> 
> This was the whole point of my messages, trying to figure out why "identical" boards have different bioses.
Click to expand...

Does your card pull 250w @ full load? Use hwinfo64 to get a reading


----------



## tangelo

Quote:


> Originally Posted by *buellersdayoff*
> 
> Does your card pull 250w @ full load? Use hwinfo64 to get a reading


No. It pulls 224.335W at full load with core voltage and power limit maxed at +100 / +104%

And when I look at the bios information with gpuz it says:

Power Limit

Minimum 90.0W
Default 210.0W
Maximum 220.0W
Adjustment Range -57% to +5%


----------



## fa5terba11

Hey guys I'm new to the ownership of the gtx 1080 - I just got the msi sea hawk 1080. Does anyone know of a good guide out there for overclocking these cards specifically or gtx 1080s in general?


----------



## Scotty99

Quote:


> Originally Posted by *fa5terba11*
> 
> Hey guys I'm new to the ownership of the gtx 1080 - I just got the msi sea hawk 1080. Does anyone know of a good guide out there for overclocking these cards specifically or gtx 1080s in general?


How are you liking that card? How are the noise levels/temps?


----------



## fa5terba11

It comes stock with an EK watercooling block and it looks amazing. I wish I had a pci-e riser so I could actually see the block side of the card. It does come with a nice backplate as well so the back side doesn't look that bad either. It's super quiet - no fans and it runs cool without an overclock. I just played Dark Souls for an hour and didn't get above 40 degrees. I'm anxious to overclock this and see what else I can squeeze out of it.


----------



## Scotty99

Quote:


> Originally Posted by *fa5terba11*
> 
> It comes stock with an EK watercooling block and it looks amazing. I wish I had a pci-e riser so I could actually see the block side of the card. It does come with a nice backplate as well so the back side doesn't look that bad either. It's super quiet - no fans and it runs cool without an overclock. I just played Dark Souls for an hour and didn't get above 40 degrees. I'm anxious to overclock this and see what else I can squeeze out of it.


Oh right forgot there were two versions, there is also an MSI seahawk with a corsair AIO attached to it lol.


----------



## papalol1

I'm about to buy a new 1080 and i saw Gigabyte G1 but i heard it has a lot of noise so i should go for other brands.

Opinions? Pros is the price since its like 100€ +/- cheaper than the others here.


----------



## pez

Quote:


> Originally Posted by *papalol1*
> 
> I'm about to buy a new 1080 and i saw Gigabyte G1 but i heard it has a lot of noise so i should go for other brands.
> 
> Opinions? Pros is the price since its like 100€ +/- cheaper than the others here.


Been a while since I had mine, but it wasn't the worst noise in the world. It can be reasonably quiet with the proper fan curve and case airflow.

However, there are quieter AIB coolers you could opt for if you don't have space or budget constraints.


----------



## Kriant

Del. Wrong thread.


----------



## coreykill99

Heres a question.
I have a R7 1700x @ 3.8
msi gtx 1080 with ek block.
all in a custom loop.
I play @ 2560x1080 144hz monitor.

now I was on microcenters site the other day and seen a msi 1080 seahawk ek edition for $399. no way I thought. but I clicked the button to reserve it behind the counter. (until end of day Saturday) and it let me. it was a really impulsive decision. based solely on just the value.

the question is. should I get it? whats crossfire/sli like these days? I ran it idk 7 years ago or so with a pair of HD6850's for a few years
would I even see any benefit of picking up another card. I picked up my 1080 the week the 1080 ti came out knowing id pair it with ryzen I figured I wouldn't need the extra performance of the ti model just running my 2560x1080. but the price of 399 sounds really good right now.

anyone running 2 cards that can input on the multi card experience nowadays?

or should I just cancel my hold and go buy a 3d printer ive been looking at. and a 4tb hdd and still save some money lol.


----------



## Scotty99

For 399 you would be crazy not to buy it if you still can. That is 100% a pricing error, happens at microcenter sometimes.

If you dont use it turn around and sell it for 550 minimum.


----------



## confed

I do not have experience with SLI. I only have random comments/posts/articles that I have read where most people complain about it. Obviously, complaints are more readily available so I can't say whether or not it's worth it. Maybe check the games you play and see how SLI performance works for them?

What microcenter did you find it for so cheap? I mean, if I found one at that price, I would probably sell my current 1080, and finally expand my loop to include the gpu.


----------



## coreykill99

may field Hts. Cleveland Ohio.
ill put a post up here if i don't pick it up Saturday night. maybe Sunday morning someone else could grab it if i don't.

as far as sli profiles. it seems sketchy at best. like you said most of the time when i look for results i get someone complaining.
I play BF1 CIV5 doom kerbal space program. metro LL watchdogs and ghost recon wild lands. and a handful of much older games.
and I have destiny 2 on order. and will get fallout 4 on PC
when you look up these games. there's not a lot going on in sli for them. (that I can find) but, I also don't wanna kick myself forever passing up on it.
but if it is in reality not going to benefit me any. then its money much better spent elsewhere.


----------



## OZrevhead

Quote:


> Originally Posted by *coreykill99*
> 
> Heres a question.
> I have a R7 1700x @ 3.8
> msi gtx 1080 with ek block.
> all in a custom loop.
> I play @ 2560x1080 144hz monitor.
> 
> now I was on microcenters site the other day and seen a msi 1080 seahawk ek edition for $399. no way I thought. but I clicked the button to reserve it behind the counter. (until end of day Saturday) and it let me. it was a really impulsive decision. based solely on just the value.
> 
> the question is. should I get it? whats crossfire/sli like these days? I ran it idk 7 years ago or so with a pair of HD6850's for a few years
> would I even see any benefit of picking up another card. I picked up my 1080 the week the 1080 ti came out knowing id pair it with ryzen I figured I wouldn't need the extra performance of the ti model just running my 2560x1080. but the price of 399 sounds really good right now.
> 
> anyone running 2 cards that can input on the multi card experience nowadays


I will take it for that price ...









I tested a pair of TXp in BF1 and the second GPU made zero difference, lucky I didn't buy it for that.


----------



## OZrevhead

Does anyone have the correct tool for adjusting voltages on a Galax 1080Ti HOF? I have a tool but it doesn't seem to work (voltages don't change).


----------



## coreykill99

Quote:


> Originally Posted by *OZrevhead*
> 
> I will take it for that price ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tested a pair of TXp in BF1 and the second GPU made zero difference, lucky I didn't buy it for that.


wow. that's not helping much. think I play BF1 the most lately. a second 1080 will be of no use whatsoever.
yay.....
yes I know it's a good price. thats the only reason im considering it. but I thought I would still see.....at least some gains from it.
Quote:


> Originally Posted by *OZrevhead*
> 
> Does anyone have the correct tool for adjusting voltages on a Galax 1080Ti HOF? I have a tool but it doesn't seem to work (voltages don't change).


this is the 1080 thread. not the TI thread.
not saying it in a negative way, just mentioning it in case you posted here by accident.


----------



## OZrevhead

Oops I will post over there


----------



## Beagle Box

Quote:


> Originally Posted by *tangelo*
> 
> No. It pulls 224.335W at full load with core voltage and power limit maxed at +100 / +104%
> 
> And when I look at the bios information with gpuz it says:
> 
> Power Limit
> 
> Minimum 90.0W
> Default 210.0W
> Maximum 220.0W
> Adjustment Range -57% to +5%


Have you flashed your card with the higher power max BIOS? I wonder if heat-related performance drops with the higher power BIOS prompted the change...







If you can provide proper case cooling, I'd definitely chance the flash and test performance again.









Please give some detail on the gains/losses if you do.

I've recently added a water block to my standard Gaming X and am once again beginning the search for the best BIOS for my setup. I can't imagine I'll do better than what I've got, but one doesn't really know until the testing is done or a fire breaks out.


----------



## fa5terba11

Do you guys know if there is a soft voltage mod that can be done on a GTX1080 like you used to could with the GTX 780. It was a software side mod on afterburner that let you increase the max voltage. In lieu of that is there a bios that lets you increase max voltage? Where do I find such things?


----------



## Vellinious

No to software. Yes, there's the ASUS T4 bios that can allow for higher voltages on a good many of the 1080s. It doesn't do much though. If you want higher clocks, you need to lower temps. With pascal, mashing it with extra voltage like with previous gens, isn't the answer.

Adding extra voltage with core temps above 0c is useless.

Example of "temps are everything on pascal":

This GPU barely touches 2202 with an ambient temp of 20c, and loaded core temps at about 36c (note: 2193 runs better). BUT...lower the coolant temp to -6c, and loaded core temp to 10c or lower, and....bam. Higher clocks. 2278 at stock voltage. (note: 2240 runs better)


----------



## ucode

@fa5terba11 You should be able increase extra voltage from 0% to 100%. Not sure what is going on with Pascal and percentages but that means increases by up to a whopping 30mV or so. Be careful how much you use







.

Alternatively if you have a HOF or Classified there are options for up to 1.3V IIRC.


----------



## tangelo

Quote:


> Originally Posted by *Beagle Box*
> 
> Have you flashed your card with the higher power max BIOS? I wonder if heat-related performance drops with the higher power BIOS prompted the change...
> 
> 
> 
> 
> 
> 
> 
> If you can provide proper case cooling, I'd definitely chance the flash and test performance again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please give some detail on the gains/losses if you do.
> 
> I've recently added a water block to my standard Gaming X and am once again beginning the search for the best BIOS for my setup. I can't imagine I'll do better than what I've got, but one doesn't really know until the testing is done or a fire breaks out.


I haven't flashed the bios yet and dunno if I ever will. I was just curious as almost every review of this card had higher power limit.


----------



## coreykill99

Quote:


> Originally Posted by *coreykill99*
> 
> may field Hts. Cleveland Ohio.
> ill put a post up here if i don't pick it up Saturday night. maybe Sunday morning someone else could grab it if i don't.
> 
> as far as sli profiles. it seems sketchy at best. like you said most of the time when i look for results i get someone complaining.
> I play BF1 CIV5 doom kerbal space program. metro LL watchdogs and ghost recon wild lands. and a handful of much older games.
> and I have destiny 2 on order. and will get fallout 4 on PC
> when you look up these games. there's not a lot going on in sli for them. (that I can find) but, I also don't wanna kick myself forever passing up on it.
> but if it is in reality not going to benefit me any. then its money much better spent elsewhere.


just leave this here.


----------



## fa5terba11

That is such a crazy price. Especially since it comes with a nice waterblock. I have this card now and I love it.


----------



## fa5terba11

So I flashed my card with the ASUS Strix Bios that is supposed to have the volts unlocked and it did removed the power limit but my volts maxed at 1.025 instead of the expected 1.2. Any thoughts? Just pushing to see what the best OC is I can get on this card.


----------



## Vellinious

Quote:


> Originally Posted by *fa5terba11*
> 
> So I flashed my card with the ASUS Strix Bios that is supposed to have the volts unlocked and it did removed the power limit but my volts maxed at 1.025 instead of the expected 1.2. Any thoughts? Just pushing to see what the best OC is I can get on this card.


The best OC comes with lower temps, not higher voltage.


----------



## GRABibus

Quote:


> Originally Posted by *fa5terba11*
> 
> So I flashed my card with the ASUS Strix Bios that is supposed to have the volts unlocked and it did removed the power limit but my volts maxed at 1.025 instead of the expected 1.2. Any thoughts? Just pushing to see what the best OC is I can get on this card.


You have to play with V/F curve to fix voltage higher than 1.093V.


----------



## Pillendreher

Any 1080 SC2 owners on here? I'm currently looking at 1080s since AMD is currently trying to trick me into overpaying for Vega and stumbled onto the 1080 SC2, which has the faster VRAM and the newer cooling technology.

The couple of reviews I found mentioned that while the cooler isn't as loud as the FE, it's not as silent as some other cards either.

Also: Does it matter that the SC2 only has 5 phases compared to 10 on the FTW2?

I'm gonna check some 1080 Ti Reviews (provided that EVGA is using the same cooler)


----------



## Vellinious

Quote:


> Originally Posted by *Pillendreher*
> 
> Any 1080 SC2 owners on here? I'm currently looking at 1080s since AMD is currently trying to trick me into overpaying for Vega and stumbled onto the 1080 SC2, which has the faster VRAM and the newer cooling technology.
> 
> The couple of reviews I found mentioned that while the cooler isn't as loud as the FE, it's not as silent as some other cards either.
> 
> Also: Does it matter that the SC2 only has 5 phases compared to 10 on the FTW2?
> 
> I'm gonna check some 1080 Ti Reviews (provided that EVGA is using the same cooler)


Power phases don't mean anything on pascal. What matters is cooling and power limit. The FTW should cool better because it has bigger fans and larger heatsink....and it'll have an advantage in that it doesn't have a lower power limit. If you're looking to overclock, without needing to hardmod or flash a bios, the FTW is the better option. If you're just looking for gaming performance. Buy the cheapest one and create a custom fan curve to keep it cool.


----------



## Pillendreher

Quote:


> Originally Posted by *Vellinious*
> 
> Power phases don't mean anything on pascal. What matters is cooling and power limit. The FTW should cool better because it has bigger fans and larger heatsink....and it'll have an advantage in that it doesn't have a lower power limit. If you're looking to overclock, without needing to hardmod or flash a bios, the FTW is the better option. If you're just looking for gaming performance. Buy the cheapest one and create a custom fan curve to keep it cool.


I'm looking for a cool and quiet card. I think I can help with that by undervolting it.

The SC2 is currently at 550 € in Germany; the FTW2 at 600 €. That's why I was wondering about the cooling and if anybody had undervolted the card to get better cooling performance.

PS: At the same price (550), I could get a Gainward Phoenix which should have no problem whatsoever staying both cool and silent.


----------



## Scotty99

Get a three fan card if you got room for it. Gigabyte extreme, asus strix, ftw3 all good cards.


----------



## Pillendreher

Quote:


> Originally Posted by *Scotty99*
> 
> Get a three fan card if you got room for it. Gigabyte extreme, asus strix, ftw3 all good cards.


I have plenty of room in my R4


----------



## fa5terba11

Quote:


> Originally Posted by *GRABibus*
> 
> You have to play with V/F curve to fix voltage higher than 1.093V.


What do you mean "play with the v/f curve"?


----------



## Vellinious

Voltage / frequency curve. The only way to overclock pascal and really explore decent overclocks.


----------



## fa5terba11

Can you recommend a good guide on how best to use the curve? Im used to the old way of overclocking.


----------



## fa5terba11

Hey good news! I flashed my bios with the unlocked strix bios, I messed with the curve, and got stable at 2215mhz. The voltage did go all the way to 1.2. Awesome stuff!


----------



## Beagle Box

Quote:


> Originally Posted by *fa5terba11*
> 
> Hey good news! I flashed my bios with the unlocked strix bios, I messed with the curve, and got stable at 2215mhz. The voltage did go all the way to 1.2. Awesome stuff!


Did you lose any ports?


----------



## fa5terba11

Maybe ... I havent checked. I didnt lose the one I have my monitor in.


----------



## Ayahuasca

Recently got a EVGA 1080 FTW (Original) has anyone flashed the 11Gbps memory BIOS onto this card and had no issues? Just wondering how it's done and whether it's worthwhile.


----------



## ycodryn

I have tried it on my 1080 ftw hybrid and is ok. Made some benchmarks but I didn't see any improvements over msi afterburner/evga precisionx using +500 on memory clock.


----------



## Vellinious

Quote:


> Originally Posted by *fa5terba11*
> 
> Hey good news! I flashed my bios with the unlocked strix bios, I messed with the curve, and got stable at 2215mhz. The voltage did go all the way to 1.2. Awesome stuff!


And it got you how much performance? Before and after Firestrike runs, maybe? What temps was the core hitting at 2215?


----------



## GRABibus

Quote:


> Originally Posted by *fa5terba11*
> 
> Hey good news! I flashed my bios with the unlocked strix bios, I messed with the curve, and got stable at 2215mhz. The voltage did go all the way to 1.2. Awesome stuff!


here is an example of curve I set on my former SLi GTX 1080 Gigabyte :

http://www.casimages.com/img.php?i=17052212161217369815053901.png

As you can see, it si set at 2202MHz only for the point of theg curve at 1.2V.
it was for bench.
BUT, curve is also twekaed below 1.2V as you can see with a large offset above stock.

Otherwise, if I would have only set the point of the curve 2202MHz at 1.2V and the whole remaining curve at stock, I would have no gain in bench.


----------



## xartic1

Quote:


> Originally Posted by *GRABibus*
> 
> here is an example of curve I set on my former SLi GTX 1080 Gigabyte :
> 
> As you can see, it si set at 2202MHz only for the point of theg curve at 1.2V.
> it was for bench.
> BUT, curve is also twekaed below 1.2V as you can see with a large offset above stock.
> 
> Otherwise, if I would have only set the point of the curve 2202MHz at 1.2V and the whole remaining curve at stock, I would have no gain in bench.


I'm curious to know the score of a single card running at those frequencies.


----------



## GRABibus

Quote:


> Originally Posted by *xartic1*
> 
> I'm curious to know the score of a single card running at those frequencies.


I am not at home but on iPhone.
I think you just have to check in timespy benchmark thread results


----------



## fa5terba11

I will post pics and scores tonight.


----------



## fa5terba11

The best clock I was able to get without flashing the bios and without manipulatung the curve was about 2120mhz.


----------



## TK421

When setting two separate curve with an undervolt lock (gpu decoupled) and saving them as a profile in MSi afterburner, the next restart would wipe one of the GPU to be +0 core and +0 memory.

Why is this? Why is my second GPU profile being lost after a restart?


----------



## Vellinious

The profile saves the overclock settings for the GPU you've currently got selected. It never saves it for the other. I was never able to get it to work, not even using the exact same curve on both GPUs. I always had to save two separate profiles for different curves, and apply them individually.


----------



## fa5terba11

Sorry guys I wanted to post my clocks and results but I got waylaid by a hard drive that went bad. I'm working on getting something posted.


----------



## fa5terba11

Ok so here's my Heaven score with no overclock.


----------



## fa5terba11

Here's my Heaven score with the best overclock I can get stable with my card's original bios.


----------



## fa5terba11

Here's the best overclock I can get stable with the unlocked ASUS Strix bios.


----------



## Beagle Box

Quote:


> Originally Posted by *fa5terba11*
> 
> 
> 
> Here's the best overclock I can get stable with the unlocked ASUS Strix bios.


Can't read your screenshots. They're too small.
If you want to see how well you're performing on the Heaven Benchmark, this is the way.


----------



## Vellinious

Use Superposition....Heaven has a tendency to get kinda CPU bound in places with the 1080. Or FIrestrike / Timespy.


----------



## fa5terba11

Ok timespy and firestrike scores are next


----------



## fa5terba11

Timespy score with Strix Bios


----------



## Vellinious

Quote:


> Originally Posted by *fa5terba11*
> 
> 
> 
> Timespy score with Strix Bios


Identical graphics scores that I was getting with cold ambient and stock voltage.


----------



## drunkonpiss

Using a GTX 1080 G1 gaming running at stock. Ran Heaven and I came with the below result. Is there a bottleneck in y setup? Also, I noticed that my temps are actually high at stock at 74 degrees.
Appreciate any tips you can give!


----------



## lever2stacks

Quote:


> Originally Posted by *drunkonpiss*
> 
> Using a GTX 1080 G1 gaming running at stock. Ran Heaven and I came with the below result. Is there a bottleneck in y setup? Also, I noticed that my temps are actually high at stock at 74 degrees.
> Appreciate any tips you can give!


No that seems about right running one card stock at that resolution. I just ran my rig with sli off and I got 1399 that's with the card running at 2126/5702. I'm running a 7700k but I also have a 4790k and don't see very much of a difference in performance between the 2 except I can hit 5ghz on my 7700k.

Temps don't seem that high on air, what are your ambient temps like and what kind of air flow do you have in your case?

Lever


----------



## 7thNemesis

.


----------



## Pillendreher

Just ordered a Zotac 1080 Extreme for 560 bucks (530 once I sell the Destiny 2 code). So I'll guess Hi Team Green. My first Nvidia GPU since the 8800 GT









Anything I should know?


----------



## drunkonpiss

Quote:


> Originally Posted by *lever2stacks*
> 
> No that seems about right running one card stock at that resolution. I just ran my rig with sli off and I got 1399 that's with the card running at 2126/5702. I'm running a 7700k but I also have a 4790k and don't see very much of a difference in performance between the 2 except I can hit 5ghz on my 7700k.
> 
> Temps don't seem that high on air, what are your ambient temps like and what kind of air flow do you have in your case?
> 
> Lever


Appreciate the response!

Are you running on stock? That's quite a performance on a single 1080!
My ambient temps are 20-22 degrees. My setup is a push-pull airflow pushing cold air from the front and exhausting hot air from the rear and top fans. Just to give you an idea, this is how my setup looks (minus the Palit GPU that i'm previously using) my radiator is sandwiched by fans via push-pull set up and exhausting hot air on the rear and top fans. Prior to getting a 1080, I was using a 1070 but didn't perform a DDU or reinstalled the drivers. Not really sure if that is the cause but I haven't any encountered and issues thus far so I leave it that way. Should I reinstall the driver?


----------



## lever2stacks

Nope I'm not running stock my core is running @2126 and memory is @5702, The cards are watercooled.

Like I said your temps and scores are typical for that card on air underload. I don't think you have a bottleneck or a temp problem.

Where is your radiator located? Its hard to tell from the pic.

Yes definitely run ddu and reinstall drivers. That should be the very first thing you do.

Lever


----------



## EDK-TheONE

Is it possible flash 11g gaming x to 10g gaming x?


----------



## drunkonpiss

Quote:


> Originally Posted by *lever2stacks*
> 
> Nope I'm not running stock my core is running @2126 and memory is @5702, The cards are watercooled.
> 
> Like I said your temps and scores are typical for that card on air underload. I don't think you have a bottleneck or a temp problem.
> 
> Where is your radiator located? Its hard to tell from the pic.
> 
> Yes definitely run ddu and reinstall drivers. That should be the very first thing you do.
> 
> Lever


Performed a clean driver reinstall via DDU and for some reason, my performance improved as well as my temps by about 3 degrees. Not bad. I'll probably undervolt it to get even better thermal . performance. Appreciate the help!


----------



## TristanL

sneak in...


----------



## ronskie66

In your opinion is it worth it to flash this z bios onto a msi gaming x plus card? I see that the core and boost clocks will rise as they are the same as the gaming x model but will the TDP be more or is not worth it for the gaming x plus? Thanks


----------



## Vellinious

Quote:


> Originally Posted by *ronskie66*
> 
> In your opinion is it worth it to flash this z bios onto a msi gaming x plus card? I see that the core and boost clocks will rise as they are the same as the gaming x model but will the TDP be more or is not worth it for the gaming x plus? Thanks


Is this GPU on air? If it is, don't bother. You won't see any significant changes. Wanna see higher boost 3.0 clocks? Run the fans higher.


----------



## ronskie66

It's on air, thanks for reply. I will stick with the gaming x plus bios. I haven't used the boost 3 before, had two 780 ti's in sli until getting a 1080 card. Haven't actually got the card yet, it's coming in 4 days time or so. I am planning on doing a manual overclock, watched some videos on overclocking pascal as it's a bit different from overclocking the 780 ti's. Might just end up using the boost, not sure yet, depends on what clocks I end up with.


----------



## drunkonpiss

Hey guys, anyone can share an undervolt curve from their Gigabyte Windforce 1080 or any 1080s? just want to use it for referenceto perform an undervolt for my card. just looking at a mild undervolt. Just trying to lower temps basically.


----------



## Derek1

Quote:


> Originally Posted by *ronskie66*
> 
> It's on air, thanks for reply. I will stick with the gaming x plus bios. I haven't used the boost 3 before, had two 780 ti's in sli until getting a 1080 card. Haven't actually got the card yet, it's coming in 4 days time or so. I am planning on doing a manual overclock, watched some videos on overclocking pascal as it's a bit different from overclocking the 780 ti's. Might just end up using the boost, not sure yet, depends on what clocks I end up with.


You don't actually have a choice to use or not use Boost 3.0. It is part of the architecture of the chip. (if that's the right way to say it)
It is there and you must adjust clocks and temps to manage it to get the best performance out of your card.

ETA Unless you were just to referring the Auto oc function of hitting a boost button, like the K Boost on the EVGA Precision. Not sure what you meant looking back at it.


----------



## stephenn82

mmmmm I have to do the joining at home. Just picked up a 1080 FTW Hybrid yesterday from my friend. Spent most of evening mounting the rad and downloading drivers and software. What nice improvement over the 390!


----------



## stephenn82

Quote:


> Originally Posted by *Pillendreher*
> 
> Just ordered a Zotac 1080 Extreme for 560 bucks (530 once I sell the Destiny 2 code). So I'll guess Hi Team Green. My first Nvidia GPU since the 8800 GT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anything I should know?


dude, im in the SAME BOAT. Last Nvidia card I owned was an 8800gt, got it from a dead 7900gt.

I picked up a 1080 FTW Hybrid from my friend yesterday at 400 bucks. I dont think he registered it with EVGA...it gave me elite membership and all the digital goods.


----------



## outofmyheadyo

Recently bought the msi sea hawk ek x for 550€ brand new from amazon seemed like an ok deal, came with destiny 2 aswell.
Thought about the 1080ti, but since I dont really play anything at all, 900€ seemed like too much for it.


----------



## Beagle Box

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Recently bought the msi sea hawk ek x for 550€ brand new from amazon seemed like an ok deal, came with destiny 2 aswell.
> Thought about the 1080ti, but since I dont really play anything at all, 900€ seemed like too much for it.


Do you use Afterburner? What's the Voltage max of that BIOS?


----------



## Frosty288

Guys, I have Gigabyte 1080Ti Gaming OC (N108TGAMINGOC-11GD) What BIOS should I flash for high clocks?


----------



## tangelo

Quote:


> Originally Posted by *ronskie66*
> 
> In your opinion is it worth it to flash this z bios onto a msi gaming x plus card? I see that the core and boost clocks will rise as they are the same as the gaming x model but will the TDP be more or is not worth it for the gaming x plus? Thanks


The X+ has different memory than Z. Are you sure it would work?


----------



## Krzych04650

Strange things are going on with my two MSI Gaming 1080s in terms of temperatures (both running 2025 @ 0.975V). If I put card A at the top and card B at the bottom, temps are 77 and 58 respectively. If I put card B at the top and card A at the bottom, temps are 63 and 65, top card actually cooler than the bottom one. It happened before but I thought that this is due to mobo change and different PCI-E slot placements but turns out that it is not, while rebuilding the PC I confused the cards and put card A at the top again and temps ended up like this again. Replaced them and everything fine. Those are two identical models. How can there be such a huge variance in temps? If card A is at the bottom it is 7 degrees hotter than card B is on this position and if card A is at the top, it is 14 C hotter than card B on this position, this is crazy... Even if card B was running on thermal paste and card A was running on ketchup, it still shouldn't make so much difference...


----------



## EDK-TheONE

Any one got 26K 3DM FS on air?


----------



## Vellinious

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Any one got 26K 3DM FS on air?


26k is a tad high for air, unless the memory is doin it's trickiness at +800 or something stupid.


----------



## buellersdayoff

Quote:


> Originally Posted by *Krzych04650*
> 
> Strange things are going on with my two MSI Gaming 1080s in terms of temperatures. If I put card A at the top and card B at the bottom, temps are 77 and 58 respectively. If I put card B at the top and card A at the bottom, temps are 63 and 65, top card actually cooler than the bottom one. It happened before but I thought that this is due to mobo change and different PCI-E slot placements but turns out that it is not, while rebuilding the PC I confused the cards and put card A at the top again and temps ended up like this again. Replaced them and everything fine. Those are two identical models. How can there be such a huge variance in temps? If card A is at the bottom it is 7 degrees hotter than card B is on this position and if card A is at the top, it is 14 C hotter than card B on this position, this is crazy... Even if card B was running on thermal paste and card A was running on ketchup, it still shouldn't make so much difference...


One card probably uses less voltage to reach the same performance, asic quality silicone lottery bla bla. Go with the 63-65 configuration. You can test with one card at a time to see what voltage it needs


----------



## Krzych04650

Quote:


> Originally Posted by *buellersdayoff*
> 
> One card probably uses less voltage to reach the same performance, asic quality silicone lottery bla bla. Go with the 63-65 configuration. You can test with one card at a time to see what voltage it needs


I forgot to mention, both are set to 2025 MHz @ 0.975V. Thats the problem, those are two exactly the same cards with exactly the same clocks and voltages.


----------



## Beagle Box

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Any one got 26K 3DM FS on air?


Closest I got to 26k on air just before I added the EK waterblock.


----------



## cory1234

Debating on upgrading from a 980 ti hybrid to a 1080. I’m currently running 1450 MHz on the core of my 980 ti. Is it worth the $150 difference to sell my current card and buy a used 1080?


----------



## Radox-0

Quote:


> Originally Posted by *cory1234*
> 
> Debating on upgrading from a 980 ti hybrid to a 1080. I'm currently running 1450 MHz on the core of my 980 ti. Is it worth the $150 difference to sell my current card and buy a used 1080?


Benching my Titan X Maxwell (for all intents and purposes a GTX 980Ti and GTX 1080 in around 30 games a while back, delta in performance was 13% from stock to stock results and 14% from overclocked to overclocked results. Overclocked Titan X maxwell was about 3% behind the 1080 at stock. My clocks:

GTX 1080 Stock - 1860 MHz Core / 10,016 MHz Memory.
GTX 1080 overclocked - 2075 MHz Core / 11,000 MHz Memory

GTX Titan X Stock (EVGA SC) - 1316 MHz Core / 7012 MHz Memory
GTX Titan X Overclocked (EVGA SC) - 1474 MHz Core / 8020 MHz Memory

So I would say around a 15% performance jump when you OC the 1080 also would be about what you get.


----------



## andydabeast

Hey guys, I am quite new to overclocking Nvidia cards, but I understand the how the offsets work. I have the Gigabyte Windforce card that uses a single 8-pin and triple fan cooler. I had an overclock that I thought was stable with +150 core and +400 mem with no voltage or power limit adjustment. Heaven ran fine but I got a black-screen-max-fan system crash when playing Fractured Space. So I gave it +10% power limit and removed the memory clock. Heaven, Firestrike, Superposition, Time Spy, and hours in Titanfall 2, Black Ops 3, and Fractured Space were fine. Then I got a black-screen-max-fan system crash again while in Fractured Space. Nothing else triggers it. No artifacts, no temps over 70C, no warning.

Any ideas besides "don't play that game"? I have yet to see if it happens while at stock.


----------



## Vellinious

Quote:


> Originally Posted by *andydabeast*
> 
> Hey guys, I am quite new to overclocking Nvidia cards, but I understand the how the offsets work. I have the Gigabyte Windforce card that uses a single 8-pin and triple fan cooler. I had an overclock that I thought was stable with +150 core and +400 mem with no voltage or power limit adjustment. Heaven ran fine but I got a black-screen-max-fan system crash when playing Fractured Space. So I gave it +10% power limit and removed the memory clock. Heaven, Firestrike, Superposition, Time Spy, and hours in Titanfall 2, Black Ops 3, and Fractured Space were fine. Then I got a black-screen-max-fan system crash again while in Fractured Space. Nothing else triggers it. No artifacts, no temps over 70C, no warning.
> 
> Any ideas besides "don't play that game"? I have yet to see if it happens while at stock.


Raise the power limit and voltage to max. As boost 3.0 adjusts clocks and voltage, due to temp changes, it'll look for additional voltage to maintain clock, or, it'll lower core clocks if already at maximum available voltage.

Creating a custom fan curve, to keep the GPU a little cooler would help as well.

As for the "one game" crashing. All games are just slightly different. What works for one, and is completely stable, could be completely different for others. I have games I can run for hours at 2189 on the core, and others that will crash almost immediately if the core goes past 2100. Just the nature of the beast.


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> As for the "one game" crashing. All games are just slightly different. What works for one, and is completely stable, could be completely different for others. I have games I can run for hours at 2189 on the core, and others that will crash almost immediately if the core goes past 2100. Just the nature of the beast.


Just like stability testing programs...an OC may run great on 5 programs, but run like trash on OCCT, x264, etc.

So, 11ghz is achievable on the older 1080 models...just +250 the offset and call it a day?

I recently snagged a 1080 FTW Hybrid and it runs very, very cool. 130% power and 92c temp limit and it runs good. I havent changed anything yet. I watched JayzTwoCents video on how to use the XOC and learned some stuff. Just havent tinkered with it yet.

I notices with sliding it up it was running 2136/10010 and hit 46c after gaming for a few hours. not bad, eh?


----------



## andydabeast

Quote:


> Originally Posted by *Vellinious*
> 
> Raise the power limit and voltage to max. As boost 3.0 adjusts clocks and voltage, due to temp changes, it'll look for additional voltage to maintain clock, or, it'll lower core clocks if already at maximum available voltage.
> 
> Creating a custom fan curve, to keep the GPU a little cooler would help as well.
> 
> As for the "one game" crashing. All games are just slightly different. What works for one, and is completely stable, could be completely different for others. I have games I can run for hours at 2189 on the core, and others that will crash almost immediately if the core goes past 2100. Just the nature of the beast.


First I will play some Fractured Space with it at stock to be sure the OC is what is doing it. Then I will re-apply the OC and jack up the voltage and try.

Does the GPU getting to 70C really contribute to instability? Isn't the max like 90C?

Maybe I just turn off the OC for that game and call it a day...


----------



## Vellinious

Quote:


> Originally Posted by *andydabeast*
> 
> First I will play some Fractured Space with it at stock to be sure the OC is what is doing it. Then I will re-apply the OC and jack up the voltage and try.
> 
> Does the GPU getting to 70C really contribute to instability? Isn't the max like 90C?
> 
> Maybe I just turn off the OC for that game and call it a day...


Yes, it certainly can. The cooler, the better.

Boost 3.0 will make voltage / frequency adjustments all along the operating temp spectrum. At that temp, it could just be trying to make an adjustment, and becomes unstable as it makes it's adjustments.


----------



## Dasboogieman

Quote:


> Originally Posted by *andydabeast*
> 
> First I will play some Fractured Space with it at stock to be sure the OC is what is doing it. Then I will re-apply the OC and jack up the voltage and try.
> 
> Does the GPU getting to 70C really contribute to instability? Isn't the max like 90C?
> 
> Maybe I just turn off the OC for that game and call it a day...


Yes, Pascal scales roughly 1 bin of OC for every 5 degrees from 80-60C, then it starts scaling 1 bin for every 10 degrees from 60-40C. After that, its 1 bin from 40-30C, I don't know what its like below this because my cooling can't go there.


----------



## Vellinious

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes, Pascal scales roughly 1 bin of OC for every 5 degrees from 80-60C, then it starts scaling 1 bin for every 10 degrees from 60-40C. After that, its 1 bin from 40-30C, I don't know what its like below this because my cooling can't go there.


I've seen boost 3.0 make adjustments to clock / voltage as low as 12c.


----------



## andydabeast

Quote:


> Originally Posted by *Vellinious*
> 
> Yes, it certainly can. The cooler, the better.
> 
> Boost 3.0 will make voltage / frequency adjustments all along the operating temp spectrum. At that temp, it could just be trying to make an adjustment, and becomes unstable as it makes it's adjustments.


Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes, Pascal scales roughly 1 bin of OC for every 5 degrees from 80-60C, then it starts scaling 1 bin for every 10 degrees from 60-40C. After that, its 1 bin from 40-30C, I don't know what its like below this because my cooling can't go there.


Wow interesting about the thermals. I'll make my fan curve more aggressive and see what raising voltage does. I actually have a 120mm CPU AIO ready for when I want to replace the cooler on the 1080, but I need more memory heatsinks first.


----------



## Scotty99

Do 3 fan cards like strix stay under 70c at all times or do you need a hybrid card to achieve this?


----------



## andydabeast

Quote:


> Originally Posted by *Scotty99*
> 
> Do 3 fan cards like strix stay under 70c at all times or do you need a hybrid card to achieve this?


My triple fan Windforce card stays under 70C with my custom fan curve never getting over 65%. That is with no extra voltage though.


----------



## mrgnex

Today I got myself a 1080 and I tested it before buying.
But now I inserted it in my own PC and my mobile hangs at q code 62 for like ten seconds and then continues to boot.
Sadly I do not have video output..


----------



## tbob22

Quote:


> Originally Posted by *mrgnex*
> 
> Today I got myself a 1080 and I tested it before buying.
> But now I inserted it in my own PC and my mobile hangs at q code 62 for like ten seconds and then continues to boot.
> Sadly I do not have video output..


Try another PCI-E slot?


----------



## mrgnex

Quote:


> Originally Posted by *tbob22*
> 
> Try another PCI-E slot?


Worked with my old gpu. I have watercooling so another slot might be hard.


----------



## tbob22

Quote:


> Originally Posted by *mrgnex*
> 
> Worked with my old gpu. I have watercooling so another slot might be hard.


How about onboard video?


----------



## mrgnex

Quote:


> Originally Posted by *tbob22*
> 
> How about onboard video?


That works fine if I unplug my gpu.

Gonna try a different gpu in the same slot tomorrow.. fingers crossed.

Edit: different gpu doesn't work either. Seems like my slot is toast. Probably happened when my last gpu exploded.

Edit2: nope it's dead.. tried a different PC and still no signal..

Edit3: Found it. It's a cap that fell off. Anyone know the capacitance of the marked cap?:


----------



## Dry Bonez

Hey everyone, its me again , a victim of this deadly hurricane. Heading my way sunday/monday. I disarmed my pc and have my gtx1080ti in my trunk in my car, im also thinking of bringing it inside. I will try to protect my pc comoonents the best i can since if anything, i can sell and get money since ik its valuable to say the least. Also, any other victims in here, please stay safe and as this one is serious and its crazy over here in FL


----------



## stephenn82

yall stay safe! I have a friend stationed in mayport/Jax area and he may be displaced heading up to my house in VA. I hope you all get to safety if you have it afforded to you.


----------



## Dasboogieman

Quote:


> Originally Posted by *Vellinious*
> 
> I've seen boost 3.0 make adjustments to clock / voltage as low as 12c.


Yeah, the curve gets really really funky at below 20C.

Also, there seems to be no hard and fast rule. Some people get clock transitions on 38C while I get it on 42C. There is some ASIC variance going on here as well.


----------



## Vellinious

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yeah, the curve gets really really funky at below 20C.
> 
> Also, there seems to be no hard and fast rule. Some people get clock transitions on 38C while I get it on 42C. There is some ASIC variance going on here as well.


You're right. There's not. One card will adjust at one point, and another at a different point. I had one that would come down off of 2202 @ 28c while running 1.093v, and another that wouldn't slide off of 2214 even up as high as 36c....but it was only using 1.081v for it.


----------



## HAL900

Someone holds a card Palit GTX 1080 OC Super JetStream 11Gbps bios ? and could upload the bios ??


----------



## gupsterg

Been AMD for ~10yrs+, snagged a MSI GTX 1080 Sea Hawk EK X on a promo. Had a screw missing on WB block.



EKWB sent screw via DHL ASAP







.

Rig defaults SuperPosition 4K optimized result.



@Vellinious the card is supposed go to 1847MHz boost, OSD of SuperPosition, Valley and Heaven showed something like 1975MHz. I suppose nVidia Boost 3.0 at 'work'? gotta do graphed monitoring of GPU clocks.


----------



## coreykill99

Quote:


> Originally Posted by *gupsterg*
> 
> Been AMD for ~10yrs+, snagged a MSI GTX 1080 Sea Hawk EK X on a promo. Had a screw missing on WB block.
> 
> 
> 
> EKWB sent screw via DHL ASAP
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Rig defaults SuperPosition 4K optimized result.
> 
> 
> 
> @Vellinious the card is supposed go to 1847MHz boost, OSD of SuperPosition, Valley and Heaven showed something like 1975MHz. I suppose nVidia Boost 3.0 at 'work'? gotta do graphed monitoring of GPU clocks.


did you get it from microcenter for 399?


----------



## gupsterg

Nope, I'm in the UK, etailer here.


----------



## stephenn82

Vellinious, from Toms! whats up!

Gupsterg, not bad man! Here is a 1080 FTW Hybrid at default run, 4k opty.



and bumped +100 mhz on core only

I would say you got a very good deal!!


----------



## coreykill99

anyone have the results offhand for a 1080p extreme run? think the best I got was a 4601 or something along those lines. is that about average or is there still more room to push? running a msi 1080 gaming with an ek block on it.


----------



## stephenn82

stand by...


----------



## stephenn82

1080p extreme gets me this...margin of error between 4370 and 4398

with 100+ on boost clock.


----------



## gupsterg

@stephenn82

Yes it worked out well







. As the etailer had no replacement stock (apparently discontinued model) they gave me a further discount







. Let's say the Sea Hawk EK X was quite a bit cheaper than VEGA 64 with EK WB







.

Total noob on nVidia here, so lotta learning/reading to do







. I just did driver install and didn't even check what is default setup and did a run, also the TR 1950X was at stock with RAM at 2133MHz C15 loose sub timings







. Gonna tune 'platform' first and do 3x runs and then aim for some GPU OC fun







.

+rep for shares of runs







.


----------



## stephenn82

Yeah man. I havent used nvidia since the 8800gt was a thing. Got fed up with trash drivers and low performance compared to what it should do. Remember to completely remove drivers with DDU. Dont run admin/safe mode its screws up the PC. Just run normal mode. Still works just fine.


----------



## Vellinious

Normal ambient temp runs, and iirc, with 2202 or so on the core. Can't recall.

1080 Extreme:



4k Optimized:


----------



## stephenn82

Huh. Mine runs 2138 and drops to 2126 when about 38c on core. Ambient is 24. Max the gpu gets is 45-47 depending on usage.

I would lilw tp puah 2200 lol. Benches dodnt run so well with sped up vram. 150mhz more lowered score 80 pts in 1080p extreme. Maybe due to heat soak onto cold plate from contacting ram as well as gpu core.


----------



## mrgnex

Quote:


> Originally Posted by *gupsterg*
> 
> @stephenn82
> 
> Yes it worked out well
> 
> 
> 
> 
> 
> 
> 
> . As the etailer had no replacement stock (apparently discontinued model) they gave me a further discount
> 
> 
> 
> 
> 
> 
> 
> . Let's say the Sea Hawk EK X was quite a bit cheaper than VEGA 64 with EK WB
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Total noob on nVidia here, so lotta learning/reading to do
> 
> 
> 
> 
> 
> 
> 
> . I just did driver install and didn't even check what is default setup and did a run, also the TR 1950X was at stock with RAM at 2133MHz C15 loose sub timings
> 
> 
> 
> 
> 
> 
> 
> . Gonna tune 'platform' first and do 3x runs and then aim for some GPU OC fun
> 
> 
> 
> 
> 
> 
> 
> .
> 
> +rep for shares of runs
> 
> 
> 
> 
> 
> 
> 
> .


Dude, YOU are here? Awesome! You'll get custom bios's in no time even though there is no way to make them XD


----------



## Sharchaster

I have MSI Gaming X PLUS, and I can absolutely said the performance of the card was great, especially the cooler, though. Even I crank up into 100%, the cooler still whisper quiet, and i haven't noticed any choil whine on it.

The thing I HATE MOST from this card is the power limit..*.it's only 104% which is LOW for the thing like GTX 1080*. I noticed crazy down clock frequencies when I gaming at DSR 4K (even 2K) but won't affect the performance (FPS) so far.

Pros :
Performance great
Cooler AMAZING quiet
Temps are not exceed 80C (my abient temperature was 28C or more)

Cons
Power Limit is only 104% which is LOW


----------



## mksteez

Will be replacing my GTX 780 and currently looking at the Asus Strix and EVGA FTW2

leaning towards the Strix though. What do you guys think?


----------



## Gen Patton

Hello guys i will have my evga 1080 founders this thursday looking foward to joining. had a 980tisc but the fans were not working and it was getting hot.So Evga said shipped it to them. Now there sending me the 1080 founders. Great work Evga. ii will never buy from anyone else Evga has me on Graphic cards.


----------



## Gen Patton

If you buy Graphic cards get one from Evga there great. it might cost a little more but its worth iit. go to there web page you can buy from Evga.


----------



## Malkorath

Considering switching back to the green team after several years of using AMD cards. I started out with the R9 390, and ended up replacing it because of a very unusual problem with it. I bought a Fury on Black Friday last year because that was all I could afford.

Since building a new rig back in late July I've decided that I wanted a true upgrade on the GPU front to accompany the new parts. Plus the disappointment with Vega has more or less turned me off to AMD GPUs as a whole.

I've been looking at the Asus Strix 1080 A8G model, the EVGA 1080 FTW2, or the FTW Hybrid. I wanted to take advantage of the Destiny 2 promotion but I didn't have the money ready in time.

Which card would be best and is there any chance of a new Destiny 2 bundle coming up soon? I've got a few friends who will be playing it, and missing the bundle really sucks as I wanted to join them.


----------



## Vellinious

Quote:


> Originally Posted by *Malkorath*
> 
> Considering switching back to the green team after several years of using AMD cards. I started out with the R9 390, and ended up replacing it because of a very unusual problem with it. I bought a Fury on Black Friday last year because that was all I could afford.
> 
> Since building a new rig back in late July I've decided that I wanted a true upgrade on the GPU front to accompany the new parts. Plus the disappointment with Vega has more or less turned me off to AMD GPUs as a whole.
> 
> I've been looking at the Asus Strix 1080 A8G model, the EVGA 1080 FTW2, or the FTW Hybrid. I wanted to take advantage of the Destiny 2 promotion but I didn't have the money ready in time.
> 
> Which card would be best and is there any chance of a new Destiny 2 bundle coming up soon? I've got a few friends who will be playing it, and missing the bundle really sucks as I wanted to join them.


I'm a fan of the EVGA products and warranty. Their customer service is second to none. I only buy ASUS boards, and the RMA process with them can be.....painful.


----------



## Malkorath

Quote:


> Originally Posted by *Vellinious*
> 
> I'm a fan of the EVGA products and warranty. Their customer service is second to none. I only buy ASUS boards, and the RMA process with them can be.....painful.


Yea I only buy ASUS boards usually, but I considered the Strix because it was well received and having unification with Aura (when it works) would be nice.

I'm leaning toward the FTW2 because of the 11gbps memory and the easier installation... But the Hybrid looks interesting and I've always wanted one.

Sent from my HTC6545LVW using Tapatalk


----------



## Scotty99

Truthfully buying evga new is not where the benefits come in, buying used is. Warranty is fully transferable so you can buy a 1080 off a guy on craigslist who bought it new and have 2 years left. EVGA is the only brand i would buy used because of that, there is no risk only upside. Asus air cooling cards are a bit better than EVGA's and if you are buying new and it fits your theme go for it.


----------



## Dasboogieman

Quote:


> Originally Posted by *Malkorath*
> 
> Yea I only buy ASUS boards usually, but I considered the Strix because it was well received and having unification with Aura (when it works) would be nice.
> 
> I'm leaning toward the FTW2 because of the 11gbps memory and the easier installation... But the Hybrid looks interesting and I've always wanted one.
> 
> Sent from my HTC6545LVW using Tapatalk


It would be safer to get the 11gbps one and slap your own AIO to it. That way, you can always fall back to the air cooler. Pre-made AIOs are a ticking time bomb and if they get issues later down the line, you have nothing to fall back to.


----------



## Malkorath

Quote:


> Originally Posted by *Dasboogieman*
> 
> It would be safer to get the 11gbps one and slap your own AIO to it. That way, you can always fall back to the air cooler. Pre-made AIOs are a ticking time bomb and if they get issues later down the line, you have nothing to fall back to.


Not really up for installing an AIO on a card myself tbh. I like to keep things simple with my system. Having a tough time trying to rationalize installing an AIO cooler for my CPU or not, let alone a GPU.

Btw is there any chance of a promotion coming up over the next couple weeks? Preferably an extension or reiteration of the Destiny 2 promotion.


----------



## coreykill99

here is about the best I can do. well. idk its the furthest ive pushed it so far. as seeing decreases in pushing further. but ive heard that its to be expected and there can be more gains if you push considerably past that point. but i havent tried it yet.
2126 on the core
5556 on the mem


----------



## EDK-TheONE

1080 GamingX with GamingZ bios, [email protected] [email protected]@1.09


----------



## Sharchaster

MSI GTX 1080 Gaming X PLUS stock clocks / boost, with 12,6 GHz memory overclock....


----------



## Sharchaster

double post, I'm sorry............


----------



## HAL900

12.6 nice


----------



## Yukss

hi guys, here are some of my results if helps


----------



## EDK-TheONE

Nice result. can you share your vbios?


----------



## Vellinious

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Nice result. can you share your vbios?


You realize that it's very likely your temps that are the limiting factor, and not the bios, yes?


----------



## Yukss

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Nice result. can you share your vbios?


me ?


----------



## Sharchaster

Quote:


> Originally Posted by *HAL900*
> 
> 12.6 nice


yeah haha, this card is rock, it helps me to gain more stable on min fps while gaming, also increase the score in superposition, without OC the core....
only crash while I try to set to max (+1000) Mem.


----------



## Sharchaster

rebench this morning on my house...and here's the results...



12,8 GHz on Memory, and 2113 MHz on core....


----------



## Vellinious

Quote:


> Originally Posted by *Sharchaster*
> 
> rebench this morning on my house...and here's the results...
> 
> 
> 
> 12,8 GHz on Memory, and 2113 MHz on core....


That's a whole lot of power limit throttling....


----------



## Sharchaster

Quote:


> Originally Posted by *Vellinious*
> 
> That's a whole lot of power limit throttling....


the max is 106,6% when Running, I take the screenshot while I'm finishing the benchmark (the card was idle at that time),
hell it's only 104% and it's only the downside of this card, other than that the performance was great even amazing I think.

*
If I can get at least 120% power limit, maybe I can try to something like 2200 Mhz on air, as temps (maybe) allow.
The temp never exceed 65C*


----------



## HAL900

2100 /12000 MHZ


----------



## coreykill99

Quote:


> Originally Posted by *Yukss*
> 
> hi guys, here are some of my results if helps
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


so you have 4650 and I have 4600. you have a slightly lower core. the cooling looks similar. is something this close margin of error. variance in silicone....
or just pascal doing that thing where it only shows you half of the picture and clocks down because it wants to but doesn't tell you.


----------



## HAL900

2139 MHZ - boost 3.0 / 12010 MHZ


----------



## coreykill99

that looks a bit closer to what ive seen as margin of error. around 10-15 point variance I will get between runs. 50 points just seemed like a lot.
looks like I need to push my memory harder.
ugh, now ive got to wait for my third motherboard to come back from RMA, doubt Im going to be doing any score chasing on an A320 lol.


----------



## Dasboogieman

Quote:


> Originally Posted by *coreykill99*
> 
> that looks a bit closer to what ive seen as margin of error. around 10-15 point variance I will get between runs. 50 points just seemed like a lot.
> looks like I need to push my memory harder.
> ugh, now ive got to wait for my third motherboard to come back from RMA, doubt Im going to be doing any score chasing on an A320 lol.


Superposition is unusually precise for me. Like 80% of runs with identical settings would arrive at the exact same score on the dot, the other 20% show only a 2-3pts variance.


----------



## Yukss

Quote:


> Originally Posted by *coreykill99*
> 
> so you have 4650 and I have 4600. you have a slightly lower core. the cooling looks similar. is something this close margin of error. variance in silicone....
> or just pascal doing that thing where it only shows you half of the picture and clocks down because it wants to but doesn't tell you.


yes i guess, nice result everyone.

mine is at 2100 core and 11100 on the mems. i probably can push it further a bit.









PS. mine is the FE card, so it does not have double pci exp power connector that is why it is so limited on the power.. MAX temp on load in winter are 45C and now on summer is around 55C


----------



## HAL900

try this bios https://www.dropbox.com/s/jgy94v4is1n57z7/gp104.rom?dl=0
230w in 1 x8 pin cards

https://www.dropbox.com/s/nurkelc2zctl6a8/aaaa.rom?dl=0
or this .No limit


----------



## Vellinious

Quote:


> Originally Posted by *Sharchaster*
> 
> the max is 106,6% when Running, I take the screenshot while I'm finishing the benchmark (the card was idle at that time),
> hell it's only 104% and it's only the downside of this card, other than that the performance was great even amazing I think.
> 
> *
> If I can get at least 120% power limit, maybe I can try to something like 2200 Mhz on air, as temps (maybe) allow.
> The temp never exceed 65C*


No, in the PerfCapReason line. You're seeing a lot of green. That's PWR, power limit, which I believe is NVIDIAs boost making microadjustments....voltage or clock, it's doin something.


----------



## Yukss

Quote:


> Originally Posted by *HAL900*
> 
> try this bios https://www.dropbox.com/s/jgy94v4is1n57z7/gp104.rom?dl=0
> 230w in 1 x8 pin cards
> 
> https://www.dropbox.com/s/nurkelc2zctl6a8/aaaa.rom?dl=0
> or this .No limit


how do i flash my card, never done it before.


----------



## HAL900

https://www.dropbox.com/s/w56qrpuwxg55eqg/nvflash_5.319.0-win.zip%281%29.zip?dl=0

nvflash64 -6 gp104.rom


----------



## Yukss

Quote:


> Originally Posted by *HAL900*
> 
> https://www.dropbox.com/s/w56qrpuwxg55eqg/nvflash_5.319.0-win.zip%281%29.zip?dl=0
> 
> nvflash64 -6 gp104.rom


Thanks, i have to ask.. will i kill my card ?


----------



## HAL900

NO


----------



## Yukss

Quote:


> Originally Posted by *HAL900*
> 
> NO


ty, i just have to click on them and that is it ?


----------



## HAL900

restart pc


----------



## Yukss

Quote:


> Originally Posted by *HAL900*
> 
> restart pc


after double click on them ? sorry i am a noob about t, i overclock stuff, but never flash anything


----------



## andydabeast

So I am still learning how to OC this card. I have the windforce card. I am using Superposition 1080p Extreme to test.

1. With core offset at +150 I experimented with voltage offset. Anything over 50% gave me artifacts. Then at +50% voltage I stepped up the core offset till I had artifacts and landed at +175 core and +50 voltage. Why does more volts give artifacts? is that normal?

2. My power limit slider only goes up to 108% in MSI Afterburner from the 100% stock which is weird. I guess that is just for throttling the power?

3. I have the memory at +500. At +600 my score went down. Still gotta tune more to find the sweet spot. But below he is at +800 memory and wrecks my score by over 100 points. Is there any logic to how the memory clocks work or is it silicon lottery??
EDIT: after reading

__
https://www.reddit.com/r/4mm5zt/for_those_overclocking_1080_memory_please_read/
 I think I will just do a superposition run at 10 or 25 increments from stock to unstable and see all the numbers to find my best clock.
Quote:


> Originally Posted by *Sharchaster*
> 
> MSI GTX 1080 Gaming X PLUS stock clocks / boost, with 12,6 GHz memory overclock....


----------



## HAL900

Quote:


> Originally Posted by *Yukss*
> 
> after double click on them ? sorry i am a noob about t, i overclock stuff, but never flash anything


You approve 2 x y and that's all
+ restart pc


----------



## AshBorer

I'm going to be delidding my 7700k soon and will be using liquid metal in place of thermal paste

Is it worth replacing the paste on my 1080 GPU die with liquid metal too?


----------



## coreykill99

are you using water cooling for the gpu? AFAIK it isnt really "recommended" but I did it anyway, used conductonaut by thermal grizzly on the gpu die and on my ryzen chip. old gpu idle temp was 31-32C new temp is around 25C but YMMV. much lower temps in idle. but maybe only 2-3C drops under load.

if your using air cooling. I have heard of it working. just keep in mind the material of your GPU cooler. making sure its not aluminum.
and however much you think you need of the liquid metal. you need less. much less. probably about half of what you thought you were gonna need. you can always add a tiny drop. but removing the stuff is nigh impossible.


----------



## AshBorer

It's an EVGA SC from June 2016 right when pascal came out

https://www.evga.com/products/product.aspx?pn=08g-p4-6183-kr

I think it's an aluminum heatsink? Idk if the actual heatplate is coated in something else. Iirc aluminum doesn't do well with the liquid metal stuff from coolaboratory


----------



## coreykill99

idk. glancing at a few tear downs on you tube. looks like aluminum to me. maybe if your lucky its nickel plated something. but if your just air cooling the card. i really wouldnt worry about it, get some thermal grizzly Kryonaut and put it on there. Or just whatever you have laying around even.
I dont think it matters which company you get your card from, im sure anything you have laying around is probably better than stock paste.


----------



## AshBorer

Yeah I'll play it safe and just leave it alone. I've got thermal grizzly hydronaut on it. Trying to get my hands on some kryonaut at some point


----------



## coreykill99

Quote:


> Originally Posted by *AshBorer*
> 
> Yeah I'll play it safe and just leave it alone. I've got thermal grizzly hydronaut on it. Trying to get my hands on some kryonaut at some point


not to get too offtopic with thermal paste info. but I bought conductonaut and kryonaut. and was supplied with hydronaut from EK with a block. and between the hydronaut and kryonaut. I seen 0 difference between the 2. so if you have enough of the one you already have. I would suggest you dont waste your money chasing after more pastes.
just my 2 cents.


----------



## Gen Patton

I would go Evga FTW2 go to their web page and look it up.


----------



## Gdourado

Is the GIGABYTE AORUS GEFORCE GTX 1080 11GBPS a good 1080?
How big of an upgrade is compared to a 980ti G1 that does 1500mhz core?

Cheers


----------



## mrgnex

Quote:


> Originally Posted by *Gdourado*
> 
> Is the GIGABYTE AORUS GEFORCE GTX 1080 11GBPS a good 1080?
> How big of an upgrade is compared to a 980ti G1 that does 1500mhz core?
> 
> Cheers


Every 1080 is a good 1080.. This cooler and PCB is indeed very good. It'll keep it cool. Any 1080 is about 30% faster than a 980ti. It depends on how good the 1080 OC's..


----------



## Sharchaster

Quote:


> Originally Posted by *andydabeast*
> 
> So I am still learning how to OC this card. I have the windforce card. I am using Superposition 1080p Extreme to test.
> 
> 1. With core offset at +150 I experimented with voltage offset. Anything over 50% gave me artifacts. *Then at +50% voltage I stepped up the core offset till I had artifacts and landed at +175 core and +50 voltage. Why does more volts give artifacts? is that normal?*
> 
> 2. My power limit slider only goes up to 108% in MSI Afterburner from the 100% stock which is weird. I guess that is just for throttling the power?
> 
> 3. I have the memory at +500. At +600 my score went down. Still gotta tune more to find the sweet spot. But below he is at +800 memory and wrecks my score by over 100 points. Is there any logic to how the memory clocks work or is it silicon lottery??
> EDIT: after reading
> 
> __
> https://www.reddit.com/r/4mm5zt/for_those_overclocking_1080_memory_please_read/
> I think I will just do a superposition run at 10 or 25 increments from stock to unstable and see all the numbers to find my best clock.


Bold : It happened to me too at the first time I set it, I think it's drivers issue, not the card, since NOW I never seen an artifacts again , until I Pushed the card beyond the limit (>2113 MHz)...

No.3 I think it's sillicon lottery or the BIOS itself, I only crashed the card when I crank up into MAX (+1000) at Superposition. *below +1000 is fine for me until now.*

I will post my rebench after work....*maybe in 3-4 hours from the time I submit my post now..*..

Number 2 : yes it will throttle the power, my suggest is not set into max, I GOT less throttle when I let the slider to 100% instead of setting it to the max.


----------



## Gdourado

Quote:


> Originally Posted by *mrgnex*
> 
> Every 1080 is a good 1080.. This cooler and PCB is indeed very good. It'll keep it cool. Any 1080 is about 30% faster than a 980ti. It depends on how good the 1080 OC's..


That particular Gigabyte is the same price as a Vega 64 air cooled.
How do they compare. Is the 1080 faster.


----------



## coreykill99

from what ive seen a stock vega 64 can in a few titles run comparably to a stock 1080. in most it lags behind. but can be OC'd to catch up. and in even fewer surpass the 1080. however power consumption is insane on an OC'd vega 64. so youd need to keep that in mind.
I havent, however seen many comparisons of a oc'd vega 64 to an oc'd 1080. have a good feeling the 1080 would curb stomp the card. both on performance and power draw. but more research would have to be done to find a review comparing both oc'd

I was originally going to wait for vega when I built my ryzen system but decided I wanted all my components now. so I bought the 1080 in march.
I don't regret it at all. and I am glad I did after I seen the performance they brought with vega 64.
from what Ive seen only vega 56 is halfway decent and that's only when you find it at a non inflated price.

but all this is anecdotal and opinion as I only have a 1080 and have only watched some reviews I haven't personally played with a Vega card


----------



## stephenn82

Quote:


> Originally Posted by *andydabeast*
> 
> So I am still learning how to OC this card. I have the windforce card. I am using Superposition 1080p Extreme to test.
> 
> 1. With core offset at +150 I experimented with voltage offset. Anything over 50% gave me artifacts. Then at +50% voltage I stepped up the core offset till I had artifacts and landed at +175 core and +50 voltage. Why does more volts give artifacts? is that normal?
> 
> 2. My power limit slider only goes up to 108% in MSI Afterburner from the 100% stock which is weird. I guess that is just for throttling the power?
> 
> 3. I have the memory at +500. At +600 my score went down. Still gotta tune more to find the sweet spot. But below he is at +800 memory and wrecks my score by over 100 points. Is there any logic to how the memory clocks work or is it silicon lottery??
> EDIT: after reading
> 
> __
> https://www.reddit.com/r/4mm5zt/for_those_overclocking_1080_memory_please_read/
> I think I will just do a superposition run at 10 or 25 increments from stock to unstable and see all the numbers to find my best clock.


How?

I have an EVGA FTW Hybrid, and when i touch the ram speed, it tanks out and crashes superposition and valley. Im talking just putting +100mhz on memory. It runs cool. Never peaks over 42c during 1080p Extreme runs.

I left voltage alone as well as 100% on slider, no difference. My core clock is only 125+ so not anything insane.

Ideas? Anyone? I need to find another video on this. Jayztwocents is good...maybe i should watch it again while tweaking the settings. Looking for personal experience here, and not just a talking head with some fun commentary.


----------



## coreykill99

Quote:


> Originally Posted by *stephenn82*
> 
> How?
> 
> I have an EVGA FTW Hybrid, and when i touch the ram speed, it tanks out and crashes superposition and valley. Im talking just putting +100mhz on memory. It runs cool. Never peaks over 42c during 1080p Extreme runs.
> 
> I left voltage alone as well as 100% on slider, no difference. My core clock is only 125+ so not anything insane.
> 
> Ideas? Anyone? I need to find another video on this. Jayztwocents is good...maybe i should watch it again while tweaking the settings. Looking for personal experience here, and not just a talking head with some fun commentary.


have you tried putting the core back to stock and just trying to OC the memory?
maybe the core is only showing its instability when the mem is loaded up. probably unlikely but worth a shot.
I know the card only pulls power against what it needs. but my card shows detrimental effects with the power slider all the way up. you could try adding 15% to the power slider at a time and trying the mem again.
and is this a new 1080 with the 11Gb/s mem already on it? it might not like being touched that way if the mems clocked that high to start.


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> How?
> 
> I have an EVGA FTW Hybrid, and when i touch the ram speed, it tanks out and crashes superposition and valley. Im talking just putting +100mhz on memory. It runs cool. Never peaks over 42c during 1080p Extreme runs.
> 
> I left voltage alone as well as 100% on slider, no difference. My core clock is only 125+ so not anything insane.
> 
> Ideas? Anyone? I need to find another video on this. Jayztwocents is good...maybe i should watch it again while tweaking the settings. Looking for personal experience here, and not just a talking head with some fun commentary.


For better results, you should use the voltage / frequency curve.

When you set an offset overclock, you're allowing the software to create what roughly equates to a stock voltage / frequency curve by itself....it's essentially just raising every point across the curve up by your offset amount. As you can see below, I set an offset of +150 on the slider, but looking at the curve, it set +143. That's because 150 was outside of the 12mhz steps that Pascal uses. Disregard that for a second, and just look at the voltage. See how the curve that it's set, has a voltage for that clock at 1043mv? That's going to allow the GPU to try to run your prescribed clock at that voltage, before it bumps the voltage up. Micro-changes that like inherently cause instability.



Now we'll look at the voltage / frequency curve method. I've set an offset clock of +110, to get my voltage points close, then raised the frequency points for the voltages above 1043mv to higher clocks until I got to the voltage and frequency I was targeting. In this case, the exact same 2164 core clock (+143), except now, it's not going to try to run 1043mv, it's going to go straight to the 1081mv I've prescribed for it to run, and won't go any lower, unless it starts warming up, in which case, it would drop a step and run 2154 @ 1075mv. This GPU is under water, so the likelihood of that happening are slim and none, but.....on air, it could.



I wrote this up quick to help a few others....really basic, and not too detailed. You can go pretty extreme into the voltage / frequency curve if you like. For instance....I charted that my GPU would run fine at 1081mv @ 2214 when the core stays under 34c. Higher than 34c would require 1093mv. Lower than 20c on the core, and it'd run at 1075mv.

Voltage point, frequency point, core temp, ambient temp.


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> For better results, you should use the voltage / frequency curve.
> 
> When you set an offset overclock, you're allowing the software to create what roughly equates to a stock voltage / frequency curve by itself....it's essentially just raising every point across the curve up by your offset amount. As you can see below, I set an offset of +150 on the slider, but looking at the curve, it set +143. That's because 150 was outside of the 12mhz steps that Pascal uses. Disregard that for a second, and just look at the voltage. See how the curve that it's set, has a voltage for that clock at 1043mv? That's going to allow the GPU to try to run your prescribed clock at that voltage, before it bumps the voltage up. Micro-changes that like inherently cause instability.
> 
> 
> 
> Now we'll look at the voltage / frequency curve method. I've set an offset clock of +110, to get my voltage points close, then raised the frequency points for the voltages above 1043mv to higher clocks until I got to the voltage and frequency I was targeting. In this case, the exact same 2164 core clock (+143), except now, it's not going to try to run 1043mv, it's going to go straight to the 1081mv I've prescribed for it to run, and won't go any lower, unless it starts warming up, in which case, it would drop a step and run 2154 @ 1075mv. This GPU is under water, so the likelihood of that happening are slim and none, but.....on air, it could.
> 
> 
> 
> I wrote this up quick to help a few others....really basic, and not too detailed. You can go pretty extreme into the voltage / frequency curve if you like. For instance....I charted that my GPU would run fine at 1081mv @ 2214 when the core stays under 34c. Higher than 34c would require 1093mv. Lower than 20c on the core, and it'd run at 1075mv.
> 
> Voltage point, frequency point, core temp, ambient temp.


Thanks for all of that. It is relevant but doesnt mention memory overclocking. I use XOC, and run +125 on my linear setup, very similar to how you run yours. I dont use AB anymore. How do you go about changing the voltage along that line like you would/could for clock?

I dont know if memory speeds greater than 5005 will have any effect on performance. The 390 I used to have did wen pushing memory up. But almost no relative game performance, maybe 2-3fps increase.


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> Thanks for all of that. It is relevant but doesnt mention memory overclocking. I use XOC, and run +125 on my linear setup, very similar to how you run yours. I dont use AB anymore. How do you go about changing the voltage along that line like you would/could for clock?
> 
> I dont know if memory speeds greater than 5005 will have any effect on performance. The 390 I used to have did wen pushing memory up. But almost no relative game performance, maybe 2-3fps increase.


Memory is pretty easy. There is no voltage for memory...pick a clock that runs well and stays stable with multiple runs, with slightly increasing temps. I tested my memory clocks using Timespy Graphics test 1, with the core clock at 2189 @ 1075mv. I started with +400, and moved up 50 each step until performance started falling off, then I backed up a step, and went by 10....rinse and repeat.

Memory clocks really depend on the software. Some benches respond very well to memory overclock, some respond better to core clock. Games are the same way. Some people are able to get +800 offset and higher on their memory.....it doesn't scale right, though. +700 may run like crap, +770 will be run like crazy, 775 falls back off again, just to find 810 gives the highest benchmark scores.

I have personally, never run a bench over +595.


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> Thanks for all of that. It is relevant but doesnt mention memory overclocking. I use XOC, and run +125 on my linear setup, very similar to how you run yours. I dont use AB anymore. How do you go about changing the voltage along that line like you would/could for clock?
> 
> I dont know if memory speeds greater than 5005 will have any effect on performance. The 390 I used to have did wen pushing memory up. But almost no relative game performance, maybe 2-3fps increase.


On most 1080s, there is no separate memory Voltage control.

As far as memory overclocks, it's trial and error. Every card is different. Get your card hot and experiment on a paused screen in Valley.

My Gaming X has 2 memory 'sweet spots', both between 5525-5600. YMMV


----------



## stephenn82

I apologize for my communications skills...I wasnt asking about memory voltage control at all.

1) How do you set a slider for voltage control like you do with clocks on core? I see that yo uhave two curves, one white and one grey, corresponding to core clock at specific voltage? With XOC, my blue lines are voltage...and it tops out at 1062mv. I set a clock per that chart by clicking the box above it. So, at 1062mv, I can run +125mhz across all voltages by setting it in the basic settings of XOC. It looks to have the same curve as setting the offset in the main window of XOC as well.

2) Setting memory is just trial and error. I got it. I added 100+ on my card and superposition didnt like it. it would crash out at scene 3 each time. odd. put card at defaults and just add 100+ to memory, thinking that core clock could have caused it, same...crash out in scene 3. Put memory back to 5005 and offset to 125, it was all good as expected.


----------



## Gdourado

I am seeing these two cards:
GIGABYTE AORUS GeForce GTX 1080 8G 11Gbps
KFA2 GeForce GTX 1080 EX OC Edition

The Gigabyte is 80 euros more expensive.
But is it worth it?
From what I see it has a better cooler, better PCB and 11gbps memory.
Is that worth 80 euros more?

Cheers!


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> I apologize for my communications skills...I wasnt asking about memory voltage control at all.
> 
> 1) How do you set a slider for voltage control like you do with clocks on core? I see that yo uhave two curves, one white and one grey, corresponding to core clock at specific voltage? With XOC, my blue lines are voltage...and it tops out at 1062mv. I set a clock per that chart by clicking the box above it. So, at 1062mv, I can run +125mhz across all voltages by setting it in the basic settings of XOC. It looks to have the same curve as setting the offset in the main window of XOC as well.
> 
> 2) Setting memory is just trial and error. I got it. I added 100+ on my card and superposition didnt like it. it would crash out at scene 3 each time. odd. put card at defaults and just add 100+ to memory, thinking that core clock could have caused it, same...crash out in scene 3. Put memory back to 5005 and offset to 125, it was all good as expected.


There aren't two curves.....the top screenshot is showing what happens when you set an offset overclock for the core. The bottom is what a proper voltage / frequency curve should look like. To change them, grab a point and drag it up to where you want it.

I stopped using EVGA's tool because it's pretty poorly done, imo.


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> There aren't two curves.....the top screenshot is showing what happens when you set an offset overclock for the core. The bottom is what a proper voltage / frequency curve should look like. To change them, grab a point and drag it up to where you want it.
> 
> I stopped using EVGA's tool because it's pretty poorly done, imo.


I know it has been polished quite a bit...but if its still not the best, I may roll back to AB.

How do you go about bumping the voltage up, if needed? I have a voltage slider on the left, goes from 0-100%.


----------



## stephenn82

I found this video that runs through it somewhat.


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> I found this video that runs through it somewhat.


Here's how to use Afterburner on a 1080: Afterburner 4.3.0 Tutorial.
.


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> I know it has been polished quite a bit...but if its still not the best, I may roll back to AB.
> 
> How do you go about bumping the voltage up, if needed? I have a voltage slider on the left, goes from 0-100%.


Put the slider up to 100%


----------



## mrgnex

Quote:


> Originally Posted by *Gdourado*
> 
> That particular Gigabyte is the same price as a Vega 64 air cooled.
> How do they compare. Is the 1080 faster.


The 1080 is faster by around 10% (http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-vs-AMD-RX-Vega-64/3603vs3933)
Also, the 1080 is vastly more power efficient. I don't see any reason to go with Vega.
Quote:


> Originally Posted by *Gdourado*
> 
> I am seeing these two cards:
> GIGABYTE AORUS GeForce GTX 1080 8G 11Gbps
> KFA2 GeForce GTX 1080 EX OC Edition
> 
> The Gigabyte is 80 euros more expensive.
> But is it worth it?
> From what I see it has a better cooler, better PCB and 11gbps memory.
> Is that worth 80 euros more?
> 
> Cheers!


I don't think that's a question we can answer for you. The silicon lottery plays a big part in overclocking. A PCB design might squeeze out the last few MHz but the bulk of the potential is just sheer luck..
The Gigabyte might be quieter, a little faster and nicer to look at. Ask yourself if that's worth 80 euto's.
Or find one that sits between them..


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> Put the slider up to 100%


I did this testing it last night, it still crashed...not sure what I am doing wrong lol

I will just roll to AB and see what it gives me...tomorrow after shift...I hate being stuck at work for 24 hours. At least I have Youtube, Netflix, Prime, and most importantly, OCN to keep me busy between actual work, sleep, and shool.


----------



## Gen Patton

Ok guys i said once it gets here i will show. Here"s motoko:






More to follow


----------



## Gen Patton

I have not started to Overclock yet, but temps are in the 40"s and my FX 8350 a cool 34


----------



## mrgnex

Quote:


> Originally Posted by *Gen Patton*
> 
> I have not started to Overclock yet, but temps are in the 40"s and my FX 8350 a cool 34


You got a 1080 with a 8350?


----------



## andydabeast

Quote:


> Originally Posted by *Gen Patton*
> 
> I have not started to Overclock yet, but temps are in the 40"s and my FX 8350 a cool 34


Quote:


> Originally Posted by *mrgnex*
> 
> You got a 1080 with a 8350?


I had that... Till I went Ryzen ;-) I am still benching the change but here are some numbers-

First number is with the 8370 at 4.8ghz. Second number is with a stock 1600x. The GPU is the same for all tests with a core overclock of +140 core and no memory OC.

Heaven 1440p ultra- 1,609 - 1,666

-Firestrike
Firestrike- 12,254 - 17,684
Graphics- 21,793 - 22,732
Physics- 8,818 - 17,149
Combo- 3,314 - 6,753

SuperPosition 1080p Extreme- 4,131 - 4,137

-Time Spy
CPU- 3,615 - 5,969
GPU- 7,299 - 7,606

Tomb Raider 2013- 1440p Max- 116 - 126

-Performance Test
Single thread- 1,748 - 1,913
CPU score- 10,501 - 14,446
GPU (3D)- 6,028 - 12,538??
Memory- 1,430 - 2,152

-Cinebench
multicore- 714 - 1,224
single core- 109 - 153


----------



## Sharchaster

1440p extreme, 2113 MHz on Core and +975 on Memory


----------



## HAL900

Power limit 220w is too low








https://www.dropbox.com/s/nurkelc2zctl6a8/aaaa.rom?dl=0
Try this


----------



## n8t1308

So quick question for the bright minds at OCN. I have an MSI Geforce GTX 1080 Armor OC 8gb for my GPU and I am going to be doing an SLI configuration. The plan is to use the same card and then water cool it like the other GPU already in my rig. My problem is when I will be able to buy it. I cant find it on MSI's website anymore and the stocks are getting smaller and smaller every time I look at vendors online. I wont be able to buy it for another couple weeks at least. When the time comes and I am finally able to buy it, what do I do if i cant find one anymore? Can I SLI with another MSI 1080 like the Gaming X version no problem?
I know my rig can handle another GPU so that isn't an issue at all. Literally just worried the card I want to SLI may be discontinued now







Any and all help is greatly appreciated!


----------



## Beagle Box

Quote:


> Originally Posted by *n8t1308*
> 
> So quick question for the bright minds at OCN. I have an MSI Geforce GTX 1080 Armor OC 8gb for my GPU and I am going to be doing an SLI configuration. The plan is to use the same card and then water cool it like the other GPU already in my rig. My problem is when I will be able to buy it. I cant find it on MSI's website anymore and the stocks are getting smaller and smaller every time I look at vendors online. I wont be able to buy it for another couple weeks at least. When the time comes and I am finally able to buy it, what do I do if i cant find one anymore? Can I SLI with another MSI 1080 like the Gaming X version no problem?
> I know my rig can handle another GPU so that isn't an issue at all. Literally just worried the card I want to SLI may be discontinued now
> 
> 
> 
> 
> 
> 
> 
> Any and all help is greatly appreciated!


The Armor is just a cheaper Gaming X with a crappy cooler. The PCBs are _exactly_ the same.


----------



## stephenn82

Quote:


> Originally Posted by *stephenn82*
> 
> I did this testing it last night, it still crashed...not sure what I am doing wrong lol
> 
> I will just roll to AB and see what it gives me...tomorrow after shift...I hate being stuck at work for 24 hours. At least I have Youtube, Netflix, Prime, and most importantly, OCN to keep me busy between actual work, sleep, and shool.


A quick review of progress.

Bumped slider to 100% voltage, slid power to max and temp to max in AB. No more xoc. Still wont take a vram overclock wothout crashing. Not even 100mhz.


----------



## stephenn82

I also assume that the AB settings i should set to third party for voltage and clock control? As my card is an EVGA
see below:



With these settings, score went down in Superposition. by quite a bit. With AB




With XOC


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> For better results, you should use the voltage / frequency curve.
> 
> When you set an offset overclock, you're allowing the software to create what roughly equates to a stock voltage / frequency curve by itself....it's essentially just raising every point across the curve up by your offset amount. As you can see below, I set an offset of +150 on the slider, but looking at the curve, it set +143. That's because 150 was outside of the 12mhz steps that Pascal uses. Disregard that for a second, and just look at the voltage. See how the curve that it's set, has a voltage for that clock at 1043mv? That's going to allow the GPU to try to run your prescribed clock at that voltage, before it bumps the voltage up. Micro-changes that like inherently cause instability.
> 
> 
> 
> Now we'll look at the voltage / frequency curve method. I've set an offset clock of +110, to get my voltage points close, then raised the frequency points for the voltages above 1043mv to higher clocks until I got to the voltage and frequency I was targeting. In this case, the exact same 2164 core clock (+143), except now, it's not going to try to run 1043mv, it's going to go straight to the 1081mv I've prescribed for it to run, and won't go any lower, unless it starts warming up, in which case, it would drop a step and run 2154 @ 1075mv. This GPU is under water, so the likelihood of that happening are slim and none, but.....on air, it could.
> 
> 
> 
> I wrote this up quick to help a few others....really basic, and not too detailed. You can go pretty extreme into the voltage / frequency curve if you like. For instance....I charted that my GPU would run fine at 1081mv @ 2214 when the core stays under 34c. Higher than 34c would require 1093mv. Lower than 20c on the core, and it'd run at 1075mv.
> 
> Voltage point, frequency point, core temp, ambient temp.


Here is my attempt at the curve method. Its not as simple as click a point and XOC does the work, its per point along teh scale. Matched your +156. Lets give her a go.


----------



## stephenn82

FAILED at +156 core clock. Maybe I just have a dud card?


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> FAILED at +156 core clock. Maybe I just have a dud card?


Every card is a little bit different. Yours may need more voltage, or less voltage to run the same clock. Too much voltage can be just as unstable as too little.


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> FAILED at +156 core clock. Maybe I just have a dud card?


No.
Your card failed because it can't do 2176MHz @ 1.075V at whatever temp it's running on your PC.

Every point on the curve up to its max Voltage point should be realistically doable by the card at average running temps. Some BIOSs are pretty good about recovering and actively seek a doable speed/Voltage combination on or below the curve. Other BIOSs just drop out and run the stock curve until you reset Afterburner.

I suggest you start with a lower curve. Your BIOS begins its plateau @ 1.050V. Your curve should be no higher than 2100MHz at that point (lower if it can't manage that). You can then set to work maximizing the higher Voltage end of the curve.

If you scan some of the benchmarking threads, you'll see some folks provide curve information along with their submissions. If you look closely, you'll notice the same card will have different a curve for each benchmark. And then nobody games using these curves. Other, lower curves are better for gaming or casual use.

Get ideas from those curves , but don't copy them directly unless you enjoy frustration and failure. Every card is different, as is every PC they run in.

To make things easier, start thinking in terms of _'[email protected]@Temperature'_. "+125" really doesn't mean anything.

Good Luck.


----------



## stephenn82

Quote:


> Originally Posted by *Beagle Box*
> 
> No.
> Your card failed because it can't do 2176MHz @ 1.075V at whatever temp it's running on your PC.
> 
> Every point on the curve up to its max Voltage point should be realistically doable by the card at average running temps. Some BIOSs are pretty good about recovering and actively seek a doable speed/Voltage combination on or below the curve. Other BIOSs just drop out and run the stock curve until you reset Afterburner.
> 
> I suggest you start with a lower curve. Your BIOS begins its plateau @ 1.050V. Your curve should be no higher than 2100MHz at that point (lower if it can't manage that). You can then set to work maximizing the higher Voltage end of the curve.
> 
> If you scan some of the benchmarking threads, you'll see some folks provide curve information along with their submissions. If you look closely, you'll notice the same card will have different a curve for each benchmark. And then nobody games using these curves. Other, lower curves are better for gaming or casual use.
> 
> Get ideas from those curves , but don't copy them directly unless you enjoy frustration and failure. Every card is different, as is every PC they run in.
> 
> To make things easier, start thinking in terms of _'[email protected]@Temperature'_. "+125" really doesn't mean anything.
> 
> Good Luck.


Thanks guys, you and @Vellinious both. Just like a CPU, sweet spot for voltages.

I think perhaps AMD cards have eaten my brain over the last 10 years. They didnt have a voltage at maximum clock speed curve. It was "put this voltage in the BIOS, and if the card needs it, it will adjust up until it hits this thermal limit."

My card runs at 42-47. I notice when it hits about 38-40 it shifts from 2126 down to 2104. around 43 it drops down to 2002, and my card doesnt get over 47c.

I have to learn this thing.

I have no idea why my memory doesnt want to overclock at all. Not even 25mhz. It crashes out benches and games.


----------



## stephenn82

OOOOH just found this...nice!

http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,2.html

GOLD MINE!


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> OOOOH just found this...nice!
> 
> http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,2.html
> 
> GOLD MINE!


Wha?!
I posted that for you 2 pages ago!


----------



## stephenn82

A third update...

After finding I hit CTRL and F to get curve, then control and drag the line, I just started at 100 and seen what it can do. During Superposition, it held 2100 the whole time, peaked at 39c, and got the highest score of all my runs so far.

Voltage is at 100%, Power is 130%, Temp is default 83. unlinked as well. Might go for more now. 25mhz at a time.


----------



## stephenn82

Quote:


> Originally Posted by *Beagle Box*
> 
> Wha?!
> I posted that for you 2 pages ago!


oh boy...lol

Its probably the fact that I was at work and couldnt test it...and half asleep. Coffee doesnt wake me up any more.







Good loooking out. at least your second wordings in recent post led me to find it...so that still counts, right?


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> oh boy...lol
> 
> Its probably the fact that I was at work and couldnt test it...and half asleep. Coffee doesnt wake me up any more.
> 
> 
> 
> 
> 
> 
> 
> Good loooking out. at least your second wordings in recent post led me to find it...so that still counts, right?


S'all good.


----------



## stephenn82

ok, 125 in the curve netted me more points, at still locked at 43c max. 2113 whole time running. next!

Thanks all, for helping educate this AMD infested brain. Finally, good to be back with Team Green!


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> ok, 125 in the curve netted me more points, at still locked at 43c max. 2113 whole time running. next!
> 
> Thanks all, for helping educate this AMD infested brain. Finally, good to be back with Team Green!


Glad you're making progress.

Have you researched your memory overclock issue? IIRC, EVGA experimented with different memory 'upgrades'. Some cards allow separate memory Voltage control and some have ECC memory or something that basically makes overclocking near impossible....?

I could be thinking of something else, or just imagining it, but you may want to investigate...


----------



## stephenn82

I dont have any options for that, even in XOC, so I presume that I have the ECC memory...stuck at the 10010mhz. dang it!


----------



## stephenn82

curve to 151 didnt like it, points dropped. i htink 125 is the sweet spot for this guy. Maybe I shouldnt drag the furthest most curve point up, but somewhere else?


----------



## stephenn82

it holds 2125 on the core clock though.

with 151 curve, stock memory speeds


and 125 curve for comparison stock mem speeds


----------



## stephenn82

I guess its an ok card, and nothing scrubby because it wont do +150 on curve. With 100% voltage and 130% power, it hits 2125 core with boost 3.0, and MAINTAINS it while being at 42c. I think its a win. Even if I turn it down a little and only get 2100 and more performance...its a win, right?









Just a bummer that mem is stuck at ONLY 10ghz...lol


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> curve to 151 didnt like it, points dropped. i htink 125 is the sweet spot for this guy. Maybe I shouldnt drag the furthest most curve point up, but somewhere else?
> 
> 
> Spoiler: Screenshot of Curve


Yeah, unless you've flashed your card with the ASUS T4 BIOS, doing anything to the curve beyond 1.093V is meaningless.

Logically, your curve should probably peak @ 1.081V or 1.093V.

Try running 2100MHz from 1.050V to 1.075V and then bumping 1.081V up to 2138MHz.

You give up way too easily to be an overclocker...


----------



## Djreversal

Hey guys,

So i got my new ram and my 2 new 1080ti's today in the mail, so I installed the water blocks and reassembled my system.. I got regular just FE Cards and EK blocks. I downloaded the XOC bios and installed it, everything is working well.. but I think maybe I'm missing something. When I open up afterburner all I have is a Voltage Bar, but the power bar and stuff is grey'd out.. Nothing I can change there.. and it seems I can only push these cards to 2050mhz anything over and I just crash consistently. My question is, is there anything I have to do to maybe release more voltage?? I didn't do anything else besides the flash of the BIOS.... I got +60mhz on the core which brought me to the 2050mhz and then I did +600 on the memory which I cant remember what that peaked at I would have to check again... This is my 3dmark for Firestrike Ultra

https://www.3dmark.com/fs/13635486


----------



## stephenn82

Quote:


> Originally Posted by *Beagle Box*
> 
> Yeah, unless you've flashed your card with the ASUS T4 BIOS, doing anything to the curve beyond 1.093V is meaningless.
> 
> Logically, your curve should probably peak @ 1.081V or 1.093V.
> 
> Try running 2100MHz from 1.050V to 1.075V and then bumping 1.081V up to 2138MHz.
> 
> You give up way too easily to be an overclocker...


nah, i am the guy who sees things through until its too late...I just dont feel like putting a ton of effort into it right now. wifes birthday and all. take a crack at it tomorrow though.


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> it holds 2125 on the core clock though.
> 
> with 151 curve, stock memory speeds
> 
> 
> and 125 curve for comparison stock mem speeds


Temps.....are you on air?


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> Temps.....are you on air?


42c. In the screenshot. Nope, got a ftw hybrid. Its not as good as an EK block...but better than a stock fan. Just wish i could control fan. Doesnt need 56% at idle.

It idles at 25c


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> 42c. In the screenshot. Nope, got a ftw hybrid. Its not as good as an EK block...but better than a stock fan. Just wish i could control fan. Doesnt need 56% at idle.
> 
> It idles at 25c


Sounds like you may have just gotten really unlucky....if you're getting poorer performance at higher clocks, you've reached that point where the GPU is running too warm for that specific clock. You either need to raise voltage, or lower core temps. Temps are more important than voltage, really...so, matching the voltage and frequency in the curve, to the ambient temps and max core temp under load is very important.

For instance...this curve is what I use for normal ambient temps of 20c, idle core temps of 25c and load temps of 34c or lower.



This is what I use if temps are slightly higher than that, and loaded core temps go above 36c.



And finally...the curve I use when I drop the ambient temps down to around -7c or so, coolant temps at -3c, and load temps on the core at no greater than 12c.



Temps are everything. = )


----------



## stephenn82

So move slider at 1093. Got it. Maybe it needs a repaste? I bought it from a good friend. He has two dogs, a mutt (lovely little beasty) and a siberian husky. They both reside in lower level of house and kennel up near PC. May need some good cleaning inside fins. I will put some GC Extreme on die akd see how that works as well. Report my findings afterwards. Off i go! Gotta finish my beer. Yes, its 10am. Pirates drink before noon. Ive spent more time at sea than 95% of people logged on thwir entire steam library. I earned that pirate title.

Thanks for patience and explaining how thw curves work.


----------



## stephenn82

Pre tearing down to clean.

Power peaked at 83% use, slider set to 130%
Temps held highest at 40c
clock was set to 2164, held steady during bench
voltage hit up to 1084mv.
Crashed out durning scene 8. Dang it! No power limits, thermal limits, hope this isnt a bummed card from my friend. At least its 55% more power/performance over that old R9 390.


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> 
> 
> Spoiler: Some words...
> 
> 
> 
> Pre tearing down to clean.
> Power peaked at 83% use, slider set to 130%
> Temps held highest at 40c
> clock was set to 2164, held steady during bench
> voltage hit up to 1084mv.
> Crashed out durning scene 8. Dang it! No power limits, thermal limits,
> 
> 
> 
> ...hope this isnt a bummed card from my friend.
> ...
> 
> 
> Spoiler: ...more words...
> 
> 
> 
> At least its 55% more power/performance over that old R9 390.











My God, man, what are you expecting from your card? That card is absolutely fine.
If you want more performance, you'll have to continue refining your power curves.
And if you can get it 2100MHz stable while gaming, you've won.


----------



## stephenn82

Maybe the ability to overclock ram at all? Yeah, 2100 on core is good. I think its locked to 10010 mhz on ram. IM ok with that.


----------



## AshBorer

Posting on the 1080th page of the 1080 thread


----------



## stephenn82

oh yeah!!!


----------



## OdinValk

1080th page!


----------



## Sharchaster

I can't gaming at 2100 with my card, due to frustrated power limit, Hell even I can't make it stable at 2050 MHz on gaming...


----------



## OdinValk

Sometimes you just get unlucky in the silicone lottery. My MSI 1080 cant get above 2100mhz or else it starts crashing.


----------



## tangelo

Quote:


> Originally Posted by *Sharchaster*
> 
> The thing I HATE MOST from this card is the power limit..*.it's only 104% which is LOW for the thing like GTX 1080*. I noticed crazy down clock frequencies when I gaming at DSR 4K (even 2K) but won't affect the performance (FPS) so far.
> 
> Cons
> Power Limit is only 104% which is LOW


Preach it brother!


----------



## stephenn82

So, It looks like a run of the cards had Micron memory...a BIOS update should fix this inability to overclock the ram. I will message EVGA later about getting a BIOS, as there are none posted on their product page. Not even for a regular 1080 FTW.

Mentioned by some of you fine OCN'ers a couple pages back about some memory issues. Not quite ECC, but will look into this further. Tomorrow. After shift.

__
https://www.reddit.com/r/5hvtkw/evga_1070_ftw_memory_overclocking_not_possible/


----------



## Dasboogieman

Quote:


> Originally Posted by *stephenn82*
> 
> So, It looks like a run of the cards had Micron memory...a BIOS update should fix this inability to overclock the ram. I will message EVGA later about getting a BIOS, as there are none posted on their product page. Not even for a regular 1080 FTW.
> 
> Mentioned by some of you fine OCN'ers a couple pages back about some memory issues. Not quite ECC, but will look into this further. Tomorrow. After shift.
> 
> __
> https://www.reddit.com/r/5hvtkw/evga_1070_ftw_memory_overclocking_not_possible/


I'm confused, Micron is the only manufacturer that makes GDDR5x.


----------



## stephenn82

Well, it looks like its only affected by the 1070 cards.

Micron is the ONLY one who makes GDDR5x? well that IS odd...

Quite a few people are listing memory issues with overclocking the memory on the 1080's. 10Ghz is more than enough. I should stop tryign to figure out the root cause on why it gets hung up. Even my old r9 390 stock speeds of 1525mhz on ram could get punched up to 1725mhz with stock voltage of 1000mv. I ran it at 950mv at 1575. 400GB/sec was enough to push any game/bench. The core on the other hand, wouldnt go more than 1175 without throwing a lot of power at it. It was on stock cooler.

I still havent thoroughy cleaned out the heatsinks and inspected for dust yet. Tomorrow, perhaps. Had to fix the XBOX controller for the kids. I played forza with my son and my car was turning right on its own when the stick was straight. Little TLC, 91% isopropyl cleaned it up all good.


----------



## Sharchaster

Quote:


> Originally Posted by *stephenn82*
> 
> So, It looks like a run of the cards had Micron memory...a BIOS update should fix this inability to overclock the ram. I will message EVGA later about getting a BIOS, as there are none posted on their product page. Not even for a regular 1080 FTW.
> 
> Mentioned by some of you fine OCN'ers a couple pages back about some memory issues. Not quite ECC, but will look into this further. Tomorrow. After shift.
> 
> __
> https://www.reddit.com/r/5hvtkw/evga_1070_ftw_memory_overclocking_not_possible/


My Memory is MICRON and I was able to OC my Memory to +1000 Max.


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> Maybe the ability to overclock ram at all? Yeah, 2100 on core is good. I think its locked to 10010 mhz on ram. IM ok with that.


You do realize that if your RAM is actually running @ 10010MHz, it's already overclocked faster than anyone else's, right?

Stock speed is ~5000MHz. I run mine just under 5600MHZ.

So.... Have you tried medication?


----------



## stephenn82

Oh no!! 5 mhz!! Its ever impressive.

Yes. I know the bios on that cars is bumped a whopping 5mhz. Most vendors that bump it up do so in bios. But its usually not 5 mhz. My old 390 was 60mhz core 25 mem. But I could squeeze out 115 extra on core and 175 on memory no sweat.

Meds...yeah. Round, crispy, and have a little M on them


----------



## Beagle Box

Quote:


> Originally Posted by *stephenn82*
> 
> Oh no!! 5 mhz!! Its ever impressive.
> 
> Yes. I know the bios on that cars is bumped a whopping 5mhz. Most vendors that bump it up do so in bios. But its usually not 5 mhz. My old 390 was 60mhz core 25 mem. But I could squeeze out 115 extra on core and 175 on memory no sweat.
> 
> Meds...yeah. Round, crispy, and have a little M on them


Uh...
I think we are not communicating.









I'm going by the shown number in Afterburner. You are doubling that, right?
For some reason I thought your number was a mis-type and you meant 6010MHz!









OK. You probably don't think it's funny...


----------



## ucode

Quote:


> Originally Posted by *Beagle Box*
> 
> You do realize that if your RAM is actually running @ 10010MHz, it's already overclocked faster than anyone else's, right?
> 
> Stock speed is ~5000MHz. I run mine just under 5600MHZ.


Actually GDDR5x runs at quad data rate (QDR) full speed so it's running at half that again, 2500MHz / 2800MHz


----------



## Beagle Box

Quote:


> Originally Posted by *ucode*
> 
> Actually GDDR5x runs at quad data rate (QDR) full speed so it's running at half that again, 2500MHz / 2800MHz


Yeah, I knew it was something like that. Every piece of software shows it differently.


----------



## Gen Patton

yes i have a Fx8350 this was before Ryzen was stable and i was ready to build so i went with Amd. Did not know when ryzen would be stable and price was high.


----------



## stephenn82

Quote:


> Originally Posted by *Beagle Box*
> 
> Uh...
> I think we are not communicating.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm going by the shown number in Afterburner. You are doubling that, right?
> For some reason I thought your number was a mis-type and you meant 6010MHz!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OK. You probably don't think it's funny...


Exactly. It was just at stock

So after watching jayz video twice and watching gauges like a hawk I found my card to do +75 on core +250 on memory voltage is at 75% power is 115% and NOT using curve as of yet.

Got heaven from 2928 to 3097 and superposition from max ever of 4398 to 4487. Woooo
So don't just cowboy it up and dump sliders to lalaland execrimg epic numbers. I think i got this down now







oh I got valley to 4906. Didn't get a before 1080p. Testing 1440p ultra now

Temps top at 42 still but smoother runs. Next, to gaming!


----------



## stephenn82

1440p barely moved up, only by 4 points lol

What's great is my board isn't memory speed locked. And most of my benchmark scores at 1440p on this mild touched 1080 spank the 1080p scores of my heavily modded 390!!!

I can dig it

Ran the Div for a bit and ran very well and smooth. Forgot I removed BF1. May redownload and play tomorrow


----------



## sinholueiro

Hi everyone. I am thinking in putting my 1080 under water. I was thinking in getting a Kraken G12. A single 120mm AIO is enough to cool a heavily OCed 1080 at decent fan RPMs (~1000)?


----------



## Dasboogieman

Quote:


> Originally Posted by *sinholueiro*
> 
> Hi everyone. I am thinking in putting my 1080 under water. I was thinking in getting a Kraken G12. A single 120mm AIO is enough to cool a heavily OCed 1080 at decent fan RPMs (~1000)?


A 120mm rad with 1200-1450RPM fan speed gives you roughly the same performance as a triple slot Aorus style cooler with fans going at 100%.

A 240mm/360mm is more ideal if you want low noise.


----------



## andydabeast

Quote:


> Originally Posted by *Dasboogieman*
> 
> A 120mm rad with 1200-1450RPM fan speed gives you roughly the same performance as a triple slot Aorus style cooler with fans going at 100%.
> 
> A 240mm/360mm is more ideal if you want low noise.


Good to know. I have a 120mm AIO hanging around for when I feel like putting it on. The Windforce triple fan is good enough for now.

I have a 280mm AIO on my CPU (ryzen 95w) how do you think it would do if I someday modded it to include the 1080 (with a cpu block on just the core)?


----------



## Dasboogieman

Quote:


> Originally Posted by *andydabeast*
> 
> Good to know. I have a 120mm AIO hanging around for when I feel like putting it on. The Windforce triple fan is good enough for now.
> 
> I have a 280mm AIO on my CPU (ryzen 95w) how do you think it would do if I someday modded it to include the 1080 (with a cpu block on just the core)?


Just make sure they're similar metals and you will be OK. IIRC most AIOs use Aluminium radiators which are really bad news with copper blocks unless you use Automotive strength coolants.


----------



## stephenn82

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just make sure they're similar metals and you will be OK. IIRC most AIOs use Aluminium radiators which are really bad news with copper blocks unless you use Automotive strength coolants.


No it's still bad


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> No it's still bad


Agreed. Potential mixed metals problems aside, those pumps aren't really made for "add ons".


----------



## sinholueiro

Copper block and aluminium is used in a lot of AIOs, but some of them give 5 years of warranty, like Corsair. I always will recommend a proper custom loop, either copper or aluminium, but not mixed, but using AIOs is a performance-to-cost effective. I was thinking in getting a custom loop, but it is not worth comparing the temps and the price of using two AIOs to cool CPU and GPU. I delay that purchase.

Also, the 120mm radiator is used in the Fury X and the Vega 64, which consumes A LOT. Are you guys sure that a 120mm radiator is so limited to cool a 1080, that is consuming much less?


----------



## Vellinious

Quote:


> Originally Posted by *sinholueiro*
> 
> Copper block and aluminium is used in a lot of AIOs, but some of them give 5 years of warranty, like Corsair. I always will recommend a proper custom loop, either copper or aluminium, but not mixed, but using AIOs is a performance-to-cost effective. I was thinking in getting a custom loop, but it is not worth comparing the temps and the price of using two AIOs to cool CPU and GPU. I delay that purchase.
> 
> Also, the 120mm radiator is used in the Fury X and the Vega 64, which consumes A LOT. Are you guys sure that a 120mm radiator is so limited to cool a 1080, that is consuming much less?


Can it do it? Sure. Will it keep it cool enough to boost / overclock very high? Probably not.

You're talking about two very different architectures. The Fury doesn't care what temps it runs at, it just chugs along until it's too hot, and then will throttle. The pascal architecture, and boost 3.0 make adjustments ALL the time. As temps rise, boost 3.0 will make adjustments to either lower clock or lower voltage to keep the GPU performing as it should

If you're not looking for high overclocks, and high benchmark scores, it should work just fine.


----------



## Dasboogieman

Quote:


> Originally Posted by *sinholueiro*
> 
> Copper block and aluminium is used in a lot of AIOs, but some of them give 5 years of warranty, like Corsair. I always will recommend a proper custom loop, either copper or aluminium, but not mixed, but using AIOs is a performance-to-cost effective. I was thinking in getting a custom loop, but it is not worth comparing the temps and the price of using two AIOs to cool CPU and GPU. I delay that purchase.
> 
> Also, the 120mm radiator is used in the Fury X and the Vega 64, which consumes A LOT. Are you guys sure that a 120mm radiator is so limited to cool a 1080, that is consuming much less?


Yes it matters a lot because most consumer watercooling parts are only made to run at 60C water temperature. The Fury AIO uses much more expensive Temperature resistant O-rings so it can sustain 60C water temperatures without damage.

So if you do end up expanding an AIO but using readily available consumer fittings, rads, blocks etc etc it is a very good idea to be more generous with the cooling power.

Also I did mention, the 120mm rad is equivalent to the monstrous Aorus Triple slot GTX 1080ti cooler with its fans going at 100% capacity. It is sufficient for a 1080 but you wanna stay below that 60C operating temperature as mauch as possible.


----------



## pez

My TXP on a hybrid AIO (same pump as the 1080/1080Ti/TXP/TXp hybrid) maxes out somewhere around 65C in a subpar airflow situation. This is, however, with a much quieter 120mm fan than provided and much quieter than an air cooler at 100%.


----------



## sinholueiro

A 240mm radiator it is, then. Lets hope I can get one cheap.


----------



## orvils

Got a really nice offer for my GTX970 and decided to sell it and get a GTX1080.
Went for the EVGA blower edition card.
Tried Palit GTX 1080 but open air cooler just heated everything else so much - even the case was hot to the touch.
Card fits in the case with few milimeters to spare, really tight fit.
Changed power target to 120% and increased temp limit. It boosts to 1911 MHz by itself and does not go over 84c. Haven't tried overclocking yet. Not that there is much need for it.


----------



## andydabeast

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes it matters a lot because most consumer watercooling parts are only made to run at 60C water temperature. The Fury AIO uses much more expensive Temperature resistant O-rings so it can sustain 60C water temperatures without damage.
> 
> So if you do end up expanding an AIO but using readily available consumer fittings, rads, blocks etc etc it is a very good idea to be more generous with the cooling power.
> 
> Also I did mention, the 120mm rad is equivalent to the monstrous Aorus Triple slot GTX 1080ti cooler with its fans going at 100% capacity. It is sufficient for a 1080 but you wanna stay below that 60C operating temperature as mauch as possible.


Quote:


> Originally Posted by *pez*
> 
> My TXP on a hybrid AIO (same pump as the 1080/1080Ti/TXP/TXp hybrid) maxes out somewhere around 65C in a subpar airflow situation. This is, however, with a much quieter 120mm fan than provided and much quieter than an air cooler at 100%.


good to know guys, thanks


----------



## ucode

Quote:


> Originally Posted by *Dasboogieman*
> 
> [The Fury AIO uses much more expensive Temperature resistant O-rings so it can sustain 60C water temperatures without damage.


Just what cheap and crappy o-rings are being used that are rated at less than 100C? Much more likely to have problems if using acrylic or acetal materials.


----------



## Dasboogieman

Quote:


> Originally Posted by *ucode*
> 
> Just what cheap and crappy o-rings are being used that are rated at less than 100C? Much more likely to have problems if using acrylic or acetal materials.


Eh, thats what XSPC advised me despite their Raystorm Pro being entirely metal. I was reasearching the safety of Indigo Extreme's re-flow procedure on block integrity. Something about the EPDM in their O-rings being at risk if it has to sustain 60c but it can tolerate up to 100C for short periods of time.


----------



## ucode

Here's one of the older XSPC products


Also with my system I don't like the way there isn't any pressure relief when the water heats up except for expansion of the tubing. Having said that haven't had any problems so far.


----------



## Vellinious

Quote:


> Originally Posted by *ucode*
> 
> Here's one of the older XSPC products
> 
> 
> Also with my system I don't like the way there isn't any pressure relief when the water heats up except for expansion of the tubing. Having said that haven't had any problems so far.


I use pressure release valves. They work beautifully and are relatively cheap.


----------



## Sycksyde

Sorry if this is a noob question but can a regular 1080 overclock to 11gbps on the RAM and what clocks in afterburner would I need to achieve it? plus 500 on the RAM slider?


----------



## sinholueiro

Quote:


> Originally Posted by *Sycksyde*
> 
> Sorry if this is a noob question but can a regular 1080 overclock to 11gbps on the RAM and what clocks in afterburner would I need to achieve it? plus 500 on the RAM slider?


That's it


----------



## ucode

Quote:


> Originally Posted by *Sycksyde*
> 
> Sorry if this is a noob question but can a regular 1080 overclock to 11gbps on the RAM and what clocks in afterburner would I need to achieve it? plus 500 on the RAM slider?


Depends whether running 3D / P0 or compute P2. Last time I used afterburner it had some shortcomings with P2


----------



## THEROTHERHAMKID

can someone point me in the right direction on how to overclock my 1080 g1 ?
Is the old way ok ? Precision or afterburner using heaven ?
Or is it different for pascal?
Yes I'm a bit of a noob
But I also keep seeing different ways?
Thanks


----------



## Beagle Box

Quote:


> Originally Posted by *THEROTHERHAMKID*
> 
> can someone point me in the right direction on how to overclock my 1080 g1 ?
> Is the old way ok ? Precision or afterburner using heaven ?
> Or is it different for pascal?
> Yes I'm a bit of a noob
> But I also keep seeing different ways?
> Thanks


Begin by thinking in terms of _'[email protected]@Temperature'_.
Then read the following guide: How to Overclock a GTX 10xx with afterburner.
Learn it.
know it.
Live it.
Then, come back here read this entire thread.









Valley is good for memory speed optimization. You'll probably find other benchmarks better for other things.


----------



## Gen Patton

Just downloaded heaven and ran a short test.Humm it will do for now. 3,715 157 fps. but my temps went up to 80 . so latter will try again


----------



## tangelo

Quote:


> Originally Posted by *OdinValk*
> 
> Sometimes you just get unlucky in the silicone lottery. My MSI 1080 cant get above 2100mhz or else it starts crashing.


Limiting the power limit to max 104% does not help either.


----------



## Sharchaster

Quote:


> Originally Posted by *tangelo*
> 
> Preach it brother!


Hahah next time, I will take a look at the power limit first before considering anything...this card is so good if the power limit is like NVIDIA BIOS (120%) or an OLD MSI BIOS (113%)


----------



## tangelo

Quote:


> Originally Posted by *Sharchaster*
> 
> Hahah next time, I will take a look at the power limit first before considering anything...this card is so good if the power limit is like NVIDIA BIOS (120%) or an OLD MSI BIOS (113%)


Yeah. It really wasn't mentioned *anywhere*. All the reviews and benchmarks I saw for this card had the limit way higher. It seems they nerfed it down on the newer bioses.


----------



## ucode

It's not the percentage that counts but the actual Watts


----------



## KGB1st

how overclock it on win server16?)


----------



## tangelo

Quote:


> Originally Posted by *ucode*
> 
> It's not the percentage that counts but the actual Watts


Max 224.335W and gpuz shows green bars on perfcap during gaming with OC .


----------



## andydabeast

Quote:


> Originally Posted by *Beagle Box*
> 
> Begin by thinking in terms of _'[email protected]@Temperature'_.
> Then read the following guide: How to Overclock a GTX 10xx with afterburner.
> Learn it.
> know it.
> Live it.
> Then, come back here read this entire thread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Valley is good for memory speed optimization. You'll probably find other benchmarks better for other things.


I did a lot of testing of the memory speed using Superposition 1080p extreme (will post results soon) and as we all know higher speed does not equal better. I have a question for anyone who knows.

If I find my optimal memory offset using one benchmark, will it be completely different for another benchmark or game? I have yet to test this.


----------



## stephenn82

Quote:


> Originally Posted by *andydabeast*
> 
> I did a lot of testing of the memory speed using Superposition 1080p extreme (will post results soon) and as we all know higher speed does not equal better. I have a question for anyone who knows.
> 
> If I find my optimal memory offset using one benchmark, will it be completely different for another benchmark or game? I have yet to test this.


yes. Valley LOVES memory speed. Superposition could care less. Heaven is a slight change.

Stock superposition run


oc'd memory only, bumped it up 450mhz (raised score about 60 some odd points)


then ran my curve of and 450mhz on memory


Now, im looking to push 500 on vram. see if it works.


----------



## stephenn82

Ok, points dropped a little, so went back to 2126/+450 profile. Oh, I figured out how to really work the curves in AB now. My card isnt broken and my VRAM clocks just fine.


----------



## sinholueiro

Anyone knows if the MSI 1080 custom PCB is compatible with the Kraken G12? I have the 1080 Armor, but I think that is the same PCB than the GAMING X and Z ones.


----------



## andydabeast

Quote:


> Originally Posted by *stephenn82*
> 
> yes. Valley LOVES memory speed. Superposition could care less. Heaven is a slight change.
> 
> Stock superposition run
> 
> 
> oc'd memory only, bumped it up 450mhz (raised score about 60 some odd points)
> 
> 
> then ran my curve of and 450mhz on memory
> 
> 
> Now, im looking to push 500 on vram. see if it works.


+575 is my optimal mem speed for Superposition. I imagine every card is different. I am making a graph to post my results. I have a lot of studying to do and benching is easy to do while studying so tomorrow I may run valley 25 times at different speeds and compare it to superposition. I ran super stable from +0 to +1000


----------



## Vellinious

Quote:


> Originally Posted by *tangelo*
> 
> Max 224.335W and gpuz shows green bars on perfcap during gaming with OC .


What he means is, that the % is relative to what's set in the vbios. A card with a 200 watt base power limit set in the bios, and a slider capable of reaching 110%, will allow for 220 watts power draw before the GPU throttles due to power limits. Likewise, a GPU that has a 300 watt base power limit, with the slider at 100%, wouldn't power limit throttle pulling 224 watts. In fact, it'd have 76 watts to spare.

Most of the NVIDIA reference boards over the past couple of generations have had artificially low power limits, which limit their overclocking capability to some extent, while many of the custom boards have increased power limits, opening up their potential somewhat, without the need to flash to another bios in order to work around it.

So...really, the % that the slider says is arbitrary.


----------



## Beagle Box

Quote:


> Originally Posted by *andydabeast*
> 
> I did a lot of testing of the memory speed using Superposition 1080p extreme (will post results soon) and as we all know higher speed does not equal better. I have a question for anyone who knows.
> 
> If I find my optimal memory offset using one benchmark, will it be completely different for another benchmark or game? I have yet to test this.


I used my most efficient memory speed in Valley for my best score in Superposition @ 1080pX, 4K and 8K.


----------



## tangelo

Quote:


> Originally Posted by *Vellinious*
> 
> Most of the NVIDIA reference boards over the past couple of generations have had artificially low power limits, which limit their overclocking capability to some extent, while many of the custom boards have increased power limits, opening up their potential somewhat, without the need to flash to another bios in order to work around it.
> 
> So...really, the % that the slider says is arbitrary.


I understand and that makes sense. But the thing what I've been asking and no one has been able to explain is why does the exact same models from the same manufacturer have different BIOSes and different power limits? I don't understand why. Why does the guy who reviewed the card on GURU3D had 114% in his bios and people who have bought the same model MSI Gaming X+ card have 104%?

Every time I ask this people reply that DIFFERENT cards have different powerlimits and then start to talk about the reference card, boardpartners and DIFFERENT manufacturers. I am talking about cards from same manufacturers same model with exactly the same SKU.


----------



## Dasboogieman

Quote:


> Originally Posted by *tangelo*
> 
> I understand and that makes sense. But the thing what I've been asking and no one has been able to explain is why does the exact same models from the same manufacturer have different BIOSes and different power limits? I don't understand why. Why does the guy who reviewed the card on GURU3D had 114% in his bios and people who have bought the same model MSI Gaming X+ card have 104%?
> 
> Every time I ask this people reply that DIFFERENT cards have different powerlimits and then start to talk about the reference card, boardpartners and DIFFERENT manufacturers. I am talking about cards from same manufacturers same model with exactly the same SKU.


Because there is a variation in the parts that comprise an SKU. This is before even taking silicon lottery in to account for the core. The NVIDIA Power scheme takes the entire board consumption in to account. That includes all Fans, LEDs, microcontrollers, Capacitors etc etc.

On large TDP cards like the 1080ti, a variance of 10W or so is not a biggie in the overall scheme of things but low TDP cards like the 1080 with 200-250W type TDPs, suddenly, variance of 10W matters a lot more.

I gained about 10W of headroom on my 1080ti when I watercooled (the cold temps gave me another 20W) just from disconnecting all my fans and LEDs.


----------



## tangelo

Quote:


> Originally Posted by *Dasboogieman*
> 
> Because there is a variation in the parts that comprise an SKU. This is before even taking silicon lottery in to account for the core. The NVIDIA Power scheme takes the entire board consumption in to account. That includes all Fans, LEDs, microcontrollers, Capacitors etc etc.


So basicly, MSI tests every card on the factory and assign different bioses on card by card basis, even when they are all the same model?


----------



## Dasboogieman

Quote:


> Originally Posted by *tangelo*
> 
> So basicly, MSI tests every card on the factory and assign different bioses on card by card basis, even when they are all the same model?


Could be a revision within the SKU if the BIOS values are different, I was talking about variances within the same BIOS, same manufacturer, same SKU.


----------



## tangelo

Quote:


> Originally Posted by *Dasboogieman*
> 
> Could be a revision within the SKU if the BIOS values are different, I was talking about variances within the same BIOS, same manufacturer, same SKU.


Ok. I was talking about cards with same model / Device ID but different bioses. Could there still be differences between devices sharing a ID?

My card's device id gives this list of bioses and they have different power limits, getting lower and lower depending on how new the bios is.

https://www.techpowerup.com/vgabios/?did=10DE-1B80-1462-3362


----------



## Dasboogieman

Quote:


> Originally Posted by *tangelo*
> 
> Ok. I was talking about cards with same model / Device ID but different bioses. Could there still be differences between devices sharing a ID?
> 
> My card's device id gives this list of bioses and they have different power limits, getting lower and lower depending on how new the bios is.
> 
> https://www.techpowerup.com/vgabios/?did=10DE-1B80-1462-3362


yeah those are all revisions.


----------



## tangelo

Quote:


> Originally Posted by *Dasboogieman*
> 
> yeah those are all revisions.


Thanks man. It all makes sense now. I was under the impression that they were all for the exact same cards.


----------



## Dasboogieman

Quote:


> Originally Posted by *tangelo*
> 
> Thanks man. It all makes sense now. I was under the impression that they were all for the exact same cards.


They probably changed a few components on the board since it has been out for so long. No idea why they would reduce the TDP though.


----------



## ucode

Quote:


> Originally Posted by *tangelo*
> 
> My card's device id gives this list of bioses and they have different power limits, getting lower and lower depending on how new the bios is.
> 
> https://www.techpowerup.com/vgabios/?did=10DE-1B80-1462-3362


All the X plus BIOS in that link have the same power limits, 220W default and 250W max.

Here's a review that actually uses Watts
Quote:


> Originally Posted by *"https://videocardz.com/review/msi-geforce-gtx-1080-gaming-x-plus-review/2*
> *
> Power limit*
> 
> This sample has 220W power limit with a maximum at 250W.
> 
> Power Draw : 16.37 W
> Power Limit : 248.60 W
> Default Power Limit : 220.00 W
> Enforced Power Limit : 248.60 W
> Min Power Limit : 90.00 W
> Max Power Limit : 250.00 W


The actual max percentage in this case would be 113.6% and probably using AB in that review so are missing out on that extra 0.6% :/

What BIOS are you using? The one in the review is 86.04.66.00.2c.

Run the nvsmi command line to read power limits.

Read power limits "nvidia-smi.exe -q -d power"
Set 250W limit "nvidia-smi.exe -pl 250"

Do the limits change depending which mode is used, silent mode, gaming mode or OC mode?


----------



## tangelo

Quote:


> Originally Posted by *ucode*
> 
> All the X plus BIOS in that link have the same power limits, 220W default and 250W max.


Look again. There are bioses with 220W max and 291W max.
Quote:


> What BIOS are you using? The one in the review is 86.04.66.00.2c.


My bios is 86.04.66.00.52
This one here: https://www.techpowerup.com/vgabios/193984/193984
Quote:


> Run the nvsmi command line to read power limits.
> 
> Do the limits change depending which mode is used, silent mode, gaming mode or OC mode?


Power Readings
Power Management : Supported
Power Draw : 114.86 W
Power Limit : 195.30 W
Default Power Limit : 210.00 W
Enforced Power Limit : 195.30 W
Min Power Limit : 90.00 W
Max Power Limit : 220.00 W

EDIT: The only power limit that changes when changing the OC/Gaming/Silent is the "Enforced Powerlimit that maxes in 210W on OC and Gaming

When trying to change the powerlimit with nvidia-smi I get:

"Provided power limit 250.00 W is not a valid power limit which should be between 90.00 W and 220.00 W for GPU 00000000:01:00.0"


----------



## Paztak

Yey!

Just got mine 1080 and I have to say, after quick test what I managed to do yesterday, that this card is a beast!
This card will hold up quite long time at 1440p 60Hz, there's no doubt about that.

Is this BF1 related issue, but I noticed that at ultra settings + DX12 when you are playing as an sniper and you change you view to scope and out that transition will give some lag and fps drop. Anyone else noticed that?


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Paztak*
> 
> Yey!
> 
> Just got mine 1080 and I have to say, after quick test what I managed to do yesterday, that this card is a beast!
> This card will hold up quite long time at 1440p 60Hz, there's no doubt about that.
> 
> Is this BF1 related issue, but I noticed that at ultra settings + DX12 when you are playing as an sniper and you change you view to scope and out that transition will give some lag and fps drop. Anyone else noticed that?


I've played a lot of BF1 and can't say I have. What are the rest of your specs?


----------



## ucode

This is what I see with your link.



Only 3 X-plus boards and all with the same power limits.

You could contact MSI and ask them why yours is set lower and if it would be okay to flash an older VBIOS. Note also your default is 210W so your maximum 104.76% is equivalent to only 100% on those other X-plus boards. Maybe those others were "special" VBIOS for reviewers.


----------



## Sharchaster

Quote:


> Originally Posted by *Paztak*
> 
> Yey!
> 
> Just got mine 1080 and I have to say, after quick test what I managed to do yesterday, that this card is a beast!
> This card will hold up quite long time at 1440p 60Hz, there's no doubt about that.
> 
> Is this BF1 related issue, but I noticed that at ultra settings + DX12 when you are playing as an sniper and you change you view to scope and out that transition will give some lag and fps drop. Anyone else noticed that?


1440p 60 fps? that's overkill for your 1080, try 3k resolution (on BF1 you can up resolution scale slider to maybe 133%) more or less....because 150% is giving you 4K Resolution


----------



## Paztak

Quote:


> Originally Posted by *6u4rdi4n*
> 
> I've played a lot of BF1 and can't say I have. What are the rest of your specs?


i5 4690k @ 4,400 MHz
16gb DDR3 2400MHz
Kingston SSDNow V300 480 GB SSD

It might be that CPU is causing it, but I've been limiting my FPS to 65.9975 so that should help with weird FPS drops and I haven't had this issue with my GTX 970. It's not that it happens all the time, but something what I noticed with my quick test.

Quote:


> Originally Posted by *Sharchaster*
> 
> 1440p 60 fps? that's overkill for your 1080, try 3k resolution (on BF1 you can up resolution scale slider to maybe 133%) more or less....because 150% is giving you 4K Resolution


Yes, I was thinking that I use higher resolution with games which GTX 1080 handles easily, but then again, I don't mind if GPU usage is not 99% all the time.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *Paztak*
> 
> i5 4690k @ 4,400 MHz
> 16gb DDR3 2400MHz
> Kingston SSDNow V300 480 GB SSD
> 
> It might be that CPU is causing it, but I've been limiting my FPS to 65.9975 so that should help with weird FPS drops and I haven't had this issue with my GTX 970. It's not that it happens all the time, but something what I noticed with my quick test.
> Yes, I was thinking that I use higher resolution with games which GTX 1080 handles easily, but then again, I don't mind if GPU usage is not 99% all the time.


Hmm. Could be CPU. BF1 really likes CPU power. But on the other hand, I don't think so. Think you gotta do more than just a quick test. Play a couple of hours, see how much it happens. Have afterburner or something run in the background so you can check to see if anything looks a bit suspect when/if it happens.

Did you use display driver uninstaller or something when you changed graphics card?


----------



## Gdourado

Hello,
I have a small request.
I am trying to decide if it is worth it to upgrade my Overclocked 980ti to a Gigabyte 1080 G1 Gaming.
I know the G1 clocks around 2000 to 2050.
So I have a request if someone that has a 1080 clocked around 2000 can please run some benchmarks for me to compare to my Ti.
These are my TI benches and the settings.
My CPU is a 3770k at 4.5.

















Thanks!
Cheers!


----------



## Paztak

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Hmm. Could be CPU. BF1 really likes CPU power. But on the other hand, I don't think so. Think you gotta do more than just a quick test. Play a couple of hours, see how much it happens. Have afterburner or something run in the background so you can check to see if anything looks a bit suspect when/if it happens.
> 
> Did you use display driver uninstaller or something when you changed graphics card?


Yep, I need to monitor this more and test another server etc. There was CPU load and GPU load on my screen, but didn't notice anything strange behavior there at the very moment, didn't check AB logs afterwards though. But it's good to know that this isn't "known issue", so I can troubleshoot it more.

I did clean installation via nvidia driver installation, but maybe I need to do it again and use driver uninstaller. Is there some specific uninstaller what is recommend to use?


----------



## Sharchaster

Quote:


> Originally Posted by *Paztak*
> 
> Yep, I need to monitor this more and test another server etc. There was CPU load and GPU load on my screen, but didn't notice anything strange behavior there at the very moment, didn't check AB logs afterwards though. But it's good to know that this isn't "known issue", so I can troubleshoot it more.
> 
> I did clean installation via nvidia driver installation, but maybe I need to do it again and use driver uninstaller. *Is there some specific uninstaller what is recommend to use?*


Mine just use the IOBIT Uninstaller....it's very good IMO...they will remove the file DEEPLY on your system (nvidia registry, etc).


----------



## AngryGoldfish

Anyone know what the differences between the Aorus 1080 models are? There's so many...

AORUS GTX 1080 8G 11Gbps (rev. 1.0)
AORUS GeForce® GTX 1080 8G (rev. 2.0)
AORUS GTX 1080 Xtreme Edition 8G 11Gbps
AORUS GTX 1080 Xtreme Edition 8G
GTX 1080 Xtreme Gaming 8G (rev. 2.0)
AORUS GTX 1080 Xtreme Edition 8G 11Gbps

I know the Xtreme Gaming and Extreme Edition models are 3 slots while the non-Xtreme models are 2.5 slots. Is that correct?

Aria PC in the UK has this one in stock that I'm considering buying:

Gigabyte AORUS NVIDIA GeForce GTX 1080 8GB 11Gbps

Which according to their product code and Gigabyte's website is this one:

AORUS GTX 1080 8G 11Gbps (rev. 1.0)

What's the difference between revision 1 and revision 2? Revision 2 suggests revision 1 needed tweaking to improve.


----------



## tangelo

Quote:


> Originally Posted by *ucode*
> 
> This is what I see with your link.
> Only 3 X-plus boards and all with the same power limits.
> 
> You could contact MSI and ask them why yours is set lower and if it would be okay to flash an older VBIOS. Note also your default is 210W so your maximum 104.76% is equivalent to only 100% on those other X-plus boards. Maybe those others were "special" VBIOS for reviewers.


You are correct that was a mistake on my part. Still strange that all the other bios has 250W limit and my card has 220W


----------



## ucode

@tangelo Don't know if you are aware of earlier MSI endeavors

https://www.eteknix.com/msi-asus-use-oc-bios-default-reviewers/

https://www.bit-tech.net/news/tech/graphics/msi-defends-bios-hack/1/

It might be worth your trouble to contact them about it or make some "noises"


----------



## stephenn82

Quote:


> Originally Posted by *ucode*
> 
> Actually GDDR5x runs at quad data rate (QDR) full speed so it's running at half that again, 2500MHz / 2800MHz


its actually 8 times...

my memory clocks are 1364 which equals out to 10912.

I finally got my card to OC the memory, I may have said that, but a few pages have went by and I am kinda slow today, no coffee


----------



## Paztak

Man I hate this new boost clock system!! Why the bucket the card needs to boost to the limits all the freaking time. Just stay some clocks and don't try to boost just to hit power or voltage limit.









Power limit, voltage limit and temp limit is cranked up and still core clock is changing all the time due the power or voltage limits. Just pick you clock and stay there!! Is there any way to stop this madness?


----------



## Vellinious

Quote:


> Originally Posted by *Paztak*
> 
> Man I hate this new boost clock system!! Why the bucket the card needs to boost to the limits all the freaking time. Just stay some clocks and don't try to boost just to hit power or voltage limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power limit, voltage limit and temp limit is cranked up and still core clock is changing all the time due the power or voltage limits. Just pick you clock and stay there!! Is there any way to stop this madness?


Use a voltage / frequency curve. Pick a good clock and voltage you want it to run at, and flatten the rest of the curve. Then run a custom fan curve to ensure it stays cool enough, and you should be good to go.


----------



## demitrisln

Hey was wondering if anybody has or knows how to switch a MSI 1080 Sea Hawk EK water block and put on a air cooled stock cooler?

I had the 1080 Sea Hawk for about a year and getting rid of water cooling. Looking to switch it over to air cooling if possible.

Let me know I would really appreciate it.


----------



## stephenn82

Quote:


> Originally Posted by *Paztak*
> 
> Man I hate this new boost clock system!! Why the bucket the card needs to boost to the limits all the freaking time. Just stay some clocks and don't try to boost just to hit power or voltage limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power limit, voltage limit and temp limit is cranked up and still core clock is changing all the time due the power or voltage limits. Just pick you clock and stay there!! Is there any way to stop this madness?


Yes...the curve really helps, a lot. just make sure whatever your 1093, 1081, 1063 and so on settings are set where you want. then flatten out the rest of the right to match the max you want. then, it wont hunt everwhere, as well as stop giving you the limit caps.


----------



## Paztak

Thanks!

Tweaking that curve helped, didn't realize how that operated. Now sitting at the constant 1860MHz and I can play in peace.


----------



## Gen Patton

Guys how do you do a screen save of heaven?


----------



## ucode

IIRC default is F12 and screenshot is saved under C:\users\YourUserName\Heaven\screenshots where YourUserName is whatever your Windows username is.


----------



## thesebastian

Is there still no way to increase the power limit in the Gigabyte G1 1080? Some magic bios or something like that? I'm not pushing the card too far because If I do I get Power perfcap all the time at higher clocks


----------



## HAL900

uV or this
https://www.dropbox.com/s/tswi8sppijtwy24/230w.rom?dl=0
230W


----------



## Sharchaster

it seems like my MSI Gaming X 1080+ hated the BIOS Flash (always got an error message on cmd)....


----------



## HAL900

This is version 11 vram ?


----------



## Sharchaster

yep....


----------



## HAL900

try this https://www.dropbox.com/s/39sf6pdgtl9y2vt/max%20power%20limit.rom?dl=0


----------



## Dasboogieman

Quote:


> Originally Posted by *thesebastian*
> 
> Is there still no way to increase the power limit in the Gigabyte G1 1080? Some magic bios or something like that? I'm not pushing the card too far because If I do I get Power perfcap all the time at higher clocks


Disconnect your LED connector, no joke, that opens up like 5W of headroom for the TDP.


----------



## outofmyheadyo

Quote:


> Originally Posted by *demitrisln*
> 
> Hey was wondering if anybody has or knows how to switch a MSI 1080 Sea Hawk EK water block and put on a air cooled stock cooler?
> 
> I had the 1080 Sea Hawk for about a year and getting rid of water cooling. Looking to switch it over to air cooling if possible.
> 
> Let me know I would really appreciate it.


Don`t quote me on that but the PCB should be the same as they use for the gaming x line, so the 1080 gaming x cooler should fit, you can also check out prolimatech MK-26 and Raijintek Morpheus II these should fit too.


----------



## outofmyheadyo

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> I have a small request.
> I am trying to decide if it is worth it to upgrade my Overclocked 980ti to a Gigabyte 1080 G1 Gaming.
> I know the G1 clocks around 2000 to 2050.
> So I have a request if someone that has a 1080 clocked around 2000 can please run some benchmarks for me to compare to my Ti.
> These are my TI benches and the settings.
> My CPU is a 3770k at 4.5.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!
> Cheers!


Dont have valley/heaven or metro installed right now but have 3dmark and superposition.

Ryzen [email protected]
2*8GB 3200 CL14
Stock GTX 1080

https://www.3dmark.com/3dm/22491138
https://www.3dmark.com/spy/2482004
https://www.3dmark.com/fs/13777638
https://www.3dmark.com/fs/13777618




Doesnt really warrant upgrading I quess, if you want real increase in performance you got to go for the 1080ti
Here are some results I got on my 1080ti, it was overclocked tho, and 1080 results were stock.



https://www.3dmark.com/fs/12361554
https://www.3dmark.com/spy/1872117
https://www.3dmark.com/fs/12361521
https://www.3dmark.com/fs/12361378

Hope this helps


----------



## Gdourado

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Dont have valley/heaven or metro installed right now but have 3dmark and superposition.
> 
> Ryzen [email protected]
> 2*8GB 3200 CL14
> Stock GTX 1080
> 
> https://www.3dmark.com/3dm/22491138
> https://www.3dmark.com/spy/2482004
> https://www.3dmark.com/fs/13777638
> https://www.3dmark.com/fs/13777618
> 
> 
> 
> 
> Doesnt really warrant upgrading I quess, if you want real increase in performance you got to go for the 1080ti
> Here are some results I got on my 1080ti, it was overclocked tho, and 1080 results were stock.
> 
> 
> 
> https://www.3dmark.com/fs/12361554
> https://www.3dmark.com/spy/1872117
> https://www.3dmark.com/fs/12361521
> https://www.3dmark.com/fs/12361378
> 
> Hope this helps


Thanks for letting me know.
Anyway, I went ahead and did the upgrade.
Managed to sell my 980TI for a great price and got the 1080 at also a great price, so the upgrade was really cheap.

Anyway, benchmarks scores didn't went up that much, but in gaming, I am really happy!
For example, Far Cry Primal was 88 fps average on the 980TI and now it's 106.
Metro Last Light Redux was 76 fps average and now is 94.

I am happy. Newer architecture, warranty was reset and performance increased in gaming!

Cheers!


----------



## Gdourado

Also, my new card.
I am now in the club


----------



## outofmyheadyo

That`s nice enjoy, I know a reason I got rid of my 980ti was because it was really hot, I belive 1080 consumes alot less power.


----------



## Gen Patton

Thanks guys


----------



## Gdourado

Quote:


> Originally Posted by *outofmyheadyo*
> 
> That`s nice enjoy, I know a reason I got rid of my 980ti was because it was really hot, I belive 1080 consumes alot less power.


My 980 TI was also a GIgabyte G1 model.
It had dual 8 pin power connectors.
The 1080 only has one.

But the 980 TI had a great cooler.
The temps under benchmarking are the same on both cards.
They both top at 75c with the stock fan curves on both.
Guess the 980 had a more agressive fan curve out of the box.

Old one vs new one:




Cheers!


----------



## andydabeast

My testing is complete for now! 20+ runs of Superposition, Valley, and Tomb Raider 2013 EACH. Maybe I'll do Heaven next.

http://www.overclock.net/t/1639421/gtx-1080-complete-memory-testing/0_100


----------



## Pafrarca

hi guys, one question, is possible flash a msi gaming X + bios on a normal gaming ? ( gtx 1080) thanks....


----------



## AngryGoldfish

New 1080 owner here.

Out of the box the card boosted to 1976Mhz and then dropped down to 1963Mhz and stayed there. I didn't touch anything in Afterburner. Temperatures never went above 60°C. They usually hovered around 58°C, all with noise levels below that of my CPU cooler, something I didn't experience even with my GTX 970. My previous card (ASUS Fury Strix) stayed around 70°C while being noticeably louder. It wasn't too loud, but I definitely notice how much quieter the 1080 is. I'm glad I went with the 1080 over Vega 64. For a silent air cooled build, Vega just isn't cut out.


----------



## moRReus

Another new owner joining the club!

And I unfortunately started that ownership off not in the best of ways.... Got a good deal on an FE card, ran it as is for a while to check it out, preceded to get going on swapping the stock fan out for an XSPC block and managed to "skillfully" knock off one of the small capacitors on the back of the pcb for a memory module.







Wasn't all too thrilled as you can imagine. I think I just stared intensely at it for a minute lol. In fact, I probably should have kept staring, I'm pretty sure it would have eventually reflowed the solder.







Fortunately, a good friend of mine is gong to resolder it for me this weekend. I've never damaged a component before, but I guess there's a first for everything. Hopefully things go a little smoother for the install!


----------



## stephenn82

Welcome to the club fellas. Enjoy the clocks and temps. And the quiet


----------



## Hequaqua

New owner here! Got the card yesterday.

Gigabyte GTX 1080 G1 Gaming

Loving it so far.

I have a habit of keeping base clock scores on different drivers. I did my first set today. This will be the 4th card I've done this with. I've done the GTX970/GTX1060/RX480 and this one.

Here is the first set. I think the spreadsheet has the settings that were used. Left is stock/Right is OC.

*https://docs.google.com/spreadsheets/d/1FWeyU64skaTzXlKRavTimbXQFRjPWOzve8Wc2O3EbHM/edit?usp=sharing*

I will try and go back and add older drivers when I get time. Those will be stock only. I do them that way to try to see if the drivers show any improvement.

The OC settings are just a base. I haven't tried any suicide runs to really see what more I can get. I did both sets of benchmarks without a single crash.

Validation Link: https://www.techpowerup.com/gpuz/details/nbyuu


----------



## Gdourado

I have started to mess with mine.
My memory can go to 5500 without crashes or artifacts, but from a few benchmarks, it seems I get better results with the memory at 5300 vs 5500.
Am I doing something wrong here?

Cheers


----------



## bajer29

I'm on the fence of selling my i5 4670K and SLI 980s to buy a EVGA GTX 1080 FTW2. I've been relatively happy with how my 980s perform in most games since I upgraded to a 4790K (no stutter, average FPS in GPU intensive games between 60FPS and 120FPS).

Would going single 1080 FTW2 smooth out my gaming in BF1 at 1440P ~144Hz?

Currently my setup gets me about 60-100 FPS in BF1 at 1440P with DX12 and settings on High on the most demanding MP maps.

Opinions?


----------



## coreykill99

Quote:


> Originally Posted by *bajer29*
> 
> I'm on the fence of selling my i5 4670K and SLI 980s to buy a EVGA GTX 1080 FTW2. I've been relatively happy with how my 980s perform in most games since I upgraded to a 4790K (no stutter, average FPS in GPU intensive games between 60FPS and 120FPS).
> 
> Would going single 1080 FTW2 smooth out my gaming in BF1 at 1440P ~144Hz?
> 
> Currently my setup gets me about 60-100 FPS in BF1 at 1440P with DX12 and settings on High on the most demanding MP maps.
> 
> Opinions?


not quite apples to apples.
but my 1080 gets me between 90-144 FPS @ 2560*1080 on DX11 highest detail settings
so 75% of your pixel count.
so taking 25% off my numbers is
68-108 FPS
I'm sure my quick and dirty math inst a nice conversion but it looks like it wouldn't net you much benefit to me.
but, if someone with that actual screen size and a 1080 would chime in then obviously take their opinion over mine.

as far as smoothing, im sure that might be a difference. as one card should 10/10 times perform better than 2. but is smoothing alone enough to justify the price?


----------



## Hequaqua

I can agree with the above.

Another thing to take into consideration is power usage. Maxwell's aren't as efficient as Pascal. They also run a bit warmer than most AIB 1080's. One card is usually better than two, especially if you play a lot of different games(no SLI profiles/micro-stuttering/etc).

I would hang on and see what the 1070ti offers. Perhaps prices will go down on the 1080's as well. They may outperform the 1080. Who knows?

I just got my 1080 and haven't really played many games yet. I can say, personally, DX12 is horrible. It does OK in games that are optimized(which BF1 is I believe). Others it performs worse in DX12, depends on the developers, and honestly, some are slow or lazy about optimizing it.

Just my


----------



## bajer29

Quote:


> Originally Posted by *coreykill99*
> 
> not quite apples to apples.
> but my 1080 gets me between 90-144 FPS @ 2560*1080 on DX11 highest detail settings
> so 75% of your pixel count.
> so taking 25% off my numbers is
> 68-108 FPS
> I'm sure my quick and dirty math inst a nice conversion but it looks like it wouldn't net you much benefit to me.
> but, if someone with that actual screen size and a 1080 would chime in then obviously take their opinion over mine.
> 
> as far as smoothing, im sure that might be a difference. as one card should 10/10 times perform better than 2. but is smoothing alone enough to justify the price?


Quote:


> Originally Posted by *Hequaqua*
> 
> I can agree with the above.
> 
> Another thing to take into consideration is power usage. Maxwell's aren't as efficient as Pascal. They also run a bit warmer than most AIB 1080's. One card is usually better than two, especially if you play a lot of different games(no SLI profiles/micro-stuttering/etc).
> 
> I would hang on and see what the 1070ti offers. Perhaps prices will go down on the 1080's as well. They may outperform the 1080. Who knows?
> 
> I just got my 1080 and haven't really played many games yet. I can say, personally, DX12 is horrible. It does OK in games that are optimized(which BF1 is I believe). Others it performs worse in DX12, depends on the developers, and honestly, some are slow or lazy about optimizing it.
> 
> Just my


Thanks, my dudes. I was planning on waiting, but didn't want to wait so long for the 980s/ 4670K to lose their value with all of the new GPU and CPU news.

I'll wait until the end of October/ beginning of November to make my final decision. A 1070ti is definitely an option that is on the table for me.


----------



## Hequaqua

It's supposed to release around the 20th or so I think.

I went ahead with the 1080 thinking the 1070ti will probably be hard to come by, and the prices might higher due to demand.

I had just sold a RX480, so that offset the cost for my 1080.









The 4670k should be fine for quite a while really. There aren't that many games that use more cores yet. BF1 will take advantage of them though. I noticed that going from my i7-4770k to the Ryzen 5 1600. It was loading up my i7 pretty well most of the time(75-80% maybe). I haven't been gaming much, but if my memory is correct, with the R5, it may hit 50-60% on a few cores from time to time.

I7-4770 via Youtube:


Spoiler: Warning: Spoiler!


----------



## bajer29

Yeah I currently have a 4790K. It gets loaded around 50% on most cores, while loading the map it will jump up a bit to around 70%.

I believe the 1070 TI is actually being released on Oct. 26th.


----------



## tangelo

Quote:


> Originally Posted by *Pafrarca*
> 
> hi guys, one question, is possible flash a msi gaming X + bios on a normal gaming ? ( gtx 1080) thanks....


X+ uses different memory than non-X versions, so I would not try it.


----------



## stephenn82

just clock it to x+ speeds, add 500 in AB and see if it can do it. at least 450.


----------



## stephenn82

A simple question for you guys. I recently started getting into folding on this setup, and noticed something insane. I know it depends on what string of work you are offered, if you are keyed vice not keyed, etc.

I was unkeyed and was gettign 125k PPD on GPU. I got a passkey and restarted after putting in the info. I am now rolling at 875k PPD on GPU alone. about 950k for total system. Is this insane?



Updated to reflect points I am seeing now.


----------



## pez

Quote:


> Originally Posted by *bajer29*
> 
> I'm on the fence of selling my i5 4670K and SLI 980s to buy a EVGA GTX 1080 FTW2. I've been relatively happy with how my 980s perform in most games since I upgraded to a 4790K (no stutter, average FPS in GPU intensive games between 60FPS and 120FPS).
> 
> Would going single 1080 FTW2 smooth out my gaming in BF1 at 1440P ~144Hz?
> 
> Currently my setup gets me about 60-100 FPS in BF1 at 1440P with DX12 and settings on High on the most demanding MP maps.
> 
> Opinions?


Glad to see that CPU is treating you well







.

My GFs rig is a i5 and 1070 and does really decent at 1440p, though she doesn't play BF1. I think the 1080 is a great card for 16:9 1440p, if not one of the best for that res. Obviously the TXP/TXp/1080Ti are as well, but you wouldn't be dissatisfied.

However, going from 2x980s that are working well for you with no issues? You'll see a slight performance hit in some games probably with a single 1080. I think your ultimate upgrade path would be 2x1080 or the eventual Ti successor....I think your 980s would hold up well until that happens, however.


----------



## Sharchaster

guys one question
On BF1, which do you prefer to overclock when playing a BF1? Core or Memory?


----------



## Dasboogieman

Quote:


> Originally Posted by *Sharchaster*
> 
> guys one question
> On BF1, which do you prefer to overclock when playing a BF1? Core or Memory?


Why not both?


----------



## Pafrarca

Quote:


> Originally Posted by *tangelo*
> 
> X+ uses different memory than non-X versions, so I would not try it.


Ok, thanks man


----------



## Sharchaster

Quote:


> Originally Posted by *Dasboogieman*
> 
> Why not both?


Both were overclocked on mine, but sometimes I think I'm limited by the power limit, so no matter how much I set the offset on MSI AB, my core clock will downclocked to the same frequencies where I'm not overclock the core. If I'm not doing anything, I got the offset, but on MP and Single Player HARD mode, there are a lot of explosives from the tanks, people, etc.

So it's bothers me.....thinking to sell this card and get the HOF version instead. at least I will get the higher power limit, since for some reason, i can't flash mine to get higher power limit.


----------



## coreykill99

Quote:


> Originally Posted by *Sharchaster*
> 
> Both were overclocked on mine, but sometimes I think I'm limited by the power limit, so no matter how much I set the offset on MSI AB, my core clock will downclocked to the same frequencies where I'm not overclock the core. If I'm not doing anything, I got the offset, but on MP and Single Player HARD mode, there are a lot of explosives from the tanks, people, etc.
> 
> So it's bothers me.....thinking to sell this card and get the HOF version instead. at least I will get the higher power limit, since for some reason, I can't flash mine to get higher power limit.


are you consistently hitting the higher power limit? I might be wrong but from what Ive seen on my card pushing the power limit slider all the way to max inst always beneficial.
my power limit is set @ 45% core +210 mem +550 but from there if I increase my power limit slider and I mean at all I start seeing negative impacts on my scores and performance.
you could just lower your power limit back down and ramp up your fans some more. isnt pascal really into cool temps. it dosent seem to care about voltage. its all about cooling, and pushing extra power into it when it dosent need it can be detrimental to it cooling itself. looks like its throttling itself to stay cool. what are your temps when this happens?
just find it odd. I have a msi gaming card and even before I put it on water it didn't seem to jump clocks around like you say yours does.


----------



## stephenn82

Guys,

If you are hitting power limit with your slider all the way up and your overclock seems to be stuck at that level, what are your temps? Max voltage displayed in RTSS? or GPU-Z?

I have noticed that when I use the slider for offset overclocking, it wants to push a higher clock over 1093mv than what the limitations can do. Even if the card it sitting at 2126 at 1093mv and temps are good. its because the slider offset on core clock wants to shoot super high on the right.

Open your afterburner up. There should be little cell bars just left of the clock setting. Click it. If its not there, hit Ctrl+F

See your curve? The maximum your board will theoretically do out of the stock box is 1093mv. My FTW does. Maybe yours doesnt, but check.

Does your clock curve still want to go higher and higher on the right? past the 1093mv setting? If so, you will ALWAYS see that limit warning.

Flatten out the dots on your curve to be all the same as your highest mv setting (for instance, 1093mv mine is at +105, and ever other voltage dot to the right is also at +104 or closest I can get it)

Do that, and report back if the limit is still there, or gone.


----------



## Vellinious

I'm not sure why anyone would even bother using the offset method anymore.....by doing so, they're just allowing the software to create the voltage / frequency curve for them, which is usually terrible.


----------



## stephenn82

Quote:


> Originally Posted by *Vellinious*
> 
> I'm not sure why anyone would even bother using the offset method anymore.....by doing so, they're just allowing the software to create the voltage / frequency curve for them, which is usually terrible.


agreed, good sir! I was just tryign to share my bit of info learned form your posts and findings on the threads you linked with the above user to help with his limit woes. I hope it helped for him.

For that said user above, thank Vellinious for all of his work and explanations on the curves and how to edit them. V helped me better understand how it works, i would say over a page and a half of back and forth posts throughout this thread.


----------



## Hequaqua

^^^Agreed


----------



## Sharchaster

Quote:


> Originally Posted by *coreykill99*
> 
> *are you consistently hitting the higher power limit?* I might be wrong but from what Ive seen on my card pushing the power limit slider all the way to max inst always beneficial.
> my power limit is set @ 45% core +210 mem +550 but from there if I increase my power limit slider and I mean at all I start seeing negative impacts on my scores and performance.
> you could just lower your power limit back down and ramp up your fans some more. isnt pascal really into cool temps. it dosent seem to care about voltage. its all about cooling, and pushing extra power into it when it dosent need it can be detrimental to it cooling itself. looks like its throttling itself to stay cool. what are your temps when this happens?
> just find it odd. I have a msi gaming card and even before I put it on water it didn't seem to jump clocks around like you say yours does.


yes bro, I did...
I just want to run my core past 2.0 GHz CONSTANT....because I HATE when I got a card like GTX 1080 that run at stock clocks. (except for some reason like game engine didn't like it, etc).
Sometimes it will run 2050 MHz but when I got an explosives from tanks, people, etc...the core will throttle to something like 2000 MHz or less...
my temps MAX is at 73C...since I live at 28C ambient temperature

Quote:


> Originally Posted by *stephenn82*
> 
> Guys,
> 
> If you are hitting power limit with your slider all the way up and your overclock seems to be stuck at that level, what are your temps? Max voltage displayed in RTSS? or GPU-Z?
> 
> I have noticed that when I use the slider for offset overclocking, it wants to push a higher clock over 1093mv than what the limitations can do. Even if the card it sitting at 2126 at 1093mv and temps are good. its because the slider offset on core clock wants to shoot super high on the right.
> 
> *Open your afterburner up. There should be little cell bars just left of the clock setting. Click it. If its not there, hit Ctrl+F
> 
> See your curve? The maximum your board will theoretically do out of the stock box is 1093mv. My FTW does. Maybe yours doesnt, but check.*
> 
> Does your clock curve still want to go higher and higher on the right? past the 1093mv setting? If so, you will ALWAYS see that limit warning.
> 
> Flatten out the dots on your curve to be all the same as your highest mv setting (for instance, 1093mv mine is at +105, and ever other voltage dot to the right is also at +104 or closest I can get it)
> 
> Do that, and report back if the limit is still there, or gone.


Sorry I'm not quite understand the meaning of your suggestion, can you please explain a little bit?
My max temps is about 73C but it's rare...usually it's about 63C - 68C...

here's mine...


Edit : Nevermind, I knew the meaning behind that statement now....
my max voltage is only 1.025 volt which is I believe it's locked by MSI (I got a newer BIOS), so far the max is 2050 which is quite low compared to the others, I was hoping to get 2100 MHz on gaming like BF1.


----------



## stephenn82

Quote:


> Originally Posted by *Sharchaster*
> 
> yes bro, I did...
> I just want to run my core past 2.0 GHz CONSTANT....because I HATE when I got a card like GTX 1080 that run at stock clocks. (except for some reason like game engine didn't like it, etc).
> Sometimes it will run 2050 MHz but when I got an explosives from tanks, people, etc...the core will throttle to something like 2000 MHz or less...
> my temps MAX is at 73C...since I live at 28C ambient temperature
> Sorry I'm not quite understand the meaning of your suggestion, can you please explain a little bit?
> My max temps is about 73C but it's rare...usually it's about 63C - 68C...
> 
> here's mine...
> 
> 
> Edit : Nevermind, I knew the meaning behind that statement now....
> my max voltage is only 1.025 volt which is I believe it's locked by MSI (I got a newer BIOS), so far the max is 2050 which is quite low compared to the others, I was hoping to get 2100 MHz on gaming like BF1.


It will be hard hittong that at 1.025v. I dont thinknits enough power to push that clock.

This is hard locked to 1.025? That is sorta low.


----------



## Vellinious

Get the core temps lower, and it'll do it just fine. Lower temps, means lower voltage needed to run the target clock.


----------



## stephenn82

that is pretty crazy to know these cards run that well. Man, Nvidia has been busy since I owned the 8800GT.


----------



## Sharchaster

Quote:


> Originally Posted by *stephenn82*
> 
> It will be hard hittong that at 1.025v. I dont thinknits enough power to push that clock.
> 
> This is hard locked to 1.025? That is sorta low.


yeah it's low....sometimes the voltage drop to 0.993 volt which is VERY LOW to running 2050 MHz cause hitting power limit.

EDIT :

It's really strange on mine, when I up the slider (not use the above method) I can easily hit beyond 1.025 volt (sometimes drop due to power limit go higher than the limit on AB)

that doesn't happen on previous method, it always steady at 0.975 no matter what I'm set at.


----------



## bajer29

Quote:


> Originally Posted by *pez*
> 
> Glad to see that CPU is treating you well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My GFs rig is a i5 and 1070 and does really decent at 1440p, though she doesn't play BF1. I think the 1080 is a great card for 16:9 1440p, if not one of the best for that res. Obviously the TXP/TXp/1080Ti are as well, but you wouldn't be dissatisfied.
> 
> However, going from 2x980s that are working well for you with no issues? You'll see a slight performance hit in some games probably with a single 1080. I think your ultimate upgrade path would be 2x1080 or the eventual Ti successor....I think your 980s would hold up well until that happens, however.


Yeah it's working really great. Had an issue with the 4790K getting pretty hot, but found the H100i wasn't keeping up anymore and have to ditch it for a CRYORIG R1 Ultimate.

That being said, I think I'll probably hold off for a while on the GPU upgrade until the 1080 successor. Goal is to find a good single card solution for my next upgrade what will run 1440P 144Hz with games on high settings. We'll see.


----------



## Vellinious

Quote:


> Originally Posted by *Sharchaster*
> 
> yeah it's low....sometimes the voltage drop to 0.993 volt which is VERY LOW to running 2050 MHz cause hitting power limit.
> 
> EDIT :
> 
> It's really strange on mine, when I up the slider (not use the above method) I can easily hit beyond 1.025 volt (sometimes drop due to power limit go higher than the limit on AB)
> 
> that doesn't happen on previous method, it always steady at 0.975 no matter what I'm set at.


Show us a screenshot of your GPUz. Sounds to me like even when you up the voltage, the curve and boost 3.0 are lowering the voltage automatically. You could easily fix this by correcting the voltage / frequency curve.


----------



## pez

Quote:


> Originally Posted by *bajer29*
> 
> Yeah it's working really great. Had an issue with the 4790K getting pretty hot, but found the H100i wasn't keeping up anymore and have to ditch it for a CRYORIG R1 Ultimate.
> 
> That being said, I think I'll probably hold off for a while on the GPU upgrade until the 1080 successor. Goal is to find a good single card solution for my next upgrade what will run 1440P 144Hz with games on high settings. We'll see.


Sweet! Gorgeous and beefy cooler that one is, too. And yeah, I think the 1080Ti (and it's Titan brethren) do great for 1440p/144hz, but of course it depends on what you play. The 1070 and 1080 are great for it, too. Honestly, I would have stuck with a 1080 had it ran 3440x1440 the way I wanted it to or kept the 1440p monitor I had previously.


----------



## Hequaqua

Well....having a few issues that I can't seem to pinpoint.

First one is this:



The issue goes away with a restart. As you can see there was no load on the card. The driver was installed, I was reinstalling it. It seems to happen when I exit ME: Shadow of War. Which brings me to my second issue.

Here is the log that game keeps for benchmarks:



As you can see....getting around 45fps. Those were run at different quality levels. The last one was at 4k.

Something isn't right. The benchmark never loads the GPU core. It normally is around 18%-40%. Voltage is always really low, as is the power. The game itself seems to be OK. It loads the core to around 98% and voltages/power seem normal. This is with the latest driver...that is "optimized" for the game.

Any ideas?

EDIT: GPU-Z running the benchmark:


----------



## Sharchaster

1. Just roll back to previous driver that works on your card
2. I'm start thinking to RMA your card (since it's happen at stock)


----------



## Hequaqua

I did a clean install using DDU, rolled back to 385.69. No more black squares(so far), but the in-game benchmark scores are still horrible. My son is going to install it on his rig, and I will run the benchmark on it and see what he gets.

Really odd to me.....I can exit the game, run Firestrike/Time Spy/The Division/etc...card acts fine.

Of course this game is a WB(remember all the BM:AK issue at release?)....lol

EDIT: I have set up the game in the NVCP as well. Max performance-power setting. Adaptive-Global.

EDIT II: I'm leaning toward the game for the crappy performance. I ran my benchmarks yesterday on the new driver without a single issue. These only started happening last night when the game went live.

https://docs.google.com/spreadsheets/d/1y6bRLEpKu4QYrnA10A0ZUPDVlB_6N7OX157-l-vjsI0/edit?usp=sharing


----------



## stephenn82

Hey, anyone have a 1070/1080 and updated to 1703 (creators update) and have issues? I have been putting it off due to the people complaining about frame drops in the past. Looks like if you turn game mode off, you dont have to worry. Should I just update to 1703 vice 1607?


----------



## Hequaqua

I haven't really had any issues with the update. Game mode is really to help lower end hardware from my understanding. I thought I had a spreadsheet showing performance after the update, but I can't seem to find it.

EDIT: Here ya go...this is with the GTX1060. Superposition was new, but all the older ones are there. It's labeled at the top of the spreadsheet.

https://docs.google.com/spreadsheets/d/1ZQUgpBE69A0FYsYWyo3A9pXdadCZVVx7ng6qUjPBugQ/edit?usp=sharing


----------



## Hequaqua

Update on my issue with ME:Shadow of War.

I'm not getting those crazy artifacts now....still having a issue with performance though.

Here is a quick video. You can see core load, and voltages aren't right. Again, I can exit the game, play something else without issue.


----------



## stephenn82

Quote:


> Originally Posted by *Hequaqua*
> 
> Update on my issue with ME:Shadow of War.
> 
> I'm not getting those crazy artifacts now....still having a issue with performance though.
> 
> Here is a quick video. You can see core load, and voltages aren't right. Again, I can exit the game, play something else without issue.


Nice! I may just update it then. It looks like some tech updates fixed a lot of issues. The frat rollout had problems. And this was with 1080 and ti.


----------



## Hequaqua

I found the issue with ME:Shadow of War.....WINDOWS 10!

I installed the latest Insiders Build....BAM!

Stock:


Spoiler: Warning: Spoiler!







OC +125/+550:


Spoiler: Warning: Spoiler!


----------



## stephenn82

which build do you have?

I finally update to 15063.674 but havent gamed on it yet.


----------



## Hequaqua

1709(16299.15)

I believe this is going to be the Fall Creator's Update Build. I got it through the Insider's Program. It was actually a different version I upgraded to, but after installing that and running Windows update, that is what it shows.

The other build had a "evaluation build" in the bottom right corner of my desktop. That is now gone. I believe that version was 16296. I'm not 100% sure, I didn't check it.

EDIT: Just logged into the Insider's Program...it was 16296.


----------



## stephenn82

Does that vastly improve your gaming? DX11 or 12? I might just skip to that instead...do we have to be signed in with Microsoft account though? I would rather TRY the goods without them linking me to my computer, if you know what I mean.


----------



## Hequaqua

I'm not sure if you can still join the Insider's program. I know the Fall update is supposed to roll out the 17th. If you wait until then, you can install it, and roll back if you just do a upgrade. It keeps the old version on your computer for 30 days, then deletes it. IIRC

I haven't really messed with it.....I just ran that benchmark and went back to mining...lmao

EDIT: I don't care for DX12. If Microsoft didn't have so much $$ into it, I think a lot of developers would go with the Vulkan API. That's just my opinion.


----------



## stephenn82

I am already part of the insiders program, I beta tested WIn 7 and Win 10 before either released. I just dont want my computer to be logged in with Microsoft account.


----------



## Hequaqua

I hear ya....

It doesn't bother me really....most of the big corps know so much about us all anyway....lmao


----------



## stephenn82

its true...they do. from your phone to your tv...

So, my max power draw while running AVX loads and superposition at 1080p extreme topped the scales at 340W. I am just under 40% load/efficiency of my PSU...I guess its too big now for these modern equipments. Maybe I need another 1080 to help me eat the power up? I bet the wife would LOVE that...lol


----------



## Hequaqua

Quote:


> Originally Posted by *stephenn82*
> 
> its true...they do. from your phone to your tv...
> 
> So, my max power draw while running AVX loads and superposition at 1080p extreme topped the scales at 340W. I am just under 40% load/efficiency of my PSU...I guess its too big now for these modern equipments. Maybe I need another 1080 to help me eat the power up? I bet the wife would LOVE that...lol


Mine just say's, "You're going to do what you want...why ask me?"....lmao


----------



## trivium nate

i finally just ordered an GTX 1080 for myself evga









im pumped!!


----------



## stephenn82

Wooo hoooooo!! Which model? Sc, ssc, classified, ftw? They're all good cards


----------



## trivium nate

the 08G-P4-6284-KR


----------



## Hequaqua

That's the FTW ACX...nice.

I almost went EVGA, but got a pretty good deal on the Gigabyte.


----------



## Gen Patton

Question does the 1080 play mordor ok? i was thinking about buying that series?


----------



## Sharchaster

Quote:


> Originally Posted by *trivium nate*
> 
> i finally just ordered an GTX 1080 for myself evga
> 
> 
> 
> 
> 
> 
> 
> 
> 
> im pumped!!


welcome to the club, and enjoy your card....


----------



## Ascii Aficionado

Received my 1080 this week.

Anyone here interested in Shadow of War for half off ?

I'd accept PUBG in a trade on Steam.

Not sure if I'm actually allowed to ask this here, although I do have some rep and positive trader feedback.


----------



## Mayclore

I got my 1080 at 5 PM yesterday, but didn't get to install it until 8.

Considering the fact that I replaced a GTX 650 with it, I have experienced a _slight_ uplift in performance.


----------



## lanofsong

Hey GTX 1080 owners,

We are having our monthly Foldathon from Monday 16th - Wednesday 18th - 12noon EST.
Would you consider putting all that awesome GPU power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

October 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## stephenn82

Quote:


> Originally Posted by *Mayclore*
> 
> I got my 1080 at 5 PM yesterday, but didn't get to install it until 8.
> 
> Considering the fact that I replaced a GTX 650 with it, I have experienced a _slight_ uplift in performance.


That's very unlikely. What are you running a single core celeron?


----------



## Hequaqua

Quote:


> Originally Posted by *stephenn82*
> 
> That's very unlikely. What are you running a single core celeron?


I took it as sarcasm....lol

EDIT: Just figured I would add this here....I'm having a strange issue when installing the drivers. It seems once I unpack them and run it, it won't allow me to run it again. Let's say I just installed the driver and PhysX and I want to go back and reinstall the driver/PhysX/and GFE. I always get this notice. It doesn't matter where I unpack them to, nor what drive I have them on.


Spoiler: Warning: Spoiler!







The only workaround I have found is to unpack them again. This isn't new to the 1080...it was happening before with my GTX1060, and on my other rig when I moved my cards around. Before, I could unpack them, go to the folder and run them again.

Any ideas? Anyone else have this issue?

Thx


----------



## outofmyheadyo

How is the gigabyte gaming g1 card, any good ?


----------



## Yetyhunter

I just ordered this beast http://www.palit.com/palit/vgapro.php?id=2619 along with an entire Coffee lake system and a 240hz monitor. What overclocking potential and cooling performance should I expect from this card?


----------



## KGB1st

Is it possible to change fan speed in bios editor for Poseidon 1080TI?


----------



## Gen Patton

Great card Evga does good. i have the founders card.


----------



## Hequaqua

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How is the gigabyte gaming g1 card, any good ?


I'm liking it so far. I'm mining and gaming on it.









I've been working on some driver benchmarks, have three sets done, about to start another one later today. Of course, it's Power Limited, and Pascals voltage's are all over the place unless you set the frequency/voltage table with it.

Here is what it's doing mining:


Spoiler: Warning: Spoiler!







It's getting a little over 25.5 Mh/s(Ethereum)....and it's only using about 10-15w more than my GTX1060 is doing in my other rig. Average wattage over the last 9hrs is 138.822.







The GDDR5X is not the best for mining though. It doesn't load up the memory controller. The 1060 is maxed. As you can see, the 1080 sits around 79%. Of course, that keeps the TDP down to around 70%.

Overall, pretty happy with it.


----------



## Gdourado

Any advice on the Aorus 1080 11gps vs the 1080 amp extreme?
With both costing the same, which is the better card?
From what I can tell, almost all 1080s cap out at 2050 core give or take, so it might be a matter of which has the better cooling solution with regards to temperature and noise, but please correct me if I am wrong.

Thank you.
Cheers


----------



## Sharchaster

Quote:


> Originally Posted by *Gdourado*
> 
> Any advice on the Aorus 1080 11gps vs the 1080 amp extreme?
> With both costing the same, which is the better card?
> From what I can tell, almost all 1080s cap out at 2050 core give or take, so it might be a matter of which has the better cooling solution with regards to temperature and noise, but please correct me if I am wrong.
> 
> Thank you.
> Cheers


I honestly prefer gigabyte over zotac in many other ways, because of the warranty. But it depends on where you live at.


----------



## Hequaqua

I looked at the AMP...and thought, why did they put that yellow all over the card? It's bad enough we get the red from MSI, and orange from Gigabyte. lol At least on the G1, most of the orange is underneath and not splattered all over the backplate...lol


----------



## sinholueiro

I've installed a Kraken G12 and a Thermaltake Water 3.0 Extreme S in my 1080 Armor. Recommended so far. 47 degrees at load and stable 2114Mhz. Much better than the before configuration of ~1600Mhz and 65 degrees configuration that I had to use to control the temperature.


----------



## outofmyheadyo

Whomever thought using 1070 cooler on 1080ti over @ msi should bespanked.


----------



## Gdourado

Quote:


> Originally Posted by *Sharchaster*
> 
> I honestly prefer gigabyte over zotac in many other ways, because of the warranty. But it depends on where you live at.


Quote:


> Originally Posted by *Hequaqua*
> 
> I looked at the AMP...and thought, why did they put that yellow all over the card? It's bad enough we get the red from MSI, and orange from Gigabyte. lol At least on the G1, most of the orange is underneath and not splattered all over the backplate...lol


Well, I was searching on this matter, and it seems the amp, despite featuring a massive cooler, has somewhat of a design flaw and the VRM cooling is not effective.
So I guess the Aorus is the better choice.

Cheers


----------



## BeeDeeEff

Are there any decent aftermarket air coolers that would work with my GTX 1080 EVGA Founder's Edition, I want to overclock it but even at stock I'm getting in the high 70s sometimes hitting 80°C. Also would be nice for it to be quieter under load. Not finding a lot of data to compare options.

Currently looking at the ARCTIC Accelero Xtreme III

I would just buy the EVGA hybrid kit, but there is no room in my case to fit the radiator+fan. I have tons of room below my graphics card to get a taller cooler that takes up more pci slots, but not enough room in any other direction. NH-D15 cooler too close to the top of my case to mount a rad+fan, same for the rear exhaust fan mounting. Hard Drive bays and optical disk bays are both riveted to the front and occupied with not enough space between them to rig the rad.

Are there any other coolers I should have my eye on?


----------



## xartic1

Quote:


> Originally Posted by *BeeDeeEff*
> 
> Are there any decent aftermarket air coolers that would work with my GTX 1080 EVGA Founder's Edition, I want to overclock it but even at stock I'm getting in the high 70s sometimes hitting 80°C. Also would be nice for it to be quieter under load. Not finding a lot of data to compare options.
> 
> Currently looking at the ARCTIC Accelero Xtreme III
> 
> I would just buy the EVGA hybrid kit, but there is no room in my case to fit the radiator+fan. I have tons of room below my graphics card to get a taller cooler that takes up more pci slots, but not enough room in any other direction. NH-D15 cooler too close to the top of my case to mount a rad+fan, same for the rear exhaust fan mounting. Hard Drive bays and optical disk bays are both riveted to the front and occupied with not enough space between them to rig the rad.
> 
> Are there any other coolers I should have my eye on?


Yes there are better ones, such as the prolimatech mk26. I used it on my 1080fe when I owned one and currently on my 1080hof. It's very large cooler that eats up about 4 slots with 2 140mm fans but it will keep your card in the 50s. Depending on your choice of fans, it can even perform cooler.


----------



## BeeDeeEff

Quote:


> Originally Posted by *xartic1*
> 
> Yes there are better ones, such as the prolimatech mk26. I used it on my 1080fe when I owned one and currently on my 1080hof. It's very large cooler that eats up about 4 slots with 2 140mm fans but it will keep your card in the 50s. Depending on your choice of fans, it can even perform cooler.


Thanks for the lead! Time to actually take some measurements of my case room when I get home.


----------



## stephenn82

Quote:


> Originally Posted by *BeeDeeEff*
> 
> Are there any decent aftermarket air coolers that would work with my GTX 1080 EVGA Founder's Edition, I want to overclock it but even at stock I'm getting in the high 70s sometimes hitting 80°C. Also would be nice for it to be quieter under load. Not finding a lot of data to compare options.
> 
> Currently looking at the ARCTIC Accelero Xtreme III
> 
> I would just buy the EVGA hybrid kit, but there is no room in my case to fit the radiator+fan. I have tons of room below my graphics card to get a taller cooler that takes up more pci slots, but not enough room in any other direction. NH-D15 cooler too close to the top of my case to mount a rad+fan, same for the rear exhaust fan mounting. Hard Drive bays and optical disk bays are both riveted to the front and occupied with not enough space between them to rig the rad.
> 
> Are there any other coolers I should have my eye on?


why not EVGA's own AIO Hybrid cooler? Its quiet and drops temps a lot.


----------



## BeeDeeEff

Quote:


> Originally Posted by *stephenn82*
> 
> why not EVGA's own AIO Hybrid cooler? Its quiet and drops temps a lot.


No room for the radiator (my case is small and my cpu heatsink is the NH-D15).

Just put in an order for the Raijintek Morpheus II after reading and comparing several similar products. Also kind glad not to add back in an AiO with a pump after I just replaced one. There's really not much hard data for this kind of purchase, not many people would go through the trouble of putting on an aftermarket gpu heatsink without having already gone full custom liquid.


----------



## stephenn82

What case?


----------



## BeeDeeEff

Quote:


> Originally Posted by *stephenn82*
> 
> What case?


Corsair Carbide 300R and here's a pic of my current inside.

I have my old single-fan AIO 120mm radiator that I used to use on my cpu, measures the same as the one on the EVGA all-in-one and I can't fit it (with a fan) on my rear exhaust nor above my NH-D15. I can expand my GPU thickness up to 5 slots, nothing but room in that direction, but I can't mount a 120mm radiator.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *BeeDeeEff*
> 
> Corsair Carbide 300R and here's a pic of my current inside.
> 
> I have my old single-fan AIO 120mm radiator that I used to use on my cpu, measures the same as the one on the EVGA all-in-one and I can't fit it (with a fan) on my rear exhaust nor above my NH-D15. I can expand my GPU thickness up to 5 slots, nothing but room in that direction, but I can't mount a 120mm radiator.


Maybe you could get it in between the 5.25" bay and the 3.5" bay?


----------



## stephenn82

Yeah man. Either custom loop or get super innovative with a hybrid.


----------



## BeeDeeEff

Quote:


> Originally Posted by *6u4rdi4n*
> 
> Maybe you could get it in between the 5.25" bay and the 3.5" bay?


Tried it, too tall, and I'd have to do some kind of ******* engineering to mount it as there are no holes for anything there. In retrospect I'm rather looking forward to not having any pump in my case, and the only moving parts being noctua fans, its super quiet.

My next case will def be larger, but don't want to change from my current one until I upgrade my mobo/cpu/ram.
Quote:


> Originally Posted by *stephenn82*
> 
> Yeah man. Either custom loop or get super innovative with a hybrid.


I'm just getting a b.a. aftermarket aircooler, and gonna slap two 120mm noctua fans on it. Already ordered a Raijintek Morphes II and the fans I'll need.

Will update with the results, and thanks for the ideas.


----------



## Sharchaster

Thinking to put Corsar H80 series to my GTX 1080


----------



## AngryGoldfish

Is water cooling really necessary for a 1080 though, especially from a H80 AIO? My 1080 barely goes 60°C on air inside a small case while being remarkably quiet. It's not silent, but neither is a H80. The performance uplift will be so small since the 1080 is not temperature limited.


----------



## PhatMuffinMan

Howdy all -

I wanted to say hey. I am a new 1080 owner.

I was able to pick up a MSI 1080 Duke for a good deal. Coming from a GTX 970 it is a pretty nice jump for gaming at 1080p.

Does anyone happen to know or can point me in the direction for OC a MSI Duke, specifically?

Hoping to make some new posts on the 1080. It's nice to be here!


----------



## Hequaqua

Quote:


> Originally Posted by *PhatMuffinMan*
> 
> Howdy all -
> 
> I wanted to say hey. I am a new 1080 owner.
> 
> I was able to pick up a MSI 1080 Duke for a good deal. Coming from a GTX 970 it is a pretty nice jump for gaming at 1080p.
> 
> Does anyone happen to know or can point me in the direction for OC a MSI Duke, specifically?
> 
> Hoping to make some new posts on the 1080. It's nice to be here!


All the cards OC about the same.....

Set the Power Limit to max
Adjust Clocks until it crashes, back it down a notch
Look for Perfcaps in GPU-Z
Keep a eye on temps/thermal throttling(doubtful if you keep a nice fan profile)
GDDR5X will OC pretty high, but performance falls off at some point(depends on what you are doing) Example: after +550 I will crash with a 3D load, mining it will run +1000 all day long.

Boost 3.0 is a bit wacky on the 10 series....Voltage is also a bit wacky...lol

You can use the Frequency/Voltage table in MSI Afterburner....after opening it, hit ctrl-F, and it will open a pane to adjust clocks/voltage.

For my card(single 8-pin), power limit is holding me back, at least in synthetic benchmarks. Games don't seem to be as bad. My G1 is pretty stable at +125 Core, +500/+550 Memory. If I go above the +550, the performance starts to fall off.

EDIT: Here is a spreadsheet that should give you some baseline numbers. These are all at stock. Four set of drivers.

https://docs.google.com/spreadsheets/d/1y6bRLEpKu4QYrnA10A0ZUPDVlB_6N7OX157-l-vjsI0/edit?usp=sharing


----------



## PhatMuffinMan

Quote:


> Originally Posted by *Hequaqua*
> 
> All the cards OC about the same.....
> 
> Set the Power Limit to max
> Adjust Clocks until it crashes, back it down a notch
> Look for Perfcaps in GPU-Z
> Keep a eye on temps/thermal throttling(doubtful if you keep a nice fan profile)
> GDDR5X will OC pretty high, but performance falls off at some point(depends on what you are doing) Example: after +550 I will crash with a 3D load, mining it will run +1000 all day long.
> 
> Boost 3.0 is a bit wacky on the 10 series....Voltage is also a bit wacky...lol
> 
> You can use the Frequency/Voltage table in MSI Afterburner....after opening it, hit ctrl-F, and it will open a pane to adjust clocks/voltage.
> 
> For my card(single 8-pin), power limit is holding me back, at least in synthetic benchmarks. Games don't seem to be as bad. My G1 is pretty stable at +125 Core, +500/+550 Memory. If I go above the +550, the performance starts to fall off.
> 
> EDIT: Here is a spreadsheet that should give you some baseline numbers. These are all at stock. Four set of drivers.
> 
> https://docs.google.com/spreadsheets/d/1y6bRLEpKu4QYrnA10A0ZUPDVlB_6N7OX157-l-vjsI0/edit?usp=sharing


Okay. Sounds good. I only do gaming. Thank you for the help!


----------



## Hequaqua

Quote:


> Originally Posted by *PhatMuffinMan*
> 
> Okay. Sounds good. I only do gaming. Thank you for the help!


No problem.

You will enjoy the gains over the 970. I had those in SLI....very nice cards. I just wish we had the tools for Pascal that we had for Maxwell. A lot of fun modding the bios on those....


----------



## PhatMuffinMan

Quote:


> Originally Posted by *Hequaqua*
> 
> No problem.
> 
> You will enjoy the gains over the 970. I had those in SLI..
> ..very nice cards. I just wish we had the tools for Pascal that we had for Maxwell. A lot of fun modding the bios on those....


I still have my 970. Would I be able to SLi a 970 with my 1080 or is that just stupid? I mean the 1080 is nice, but we can always have more POWER!! lol


----------



## Hequaqua

Quote:


> Originally Posted by *PhatMuffinMan*
> 
> I still have my 970. Would I be able to SLi a 970 with my 1080 or is that just stupid? I mean the 1080 is nice, but we can always have more POWER!! lol


I'm not sure you would want to SLI really. I think it would only use 4gb of VRAM on both cards. I'm also not sure about the SLI bridge. You can use the 970 to handle the PhysX though. That might give a little improvement. To be honest though, the 970 is power hungry compared to the 1080's. The 970 will use almost as much as a 1080 maxed and fully loaded. Pretty close anyway, depends on the TDP for both cards. IIRC, my 970 would use over 200w under load easily. The 1080 might peak out at 220w, but it never stays there for the duration.

EDIT: Full disclosure, I was running a modded bios. 305w TDP, 1550*1506*/[email protected]


----------



## PhatMuffinMan

Quote:


> Originally Posted by *Hequaqua*
> 
> I'm not sure you would want to SLI really. I think it would only use 4gb of VRAM on both cards. I'm also not sure about the SLI bridge. You can use the 970 to handle the PhysX though. That might give a little improvement. To be honest though, the 970 is power hungry compared to the 1080's. The 970 will use almost as much as a 1080 maxed and fully loaded. Pretty close anyway, depends on the TDP for both cards. IIRC, my 970 would use over 200w under load easily. The 1080 might peak out at 220w, but it never stays there for the duration.
> 
> EDIT: Full disclosure, I was running a modded bios. 305w TDP, 1550*1506*/[email protected]


Wow, thanks for the input. It looks like someone is going to buy it from me actually!


----------



## Hequaqua

Quote:


> Originally Posted by *PhatMuffinMan*
> 
> Wow, thanks for the input. It looks like someone is going to buy it from me actually!


Great!









I sold mine off about a year ago. I picked up a 1060 first, then a RX480. Just sold the 480 a few weeks ago. Picked up the 1080. I still have the 1060 in my other rig mining away...









I've done pretty well mining....about 275.00 in a little over 2.5 months. That's with 3 cards, one mining 24/7. I use my son's RX470 part of the time, and the 1080 when I'm not benching or gaming. I'm addicted to benchmarking and GPU's.


----------



## pez

Quote:


> Originally Posted by *AngryGoldfish*
> 
> Is water cooling really necessary for a 1080 though, especially from a H80 AIO? My 1080 barely goes 60°C on air inside a small case while being remarkably quiet. It's not silent, but neither is a H80. The performance uplift will be so small since the 1080 is not temperature limited.


You can buy a fan to make the H80 or an AIO silent, but will most likely run slightly slower than the 2k+ RPM garbage that Corsair includes with the AIOs. I only run mine on a AIO b/c of a space restriction. I have a Phanteks F120MP on my AIO and at it's maximum fan speed it's still heaps quieter than a GPU cooler at full speed.

That being said, if you're already at a sound level you can stand and temps you're happy with, I would think it's not worth the trouble.


----------



## AngryGoldfish

Quote:


> Originally Posted by *pez*
> 
> You can buy a fan to make the H80 or an AIO silent, but will most likely run slightly slower than the 2k+ RPM garbage that Corsair includes with the AIOs. I only run mine on a AIO b/c of a space restriction. I have a Phanteks F120MP on my AIO and at it's maximum fan speed it's still heaps quieter than a GPU cooler at full speed.
> 
> That being said, if you're already at a sound level you can stand and temps you're happy with, I would think it's not worth the trouble.


What about pump noise? I haven't tested the newest iterations to see if pump noise has improved.


----------



## pez

Quote:


> Originally Posted by *AngryGoldfish*
> 
> What about pump noise? I haven't tested the newest iterations to see if pump noise has improved.


My H60 has some pump noise to it, but somehow it seems to have subdued after some time. Thankfully temps are consistent so it's working correctly. My EVGA CLC however has not had pump noise. The most noticeable thing from my system is actually the coil whine now that it's reasonably quiet.


----------



## AngryGoldfish

Quote:


> Originally Posted by *pez*
> 
> My H60 has some pump noise to it, but somehow it seems to have subdued after some time. Thankfully temps are consistent so it's working correctly. My EVGA CLC however has not had pump noise. The most noticeable thing from my system is actually the coil whine now that it's reasonably quiet.


The pump noise from the Cooler Master AIO I had for a few years came and went periodically. That's not to say the noise went away completely, but it went from being unbearably annoying to bearable.


----------



## pez

Quote:


> Originally Posted by *AngryGoldfish*
> 
> The pump noise from the Cooler Master AIO I had for a few years came and went periodically. That's not to say the noise went away completely, but it went from being unbearably annoying to bearable.


Indeed. At least now the fans are just loud enough to drown out any pump noise. The system is dead silent to the point I keep my air purifier on the middle setting to create some ambient noise.


----------



## Myzc

2 Fe i have.
Giga and evga.


----------



## kmac20

Hey guys first of all: I apologize in advance that this is a long post. But I have been having some problems recently that are making me "afraid" to run certain games that I just spent decent money on, on a build that is less than 2 months old that has cost me a lot of money (at least a lot of money to me). So I apologize this is long in advance, but I'm looking for help anywhere I can get it. Thanks to anyone with any ideas or input or things to try.

I recently did a new build. Ryzen 1700, top of the line Asrock Taichi X370, RAM i've got going at 3066etc. Originally I went with a 1060 but eventually sold that to a friend when I did a build for his girlfriend and bought myself an EVGA GTX 1080 FTW 2 edition.

I have been having some weird issues that I was discussing in my motherboard thread thinking it was possibly that at first because it was spitting out all sorts of random debug codes (ranging from memory to GPU to CPU) but has since stopped doing that after a new install and reseat of everything. And I've run HCI memtest over 1000% on all my memory so I'm confident its not that. I've also run tons of prime95 on this chip in the past at both its current and oc'd speed so I'm confident its not that.

Here's my problem:

When I have played GTA 5, it'll play fine for awhile. IN fact the first time I played it for at least a week before an issue. Then, the game crashed, hard. It required me to manually restart the system. Not long after that, it crashed, hard again (not even in game this time) and eventually got to blue screen of death looping. I had to reinstall Windows.

I reinstall it, everythings fine at first. This time I get a different regular CTD on GTA 5 that I look up and seems pretty common. Ok, fine, seems like one way to resolve it is to play GTA 5 without an overclock. No problems, I have a 1080 and I"m only at 1440p I can max everything out except MSAA anyway. No problem. Lasts a bit longer this time, BAM CRASHES HARD AGAIN triggering blue screen looping and requires a fresh install.

Now here are some weird things:
1) it only has happened so far in GTA 5, and once in Civ 5. But the CIv 5 crash was just a regular CTD however if i remember correctly the error message was a similar one to the regular CTD on GTA 5 (maybe my GPU or PC hates the number 5 like Valve hates the number 3







)
2) WHEN I WAS RUNNING IN SAFE MODE WHILE GETTING THE BLUE SCREEN LOOPING TRYING TO FIX IT, I HAD NO. ISSUES. The blue screens stopped. Obvious I couldn't run the games I wanted, but whereas I literally could not boot into Windows AT ALL (it would blue screen IMMEDIATELY) I could leave it in safe mode forever and not have issues.
3)No more debug code errors on my motherboard
4) No other games seem to trigger this so far. Not sure if this is DX11 related issue or simply these games put more stress on the entire system and are triggering it. I would like to borrow the 1060 I sold to my friend as I could intentionally try to trigger this and if it doesn't happen I'd know 100% its the GPU but I dont know if this is possible.

Now I played through ALL OF DOOM (2016) on the 1060 without a SINGLE ISSUE. Literally played through the ENTIRE CAMPAIGN without one issue. Unfortunately I had to reinstall windows after the first hardcore crash from GTA 5 and haven't had a chance to play it, but maybe I'll reinstall it this weekend and check. If I start gettin similar crashes I probably have my answer. If not, then I'm still a bit lost on this.

I also know that my GPU has a dual BIOS feature. I've only ever had motherboards with dual BIOS and this has always been just incase you corrupt one your motherboard isn't bricked as you can reflash the corrupted one and repair it. I know NOTHING about dual BIOS on GPUS. I remember hearing one time one was better for overclocking? Are the BIOS versions the same? If they're different perhaps I'll turn the switch on to the second BIOS and see whats up. If it is a different BIOS version and resolves it perhaps flashing a different BIOS will fix it as well. If the BIOS versions are identical theres no point to trying this.

SYSTEM SPECS ARE BELOW. Everything is the same EXCEPT I replaced the 212 evo with a cooler master liquid cooler (small 120mm rad with 2 fans) and the evga GTX 1060 is now an evga GTX 1080 FTW 2. And I can say pretty confidently its not the powersupply as its an 850w Seasonic Focus and while my Ryzen IS overclocked the TDP is only 65 so even if its over 100, and even assuming that the GPU is over 200, I doubt I"m even hitting half of that 850w after all is said and done.

I'm pretty lost as of now. I know a LOT about GPUs but I have never seen problems like this before. I also admittedly don't know a lot about GPU bios versions as the last time I had to worry about that was like....13 years ago when you could flash a BIOS on like a 6800 to get some good performance out of it.

ANY ADVICE OR INPUT IS GREATLY APPRECIATED IN ADVANCE. I'm really at a annoying point right here, as I REALLY DONT WANT to have to RMA this card as its brand new and will leave me without a system for awhile (Ryzen build, NEEDS A GPU) and I have no extra one except a SUPER OLD x1900xtx (lol doesn't have display port or support 1440p) or a hafl broken other PC. Meaning I would need to PAY TO SHIP THIS BACK TO EVGA, AND PAY TO CROSS SHIP IT.

EVGA has always been good to me thats why they're one of the few brands I"m loyal too (I'll swap back and fourth between nvidia and AMD no issues but anytime I get an evga card they always honor warranties and help out with stuff no issues, so maybe they'll compensate me a bit as a result of the issues), but it'll still push my cost of this card up another, what, 50$+ especially if I want to cross ship the card so it gets here sooner. Perhaps they'll reimburse me or something IDK I have bought about 1 grand worth of graphics cards from them in the past 2 months or so so I'm sure they would.

BUT I would like to avoid having to RMA it if I can diagnose this problem and fix it.

Sorry this is a long post but basically I'm trying to explain my situation and the oddities that have surrounded it. All other games are fine, but they are wayyyyy less demanding than GTA 5 or even CIv 5 in DX11. I just bought the Witcher 3 on sale, maybe I'll give that a shot. I would imagine its the very demanding aspects of DX11 are stressing out thes system and GPU enough to trigger these problems. But it seems almost totally related to software. I would wager that the super hard crashes in GTA 5 crash the system so hard they corrupt something and thus trigger the looping BSODs but I have no idea.

Again, sorry its a long post, but I need to explain everything here. Once again THANK YOU TO ANYONE IN ADVANCE WITH ANY ADVICE AT ALL.


----------



## TUFinside

I would test stability of RAM and CPU (are you sure the OC is stable ? Test with no OC at all), maybe a clean install of drivers to make sure there is no conflict. Check your parameters onto BIOS.


----------



## kmac20

I have ran HCI memtest for over 1000% all day while I was at work. Memory is rock solid at this speed and in general. Not a single error. I also ran memtest86 at stock, no errors. So I have done hours and hours of memory testing.

I have ran p95 for hours and hours both at stock and at the current speed. No issues whatsoever. All parameters in BIOS are fine. Have triple and quintuple checked.

Can't remember if I posted that or not in my post, but I have tested all of these things thoroughly.

Benchmarks even if I run them back to back to back have not triggered a crash or a problem. Either at stock or at its current overclock (which I obviously used to determine a stable overclock). Right now the card is at factory defaults.

I have done a clean install of drivers as this is a fresh install of windows. So basically it was a fresh install of everything. With the newest GPU drivers (even newer than the ones that GTA 5 had been crashing on).

I would like to try running GTA 5 again to see if I can trigger a crash once more, and see if maybe it was the old drivers, but I really don't want any downtime nor to have to do a clean install of Windows for a third time in a week. Ideally if I can borrow the GTX 1060 I sold to my friend I will, but IDK if thats possible. I might also reinstall DOOM and see if I can trigger an issue in there. I beat the entire DOOM campaign while I still had the 1060 and had no crashes no errors nothing. So if I start getting crashes in DOOM it would give me another big clue its probably the GPU. But thats like 50GB of a download that I'm gonna have to do, just to then lose if it does trigger a crash and require a fresh install.

Seems to be pretty unique and I think related in some way to software but might also be hardware. As again, beat the ENTIRE DOOM CAMPAIGN on the 1060 not a single crash. And thats......well now that I'm thinking about it, I ran that in Vulkan, so maybe it is specifically related to DX11....The only 2 games to trigger crashes were Civ 5 in DX10/11 mode and GTA 5 in DX11 mode. Makes me think its software or a software+hardware issue.

I should reinstall DOOM and run it in Vulkan and OpenGL for awhile see if theres any problems. But thats like a 50GB dl that if it does trigger a BSOD loop would just be wiped again when I need to do a fresh Windows install. Thinking more and more about it it seems DX11 is the only type of games that have triggered this. Which makes me really really think its software somehow. Which when combined with the fact that when I was getting the BSOD loops safe mode worked 100%.

But its obviously a combination of software and hardware. Perhaps some sort of error in the GPU is triggering the BSOD errors when it crashes super hard and corrupts certain files as a result. Which is why safe mode would work. I'm not what I'd call an 'expert' in these fields (I studied ECE originally first year at Uni but switched and while I know quite a bit about hardware, I really don't know software very well) so this is the only thing I can think of and its a very layman description because I dont know how else to describe it. But considering after the BSOD loop would be triggered, these BSOsD would be tossing out maybe 3 different kinds, of all sorts of things thread_exception errors, graphics driver errors, other stuff etc but after awhile it would just continue the thread_exceptions at the end. But again when I'd reboot into safemode not a single problem. Ran fine.

Thank you for your input though, I do appreciate any and all.


----------



## SavantStrike

Quote:


> Originally Posted by *PhatMuffinMan*
> 
> I still have my 970. Would I be able to SLi a 970 with my 1080 or is that just stupid? I mean the 1080 is nice, but we can always have more POWER!! lol


You can only SLI with a card from the same series and number, so 970 with 970, 1080 with 1080, etc. The most you can get away with is two different manufacturers but the same card.


----------



## kmac20

@PhatMuffinMan
Quote:


> Originally Posted by *SavantStrike*
> 
> You can only SLI with a card from the same series and number, so 970 with 970, 1080 with 1080, etc. The most you can get away with is two different manufacturers but the same card.


Just a quick reply unrelated to my own questions but some DX12 games will (and maybe a couple do now) allow you to use asynchronous dual GPUs. Meaning you can use 2 GPUs that are not identical. This isn't really big yet. But if devs start coding games to allow it, it will be possible. I believe Vulkan also possesses this ability.

Ashes of the Singularity I believe allows this feature. So in that game you would be able to use your two cards together. So while there are maybe only a handful of games currently that allow it, it will HOPEFULLY be possible with more and more games eventually. Hopefully being the key word.

Here's an article about it from PCWORLD about a year or so ago. https://www.pcworld.com/article/3036760/hardware/the-impossible-has-happened-radeon-and-geforce-come-together-in-directx-12.html Theres more info out there if you search.

But otherwise this guy is accurate SLI itself does not allow it, this is a feature of the APIs that depends on the games coding. But then again, SLI depends on SLI profiles so we'll see how popular it gets.


----------



## Sharchaster

Quote:


> Originally Posted by *kmac20*
> 
> I have ran HCI memtest for over 1000% all day while I was at work. Memory is rock solid at this speed and in general. Not a single error. I also ran memtest86 at stock, no errors. So I have done hours and hours of memory testing.
> 
> I have ran p95 for hours and hours both at stock and at the current speed. No issues whatsoever. All parameters in BIOS are fine. Have triple and quintuple checked.
> 
> Can't remember if I posted that or not in my post, but I have tested all of these things thoroughly.
> 
> Benchmarks even if I run them back to back to back have not triggered a crash or a problem. Either at stock or at its current overclock (which I obviously used to determine a stable overclock). Right now the card is at factory defaults.
> 
> I have done a clean install of drivers as this is a fresh install of windows. So basically it was a fresh install of everything. With the newest GPU drivers (even newer than the ones that GTA 5 had been crashing on).
> 
> I would like to try running GTA 5 again to see if I can trigger a crash once more, and see if maybe it was the old drivers, but I really don't want any downtime nor to have to do a clean install of Windows for a third time in a week. Ideally if I can borrow the GTX 1060 I sold to my friend I will, but IDK if thats possible. I might also reinstall DOOM and see if I can trigger an issue in there. I beat the entire DOOM campaign while I still had the 1060 and had no crashes no errors nothing. So if I start getting crashes in DOOM it would give me another big clue its probably the GPU. But thats like 50GB of a download that I'm gonna have to do, just to then lose if it does trigger a crash and require a fresh install.
> 
> Seems to be pretty unique and I think related in some way to software but might also be hardware. As again, beat the ENTIRE DOOM CAMPAIGN on the 1060 not a single crash. And thats......well now that I'm thinking about it, I ran that in Vulkan, so maybe it is specifically related to DX11....The only 2 games to trigger crashes were Civ 5 in DX10/11 mode and GTA 5 in DX11 mode. Makes me think its software or a software+hardware issue.
> 
> I should reinstall DOOM and run it in Vulkan and OpenGL for awhile see if theres any problems. But thats like a 50GB dl that if it does trigger a BSOD loop would just be wiped again when I need to do a fresh Windows install. Thinking more and more about it it seems DX11 is the only type of games that have triggered this. Which makes me really really think its software somehow. Which when combined with the fact that when I was getting the BSOD loops safe mode worked 100%.
> 
> But its obviously a combination of software and hardware. Perhaps some sort of error in the GPU is triggering the BSOD errors when it crashes super hard and corrupts certain files as a result. Which is why safe mode would work. I'm not what I'd call an 'expert' in these fields (I studied ECE originally first year at Uni but switched and while I know quite a bit about hardware, I really don't know software very well) so this is the only thing I can think of and its a very layman description because I dont know how else to describe it. But considering after the BSOD loop would be triggered, these BSOsD would be tossing out maybe 3 different kinds, of all sorts of things thread_exception errors, graphics driver errors, other stuff etc but after awhile it would just continue the thread_exceptions at the end. But again when I'd reboot into safemode not a single problem. Ran fine.
> 
> Thank you for your input though, I do appreciate any and all.


your realtek driver maybe conflicting with NVIDIA HD Audio Driver, try re-installing the driver without "HD Audio Driver" from NVIDIA....
also, I'm curious maybe your PSU is causing you so much trouble, maybe it starts to faulty, and needs to RMA....

also why you only play GTA 5? do you have another game to test with? like BF1, BF4, Crysis 3, or maybe Shadow of War / Mordor, The Witcher 3, etc...


----------



## IronAge

Any of you guys got a Strix 11GBPS and tried flashing the Strix XOC t4 Bios to it ?

I have tried quite a few NVflash versions and i have not been able to flash that Bios due to a board ID mismatch.

Also its not possible to increase the VDDC

I tried MSI AB 4.3.0/ 4.4.0 B19 / Asus GPUTweak and Settings just do not get applied - neither by Offset nor by a Curve with MSI AB.


----------



## stephenn82

I have a
Quote:


> Originally Posted by *Sharchaster*
> 
> your realtek driver maybe conflicting with NVIDIA HD Audio Driver, try re-installing the driver without "HD Audio Driver" from NVIDIA....
> also, I'm curious maybe your PSU is causing you so much trouble, maybe it starts to faulty, and needs to RMA....
> 
> also why you only play GTA 5? do you have another game to test with? like BF1, BF4, Crysis 3, or maybe Shadow of War / Mordor, The Witcher 3, etc...


1080 ftw hybrid and z170 realtek driver. I never have installed hd audio driver. Try that. And dont use XOC. Try using AB. Flip your bios switch towards front of case when PC is off. It gives better fan curves and slight ly better for overclock settings. I run my card at 2126/5454 speeds rock solid at 43c max. I have the AIO on it but never hits any limits. Make sure to flatten everything past the 1093mv curve to same setting. I.e. +100mhz at all voltage levels at 1063, 1085, 1093, and so on


----------



## kmac20

It is a brand new seasonic focus gold 850w that i won on the contest on this site. It has no issues whatsoever and I doubt very much I got a bad seasonic from the reps themselves. I am 99% sure its not this brand new Seasonic based on the fact that it is displaying no signs of a bad PSU (ive had this happen before and even had a PSU surge and fry my PCIE slots)

Thats a REALLY GOOD IDEA about the nvidia audio and realtek drivers................I have the nvidia audio driver installed just because I have speakers on my monitor, but I've been using a (old) surround sound setup. Thats a really good idea, thank you.

I have some other games to test it with, but none of the ones you've listed. I just mention it because it seems to be the fastest/easiest to trigger. And since I would like to play the game I spent 35$ on that I haven't beaten yet, its the one I've focused on. Again, it seems to be the easiest i've found to trigger this problem. Within like, a fwe hours after the last install I was able to have the problem pop up again. With Civ 5 it took almost all day.

I dont have a lot of newer games. I will say Shadow of MORDOR (not war) has not had a single issue either and thats a pretty stressful game. Is that DX11 does anyone know? I have played a lot of that game on both the 1060 and the newer 1080 and never had a single crash.

I'm gonna uninstall the nvidia audio aspect right now....that seems like a very very very good idea. Thank you.

I use XOC because its a ICX meaning it has certain features that only are available on XOC. Again though this is with everything at stock and with XOC not even open so I'm not worried about the overclocking program itself. It's definitely something else causing this as I"ve had this happen with the program closed and at defaults. I'm not 100% about this but I believe I even trigger this problem after the first Windows reinstall without even having XOC or AB or anything installed. But I"m forced to use XOC anyway if I want certain features. But once again, everything at stock, everything left alone.

I just removed the NVIDIA audio aspect. We'll see how it goes from here. Thank you both this seems like a very likely candidate although i'm not a software guy so who knows except the future..


----------



## sinholueiro

Quote:


> Originally Posted by *stephenn82*
> 
> I have a
> 1080 ftw hybrid and z170 realtek driver. I never have installed hd audio driver. Try that. And dont use XOC. Try using AB. Flip your bios switch towards front of case when PC is off. It gives better fan curves and slight ly better for overclock settings. I run my card at 2126/5454 speeds rock solid at 43c max. I have the AIO on it but never hits any limits. Make sure to flatten everything past the 1093mv curve to same setting. I.e. +100mhz at all voltage levels at 1063, 1085, 1093, and so on


Why is important to make sure that at 1.093v and above is allways at the same speed? Also, I think that 1.093v is the maximun that my 1080 is using.


----------



## Hequaqua

Have you tried and older driver and used DDU to completely remove the old driver?

You can install just the bare minimums for nVidia.

After unpacking the driver, delete everything else in the folder except these:


----------



## kmac20

I originally did use an older driver. The same driver that the 1060 used. It crashed on that. So then there was a new driver released the very day that I did the 2nd installation of Windows and I've been using that since. Have not run GTA 5 yet as I didn't want any downtime.

I use DDU in safe mode when I need to completely remove everything.

I'm gonna try it without the Audio component later and see what happens. I appreciate all the tips and help. I'll get back to you guys when I give it a whirl later today or tomorrow.

Thank you so much everyone and any more input is still greatly appreciated.


----------



## stephenn82

Quote:


> Originally Posted by *sinholueiro*
> 
> Why is important to make sure that at 1.093v and above is allways at the same speed? Also, I think that 1.093v is the maximun that my 1080 is using.


1093 is the MAX any 1080 will go. You give 100% voltage slider and set your clocks.

You ever see voltage limit, temp limit when gaming? Voltage limit is 90% of ehat ypu will see. It drops clocks. You flatten your curve to match aftet 1093 and above, you wont hitba limit. Its all over this thread.

Aldo. O eouldnt use a 1060 driver on a 1080.


----------



## Sharchaster

I'm started got confused about my card, it never goes up to 1.093 volt STEADY....WHEN Playing a game....it just fluctuate like crazy...sometimes goes down to 0.993 volt, LOL...

i just want to run this card @2100 Mhz @decent voltage, is that so hard to run with? because I knew my card can do this if I get the right curve / voltage

Maybe power limit is limiting myself, arrggh MSI....why why why? LOL ..


----------



## SavantStrike

Quote:


> Originally Posted by *Sharchaster*
> 
> I'm started got confused about my card, it never goes up to 1.093 volt STEADY....WHEN Playing a game....it just fluctuate like crazy...sometimes goes down to 0.993 volt, LOL...
> 
> i just want to run this card @2100 Mhz @decent voltage, is that so hard to run with? because I knew my card can do this if I get the right curve / voltage
> 
> Maybe power limit is limiting myself, arrggh MSI....why why why? LOL ..


The more voltage you give it the harder it will throttle.

The power limits for most of the 1080 cards aren't very generous.

Sent from my ZTE Axon 7 Resurrection Remix.


----------



## BeeDeeEff

Quote:


> Originally Posted by *BeeDeeEff*
> 
> I'm just getting a b.a. aftermarket aircooler, and gonna slap two 120mm noctua fans on it. Already ordered a Raijintek Morphes II and the fans I'll need.
> 
> Will update with the results, and thanks for the ideas.


Got my cooler, took off reference cooler, applied mini-heatsinks to the board as required, and it didn't come with an adapter to plug my fans into the mini-4-pin fan header on the board. Now I have to wait till tuesday for one to be shipped so I can plug my 120mm noctua pwm fans into it.

Really bummed something like that didn't come with the aftermarket VGA cooler.


----------



## xartic1

Quote:


> Originally Posted by *BeeDeeEff*
> 
> Got my cooler, took off reference cooler, applied mini-heatsinks to the board as required, and it didn't come with an adapter to plug my fans into the mini-4-pin fan header on the board. Now I have to wait till tuesday for one to be shipped so I can plug my 120mm noctua pwm fans into it.
> 
> Really bummed something like that didn't come with the aftermarket VGA cooler.


Not all cards are the same and require those heatsinks. My Mk26 did come with heatsinks but I didn't need them bc the HOF series uses their a separate heatsink to cool the VRMS and VRAM.


----------



## BeeDeeEff

Quote:


> Originally Posted by *xartic1*
> 
> Not all cards are the same and require those heatsinks. My Mk26 did come with heatsinks but I didn't need them bc the HOF series uses their a separate heatsink to cool the VRMS and VRAM.


Yup, that's the price I paid for getting a reference card at launch. Founder's Edition.

Edit:
Got The Raijintech Morpheus II cooler installed and running on my reference founder's edition GTX 1080

Controlling the fan speed for now from my secondary cpu pwm fan header with manual software control. Under 100% load (mining) and my two 120mm Noctua PWM fans running at half speed, holding at 55C (with 22C ambient) with no thermal throttling of the speed.

HUGE IMPROVEMENT, very quiet, with the fans off during idle stays around 35C.

Time to look into overclocking this thing.


----------



## sinholueiro

Quote:


> Originally Posted by *stephenn82*
> 
> 1093 is the MAX any 1080 will go. You give 100% voltage slider and set your clocks.
> 
> You ever see voltage limit, temp limit when gaming? Voltage limit is 90% of ehat ypu will see. It drops clocks. You flatten your curve to match aftet 1093 and above, you wont hitba limit. Its all over this thread.
> 
> Aldo. O eouldnt use a 1060 driver on a 1080.


Precisely, if you won't use that frecuency because you won't use that voltage (higher than 1.093v), why you have to flatten? It doesn't matter what value you have there, does it?


----------



## thomjak

So i made the jump a week ago from my old amd 280x to Gigabyte GeForce GTX 1080 Windforce OC. I had a discount coupon so got the card for a good price, i have it stable at 175core/500mem.

I was waiting for 2080 but decided to just get the 1080 and upgrade if 2080 kicks ass. Abit late to the 1080 upgrade game but i have not played heavy AAA games in 2 years. But man what an upgrade.


----------



## stephenn82

Quote:


> Originally Posted by *sinholueiro*
> 
> Precisely, if you won't use that frecuency because you won't use that voltage (higher than 1.093v), why you have to flatten? It doesn't matter what value you have there, does it?


You will hit the limit. My card would walk down a stepping and run 2112 vice 2124 or whatever. Sometimes it would atep down two. Flattening that curve prevents a soft throttle like that.


----------



## Sharchaster

Quote:


> Originally Posted by *thomjak*
> 
> So i made the jump a week ago from my old amd 280x to Gigabyte GeForce GTX 1080 Windforce OC. I had a discount coupon so got the card for a good price, i have it stable at 175core/500mem.
> 
> I was waiting for 2080 but decided to just get the 1080 and upgrade if 2080 kicks ass. Abit late to the 1080 upgrade game but i have not played heavy AAA games in 2 years. But man what an upgrade.


welcome to the club and enjoy the card....same as me, a little bit late of an upgrade, but I enjoyed very much the card, except the power limit


----------



## Gdourado

Just got my 1080 11gbps. What a beast of a card!

Now to see what it can do.


----------



## AngryGoldfish

Quote:


> Originally Posted by *Gdourado*
> 
> Just got my 1080 11gbps. What a beast of a card!
> 
> Now to see what it can do.


Very nice! I've been able to overclock my 11Gbps memory to 12000Mhz without any issues. I could get it slightly higher, but I'm happy at 12Ghz.

Quote:


> Originally Posted by *thomjak*
> 
> So i made the jump a week ago from my old amd 280x to Gigabyte GeForce GTX 1080 Windforce OC. I had a discount coupon so got the card for a good price, i have it stable at 175core/500mem.
> 
> I was waiting for 2080 but decided to just get the 1080 and upgrade if 2080 kicks ass. Abit late to the 1080 upgrade game but i have not played heavy AAA games in 2 years. But man what an upgrade.


Same. I'm upgrading late to the 1080 party because I was waiting for Vega. When Vega took too long and only offered a very hot GPU, I jumped back to Nvidia. The card is very cool and quiet even in a small case. It's excellent. I'll upgrade next Summer when the 2080 comes out, or if it's not that amazing I'll wait for Navi and go back to AMD again.


----------



## sirleeofroy

Hey Guys, just a quick one.....

Does anyone have any first hand experience with the following GPU cooler - Alphacool Eiswolf 120 GPX Pro Nvidia Geforce GTX 1080

I'm looking at getting one of these for my Gainward GTX 1080 GLH as I run 4K most of the time which the card does a decent job at but to keep my OC as effective as possible, I have to run the fans at full tilt which is far noisier than I would like.

I also have a few Alphacool parts already as I have a semi-custom loop built out an Alphacool AIO and additional radiators, so this could either add to my loop for a total of a 280mm rad, a 120mm rad and 2 AIO pumps (CPU+GPU) or.... Separate cooling for the CPU using the 280mm rad and separate cooling for the GPU using the new AIO.

Edit: The 280mm rad is 60mm thick and the 120mm is 40mm thick if that helps.....

What are your thoughts guys?

Thanks


----------



## Roy360

Purchased this card recently :https://www.amazon.ca/gp/aw/d/B01GCAVSIO

I'm wondering if there will be any problems adding it to my loop.

I don't see any issues, but would any problems arise from using different tubing material? (I'm using Ek zmt)

I think metal wise I'm fine as I'm using an ek supremacy block


----------



## jon666

I'm not sure it will matter so long as you flush everything out beforehand. Not that I am very experienced, only on my second build for watercooling. Picked up the MSi 1080 with preinstalled EK waterblock, been loving it. Shame my 390 died...it had some grunt to it, but the 1080 has yet to dissappoint at 1080p while streaming on occasion. Being watercooled all I had to do was increase voltage, and power limits just a tad to easily break the 2k mhz mark constant. Have yet to truly benchmark it since I have a few lans to attend before abusing my PC. Been trying not to mess with RAM clocks for that reason. There is nothing worse then almost stable clocks to ruin marathon gaming.


----------



## Imprezzion

Just bought myself a super cheap secondhand MSI 1080 FE with a Accelero Twin Turbo III. Upgraded it with VRM cooling in the form of a Gelid 1070/1080 Enhancement Kit and a whole bunch of Arctic heatsinks for the RAM.

Stays under 50c load at all times.

Problem is.. the terrible terrible FE power limit.. It boosts to about 1911Mhz but i can't clock it any higher since it will just bounce off the limited like a madman in like, 3DMark Fire Strike or Superposition so I can't even stresstest it since it will fluctuate wildly.

I have seen the shunt mod and i do have some Liquid Ultra left over that I can use but.. i want a more permanent maintainance free mod.. Can't I just solder a piece of wire across it or just drop a ball of solder on the resistor in stead of using Liquid Ultra?

I have been expirimenting with a whole bunch of random BIOS as well with higher limits and they do help but not to the point of solving all throttling. I tried for example the EVGA FTW BIOS, MSI Armor (240w) MSI Gaming X (270w) but they all seem to still hit 110-115% in MSI AB power monitoring even tho the limit is almost 80w higher in the BIOS..

So, is it a good idea to shunt mod it? I don't really intent to run higher voltages per sé, but i just want a stable card with no fluctuations in clocks under load so i can set a solid offset and don't have to worry about GPU Boost messing with it. Like, 1.0625v to 1.09v would be plenty i think..


----------



## stephenn82

This may be a little off topic, but still in line with the 1080 series

What is the difference between the EK GTX 1080 FTW and FTW2 blocks?

The original one (I can get this at Micro Center in an open-box form for 63 bucks right now)
https://www.ekwb.com/shop/ek-fc1080-gtx-ftw-acetal-nickel

and the new FTW2 for the iCX cooled cards
https://www.ekwb.com/shop/ek-fc1080-gtx-ftw2-acetal-nickel

Any real differences between them? I found an artical on Toms about EK thinks they can make the block better, but is there any reason to spend twice as much on the new thing?
http://www.tomshardware.com/news/ekwb-evga-ftw-water-block,32631.html

I do plan on taking the Hybrid cooler off my FTW and do straight custom loop. Too many fans, hoses, cables, etc flying aroun dmy case looking like spaghetti lately. It takes a lot of time to tuck everything in to not be visible to the eye.


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> This may be a little off topic, but still in line with the 1080 series
> 
> What is the difference between the EK GTX 1080 FTW and FTW2 blocks?
> 
> The original one (I can get this at Micro Center in an open-box form for 63 bucks right now)
> https://www.ekwb.com/shop/ek-fc1080-gtx-ftw-acetal-nickel
> 
> and the new FTW2 for the iCX cooled cards
> https://www.ekwb.com/shop/ek-fc1080-gtx-ftw2-acetal-nickel
> 
> Any real differences between them? I found an artical on Toms about EK thinks they can make the block better, but is there any reason to spend twice as much on the new thing?
> http://www.tomshardware.com/news/ekwb-evga-ftw-water-block,32631.html
> 
> I do plan on taking the Hybrid cooler off my FTW and do straight custom loop. Too many fans, hoses, cables, etc flying aroun dmy case looking like spaghetti lately. It takes a lot of time to tuck everything in to not be visible to the eye.


The original block doesn't have the section cut out for the led leads. Other than that, they're exactly the same. But because of it, you can't use the original block on the FTW2 GPU.

If I recall correctly.....it's been a while since I was looking into this.


----------



## Gdourado

I am curious.
I have tuned my 1080 to 2100 core and memory is at the stock 11gbps.
How much faster is a 1080 Ti?
A similar aorus 1080ti with a similar cooler, usually is able to overclock to what levels?

Cheers


----------



## SavantStrike

Quote:


> Originally Posted by *Gdourado*
> 
> I am curious.
> I have tuned my 1080 to 2100 core and memory is at the stock 11gbps.
> How much faster is a 1080 Ti?
> A similar aorus 1080ti with a similar cooler, usually is able to overclock to what levels?
> 
> Cheers


Less than 2100 unless you're under water. The TI is a much hotter running die. The aorus cards can do 2025-2050 without too much trouble on air though.

As for performance, another 20 to 30 percent, a bit more at ultra high resolutions.


----------



## yuppicide

I bought the 1080 close to when it came out.. I have the founders edition. This is the first "top of the line" video card I've ever owned. I usually get something good, but not great.. or I buy my friends old card for $100 when he upgrades, but unfortunately he no longer does that.


----------



## ucode

@yuppicide Now your turn to upgrade and sell your friend your GTX1080 for $100


----------



## BeeDeeEff

Quote:


> Originally Posted by *BeeDeeEff*
> 
> Got my cooler, took off reference cooler, applied mini-heatsinks to the board as required, and it didn't come with an adapter to plug my fans into the mini-4-pin fan header on the board. Now I have to wait till tuesday for one to be shipped so I can plug my 120mm noctua pwm fans into it.
> 
> Really bummed something like that didn't come with the aftermarket VGA cooler.


Now with pics:


----------



## Imprezzion

Seems like my FE can make about 2050Mhz core and about 5400Mhz VRAM without throttling. Any higher on the memory and it will randomly crash in the division. Any higher on the core and it's hard to test 100% stability since it starts to throttle slightly to the lower clocks. I'm hitting about 115-120% power limit now in BF1 and The Division for example.

I'm running a Arctic Accelero Twin Turbo III with a Gelid GTX 1070/1080 Enhancement Kit for the VRM's and a whole bunch of Arctic heatsinks for the VRAM and Prolimatech PK-3 on the core.
The card stays unbelievably cool. haven't seen it hit 50c yet. In ANY load. Usually hovering at 47-48c. (18c ambient). idles at 21c at all times. Under hw acceleration load in Chrome / MPC-HC it runs at 24c.

Problem is, i'm still running 1.050v (+0v) since running +100mv (for 1.09v) will only result in hard throttling back down to 1.050-1.062v so it won't run any faster. I don't really want to power mod it with the shunt mod since this rig goes to LAN's a lot in the back of my car and such and i don't want it to run off of the resistor and cause havoc. I could just straight up solder over the resistor but i doubt the card can handle full power 1.092v with unlimited power limit on a single 8 pin..

I am looking for a founders editions cooler tho since i got this one second hand without the founders cooler and i really want to have it to mod it..
Does anyone in EU have a founders cooler or does anyone outside of the EU have a decent quote for sending me one?


----------



## nolive721

hello.I am about to pull the trigger on my 1st GTX 1080 card to run my 3 1080p monitor set-up

I was keen to try the ZOTAC mini variant but bumped into that baby on Amazon japan

https://www.amazon.co.jp/gp/product/B071NHWS7V/ref=od_aui_detailpages00?ie=UTF8&psc=1

I cant seem to find any review of what seems to be an higher clock memory SKU of the AMP extreme released last year, is there any owner of the card here? or could someone point me to an actual review to see the benefit of this extra memory performance in real life terms?

the price gap is not that big to the mini cost here in japan (just 40USD more) and I can use AMAZON refund system if it doesnt deliver so the risk isnt big, juts needed some confirmation is the right move to make now ( I decided to leave the 1070Ti alone because US prices dont give me a good feeling of what it would cost here in Japan)

thanks in advance

Olivier


----------



## nolive721

and for the non japanese speakers who have opened the link...

vd6345 Japan Product Guarantee 1 years ztgtx1080 - Authorized Distributor 8gd5xampex +/ZT - p10800i - 10P
Equipped with NVIDIA GeForce GTX 1080
Equipped with a ZOTAC exclusive cooler "Icestorm"
2 ply construction of the fan blades used on the "EKO fan"
koabe-sukurokku Base clock 1, 771mhz
Boost Clock 1, 911mhz
Memory clock 11,000 MHz

and yes, you have read it right, PC stuff are warranted for just 1 year in this lovely country.......


----------



## steveTA1983

I have a G701VI laptop with the 1080 and I lost the ability to set a custom fan curve a few driver updates ago. I had it so that the fan would go full blast at 80C. Now when gaming it goes up to 91C (never above) but it starts to kill Frame rates due to throttling. Is there anything I can do to gain control again using Afterburner or PX?? Rolling back drivers won't work due to some games requiring a newer driver. Did Nvidia pull fan control support on purpose??


----------



## Gen Patton

Heat cause it to crash. make sure you have a lot of cooling going to the Gpu.


----------



## nolive721

Quote:


> Originally Posted by *nolive721*
> 
> and for the non japanese speakers who have opened the link...
> 
> vd6345 Japan Product Guarantee 1 years ztgtx1080 - Authorized Distributor 8gd5xampex +/ZT - p10800i - 10P
> Equipped with NVIDIA GeForce GTX 1080
> Equipped with a ZOTAC exclusive cooler "Icestorm"
> 2 ply construction of the fan blades used on the "EKO fan"
> koabe-sukurokku Base clock 1, 771mhz
> Boost Clock 1, 911mhz
> Memory clock 11,000 MHz
> 
> and yes, you have read it right, PC stuff are warranted for just 1 year in this lovely country.......


not much traction here but I guess thats because this card SKU is not sold in the US?

anyway, I decided to take the plunge and order it,its coming tomorrow so I will use Japan holiday week-end to install and run it.

Bit worried about the sag considering how heavy it is and my PCI slots are not reinforced on my PRO4 mobo so I might do some DIY as well to support the card somehow better


----------



## SavantStrike

Quote:


> Originally Posted by *nolive721*
> 
> not much traction here but I guess thats because this card SKU is not sold in the US?
> 
> anyway, I decided to take the plunge and order it,its coming tomorrow so I will use Japan holiday week-end to install and run it.
> 
> Bit worried about the sag considering how heavy it is and my PCI slots are not reinforced on my PRO4 mobo so I might do some DIY as well to support the card somehow better


Oh wait I've seen that model here in the US. It's miles beyond the mini. The mini needs water to keep from throttling so you made a good choice for GPU core speed alone.

The 11gbps memory will get you another 5-8 percent performance over a normal 1080. this review should have the data you're looking for. I've also gotten to play with the aorus 11gbos 1080 and can confirm there's a benefit with the faster RAM.

lI would say the card you picked is a good one. Easily top three for GTX 1080 cards since I'm not a strix fan and only a few vendors did 11gbos models.


----------



## Jorginto

Guys, could you chek out my furestrike score:

https://www.3dmark.com/3dm/23042774?

Could you compare your GPU score at similiar clocks? I'm just wondering if it's worth it to push it so far...


----------



## thomjak

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, could you chek out my furestrike score:
> 
> https://www.3dmark.com/3dm/23042774?
> 
> Could you compare your GPU score at similiar clocks? I'm just wondering if it's worth it to push it so far...


Here is mine https://www.3dmark.com/3dm/23052556?

Nvidia driver 388.00
6700K @ 4.7GHz
DDR4 3000MHz Memory
Gigabyte GTX 1080 175core/500mem.

Looks like it makes a good difference running it at 2215MHz


----------



## thomjak

.


----------



## nolive721

Quote:


> Originally Posted by *SavantStrike*
> 
> Oh wait I've seen that model here in the US. It's miles beyond the mini. The mini needs water to keep from throttling so you made a good choice for GPU core speed alone.
> 
> The 11gbps memory will get you another 5-8 percent performance over a normal 1080. this review should have the data you're looking for. I've also gotten to play with the aorus 11gbos 1080 and can confirm there's a benefit with the faster RAM.
> 
> lI would say the card you picked is a good one. Easily top three for GTX 1080 cards since I'm not a strix fan and only a few vendors did 11gbos models.


thanks

been running few of my fav games and happy to reach above 60fps with card at stock core/memory with all settings maxed out on Crisys3, Project cars 1 and 2, Assetto corsa in a whisper quiet environment

now Heaven benchmark at 1080p was giving me just 110fps and 2500pts until I raised power and temp limits as well as applies 100% core voltage.

card would boost stable at around 2060Mhz and OCed memory 12,000Mhz would then give me 121fps avg and 3050pts in the same benchmark

been looking briefly on this thread and it seems this kind of scores are pretty much the norm of most OCed cards, right?

so happy with my purchase but not over the moon.maybe my expectations were too high or I havent played enough with the Voltage as it looks like it the factor making the card throttling


----------



## Derek1

Quote:


> Originally Posted by *nolive721*
> 
> thanks
> 
> been running few of my fav games and happy to reach above 60fps with card at stock core/memory with all settings maxed out on Crisys3, Project cars 1 and 2, Assetto corsa in a whisper quiet environment
> 
> now Heaven benchmark at 1080p was giving me just 110fps and 2500pts until I raised power and temp limits as well as applies 100% core voltage.
> 
> card would boost stable at around 2060Mhz and OCed memory 12,000Mhz would then give me 121fps avg and 3050pts in the same benchmark
> 
> been looking briefly on this thread and it seems this kind of scores are pretty much the norm of most OCed cards, right?
> 
> so happy with my purchase but not over the moon.maybe my expectations were too high or I havent played enough with the Voltage as it looks like it the factor making the card throttling


Here is you and me compared when I was running a single 1080.

https://www.3dmark.com/compare/fs/11614301/fs/14023617#

Graphics score is within 30 points. My core was 2139 with a slightly higher mem oc.
If you are gonna oc to 2200 then you need to make sure your Temps are as low as possible to get the max performance. Actually it doesn't matter what the oc is , Temps are the primary thing to control for. You need to keep it as cool as possible.


----------



## Vellinious

Looks like it's about right where it should be
Quote:


> Originally Posted by *Derek1*
> 
> Here is you and me compared when I was running a single 1080.
> 
> https://www.3dmark.com/compare/fs/11614301/fs/14023617#
> 
> Graphics score is within 30 points. My core was 2139 with a slightly higher mem oc.
> If you are gonna oc to 2200 then you need to make sure your Temps are as low as possible to get the max performance. Actually it doesn't matter what the oc is , Temps are the primary thing to control for. You need to keep it as cool as possible.


I completely agree. I've found that 2200+ works best when core temps are in the single digits....I shoot for 7c or lower in order to get good runs with 2252.


----------



## nolive721

Quote:


> Originally Posted by *Derek1*
> 
> Here is you and me compared when I was running a single 1080.
> 
> https://www.3dmark.com/compare/fs/11614301/fs/14023617#
> 
> Graphics score is within 30 points. My core was 2139 with a slightly higher mem oc.
> If you are gonna oc to 2200 then you need to make sure your Temps are as low as possible to get the max performance. Actually it doesn't matter what the oc is , Temps are the primary thing to control for. You need to keep it as cool as possible.


nope wasnt me but this is where I sit, right up there with my 1500X.not bad at all

https://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/2229/1085/500000?minScore=0&cpuName=AMD Ryzen 5 1500X&gpuName=NVIDIA GeForce GTX 1080


----------



## Derek1

Quote:


> Originally Posted by *nolive721*
> 
> nope wasnt me but this is where I sit, right up there with my 1500X.not bad at all
> 
> https://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/2229/1085/500000?minScore=0&cpuName=AMD Ryzen 5 1500X&gpuName=NVIDIA GeForce GTX 1080


OOPs Sorry bout that. Was supposed to quote Jorginto.


----------



## Derek1

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, could you chek out my furestrike score:
> 
> https://www.3dmark.com/3dm/23042774?
> 
> Could you compare your GPU score at similiar clocks? I'm just wondering if it's worth it to push it so far...


Here is you and me compared when I was running a single 1080.

https://www.3dmark.com/compare/fs/11614301/fs/14023617#

Graphics score is within 30 points. My core was 2139 with a slightly higher mem oc.
If you are gonna oc to 2200 then you need to make sure your Temps are as low as possible to get the max performance. Actually it doesn't matter what the oc is , Temps are the primary thing to control for. You need to keep it as cool as possible.


----------



## Derek1

Quote:


> Originally Posted by *nolive721*
> 
> nope wasnt me but this is where I sit, right up there with my 1500X.not bad at all
> 
> https://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/2229/1085/500000?minScore=0&cpuName=AMD Ryzen 5 1500X&gpuName=NVIDIA GeForce GTX 1080


Here's you and me on single card....

https://www.3dmark.com/compare/fs/14035105/fs/11715414

and in SLi...

https://www.3dmark.com/compare/fs/13949492/fs/12681791#


----------



## nolive721

No worries

Trying to break 2100mhz stable boost barrier ibut no joy so far


----------



## nolive721

and your I7 is beast against my 1500X but our graphics score are relatively close indeed

I have not done that in the pats but I heard that setting a clock vs voltage curve in MSI AB would help to maintain boost high

will look at some tutorial and give it a try this week-end


----------



## Vellinious

My best run in SLI. This run was done with the coolant at -8c.

https://www.3dmark.com/fs/11379496


----------



## nolive721

been playing with MSI AB voltage curve and managed to get boost stable at above 2100Mhz in Firestrike or Heaven



but my scores are not improving at all.does that makes sense? the card doesn't seem to throttle from what I checked in the monitoring tool


----------



## Derek1

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *nolive721*
> 
> been playing with MSI AB voltage curve and managed to get boost stable at above 2100Mhz in Firestrike or Heaven
> 
> 
> 
> but my scores are not improving at all.does that makes sense? the card doesn't seem to throttle from what I checked in the monitoring tool






I am just speculating here but I think that Temps are the issue.
You haven't said what you are using for cooling.
I know from my own trials when I had the single card it took some fine tuning to find its sweet spot on the OC, both core and memory, to finally be able to break 25000 on the Graphics score.
I have, had EVGA FTW 1080 Hybrid, and my temps then doing runs, if memory serves, were around 40-43C during runs.
It may also be the silicon. You may just have maxxed out.

BUT, if you look at Vellinious' score in the post before yours on his SLi run vs my runs, his Graphics score is 6000 pts greater than mine. My Temps were again in the low 40's, his were -8C.
Not sure what his clocks were as the page is not reporting those accurately as there is sometimes a bug, but I believe he was around 2200. He can confirm that for you. My runs were at 2152. So that ~+50 on the core difference between us is not the reason why he scored so much higher than me. It was the fact that he was sub zero. TEMPS are the main issue governing performance with Pascal. Again, you may find better performance, graphics score, fps whatever you are using to measure what performance is, that a higher clock does not guarantee that higher performance. Vellinious will also confirm that his highest or best performance came with a lower clock than he was capable of but better Temps X Voltage X Clock ratio on AB using the curve method of OCing.

I would say though if you are getting close to 25k on your graphics score, and stable oc of 2100-2139 You have an above average card and should be pleased.
Keep fine tuning it though, find its sweet spot.


----------



## nolive721

thanks for the detailed comments.I am using the 3 fan cooler of the card, as mentioned earlier on this thread being the Zotac Exteme AMP+ variant

my previous results were with fans at 60% and I was hitting 61-62degC during the firestrike run

with fans pushed to 80%, temps are getting lower to 53-54degC and indeed my graphics score increased accordingly and getting close to 25000pts. but oh boy that thing was loud then.

might give it another whirl just for fun by further pushing the clocks and cooling fans to 100% but yes card is a good one


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> 
> I am just speculating here but I think that Temps are the issue.
> You haven't said what you are using for cooling.
> I know from my own trials when I had the single card it took some fine tuning to find its sweet spot on the OC, both core and memory, to finally be able to break 25000 on the Graphics score.
> I have, had EVGA FTW 1080 Hybrid, and my temps then doing runs, if memory serves, were around 40-43C during runs.
> It may also be the silicon. You may just have maxxed out.
> 
> BUT, if you look at Vellinious' score in the post before yours on his SLi run vs my runs, his Graphics score is 6000 pts greater than mine. My Temps were again in the low 40's, his were -8C.
> Not sure what his clocks were as the page is not reporting those accurately as there is sometimes a bug, but I believe he was around 2200. He can confirm that for you. My runs were at 2152. So that ~+50 on the core difference between us is not the reason why he scored so much higher than me. It was the fact that he was sub zero. TEMPS are the main issue governing performance with Pascal. Again, you may find better performance, graphics score, fps whatever you are using to measure what performance is, that a higher clock does not guarantee that higher performance. Vellinious will also confirm that his highest or best performance came with a lower clock than he was capable of but better Temps X Voltage X Clock ratio on AB using the curve method of OCing.
> 
> I would say though if you are getting close to 25k on your graphics score, and stable oc of 2100-2139 You have an above average card and should be pleased.
> Keep fine tuning it though, find its sweet spot.


Agreed....gotta lower the temps.


----------



## nolive721

decent score I presume?


----------



## nolive721

who said Pascal GPU OCing wasn't fun?

I have kept pushing the memory clock, now running at 12200Mhz with further improvement in heaven and superposition while no crash in games

just happy, card is a keeper


----------



## Vellinious

Boost 3.0 made it more difficult for overclockers to get clocks without lowering temps to below ambient. It's still fun, ya just have to work within the framework that NVIDIA outlined. Memory overclocking on a GPU, however, doesn't impress me at all. I barely even register that as "overclocking".


----------



## nolive721

core boosting stable at 2138Mhz with MSI AB set curve but I felt it was nice to share how far the memory clock can be pushed on this special SKU.

I am air cooling the card and have no plan to watercool , it will stay like that for the time I keep the card


----------



## 6u4rdi4n

My GTX 1080 suddenly died







But I'm getting a warranty replacement card probably this week. Already been shipped from Germany, but not sure how long the transit will be with UPS from Germany to Norway. The old card wasn't that fun. Wouldn't go higher than ~2070 sub 40°C. Not completely sure what to do about the new card. The tinker in me wants to put the water block on it and see what it's good for, but I've also been thinking about selling it as a not used not opened card, and upgrade to a 1080 Ti.


----------



## Imprezzion

It's a shame my 1080's such a meh card.. Mines an FE with a EVGA Hybrid cooler slapped onto it which means it doesn't even reach 50c load on super silent fan profile. Could probably keep it at around 40-45c if I wanted to..

Problem #1 is power.. Since it's an FE with a single 8 pin it has a 180/217w power limit. Flashing the BIOS doesn't really seem to help at all. Even on just a OC with +120% power and +0mV it will run into the 115% range in games and fire strike or superposition will just bounce off the limiter hard.

Problem #2 is the random crashes to desktop / dx11 freezes I'm getting at clocks above 2012Mhz core. Eventvwr will show me it's a driver timeout in all of those cases. Is that just the max core clock reached for my card or does it need more power limit / voltage. It never gets more than 1.062v now and usually runs at 1.050v because of the power limiter. 2012 is like, really really terrible..

Memory.. Another weird one. Benches and some games run fine on +800. (Like cod ww2 and firs strike / superposition).
Some artifact and need +400 (like bf1).
And then there's the division. Even +200 will crash the division it will ONLY run on stock memory..

Now, my temps are low. VRM and VRAM is cooled well.. Is there a point for my setup to do a "hard mod"? Like the shunt mod for example so I can run +100mV / 1.09v stable?
Or maybe even a hard mod for the voltage and power? I know my way around a soldering iron.. Problem is, can a FE VRM and 1 single 8 pin deal with a shunt modded 1.09v card?


----------



## Vellinious

Quote:


> Originally Posted by *Imprezzion*
> 
> It's a shame my 1080's such a meh card.. Mines an FE with a EVGA Hybrid cooler slapped onto it which means it doesn't even reach 50c load on super silent fan profile. Could probably keep it at around 40-45c if I wanted to..
> 
> Problem #1 is power.. Since it's an FE with a single 8 pin it has a 180/217w power limit. Flashing the BIOS doesn't really seem to help at all. Even on just a OC with +120% power and +0mV it will run into the 115% range in games and fire strike or superposition will just bounce off the limiter hard.
> 
> Problem #2 is the random crashes to desktop / dx11 freezes I'm getting at clocks above 2012Mhz core. Eventvwr will show me it's a driver timeout in all of those cases. Is that just the max core clock reached for my card or does it need more power limit / voltage. It never gets more than 1.062v now and usually runs at 1.050v because of the power limiter. 2012 is like, really really terrible..
> 
> Memory.. Another weird one. Benches and some games run fine on +800. (Like cod ww2 and firs strike / superposition).
> Some artifact and need +400 (like bf1).
> And then there's the division. Even +200 will crash the division it will ONLY run on stock memory..
> 
> Now, my temps are low. VRM and VRAM is cooled well.. Is there a point for my setup to do a "hard mod"? Like the shunt mod for example so I can run +100mV / 1.09v stable?
> Or maybe even a hard mod for the voltage and power? I know my way around a soldering iron.. Problem is, can a FE VRM and 1 single 8 pin deal with a shunt modded 1.09v card?


The shunt mod doesn't allow for higher voltages....it just helps with power limit.


----------



## Imprezzion

I know. And thus indirectly it helps with higher voltages. As my card can only run 1.050-1.062v without throttling. 1.09v will just bounce off the limiter and not allow for any higher clocks.

I just want to know 2 things.

1. Can a FE with a single 8 pin reliably run the shunt mod without blowing up or melting the cables?

2. Can I make the shunt mod a bit more "permanent" by not using CLU but just soldering over it for example. Or soldering a wire over it to bridge the resistor.. Or whatever..

I got like, 5 dead cards hanging on the wall I can "lend" a resistor off of. For example, a 780's shunt resistor soldered over this one?


----------



## AngryGoldfish

I've seen a lot of FE style 1080's top out at around 2012Mhz.


----------



## HAL900

someone flash gtx 1080 10 Gbps bioses from 11Gbps ??


----------



## SavantStrike

Quote:


> Originally Posted by *Imprezzion*
> 
> I know. And thus indirectly it helps with higher voltages. As my card can only run 1.050-1.062v without throttling. 1.09v will just bounce off the limiter and not allow for any higher clocks.
> 
> I just want to know 2 things.
> 
> 1. Can a FE with a single 8 pin reliably run the shunt mod without blowing up or melting the cables?
> 
> 2. Can I make the shunt mod a bit more "permanent" by not using CLU but just soldering over it for example. Or soldering a wire over it to bridge the resistor.. Or whatever..
> 
> I got like, 5 dead cards hanging on the wall I can "lend" a resistor off of. For example, a 780's shunt resistor soldered over this one?


You'll want to cool the crap out of the VRMs to avoid creating another dead donor card to hang on the wall







. The VRMs on the vanilla 1080 weren't super generous IIRC. The single 8 pin while out of spec should be okay with a decent PSU.
Quote:


> Originally Posted by *HAL900*
> 
> someone flash gtx 1080 10 Gbps bioses from 11Gbps ??


EVGA did this with the FTW2 model. Not all cards were stable at the higher speed. While this is theoretically possible, increased errors from the high memory clock can offset any potential performance gains. The 11gbps models (except for EVGA) used 11gbps RAM (sometimes capable of 12Gbps if one is lucky).


----------



## Imprezzion

Quote:


> Originally Posted by *SavantStrike*
> 
> You'll want to cool the crap out of the VRMs to avoid creating another dead donor card to hang on the wall
> 
> 
> 
> 
> 
> 
> 
> . The VRMs on the vanilla 1080 weren't super generous IIRC. The single 8 pin while out of spec should be okay with a decent PSU.


Luckily most of those dead cards are memory failures or something like that. Only blown VRM one is the Sapphire 7970 with 1.35v @ 110c VRM









I really don't know how effective EVGA Hybrid's VRM plate is. I'm now running a Gelid 1070/1080 Enhancement Kit VRM block but it isn't very large either. Although it is finned and such.

* Off-topic question. Are 1080 Ti founders cooler and/or backplate compatible with 1080 founders cards? My buddy has a cooler left over since he watercools his FE 1080 Ti (EK block + backplate).

PSU is an oldy, like 4 years old, but it's an XFX ProSeries 750W XXX Edition which is a Seasonic unit with mediocre efficiency but incredible regulation and ripple. So yeah.. That'll be fine.


----------



## Gdourado

Anyone here has a 1080 with an EVGA Hybrid Kit install?
If so, what do you think of the kit?
Is it nice build quality? Good to cool the memory and VRM?
Is it quiet?

Thanks.


----------



## Imprezzion

Quote:


> Originally Posted by *Gdourado*
> 
> Anyone here has a 1080 with an EVGA Hybrid Kit install?
> If so, what do you think of the kit?
> Is it nice build quality? Good to cool the memory and VRM?
> Is it quiet?
> 
> Thanks.


Just installed mine today. It's a tedious installation process but the build quality is very very good as expected from EVGA.

VRM and VRAM cooling is on par with a Founders as the cooling plate is basically a copy of the Founders plate with a different cut-out to allow room for the waterblock.

I read complaints about pump noise but mines dead quiet.

I don't like the stock fan EVGA has on the rad though as it's a cheap sleeve bearing thing with only voltage control and no RPM readout.
I'm using a set of Noiseblocker XL-P PWM fans running off the motherboard on 900-1000RPM.

Temps @ 2025Mhz core , +0mV, 120% power and 5200Mhz VRAM are around 44-45c load with no increase in fanspeed. The radial fan (for the VRM and VRAM) is set to 30% idle and 50% under load and is dead quiet as well. Above 50% it gets loud but this isn't necessary at all.

Overall I'm very happy with it if you don't use the stock fan on the rad.


----------



## SavantStrike

Quote:


> Originally Posted by *Imprezzion*
> 
> Luckily most of those dead cards are memory failures or something like that. Only blown VRM one is the Sapphire 7970 with 1.35v @ 110c VRM
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really don't know how effective EVGA Hybrid's VRM plate is. I'm now running a Gelid 1070/1080 Enhancement Kit VRM block but it isn't very large either. Although it is finned and such.
> 
> * Off-topic question. Are 1080 Ti founders cooler and/or backplate compatible with 1080 founders cards? My buddy has a cooler left over since he watercools his FE 1080 Ti (EK block + backplate).
> 
> PSU is an oldy, like 4 years old, but it's an XFX ProSeries 750W XXX Edition which is a Seasonic unit with mediocre efficiency but incredible regulation and ripple. So yeah.. That'll be fine.


The 1080 TI founders cooler is identical enough that I used parts from a 1080 cooler on a 1080 ti franken hybrid card. I'll see if I can't dig up some side by side pictures for the GPU side, but the back plates are identical as is assembly/disassembly.
Quote:


> Originally Posted by *Gdourado*
> 
> Anyone here has a 1080 with an EVGA Hybrid Kit install?
> If so, what do you think of the kit?
> Is it nice build quality? Good to cool the memory and VRM?
> Is it quiet?
> 
> Thanks.


The evga hybrid is the nicest aftermarket AIO solution, but it doesn't cool memory or VRMS appreciably. Most of that heat is going to be exhausted out the back of the case with the stock blower. It works pretty well all things considered, but you'll want some air on the backplate of the card if you're folding or mining.


----------



## Gdourado

Quote:


> Originally Posted by *Imprezzion*
> 
> Just installed mine today. It's a tedious installation process but the build quality is very very good as expected from EVGA.
> 
> VRM and VRAM cooling is on par with a Founders as the cooling plate is basically a copy of the Founders plate with a different cut-out to allow room for the waterblock.
> 
> I read complaints about pump noise but mines dead quiet.
> 
> I don't like the stock fan EVGA has on the rad though as it's a cheap sleeve bearing thing with only voltage control and no RPM readout.
> I'm using a set of Noiseblocker XL-P PWM fans running off the motherboard on 900-1000RPM.
> 
> Temps @ 2025Mhz core , +0mV, 120% power and 5200Mhz VRAM are around 44-45c load with no increase in fanspeed. The radial fan (for the VRM and VRAM) is set to 30% idle and 50% under load and is dead quiet as well. Above 50% it gets loud but this isn't necessary at all.
> 
> Overall I'm very happy with it if you don't use the stock fan on the rad.


Thanks for the reply.
I am asking as I have somebody who wishes to trade my Aorus 1080 11gbps for a Founders edition with an EVGA Hybrid Kit installed.
I am wondering if the trade would be worth it or not.
My Aorus has 2 8 pin connectors so I guess it is not power limited.
But I don't know if a FE is power starved or not.

Cheers!


----------



## Imprezzion

The FE is horribly power starved and the only way to circumvent this is by shunt modding it.

It only has 1 single 8 pin and a 180/217w power limit (100/120%)

It will hit this in benches at stock voltage around 2000Mhz. +100mV is hopeless.

2025Mhz core will run about 110-115% without any throttling in games like Battlefield 1 and The Division (the heaviest games i play).
3DMark Fire Strike and Superpositioin just bounce off the limiter and downclock to about 1920Mhz.


----------



## SavantStrike

Quote:


> Originally Posted by *Gdourado*
> 
> Thanks for the reply.
> I am asking as I have somebody who wishes to trade my Aorus 1080 11gbps for a Founders edition with an EVGA Hybrid Kit installed.
> I am wondering if the trade would be worth it or not.
> My Aorus has 2 8 pin connectors so I guess it is not power limited.
> But I don't know if a FE is power starved or not.
> 
> Cheers!


That's a bad trade. The Aorus is much better in the power delivery department and should clock as high or higher than the hybrid FE, plus it's got faster RAM.

The Aorus is a fantastic card.


----------



## 6u4rdi4n

Got my replacement 1080 today. Haven't put it under water yet, because I wanted to test the card and see that it's working properly. Been playing some games for a couple of hours, and so far it actually seems to be stable at 2088Mhz, just a quick and dirty OC. This is with 60-65°C load temps. Seems like a pretty decent card and I'm looking forward to see how it performs under water


----------



## StenioMoreira

I think there's a chance that I may have the top scoring gtx 1080 overclocked in normal conditions, normal fan curve and temps, no nitrogen, no having radiator outside the window like some people do. 8475 Timespy score? https://www.3dmark.com/3dm/23186254 . My clocks were 2101-2088mhz +1500 on Vram, 6700k @4.5ghz


----------



## Sharchaster

Quote:


> Originally Posted by *Imprezzion*
> 
> The FE is horribly power starved and the only way to circumvent this is by shunt modding it.
> 
> It only has 1 single 8 pin and a 180/217w power limit (100/120%)
> 
> It will hit this in benches at stock voltage around 2000Mhz. +100mV is hopeless.
> 
> 2025Mhz core will run about 110-115% without any throttling in games like Battlefield 1 and The Division (the heaviest games i play).
> 3DMark Fire Strike and Superpositioin just bounce off the limiter and downclock to about 1920Mhz.


well, you have 120% power limit, don't you? mine only have 104%....so your card is better than mine on power limit


----------



## Imprezzion

Nope. 120% of 180w is 217w. Yours has a base power limit that is higher than my 120% limit.

It's a percentage. And you have a Gaming X which has a base limit of (depending on the card and BIOS version) 210w, 240w, or even 271w. And AFAIK the Gaming X CAN be flashed to get a higher limit as the card is not a Founders PCB and it has a way higher limit and shunt resistor limit.

Go to TechPowerUp VGA BIOS collection for the MSI GTX1080

And look up the BIOS that represents your version (check GPU-Z) and flash it to one of the versions that has the 270w base limit and 291w max limit.

Check if the power limit under load is lower and the card doesn't throttle.


----------



## Derek1

Quote:


> Originally Posted by *StenioMoreira*
> 
> I think there's a chance that I may have the top scoring gtx 1080 overclocked in normal conditions, normal fan curve and temps, no nitrogen, no having radiator outside the window like some people do. 8475 Timespy score? https://www.3dmark.com/3dm/23186254 . My clocks were 2101-2088mhz +1500 on Vram, 6700k @4.5ghz


If you look at the first page of the TS thread you will see the rankings.
I spotted 2 ahead of you. One at 8600 and one at 8700.
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30


----------



## Vellinious

Quote:


> Originally Posted by *StenioMoreira*
> 
> I think there's a chance that I may have the top scoring gtx 1080 overclocked in normal conditions, normal fan curve and temps, no nitrogen, no having radiator outside the window like some people do. 8475 Timespy score? https://www.3dmark.com/3dm/23186254 . My clocks were 2101-2088mhz +1500 on Vram, 6700k @4.5ghz


Nope

https://www.3dmark.com/compare/spy/2697091/spy/1422487


----------



## Beagle Box

Quote:


> Originally Posted by *StenioMoreira*
> 
> I think there's a chance that I may have the top scoring gtx 1080 overclocked in normal conditions, normal fan curve and temps, no nitrogen, no having radiator outside the window like some people do. 8475 Timespy score? https://www.3dmark.com/3dm/23186254 . My clocks were 2101-2088mhz +1500 on Vram, 6700k @4.5ghz


Don't confuse the Graphics Score with overall Score.
i7-6700K water-cooled + MSI GTX 1080 Gaming X Air-cooled
https://www.3dmark.com/spy/2235134


----------



## ucode

Quote:


> Originally Posted by *StenioMoreira*
> 
> I think there's a chance that I may have the top scoring gtx 1080 overclocked in normal conditions, normal fan curve and temps, no nitrogen, no having radiator outside the window like some people do. 8475 Timespy score? https://www.3dmark.com/3dm/23186254 . My clocks were 2101-2088mhz +1500 on Vram, 6700k @4.5ghz


What, lol. However seems you've pushed a few buttons









FWIW Here's mine with 8804 graphics score.

It's actually one of the first round of Galax 1080 FE's, but due to being X-Flashed with Strix T4 is reported as Asus. Nothing exotic with the cooling just the air cooling replaced with water cooling and no, the radiator isn't stuck outside the window. Temperatures here are in the 30's (Centigrade) during the day most of the year and mid to high 20's at night. IIRC that particular run had the GPU at 60C.


----------



## Imprezzion

Quote:


> Originally Posted by *ucode*
> 
> What, lol. However seems you've pushed a few buttons
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FWIW Here's mine with 8804 graphics score.
> 
> It's actually one of the first round of Galax 1080 FE's, but due to being X-Flashed with Strix T4 is reported as Asus. Nothing exotic with the cooling just the air cooling replaced with water cooling and no, the radiator isn't stuck outside the window. Temperatures here are in the 30's (Centigrade) during the day most of the year and mid to high 20's at night. IIRC that particular run had the GPU at 60C.


And you weren't scared to drop a VRM or get issues with the single 8 pin?

Does that BIOS ignore the shunt then? I wanna try what my MSI FE can do under a EVGA Hybrid. But I'm scared to hell and back that it'll blow a VRM if I really push it with 1.10-1.15v.


----------



## Vellinious

Quote:


> Originally Posted by *Imprezzion*
> 
> And you weren't scared to drop a VRM or get issues with the single 8 pin?
> 
> Does that BIOS ignore the shunt then? I wanna try what my MSI FE can do under a EVGA Hybrid. But I'm scared to hell and back that it'll blow a VRM if I really push it with 1.10-1.15v.


I'd guess it was done with more memory clock, than a super high core clock....but we'd have to ask ucode. With the GPU running 60c, I can't see the core running 2200+ very well at all.


----------



## Derek1

Quote:


> Originally Posted by *Vellinious*
> 
> I'd guess it was done with more memory clock, than a super high core clock....but we'd have to ask ucode. With the GPU running 60c, I can't see the core running 2200+ very well at all.


If you check the link it says he was at 2265 on the Core and 1396 on the Memory.
Stellar silicon at 60C.
Wonder what the voltage was at as he was using the T4.


----------



## StenioMoreira

All of these TimeSpy benchmarks got Cpu's running at enthusiast speeds, and other extreme things and cooling. My set up is all as basic as you can get, also every single gtx1080 that
got higher scores then mine seem to be having better cpus and clocks, and ram. Idk I still think my graphics score is best for basic set up


----------



## Vellinious

Quote:


> Originally Posted by *Derek1*
> 
> If you check the link it says he was at 2265 on the Core and 1396 on the Memory.
> Stellar silicon at 60C.
> Wonder what the voltage was at as he was using the T4.


At 60c.....high


----------



## StenioMoreira

Quote:


> Originally Posted by *ucode*
> 
> What, lol. However seems you've pushed a few buttons
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FWIW Here's mine with 8804 graphics score.
> 
> It's actually one of the first round of Galax 1080 FE's, but due to being X-Flashed with Strix T4 is reported as Asus. Nothing exotic with the cooling just the air cooling replaced with water cooling and no, the radiator isn't stuck outside the window. Temperatures here are in the 30's (Centigrade) during the day most of the year and mid to high 20's at night. IIRC that particular run had the GPU at 60C.


Thats sexy







score, but your setup still is like God compared to mine, there's a definite correlation... still even if its just graphics score. I tested Ram speed and CPU clocks, they impact that score too


----------



## Vellinious

Quote:


> Originally Posted by *StenioMoreira*
> 
> Thats sexy
> 
> 
> 
> 
> 
> 
> 
> score, but your setup still is like God compared to mine, there's a definite correlation... still even if its just graphics score. I tested Ram speed and CPU clocks, they impact that score too


Not the graphics score, they don't.


----------



## StenioMoreira

Quote:


> Originally Posted by *Vellinious*
> 
> Not the graphics score, they don't.


Yes it does, brining down my own cpus clock by 500 decreases it beyond margin of error, over and over, so does my ram, so does different cpus, as countless benchmark tests show. Why does Vega cards perform better on Ryzen in so many tittles, and ect ect. We could go all day but I dont think i need to explain more.

Guys look what I came across and it reminded me over how Nvidia cards have issues at times rendering texture in time as compared to AMD, and it seems this wasn't an issue before. Not untill Maxwell and Pascal. Some times the loading of texture is extreme like in doom 




It happens in other tittles too, seems to be a thing out there of people wondering, I think its because of how Nvidia Maxwell and Pascal renders in tiles, that seems to be the biggest reason for their power efficiency when Maxwell came out. https://www.extremetech.com/gaming/232771-targeted-testing-reveals-secrets-of-nvidia-maxwell-pascal-power-efficiency


----------



## Derek1

Quote:


> Originally Posted by *StenioMoreira*
> 
> Yes it does, brining down my own cpus clock by 500 decreases it beyond margin of error, over and over, so does my ram, so does different cpus, as countless benchmark tests show. Why does Vega cards perform better on Ryzen in so many tittles, and ect ect. We could go all day but I dont think i need to explain more


Here is a comparison between me (at #42 on the list of TS Extreme) and 4 other systems ahead of me on the list including 1st place. None of those systems are on exotic cooling. Mine is a EVGA Hybrid, better than Air but still not doing anything sub 40C on benches.

https://www.3dmark.com/compare/spy/2548892/spy/2601728/spy/2641542/spy/2596554/spy/2654206#

Notice the CPU systems they are using and compare their Graphics score to mine. I also did a comparison between me and the top 5 and the same result. Only the first place guy had a higher graphics score and yet they all had better (?), newer CPU's and systems than me. My X79 is like almost 6 years old now.
The Graphics score is an accurate comparison between graphics cards set-up. The Overall scores are influenced by the stuff you mention.

I'm not saying your card is not good. Your Graphics score when I was running a single card is about 200 points higher than mine. But my Overall score is higher because I am running a 6 core and your is only 4.


----------



## Vellinious

Quote:


> Originally Posted by *StenioMoreira*
> 
> Yes it does, brining down my own cpus clock by 500 decreases it beyond margin of error, over and over, so does my ram, so does different cpus, as countless benchmark tests show. Why does Vega cards perform better on Ryzen in so many tittles, and ect ect. We could go all day but I dont think i need to explain more.
> 
> Guys look what I came across and it reminded me over how Nvidia cards have issues at times rendering texture in time as compared to AMD, and it seems this wasn't an issue before. Not untill Maxwell and Pascal. Some times the loading of texture is extreme like in doom
> 
> 
> 
> 
> It happens in other tittles too, seems to be a thing out there of people wondering, I think its because of how Nvidia Maxwell and Pascal renders in tiles, that seems to be the biggest reason for their power efficiency when Maxwell came out. https://www.extremetech.com/gaming/232771-targeted-testing-reveals-secrets-of-nvidia-maxwell-pascal-power-efficiency


If you're running an FX processor, maybe.....

I've been benchmarking for a very long time, and all my test runs on the GPU are done with CPU and memory at stock speeds. The only time I push the entire system hard, is when I'm making a scoring run. There is NO difference in the graphics score during those runs, beyond what I would consider a reasonable margin of error. + / - .2 fps.


----------



## ucode

Quote:


> Originally Posted by *Imprezzion*
> 
> And you weren't scared to drop a VRM or get issues with the single 8 pin?
> 
> Does that BIOS ignore the shunt then? I wanna try what my MSI FE can do under a EVGA Hybrid. But I'm scared to hell and back that it'll blow a VRM if I really push it with 1.10-1.15v.


It's just for benching more than anything else to see what might be possible, not for 24/7. For a short bench like TS it's doesn't seem to be a problem but I wouldn't want to run furmark uninhibited. No power limit with T4. The extra voltage may give you some extra clocks, it may make things worse. Depends on your chip but even those that do better pay a high price for a relatively small gain.

Quote:


> Originally Posted by *Derek1*
> 
> If you check the link it says he was at 2265 on the Core and 1396 on the Memory.
> Stellar silicon at 60C.
> Wonder what the voltage was at as he was using the T4.


1.2V but that clock is for the start and drops as temperature increases. I can't seem to find a snapshot for it, was a while back now. Here's a similar one though around the same time I think. Max 59C and minimum 44C.


Quote:


> Originally Posted by *StenioMoreira*
> 
> Thats sexy
> 
> 
> 
> 
> 
> 
> 
> score, but your setup still is like God compared to mine, there's a definite correlation... still even if its just graphics score. I tested Ram speed and CPU clocks, they impact that score too


CPU is a used Xeon that at the time were selling for less than the 6700k. Top CPU multiplier is 30x. Xeon 26xx v3 only supports up to 1066MHz RAM (2133MT/s) and the board is about the same price as yours. Where I live my board is actually about $1 less than yours. BCLK was run at 105MHz so all in all maximum CPU speed was 3.15GHz and RAM speed 1120MHz or 2240MT/s. So my clock speed was probably 1GHz+ less than yours and RAM probably 1000MT/s+ less than yours. Any advantage my side would be quad channel and core count. The graphics settings are not my 24/7 settings so you would have me beat there.

Also that 105MHz BCLK was causing PCIe WHEA corrections that I was not aware of at the time so don't know if that had an effect on the score or not, probably not so much.


----------



## [email protected]

Hello guys,

MSI SEA HAWK 1080 owners here, bought 2 days ago but in heaven benchmark gpu temp hit 80C as you can see in the picture. So, how can i be sure water pump is working and is card may be defect? I think i have the non X model according to GPU-Z.

Thank you.


----------



## Beagle Box

Quote:


> Originally Posted by *StenioMoreira*
> 
> All of these TimeSpy benchmarks got Cpu's running at enthusiast speeds, and other extreme things and cooling. My set up is all as basic as you can get, also every single gtx1080 that
> got higher scores then mine seem to be having better cpus and clocks, and ram. Idk I still think my graphics score is best for basic set up


You've set the standards by which you choose to measure.
Running your GPU memory @ 1440, yet limiting the GPU Clock to 2114?








Good score, no doubt, but you've really narrowed the brackets in order to score the win in your 'class'.


----------



## Dasboogieman

Quote:


> Originally Posted by *[email protected]*
> 
> Hello guys,
> 
> MSI SEA HAWK 1080 owners here, bought 2 days ago but in heaven benchmark gpu temp hit 80C as you can see in the picture. So, how can i be sure water pump is working and is card may be defect? I think i have the non X model according to GPU-Z.
> 
> Thank you.


That looks defective as hell. The card should be no higher than 50C even with terribad ambients. 80C implies DOA pump or really really bad mounting.


----------



## Imprezzion

Quote:


> Originally Posted by *Dasboogieman*
> 
> That looks defective as hell. The card should be no higher than 50C even with terribad ambients. 80C implies DOA pump or really really bad mounting.


Agreed. It should be comparable to my EVGA Hybrid in terms of temps and that runs at about 44c load with 20c ambient on 2025 core. So yeah. It's either defective or as you said, bad mount?


----------



## pez

The worst I see on my Titan X is 65C with a Hybrid cooler and in a SFF case. Additionally, my fan is a lower RPM than stock and should probably be an example of worst-case scenario for a Pascal GPU with a Hybrid kit on it







.


----------



## kmac20

Holy crap +500 mem? Jesus


----------



## Imprezzion

I now see the whole truth in the talk about lower temps allowing more clocks..
I didn't really believe it but on the 55c my card used to run it would only do 2000 or maybe 2012Mhz stable. Higher would give DX crashes and freezes..

Now i'm running a EVGA Hybrid and it keeps it around 42-43c it now seems to run at 2038Mhz with no real issues.. Wierd









Still, my memory is annoying.. It will run as high as +800 without artifacts or anything but will also randomly freeze.. Highest i've gotten it without randomly freezing is +200 so far lol.. Even tho i can play BF1 for an hour straight at +800..


----------



## PlugFour

I have a Zotac FE an I just tried the power limit hard mod described here :






It states to short the RS3

Unfortunately the result is worst than before :

Before : max CPU clock 1930Mhz, Temp 68°, Vcore 0.9V (power limit 120%)
After : max CPU clock 1835Mhz, Temp 65°, Vcore 0.85V (power limit 120%)



Looking at the comments under the video, I can read this :
Quote:


> This mod doesn't work unless you put liquid metal on all 3 shunt resistors. It actually disabled boost on my card when I did it to only 1 resistor and wouldn't let the card boost over 1607mhz. I have friends who have also experienced this issue when applying liquid metal to only 1 resistor, possibly a newer revision or bios.﻿


This matches with this warning on this page :
Quote:


> Note that earlier version of this guide incorrectly mentioned need to short RS1, RS2, RS3. This is wrong, and will cause card clock to lock at 135MHz. Do not short shunt resistors themselves, but add resistors like shown on photo below. Sorry for confusion.


So I wonder why ot worked for some people, but anyway, here is my question :

*Is there a bios mod to unlock the power limit for an aircooled 1080FE?*

EDIT : I just made the shunt on all three resistors : RS1,RS2,RS3, I get a mixed result with 0.88v as Vcore.
EDIT : Well, in a game like GTA5 it goes to 2076Mhz and 1.09v rock stable, while in furmark everything goes wrong. I think I did a mistake by testing my mods using furmark.


----------



## stephenn82

Quote:


> Originally Posted by *Gdourado*
> 
> Anyone here has a 1080 with an EVGA Hybrid Kit install?
> If so, what do you think of the kit?
> Is it nice build quality? Good to cool the memory and VRM?
> Is it quiet?
> 
> Thanks.


My FTW hybrid sits daily at 1.093v 2126 and memory at 5454 and nets a max load of 43-45c as well.

I hear an occasional swooshing of water but it's hard to tell if it's from the Corsair h115i. It happens very very rarely

I am considering getting a block and doing a custom loop. Interested in a hybrid cooler if it will fit?


----------



## stephenn82

Alright guys and gals. I have a question that needs confirmation.

The local micro center has an ekwb gtx 1080 ftw kit open box fo 62 bucks. Should I snag it? A new one is 91 or so. I think we are allowed to look it over and see what is wrong with open box items there.

I eventually want to do petg and fancy stuff. I think a 1080 ftw will be plenty of card for at least 3 more years at 1440p and maybe even 4K with lowered settings.

So...would you get the block??


----------



## SavantStrike

Quote:


> Originally Posted by *stephenn82*
> 
> Alright guys and gals. I have a question that needs confirmation.
> 
> The local micro center has an ekwb gtx 1080 ftw kit open box fo 62 bucks. Should I snag it? A new one is 91 or so. I think we are allowed to look it over and see what is wrong with open box items there.
> 
> I eventually want to do petg and fancy stuff. I think a 1080 ftw will be plenty of card for at least 3 more years at 1440p and maybe even 4K with lowered settings.
> 
> So...would you get the block??


MC will let you look at it and also gives your a 15 day return window.

If you want to go full cover then that's the cheapest you'll ever get for 60 bucks.


----------



## kmac20

I have a FTW2 (the icx version) and I can get my core to like +125/+150 range (so it hits about 2025) but my memory I've got at +150 and its stable at that and I dont even want to push it farther.

I also have noticed no difference when it comes to upping the voltage to 100% or leaving it at stock. Seems like it gives me no gains whatsoever.

I had a 1060 that thing was a lottery loser. Barely did +40/+50. Again with that card as well voltage didn't matter. Seems at least in my experience only the power allowance matters, not the voltage.

I'd love to be proven wrong and be able to do +500 on my memory. I'm happy with the core boost, 2025 on air is pretty decent, and unless I intentionally SLAM it like with FurMark or something it never goes above 70. this is with fans set to auto so they barely rev. If i crank up fans manually the temps drop significantly. If I slam it hardcore with furmark AND slam the cpu with p95 in large FTT (max heat) the card will hit above 80-90 on memory, power, and be just under 83 on core (the icx has the three lights on the side that change depending on the temp range so I can tell just from looking at it).

Dang. Maybe I should try pushing my memory farther if you guys are getting like +100s instead of like +150. Dunno, thought that was my limit. Maybe if I dial the core back just a couple like 5 I could crank the memory up more easily.


----------



## nolive721

I have played again this week-end with my ZOTAC AMP EXTREME+ (the 11Gbs memory model)

I love what the card can pull on my 3 monitor set-up dont get me wrong but I dont feel its a significant performance increase over std AIB 1080 cards from some of, rare I confess, equivalent 3 monitor benchmarks

what is leaving me , just a bit then, underwhelmed is also the fact that I dont see fps gains in synthetic benchmarks like HEAVEN and In Games benchmarks (ROTR, Assetto corsa) with 2088Mhz boost core vs 2138Mhz for example

I can reach 12Gbs from the 11Gbs std memory as well, stable, but it doesnt show much gains either

and that is all with peak temps below 60degC and no Power throttling observed

Am I too much demanding with my running in circles OCing expectations?


----------



## kmac20

I was gonna say if it was a memory overclock it might be because the memory sometimes starts error checking (at least with certain cards, im not sure about the 1080 actually) which actually reduces performance despite the clock speed being higher as it has to devote resources to that.

But since its core its a bit weird youre not seeing even minute gains on benchmarks like 3Dmark firestrike/timespy/etc as its normally impacted by core somewhat. Although it might be such a small difference that it doesn't matter? And im sure as you know you can run the same benchmark 5 times on a fresh session of windows at identical settings and get 5 different scores due to tiny variables.


----------



## Vellinious

Pascal just doesn't perform well at even "warm" temps. If you're running at higher than 40c on the core and hoping for decent clocks, you're most likely going to be disappointed.


----------



## microchidism

For clarification sake, my understanding of Pascal is that not only does lower temperature allow for higher clock speed, but higher performance at a given clock speed?

This really changes the overclocking game because it means that saying your card runs at 2200 means jack if you are getting performance of a card running 2000.


----------



## moustang

Quote:


> Originally Posted by *stephenn82*
> 
> I hear an occasional swooshing of water but it's hard to tell if it's from the Corsair h115i. It happens very very rarely


In my experience with Corsair AIOs that is entirely normal. I heard it from my old H110 and I hear it in my current H115i. Seems entirely random and doesn't seem to hurt anything (The H110 is still working fine after 4 years), but it's a little weird hearing the occasional sound or rushing water coming from the AIO.

I never hear it from either of my Kraken AIOs though.


----------



## SavantStrike

Quote:


> Originally Posted by *nolive721*
> 
> I have played again this week-end with my ZOTAC AMP EXTREME+ (the 11Gbs memory model)
> 
> I love what the card can pull on my 3 monitor set-up dont get me wrong but I dont feel its a significant performance increase over std AIB 1080 cards from some of, rare I confess, equivalent 3 monitor benchmarks
> 
> what is leaving me , just a bit then, underwhelmed is also the fact that I dont see fps gains in synthetic benchmarks like HEAVEN and In Games benchmarks (ROTR, Assetto corsa) with 2088Mhz boost core vs 2138Mhz for example
> 
> I can reach 12Gbs from the 11Gbs std memory as well, stable, but it doesnt show much gains either
> 
> and that is all with peak temps below 60degC and no Power throttling observed
> 
> Am I too much demanding with my running in circles OCing expectations?


The faster memory only nets 5-6 percent more speed. It shouldn't show up as earth shatteringly different but it's a nice bump.
Quote:


> Originally Posted by *microchidism*
> 
> For clarification sake, my understanding of Pascal is that not only does lower temperature allow for higher clock speed, but higher performance at a given clock speed?
> 
> This really changes the overclocking game because it means that saying your card runs at 2200 means jack if you are getting performance of a card running 2000.


The performance is still tied to the core clock, however when the card is hot, it throttles at temperature. What that means is you can have a card that's got an average clock of 2100mhz over the course of a benchmark run, but all of the times it throttles make it slower than a card at a lower speed that ran cooler and never throttled. Pascal is BIOS locked so any really high cite clock (2.2+) is usually on a XOC BIOS that's not as optimized as the factory one, so the same card at the same speeds can be slower once the XOC is flashed.


----------



## VeauX

Hello Folks,

I'm a new owner of a GTX 1080 FE (I'm comming from a 1070 G1 Gaming that I traded to the 1080FE for a nominal amount). The card works pretty well with my 2500k @ 4.6GHz but man it is very loud.

Could you give me some tips to restrain the noise a bit or it is a lost cause?

Thanks in advance!


----------



## Vellinious

Quote:


> Originally Posted by *microchidism*
> 
> For clarification sake, my understanding of Pascal is that not only does lower temperature allow for higher clock speed, but higher performance at a given clock speed?
> 
> This really changes the overclocking game because it means that saying your card runs at 2200 means jack if you are getting performance of a card running 2000.


Exactly. An example: My 1080 FTW will do 2214 @ 1.093v with the core temps hitting 36c under load, but it runs better at 2189 @ 1.083v with the same core temps. Lower the core temps to 7c, and now the 2214 @ 1.083v runs much better than both previous clock settings. In fact, with the core temps down around 7c under load, now it'll run 2252 @ 1.093v and still gain performance.....the performance at those temps doesn't drop off until the core is running 2278+.

Temps are everything with Pascal.


----------



## Imprezzion

I can keep mine at 43-45c load in the heaviest games on 2050-2062 core (with +20% power and +50mV) and it just barely stays under the power limit.

I do have my rad fans really quiet at that setting. Should i go for more nosie and lower temps? I can probably get it down to the mid to high 30's..


----------



## kmac20

I'm content with my clocks and scores, I just thought that +500 mem was an insanely high number.

I'll keep the fans blasting at a lot higher now to see how much more I can get out of it then. The ICX version of the card keeps it a decent bit cooler thant he ACX (not a lot but it does have some benefits) so we'll see how much more I can squeeze out of it. I already was able to boost my memory overclock up to over 250 from it being at like 125 so.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> I have a FTW2 (the icx version) and I can get my core to like +125/+150 range (so it hits about 2025) but my memory I've got at +150 and its stable at that and I dont even want to push it farther.
> 
> I also have noticed no difference when it comes to upping the voltage to 100% or leaving it at stock. Seems like it gives me no gains whatsoever.
> 
> I had a 1060 that thing was a lottery loser. Barely did +40/+50. Again with that card as well voltage didn't matter. Seems at least in my experience only the power allowance matters, not the voltage.
> 
> I'd love to be proven wrong and be able to do +500 on my memory. I'm happy with the core boost, 2025 on air is pretty decent, and unless I intentionally SLAM it like with FurMark or something it never goes above 70. this is with fans set to auto so they barely rev. If i crank up fans manually the temps drop significantly. If I slam it hardcore with furmark AND slam the cpu with p95 in large FTT (max heat) the card will hit above 80-90 on memory, power, and be just under 83 on core (the icx has the three lights on the side that change depending on the temp range so I can tell just from looking at it).
> 
> Dang. Maybe I should try pushing my memory farther if you guys are getting like +100s instead of like +150. Dunno, thought that was my limit. Maybe if I dial the core back just a couple like 5 I could crank the memory up more easily.


question for you...you using X-OC?

If yes...dont

Do yourself a solid, uninstall it...wipe all of it away.

Go get MSI AB and tune with the curve. There are some things in this thread on how to do it, and also vids for it. Very, very good stuff. I do my tune of +125 and my 1080 FTW runs 2126 all the time at 44c..It is under the Hybrid cooler though from factory. And my VRAM is almost 11gbps.

I say this becuase I was in the same boat you are now with using X-OC on memory overclock. 125 was ok, went to 150, artifacts and crashing.

Got the AB running, and all of it went away...ALL OF IT.


----------



## sirleeofroy

Quote:


> Originally Posted by *sirleeofroy*
> 
> Hey Guys, just a quick one.....
> 
> Does anyone have any first hand experience with the following GPU cooler - Alphacool Eiswolf 120 GPX Pro Nvidia Geforce GTX 1080
> 
> I'm looking at getting one of these for my Gainward GTX 1080 GLH as I run 4K most of the time which the card does a decent job at but to keep my OC as effective as possible, I have to run the fans at full tilt which is far noisier than I would like.
> 
> I also have a few Alphacool parts already as I have a semi-custom loop built out an Alphacool AIO and additional radiators, so this could either add to my loop for a total of a 280mm rad, a 120mm rad and 2 AIO pumps (CPU+GPU) or.... Separate cooling for the CPU using the 280mm rad and separate cooling for the GPU using the new AIO.
> 
> Edit: The 280mm rad is 60mm thick and the 120mm is 40mm thick if that helps.....
> 
> What are your thoughts guys?
> 
> Thanks


Literally, nothing.....Anyone?!


----------



## Yetyhunter

Hey guys what is the best way to test the stability for the card ? I have a Super Jetstream edition from Palit, and I seem to have hit a wall at 2050 on the core and 10,4gbs on the memory. Even the slightest increase will crash in 3dmark firestrike second graphics test. I played with voltages and power limit from 0 to max still can't get more. Temps seem normal, never over 65°C. Is this the max I can get from the card ?


----------



## nolive721

Quote:


> Originally Posted by *Vellinious*
> 
> Exactly. An example: My 1080 FTW will do 2214 @ 1.093v with the core temps hitting 36c under load, but it runs better at 2189 @ 1.083v with the same core temps. Lower the core temps to 7c, and now the 2214 @ 1.083v runs much better than both previous clock settings. In fact, with the core temps down around 7c under load, now it'll run 2252 @ 1.093v and still gain performance.....the performance at those temps doesn't drop off until the core is running 2278+.
> 
> Temps are everything with Pascal.


Quote:


> Originally Posted by *SavantStrike*
> 
> The faster memory only nets 5-6 percent more speed. It shouldn't show up as earth shatteringly different but it's a nice bump.
> The performance is still tied to the core clock, however when the card is hot, it throttles at temperature. What that means is you can have a card that's got an average clock of 2100mhz over the course of a benchmark run, but all of the times it throttles make it slower than a card at a lower speed that ran cooler and never throttled. Pascal is BIOS locked so any really high cite clock (2.2+) is usually on a XOC BIOS that's not as optimized as the factory one, so the same card at the same speeds can be slower once the XOC is flashed.


ok ok I got it! I should be happy with what I have and if I wanted more, should have bought a LC card.......just coming from an AMD RX480 space with high temps and power hungry compared to that ZOTAC so I thought she could give more.


----------



## SOCOM_HERO

Looking for advice on getting a decent clock on air. I have the Gigabyte G1 Gaming and the card doesn't seem to want to go over 2GHz when I use the manual curve method in X-OC. Should I try afterburner instead?


----------



## Vellinious

I would. The curve is much easier in AB.


----------



## SOCOM_HERO

Quote:


> Originally Posted by *Vellinious*
> 
> I would. The curve is much easier in AB.


I'll give it a shot. X-OC worked with the sliders, but I could only get +151 core and 500 mem. +152 led to crashes in superposition. Either way, the highest clock max I got was 2114, but at +152 it crashed at that speed.

I like the idea of the curve method, as it should help for higher clocks at max voltage, and more moderate clocks/temps at lower voltage. The curve in X-oc seemed to be harder to work with. I do like the OSD they have though. AB seems to be a bit dated there.


----------



## Hequaqua

Steam has 3DMark on sale....$4.49.

Some pretty good prices on bundles too(3DMark/VRMark/PCMark10)

http://store.steampowered.com/app/223850/3DMark/

EDIT: Once you redeem the key, you can install the standalone btw.


----------



## SavantStrike

Quote:


> Originally Posted by *Hequaqua*
> 
> Steam has 3DMark on sale....$4.49.
> 
> Some pretty good prices on bundles too(3DMark/VRMark/PCMark10)
> 
> http://store.steampowered.com/app/223850/3DMark/
> 
> EDIT: Once you redeem the key, you can install the standalone btw.


That's a good deal.

You should post it in the deals section for more visibility.


----------



## kmac20

Quote:


> Originally Posted by *stephenn82*
> 
> question for you...you using X-OC?
> 
> If yes...dont
> 
> Do yourself a solid, uninstall it...wipe all of it away.
> 
> Go get MSI AB and tune with the curve. There are some things in this thread on how to do it, and also vids for it. Very, very good stuff. I do my tune of +125 and my 1080 FTW runs 2126 all the time at 44c..It is under the Hybrid cooler though from factory. And my VRAM is almost 11gbps.
> 
> I say this becuase I was in the same boat you are now with using X-OC on memory overclock. 125 was ok, went to 150, artifacts and crashing.
> 
> Got the AB running, and all of it went away...ALL OF IT.


The problem is I can't get the icx lights to work without X-oc. And considering I paid extra for hat very feature....

I've gotten it to 152/310 with xoc which put me in top 15 on firestrike for the 1700/1080 combo. I can up the memory to like almost 400 with the temps staying below 50 but my score seems to drop after I up it past about 310-350


----------



## Vellinious

Quote:


> Originally Posted by *kmac20*
> 
> The problem is I can't get the icx lights to work without X-oc. And considering I paid extra for hat very feature....
> 
> I've gotten it to 152/310 with xoc which put me in top 15 on firestrike for the 1700/1080 combo. I can up the memory to like almost 400 with the temps staying below 50 but my score seems to drop after I up it past about 310-350


If you're happy with that performance and the pretty lights, stick with it.


----------



## azcrazy

Hello

I been having a problem with my 1080 dropping frame after an hour or so of playing TF2 , any body had that problem?

Temps never go above 55 (that's what open hardwear says), i play in 2560x1440 60hz refresh rate monitor.

Thanks for the help.


----------



## hotrod717

Back to a 1080. EVGA SC icx2 advertises 1847 Boost.Can someone remind me what a bone stock 1080 should or averages on boost clock? What is a average boost I should be seeing? I had a 1080 strix and cant for the life of me can't remember what it boosted to stock out of the box. Got this solely for gaming but the oc'er in me is taking over. lol.


----------



## Hequaqua

I would say around 1946+..depending on temps/application.









EDIT: My G1 is 1881....it boosts to around 1961 or so. Hope that helps.


----------



## hotrod717

Thanks. Yes it does. Looking for some more feedback across several samples. This card is boosting to 2012, which i believe is on the high side. Looking for confirmation. Voltage on this one seems a bit modest at 1.05v. Previous experience with Pascal is its hit or miss and aib doesn't really mean J and/or S. Luck of the draw on chip, regardless.


----------



## Vellinious

The cooler it is, the higher it'll boost. Anywhere between 1950 and 2050 is what I'd expect to see from boost 3.0.


----------



## Hequaqua

Don't take this too seriously....

I did a hard mod on my GTX1080....trying to unlock the power limit. Do you think this will help?



Spoiler: Warning: Spoiler!


----------



## Vellinious

MMMmmmm, tots.......


----------



## Hequaqua

A friend of mine posted that on FB...I thought it was hilarious....had to share....lol


----------



## SOCOM_HERO

Managed a pretty good OC on my 1080. Used the curve roughly max at 2125/+500mem. Stable in everything I could throw at it. Never goes above 73C on air. I don't have any plans to watercool it. I have not yet run it with my 5GHz CPU OC as I'm still dialing it in.

https://www.techpowerup.com/gpuz/details/d6rma


----------



## awdrifter

I'm doing a new build and I'm thinking about getting either a GTX 1080 or 1080 TI. From what I understand there's no way to modify the bios to bypass the voltage or power limit for the Pascal cards. However, does the cards take into account the overall board power draw or just the GPU? I remember for the GTX 970 people were saying even the fan and LED power draw was included in the limit. Is this still the case for the GTX 1080? If that's the case, I might get the one with the flashiest LED and fan then disable the LED and move the fan to external power, that way I can give all the power alloted to the GPU and VRAM. Has anyone done something similar? What's your highest OC on air? Thanks.


----------



## SavantStrike

Quote:


> Originally Posted by *awdrifter*
> 
> I'm doing a new build and I'm thinking about getting either a GTX 1080 or 1080 TI. From what I understand there's no way to modify the bios to bypass the voltage or power limit for the Pascal cards. However, does the cards take into account the overall board power draw or just the GPU? I remember for the GTX 970 people were saying even the fan and LED power draw was included in the limit. Is this still the case for the GTX 1080? If that's the case, I might get the one with the flashiest LED and fan then disable the LED and move the fan to external power, that way I can give all the power alloted to the GPU and VRAM. Has anyone done something similar? What's your highest OC on air? Thanks.


The xoc BIOS bypasses power limits. If you don't use the xoc BIOS then either the aorus or the arcticstorm offer the highest tdp out of the box. I've got an aorus and haven't hit the 375W limit yet


----------



## kmac20

REALLY? I did not know that. Is that what the second bios on my evga card is for?

I've never had a card with dual bios before nor have I flashed a card BIOS since like, back when I had a 6800 or something to unlock more pipelines.


----------



## stephenn82

Quote:


> Originally Posted by *SOCOM_HERO*
> 
> Managed a pretty good OC on my 1080. Used the curve roughly max at 2125/+500mem. Stable in everything I could throw at it. Never goes above 73C on air. I don't have any plans to watercool it. I have not yet run it with my 5GHz CPU OC as I'm still dialing it in.
> 
> https://www.techpowerup.com/gpuz/details/d6rma


good stuff huh? That's almost my same setup. But I'm under the hybrid cooler from evga


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> REALLY? I did not know that. Is that what the second bios on my evga card is for?
> 
> I've never had a card with dual bios before nor have I flashed a card BIOS since like, back when I had a 6800 or something to unlock more pipelines.


evga bios switch gives slightly better curves for voltage and aggressive fan speeds.


----------



## Vellinious

Pretty sure the only the FTW and Classy have the dual bios on the 1080. Not sure which card you have, but....if it's an SC or something, I'm about 100% sure it's not there.


----------



## SOCOM_HERO

Quote:


> Originally Posted by *stephenn82*
> 
> good stuff huh? That's almost my same setup. But I'm under the hybrid cooler from evga


I was actually pretty impressed with how finely tuned you can get the curve in afterburner. X-OC and other programs weren't as granular and it showed in certain AAA games like Rise of the Tomb Raider and Metro 2033 or Last Light. Those really hammer a GPU. You should make a new rig in your sig, I can't see your specs







!


----------



## stephenn82

Quote:


> Originally Posted by *SOCOM_HERO*
> 
> I was actually pretty impressed with how finely tuned you can get the curve in afterburner. X-OC and other programs weren't as granular and it showed in certain AAA games like Rise of the Tomb Raider and Metro 2033 or Last Light. Those really hammer a GPU. You should make a new rig in your sig, I can't see your specs
> 
> 
> 
> 
> 
> 
> 
> !


really? Cuz it's in there...

Also agree on xoc sucking compared to AB. I couldn't overclock memory at all...not even 10mhz with xoc. AB I'm at 11ghz.


----------



## awdrifter

Quote:


> Originally Posted by *SavantStrike*
> 
> The xoc BIOS bypasses power limits. If you don't use the xoc BIOS then either the aorus or the arcticstorm offer the highest tdp out of the box. I've got an aorus and haven't hit the 375W limit yet


Thanks for the info. I looked into this more, it seems like most cards can flash to the XOC bios and it'll allow the card to hold 2.1ghz boost clock (more or less). I think that's good enough for me. I don't want to do crazy voltmod on a $500 card.


----------



## Gen Patton

I am playing Ghost Recon wildlands and its a very demanding game but my Founders is hanging in there, also Final Asault (Gits) its playing good also, even F-1 2016 is hard on a gpu but mine works fine. i play for hours on my off days. For a free Graphic card its fine for me. And Motoko loves it.(see she is smiling).


----------



## kmac20

Yeah I have an EVGA FTW2. I've never used a dual bios on a card. Should I flip that switch? Will I get better voltage out of it? The rig in my sig is what I'm working with. 850w. 1 HDD 1SSD and 1 m.2SSD. 1700 overclocked and a couple of fans and a small AIO. Not even close to my 850w so yeah, if I can get more out of it by flippin that switch to the other side I will.

Anything I need to know before/if I do that? Again, never had a card with 2 BIOS and the last card I flashed with a new bios was like, a 6800 back in like 2005 to get more pipelines or shaders unlocked or something I dont really remember it was a long time ago.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> Yeah I have an EVGA FTW2. I've never used a dual bios on a card. Should I flip that switch? Will I get better voltage out of it? The rig in my sig is what I'm working with. 850w. 1 HDD 1SSD and 1 m.2SSD. 1700 overclocked and a couple of fans and a small AIO. Not even close to my 850w so yeah, if I can get more out of it by flippin that switch to the other side I will.
> 
> Anything I need to know before/if I do that? Again, never had a card with 2 BIOS and the last card I flashed with a new bios was like, a 6800 back in like 2005 to get more pipelines or shaders unlocked or something I dont really remember it was a long time ago.


Yes, flip that to the 'slave' or secondary bios. My card is towards the front. It allows more potential for the card. You are running separate power cables to your pci-e pins right? That makes a difference too.


----------



## kmac20

Its actually all off of one line both the pins because the cable management is a nightmare without it. But its a high quality Seasonic so I'm not super worried about the power delivery.

I need to install a new SSD i got on sale (samsung 850 evo 500gb good deal on it, yes its not a pro but whatever SSD is SSD to me I dont need nvme yet unless I start doing tons of HQ video rendering). So when I wire that one up if I can i'll try to do a separate wire.

Thanks for the advice.


----------



## perern

I got a new 1080 Strix A8G yesterday. I think I'm running out of power when overclocking. I got a Corsair CX750M, when I go over 2060MHz I get a lot of 16 / Util perfcap.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> Its actually all off of one line both the pins because the cable management is a nightmare without it. But its a high quality Seasonic so I'm not super worried about the power delivery.
> 
> I need to install a new SSD i got on sale (samsung 850 evo 500gb good deal on it, yes its not a pro but whatever SSD is SSD to me I dont need nvme yet unless I start doing tons of HQ video rendering). So when I wire that one up if I can i'll try to do a separate wire.
> 
> Thanks for the advice.


jayztwocents did a video showing for gain and almost 200 pts in synthetic benchmarks gained from two individual cables. He also had one of those seasonic based platinum edition PSU's. But up to you.


----------



## stephenn82

Quote:


> Originally Posted by *perern*
> 
> I got a new 1080 Strix A8G yesterday. I think I'm running out of power when overclocking. I got a Corsair CX750M, when I go over 2060MHz I get a lot of 16 / Util perfcap.


You use AB and change your voltage/freq curve? Just flinging that offset and voltage up doesn't work out so well.


----------



## kmac20

No no I greatly appreciate the info. I have watched a lot of his vids but don't remember that. I'll swap the cables as soon as feasible. Really thank you for the info


----------



## perern

Haven't tried editing the curve, just going up 12MHz at the time. Started getting perfcap 4, increased voltage 20% or so. Got further but ended with lots of perfcap 16. I used three different calculators and all of them meant I needed at least 800W at that speed


----------



## Vellinious

Your GPU is only going to pull around 300 watts, no matter what speed it runs at, because it has a power limit built into the bios. Most of them have a power limit of 280 watts or less.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> No no I greatly appreciate the info. I have watched a lot of his vids but don't remember that. I'll swap the cables as soon as feasible. Really thank you for the info


usually I tag it for people in my posts...but here you go.






*update*
It may not be 200 points...but it IS an improvement.


----------



## stephenn82

If you are worried about cabling and tidiness, I found a guy who makes custom cables or extensions. I got an extension set from him, super high quality, for 35 bucks shipped. I may need to order two more from him for another GPU when prices come way, way, way down and when I get enough dough to support my 4k HDR monitor...If you are interested in his work, let me know.


----------



## jon666

What monitors are you all running with your GPU's? I'm thinking 1080p ain't cutting it anymore since I picked up my 1080.


----------



## coreykill99

running 2560 x 1080 144HZ keep wanting to dip my feetin into 4k or something but I just bought this a year ago. cant upgrade just yet.
although I have found great happiness in turning up the rendering in games that allow it. around 130% is equivalent to 1440p in pixels


----------



## kmac20

1440p 75hz


----------



## Spiriva

Quote:


> Originally Posted by *stephenn82*
> 
> Yes, flip that to the 'slave' or secondary bios. My card is towards the front. It allows more potential for the card. You are running separate power cables to your pci-e pins right? That makes a difference too.


I got the Asus 34" IPS G-Sync pg348q, using 1080 in sli tho. But gonna upgrade to Nvidia "Volta" as soon as they release the Titan version of volta, and then just run a single card instead of sli.


----------



## cole2109

My new Asus Strix. On air at the moment. Voltage stock...
2100-2126MHz

http://shrani.si/?1l/Oh/2MLMep44/1.png


----------



## stephenn82

Not bad there, @cole2109 and thats stock volts? what is that exactly?

I get 2126 hold steady at 1.093v and its fun!

Enjoy the gaming experiences!!


----------



## perern

Quote:


> Originally Posted by *Vellinious*
> 
> Your GPU is only going to pull around 300 watts, no matter what speed it runs at, because it has a power limit built into the bios. Most of them have a power limit of 280 watts or less.


I checked the wattage now. It pulls a maximum of 360W with 3DMark Timespy, what can I do to OC higher?


----------



## Vellinious

Lower temps so you can run lower voltage, or do the shunt mod.


----------



## perern

Quote:


> Originally Posted by *Vellinious*
> 
> Lower temps so you can run lower voltage, or do the shunt mod.


I ran the fan at 75% all the time, keeping it under 65C all the time


----------



## SavantStrike

Quote:


> Originally Posted by *Vellinious*
> 
> Lower temps so you can run lower voltage, or do the shunt mod.


Also only do the shunt mod or XOC BIOS if you have low temps.


----------



## cole2109

Quote:


> Originally Posted by *stephenn82*
> 
> Not bad there, @cole2109 and thats stock volts? what is that exactly?
> 
> I get 2126 hold steady at 1.093v and its fun!
> 
> Enjoy the gaming experiences!!


Stock voltage







We'll see results after installing EK FC.


----------



## Vellinious

Quote:


> Originally Posted by *perern*
> 
> I ran the fan at 75% all the time, keeping it under 65C all the time


Yeah, that's not what I mean by "run it cooler". More along the lines of 20c or colder core temp while under full load....the colder the better.


----------



## stephenn82

Ewwww xoc! Go to AB


----------



## Andrew LB

Just picked up an EVGA GTX 1080 Gaming the other day (reference) to replace my recently dead GTX 780 ti, and this card is amazing. Haven't had much time to overclock it much, but I just ran Superposition and got an ok score. I'm gonna ramp up my CPU and see if that improves things. I'd rather hold off on pushing the GPU any further until my Aquacomputer Kryographics full cover water block and the Aquacomputer XCS Active backplate arrive. From what i've read, the active backplate makes a huge difference on the VRM temps.

CPU was running @ 4ghz and the GTX 1080 peaked at 2012mhz and held just below 2ghz throughout the test.


----------



## stephenn82

Quote:


> Originally Posted by *Andrew LB*
> 
> Just picked up an EVGA GTX 1080 Gaming the other day (reference) to replace my recently dead GTX 780 ti, and this card is amazing. Haven't had much time to overclock it much, but I just ran Superposition and got an ok score. I'm gonna ramp up my CPU and see if that improves things. I'd rather hold off on pushing the GPU any further until my Aquacomputer Kryographics full cover water block and the Aquacomputer XCS Active backplate arrive. From what i've read, the active backplate makes a huge difference on the VRM temps.
> 
> CPU was running @ 4ghz and the GTX 1080 peaked at 2012mhz and held just below 2ghz throughout the test.


nice! time to tweak your curve and get that score up!

Speaking of that, I recently reinstalled windows, went through drivers and got my 1080 rocking in games. I forget which setting in AB allows my card to get to 1.093v. Card maxes out at 1.062v and wont hit the previous clocks that I had before, or 1.093v. Right not, it set as third party for voltage unlock type. Does it need to be reference? Its been a few months since I found this answer. Time to come through the many, many pages of this thread.


----------



## kmac20

Ok so tonight/tomorrow I'm going to both run the separate PCIE cable for the 2nd 8pin for better power as well as try that second bios. EVGA FTW2 here

Is there anything i need to do to utilize the second bios aside from flip the switch? Like I said I've never had a card with two bios on it before so just looking to know if there's anything else that has to be done aside from switching the switch to slave.

Thanks everyone for your input. I'm actually getting really high firestrike EXTREME score's cu4remrly but not so great regular firrstrike. Anything seem off about that?


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> Ok so tonight/tomorrow I'm going to both run the separate PCIE cable for the 2nd 8pin for better power as well as try that second bios. EVGA FTW2 here
> 
> Is there anything i need to do to utilize the second bios aside from flip the switch? Like I said I've never had a card with two bios on it before so just looking to know if there's anything else that has to be done aside from switching the switch to slave.
> 
> Thanks everyone for your input. I'm actually getting really high firestrike EXTREME score's cu4remrly but not so great regular firrstrike. Anything seem off about that?


nope...just power off the PC, flip that switch towards the front of case, and power on and enjoy!


----------



## stephenn82

back at it again...I think this is the sweetest spot for my card:

100% voltage
120% power
Curve with 4 dots before 1093 flat at +110.
+500mhz on vram


----------



## kmac20

I'm getting 4388 in Superposition extreme but thats with a video playing on my 2nd monitor and without any overclocks (I think, or at least not my usual. max for bencharmking). Does that sound right with a video and several tabs open on a 2nd 1280x1024 monitor? I know its not orthodox to benchmark that way I just wanted a frame of reference and thank you @stephenn82 for providing one.

And again this is before I put a 2nd line to my 2nd 8pin without switching BIOS and while using XOC and such for overclocking. But I think again theres only a mild overclock or none at all currently. 4388 sound normalish for this circumstance of me watching avengers with about 4 other tabs open (youtube included on several?)

I cant wait till tomorrow when I change all this stuff around to see what i get. Might even use afterburner to overclock with the 2nd bios.


----------



## hotrod717

Easiest, cheapest way to get low, low temps. A/C v1.7. After multiple ways of ducting to the case, this is the most simple and effective. Insilation of some soty woulfd be the most effeicient measn of getting it their, ,but it does the job.. Down to 8*c. $150 a/c FTW.


----------



## coreykill99

started playing with the curve for the first time. going...allright I guess. having some more issues my card dosent seem to like stepping it only goes in jumps
stable 2138mhz from 1.043v all the way to 1.081v between these voltages dosent seem like it wants to make the next jump to 2151 happily, 2151 inst happy until 1.093v
but this is what I have. any tips on breaking the 4700 mark? 2151 is also my max it seems. 2163? i think it was. it let me run the test but scores fell through the floor to ~4500


----------



## nolive721

Can you post a pic of your voltage curve? Just curious cause your clocks are high but I scored above 4800points with similar ones
You can see earlier in this thread

My card is Zotac amp extreme+ with beefier memory though


----------



## stephenn82

Well, this is strange...

I recently reinstalled windows, the latest Nvidia driver, MSI AB (set to third party voltage control) and have all of the same settings as previous install. My machine wants to run 2139 at 1.062v and gives me the 'Voltage Limit' warning. I know its ok for this, but it NEVER gave me issues at all. Ever.
What am i doing wrong here? Is it something with the 1709 build of windows? Something with 388.XX new driver? My card wont go to 1.093v at all. It used to ramp to 1.093 and hold 2126 and NEVER give me a voltage limit. Not when folding or gaming. Now, it gives me the VRel cap just minutes after gaming.

I have GTX 1080 FTW Hybrid.


----------



## Vellinious

Quote:


> Originally Posted by *coreykill99*
> 
> started playing with the curve for the first time. going...allright I guess. having some more issues my card dosent seem to like stepping it only goes in jumps
> stable 2138mhz from 1.043v all the way to 1.081v between these voltages dosent seem like it wants to make the next jump to 2151 happily, 2151 inst happy until 1.093v
> but this is what I have. any tips on breaking the 4700 mark? 2151 is also my max it seems. 2163? i think it was. it let me run the test but scores fell through the floor to ~4500


Run colder, try for a higher memory clock.


----------



## Sev501

I just upgraded from gtx 970 g1 to an Aorus gtx 1080 rev 2. Is the Aorus graphics engine good enough to oc? Or should I just stick with After burner?

I just want to bump ram to match the new models with 11gbps. Any guidance will be accepted. Thanks.


----------



## stephenn82

Quote:


> Originally Posted by *stephenn82*
> 
> Well, this is strange...
> 
> I recently reinstalled windows, the latest Nvidia driver, MSI AB (set to third party voltage control) and have all of the same settings as previous install. My machine wants to run 2139 at 1.062v and gives me the 'Voltage Limit' warning. I know its ok for this, but it NEVER gave me issues at all. Ever.
> What am i doing wrong here? Is it something with the 1709 build of windows? Something with 388.XX new driver? My card wont go to 1.093v at all. It used to ramp to 1.093 and hold 2126 and NEVER give me a voltage limit. Not when folding or gaming. Now, it gives me the VRel cap just minutes after gaming.
> 
> I have GTX 1080 FTW Hybrid.


Definitely...brand new driver even with HIGHER settings (and keeping my card at 41c max too!) LOWERED my score. I was netting like high 4600's. I scored a measily mid 4400s. My card does 4400 stock with just upping the power limit..lol

new driver and +120 core +450 memory 100% volts 120% power (card only hit 85% power total)


Old setup with +100 on core, +450 memory 100% volts 120% power....card usually was a little higher on power.


I also noticed that the voltage on GPU wants to stay lower as well. My card ran 2153 at 1.071v...the old setup was rock solid 2126 at 1.093. Huh...going to find the driver from I think October? Testing this again.


----------



## stephenn82

Quote:


> Originally Posted by *Sev501*
> 
> I just upgraded from gtx 970 g1 to an Aorus gtx 1080 rev 2. Is the Aorus graphics engine good enough to oc? Or should I just stick with After burner?
> 
> I just want to bump ram to match the new models with 11gbps. Any guidance will be accepted. Thanks.


I tried my hand at EVGA X-OC...supposedly supposed to be epic...but it has its quarks.

You can try Giga's stuff...but AB just works. Remember to unlock all the settings when you install it, and set your voltage unlock to "third party" and you will be good to go.


----------



## stephenn82

Ok...

I am back at 2126mhz and +450 on memory with 120% power allowed (funny, card has barely even hit 90% in anything as of yet) and on LESS voltage slider. I ran 80% and core would top out at 1093mv. Now, I am at 40% and it holds at 1062mv at same clocks.


----------



## kmac20

Its not uncommon for cards nowadays to do well with less voltage. Especially certain generations of cards. I'm pretty sure pascal cares more about being cold than it does getting slightly more voltage.


----------



## Vellinious

Quote:


> Originally Posted by *kmac20*
> 
> Its not uncommon for cards nowadays to do well with less voltage. Especially certain generations of cards. I'm pretty sure pascal cares more about being cold than it does getting slightly more voltage.


That would be 100% correct. You can add extra voltage, but it's a lot less effective without first lowering the clock temps significantly.


----------



## stephenn82

Yeah, I know. Mine sits at 43c so thats pretty good. I need a full loop, i know. I am really considering buying an open box EK block for my card..its 62 bucks at Micro Center. The brand new one is 72...as its the first gen model and being phased out. Back plates are also 4.99. Is an EK backplate better than EVGA one? It looks solid...no air vents, etc.


----------



## stephenn82

oh, and with the lowered voltage...I get the 'voltage limit" all the time now. so card drops to 2114 vice 2126 and temps are like 41-43c. Must....Make....COLDER!! I guess fans manaully at 70%? maybe 80%? I dont really hear them at 80%


----------



## stephenn82

Quote:


> Originally Posted by *hotrod717*
> 
> Easiest, cheapest way to get low, low temps. A/C v1.7. After multiple ways of ducting to the case, this is the most simple and effective. Insilation of some soty woulfd be the most effeicient measn of getting it their, ,but it does the job.. Down to 8*c. $150 a/c FTW.


Nice work!

For some reason, it made me think of this:


----------



## Gen Patton

Try a 1440 monitor you will see a lot of improvement.


----------



## w1n1x

Hello guys ! I'm experiencing some stutters in games like GTA V with my EVGA ACX 3.0 GTX 1080 once the temperature hits 83 C even though I set the temperature limit to 90 C . My case is the Zalman Z9 Neo using the original case fans that were provided with it. My fps drops down from 80-90 to suddenly 49-55 and I experience heavy stutters because of it. Is it throttling by any chance? Is there any way to turn off the throttling or should I try increasing the fan speeds on the card? I already set it to be at 80% when reaching 80 C and it's already pretty loud.








Maybe a new paste is needed instead of the original?


----------



## pez

Airflow in that case is not too ideal for an aircooled card. What is your current fan setup like in the case? I think I was running my 1080Ti ACX (non-iCX sensors one) at around 70-80% fan and hitting around a max of 75 I think (in a much tinier case).


----------



## w1n1x

Quote:


> Originally Posted by *pez*
> 
> Airflow in that case is not too ideal for an aircooled card. What is your current fan setup like in the case? I think I was running my 1080Ti ACX (non-iCX sensors one) at around 70-80% fan and hitting around a max of 75 I think (in a much tinier case).


I see. Which kind of cases do you recommend with good airflow under $100? I thought about NZXT S340 Elite. Is that good enough?
My fan setup:


----------



## stephenn82

Perhaps Superposition IS suspect...IDK why though. Same clocks, runs fine in TimeSpy

https://www.3dmark.com/compare/spy/2931734/spy/2931782/spy/2931835


----------



## kmac20

For reference this was my last score in timespy. I dont remember if its my highest but its definitely up there, above 8200

https://www.3dmark.com/spy/2757337

Exact score is 8253, with a graphics score of 8231 and a CPU score of 8384.

GPU speeds listed are 2063/1453. CPU speed is 3800.


----------



## Gen Patton

Rosewell has a good case, i have the stryker. its ok and its around 50.00


----------



## nolive721

Quote:


> Originally Posted by *w1n1x*
> 
> I see. Which kind of cases do you recommend with good airflow under $100? I thought about NZXT S340 Elite. Is that good enough?
> My fan setup:


https://www.aerocool.com.tw/en/project7/case-p7/p7-c1

I have that one since August.its a beauty and cost me equivalent of only 60USD over here in japan, it a was a lighting offer at Amazon.

been a breeze, literally, thanks to open front mesh and side and top vents as well as fan options for my ZOTAC 1080Amp extreme plus and my Ryzne 1500X CPU to run cool

highly recommended


----------



## nolive721

oh I forgot to mention, I have the tempered Glass edition, a beauty really.....

other topic, can ZOTAC AMP extreme owners here confirm how you cope with the card sag because oh boy that thing is heavy!! I was meaning to show a picture of what I did, but its a bit embarrassing the solution I set actually so you guys first....


----------



## w1n1x

Quote:


> Originally Posted by *nolive721*
> 
> https://www.aerocool.com.tw/en/project7/case-p7/p7-c1
> 
> I have that one since August.its a beauty and cost me equivalent of only 60USD over here in japan, it a was a lighting offer at Amazon.
> 
> been a breeze, literally, thanks to open front mesh and side and top vents as well as fan options for my ZOTAC 1080Amp extreme plus and my Ryzne 1500X CPU to run cool
> 
> highly recommended


Thank you for the tip. I forgot to remove the cover on top of my case which didn't let the warm air go out. My bad,lol. That was a rooky mistake from my part and I'm ashamed of it








Now my temps went down to 76 C'. I've never seen it go above that with 75% fan speed. I guess now that the air can go out with ease the temps went down by a good 8-10 C' since I saw it go above 84 C' before.
Btw I will definitely swap my case with a better one. The one you mentioned looks dope. I might as well get that next year.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> For reference this was my last score in timespy. I dont remember if its my highest but its definitely up there, above 8200
> 
> https://www.3dmark.com/spy/2757337
> 
> Exact score is 8253, with a graphics score of 8231 and a CPU score of 8384.
> 
> GPU speeds listed are 2063/1453. CPU speed is 3800.


how on earth did you get that ram up to 11.6gbps? Is this the newer model with 11gbps ram stock?


----------



## kmac20

Quote:


> Originally Posted by *stephenn82*
> 
> how on earth did you get that ram up to 11.6gbps? Is this the newer model with 11gbps ram stock?


Yes it is. Sorry if that wasn't clear. Maybe this is why I'm always wondering why other people seem to get higher + on their memory than myself.

I still don't know why my firestrike extreme and timespy scores are this food but regular firestrike seems incredibly low comparatively.
Quote:


> Originally Posted by *w1n1x*
> 
> Thank you for the tip. I forgot to remove the cover on top of my case which didn't let the warm air go out. My bad,lol. That was a rooky mistake from my part and I'm ashamed of it
> 
> 
> 
> 
> 
> 
> 
> 
> Now my temps went down to 76 C'. I've never seen it go above that with 75% fan speed. I guess now that the air can go out with ease the temps went down by a good 8-10 C' since I saw it go above 84 C' before.
> Btw I will definitely swap my case with a better one. The one you mentioned looks dope. I might as well get that next year.


Just FYI I have an s340 elite and love it and it gets pretty good airflow. With my fans up to like 70-80 it NEVER goes above 50 and rarely goes above 45. Unless I'm slamming it with like p95 and furmark at the same time.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> Yes it is. Sorry if that wasn't clear. Maybe this is why I'm always wondering why other people seem to get higher + on their memory than myself.


I guess FTW2 explains it...hahahah

Others higher than yours? Im rocking 5454 on my ram speed...10908mhz. you have me beat sir. lol

according to this, its only 10010mhz stock like mine.
https://www.evga.com/products/product.aspx?pn=08G-P4-6686-KR

so I question again, how did you get your memory up that high?

I guess we should have all looked at the Classified verison with beefier power phases at 14+3 vice 10+2...cest la vie!


----------



## kmac20

When I say more than mine I mean, in retrospect now because you pointed it out to me in fact, that they were over clocking higher than what their stock was. Since my stock is higher I now know why others are able to get like +500 which seemed ridiculous to me when +300-350 was my range.

I actually could go higher than what I did on that run, I did not have my memory pushed as far as I could. But it seems as if my scores start to go lower after a certain point even if it doesn't crash, which I would assume has to do with error correction or some other specific I'm not aware of.

As for how I did, silicon lottery maybe? I use XOC not afterburner, maybe that's impacting it differently sinces it's made for evga cards and I have an evga card?

The FTW2+ has really good power delivery. 10+2 is good, especially compared to 5+1 (like wow that's blow) Not the best but really good. It's icx too so maybe that has some impact.

But what's the number you multiply the 1860 by, is it 5 or 6? I forget these things sometimes when I'm out and about now sitting still my PC.

If it's 6 like I think isn't 1860*6=11160?

I can't remember all the multiples with the new cards. My last card before this was a 7850 and before that an 88GT


----------



## stephenn82

I tried using XOC for my card. I couldnt overclock my ram at all. not even 20mhz without crashing my system. AB allows me to push it up to 525 before issues.

and your power delivery, if anyhting like mine. is actually 5+2...they just double the 5 to be called 10. Its more stable and runs way cooler...but its a double 5.
https://www.gamersnexus.net/hwreviews/2799-evga-gtx-1080-ftw2-sc2-icx-in-depth-thermal-analysis?showall=1

the RAM speed is 8 times. so 11600 nets you 1450. times 4 gives you what the software shows. 5800. mine is 5454, or 1365 single rate speed. 11600 vs just at 10908.


----------



## kmac20

Well they must be doing something right with the power delivery because for a few weeks before I got this 1080 I had a evga 1060 which was only 5+1 (listed as 5+1) and that thing did not overclock AT ALL. I'm talking +25/+40 on XOC.

Silicon lottery I guess? And GPUZ will list it at like 1753, but isn't it listed on evga as 1860? Am I confusing some numbers here? Seem to be a lotta different ways today to list these GPUmemory speeds lol

And yeah in xoc I get 5843 or something like that. Maybe 5863 I can't remember off the top of my head. But definitely above 5800


----------



## w1n1x

Quote:


> Originally Posted by *kmac20*
> 
> Just FYI I have an s340 elite and love it and it gets pretty good airflow. With my fans up to like 70-80 it NEVER goes above 50 and rarely goes above 45. Unless I'm slamming it with like p95 and furmark at the same time.


Do the fans that came with that case suffice or did you replace them with more efficent ones? How did they perform?
It must be pretty noisy with 70-80% as I know that first-hand with my own card,lol.


----------



## kmac20

I replaced them. I still use one fan that came with the case and it's fine To be honest I don't notice the noise at all.

The noise I've been noticing lately I'd one of my case fans somewhere got a bit loose and has been rattling oh so slightly. I don't mind when the fans are spinning it's not bad but those slight variations bother me more as they come and go instead of being constant.


----------



## stephenn82

Quote:


> Originally Posted by *kmac20*
> 
> Well they must be doing something right with the power delivery because for a few weeks before I got this 1080 I had a evga 1060 which was only 5+1 (listed as 5+1) and that thing did not overclock AT ALL. I'm talking +25/+40 on XOC.
> 
> Silicon lottery I guess? And GPUZ will list it at like 1753, but isn't it listed on evga as 1860? Am I confusing some numbers here? Seem to be a lotta different ways today to list these GPUmemory speeds lol
> 
> And yeah in xoc I get 5843 or something like that. Maybe 5863 I can't remember off the top of my head. But definitely above 5800


try it again on that 1060...but use AB.


----------



## kmac20

I sold the 1060 and I also had used afterburner on it hoping to get more and I didn't. Just did not OC. Lottery loser that card was.


----------



## AlbertoM

Does anyone had better performance with the Strix XOC Bios, or any other Bios?

I soon will be putting my FE 1080 on water and wondering if it's worth a Bios flash for higher overclock for gaming.

Tks


----------



## Vellinious

Quote:


> Originally Posted by *AlbertoM*
> 
> Does anyone had better performance with the Strix XOC Bios, or any other Bios?
> 
> I soon will be putting my FE 1080 on water and wondering if it's worth a Bios flash for higher overclock for gaming.
> 
> Tks


You don't want to do that for just gaming. Set a custom voltage / frequency curve with the watercooling, and be happy.


----------



## AlbertoM

Quote:


> Originally Posted by *Vellinious*
> 
> You don't want to do that for just gaming. Set a custom voltage / frequency curve with the watercooling, and be happy.


Thanks for the tip.

I finally got to make the voltage go to 1.093v using the curve editor, and I think it will be more than enough for overclocking on water.


----------



## stephenn82

Quote:


> Originally Posted by *AlbertoM*
> 
> Thanks for the tip.
> 
> I finally got to make the voltage go to 1.093v using the curve editor, and I think it will be more than enough for overclocking on water.


Keeping your card cool, it may not need to go to 1.093. My ambient temps have dropped in my area (its been 16-20f lately) and my basement temps have lowered to maybe 65. My 1080 ftw hybrid might max out at 1.086 volts and it has lowered to max of 41c compared to 46c as before. Keep it cool and volts will stay low.


----------



## Vellinious

Quote:


> Originally Posted by *stephenn82*
> 
> Keeping your card cool, it may not need to go to 1.093. My ambient temps have dropped in my area (its been 16-20f lately) and my basement temps have lowered to maybe 65. My 1080 ftw hybrid might max out at 1.086 volts and it has lowered to max of 41c compared to 46c as before. Keep it cool and volts will stay low.


This...with lower temps, you'll be able to maintain higher clocks at lower voltages.


----------



## stephenn82

I learned that from V, much long ago in this very thread


----------



## AlbertoM

Nice to know guys... With low temperatures on water, what frequency and voltage can I expect to set on my 1080?

Also, with the Hybrid kit I can push memory OC further?

On air I set 200+ core and 450+ memory on my FE, that gives around 2100 MHz, voltages default to 1.063v max, mostly 1.050v.


----------



## kmac20

Having finally gotten around to using a 2nd dedicated PCIE power cable to my 1080, I was able to get the memory to an INSANELY HIGH AMOUNT.

Core I couldnt' really push THAT much farther. Maybe an extra....+10 or so in precision. But memory I got to like, crazy high.

Over 11100 firestrike extreme, #13 out of all those using a 1700/1080 combo

So 1475*8 would be 11800 on the memory if I'm remembering correctly

Still have yet to try the 2nd BIOS.


----------



## Dikonou

Just ordered an EVGA 1080 FTW2....and came upon this on their official site.

https://www.evga.com/articles/01104/evga-geforce-gtx-1080-ftw2-sc2-11gbps/

Has anyone done that bios update?


----------



## SavantStrike

Quote:


> Originally Posted by *Dikonou*
> 
> Just ordered an EVGA 1080 FTW2....and came upon this on their official site.
> 
> https://www.evga.com/articles/01104/evga-geforce-gtx-1080-ftw2-sc2-11gbps/
> 
> Has anyone done that bios update?


Not me personally but I've heard some cards are unstable on it. The update just adds a healthy oc to 10gbps RAM.


----------



## Dikonou

Ok...... thanks for the answer!!


----------



## ThrashZone

Hi,
1080 classified vbios update I read only changes the default fan curve so I never did it
I've always used a custom fan curve or full blast pushing it hard


----------



## kmac20

Quote:


> Originally Posted by *Dikonou*
> 
> Just ordered an EVGA 1080 FTW2....and came upon this on their official site.
> 
> https://www.evga.com/articles/01104/evga-geforce-gtx-1080-ftw2-sc2-11gbps/
> 
> Has anyone done that bios update?


Wait what BIOS update? Wasn't it already done? I'm confused, I see the link i'm checking it out right now.

I have an FTW2. I just psoted a 3dmark score you can see my results and speeds. My memory speed is insanely high. So if this would push the memory even farther that would be absolutely nuts.


----------



## stephenn82

Quote:


> Originally Posted by *SavantStrike*
> 
> Not me personally but I've heard some cards are unstable on it. The update just adds a healthy oc to 10gbps RAM.


Will the FTW2 bios work on the regular FTW model? I have a Hybrid version...and soon will go full EK block...when I piece together my loop


----------



## kmac20

Oh bro that update seems like its from APRIL. If you just bought the card i'm going to assume that it has it already. i'll check right now as I have an FTW2 I bought about 2-3 months ago well after april. Lets see. I'll edit this in a min after i check gpuz

Edit I wish it would list the BIOS versions inside the zip files so I could actually see. The BIOS I have on my card is:

86.04.66.00.80

But I have no idea if that is the same as the original or a newer one.

If some people are saying its unstable I'd maybe avoid it? Because if you see I was able to get my memory well above like 11000. I dont know. I wish it was more specific.

Is the original bios before the / and the newer one after? Why does the GPUZ listed bios have zero letters in it, if this is the original one it should be one of those not both. This would also seem it might be later as its .66 in the third column. I dont know, this seems a bit weird. I wish they were more specific.

Maybe I'll give it a shot. Card has two bios so if I save the one via GPUZ and it causes issues I can always reflash it back without much worry. If this gives it a regular boost I wonder if I'll be able to push it the same amount PAST the new speed and hit like 12000 on memory which would be INSANE.

DOUBLE EDIT: IN THE ADVANCED TAB ON GPUZ UNDER NVIDIA BIOS IT LISTS THE BUILD DATE AS 2017-04-05, which would imply this is RIGHT BEFORE this new one. Wow thats frustrating.


----------



## Dikonou

Sorry for the upset guys! Yes you are right and I' m reading every article I can find....Fromwhat I have read so far in the evga forum they said at 22 May that all new shipments coming will have that update applied.
Saw the article but didn't do my work-search properly!!!!!!

Sorry again!!!

If you have at CPU-Z

GPU clock 1721 *memory 1376* boost 1860

then you already have the update!!


----------



## kmac20

Reread my last last post because I made a few edits. And it seems like....I just dont know which at this point? They dont make the bios revisions very easy to determine the date of. They seem to like adding letters that previous didn't exist, randomly changing numbers, etc, instead of just doing a simple newer bios=higher number.

Again my BIOS is listed as April 5th version, and this was posted on April what 17th on the forum or 20th? But that could also mean the BIOS was OUT BEFORE THEN on the FTW2s but just not posted on the forum? And once again the version history isn't clear because of they way they make their BIOS.

I made a post in that thread on EVGA forums. See what answers i get.

Regardless I"ve already got my card a bit past 11 on the memory. SO my question here would be this: I've got my mem running at 11800 for benching. Would this update allow me to push it FARTHER than that? Or would it just mean the default memory is now 11 instead of 10 and I'd still only be able to push it to 11800? If that's the case I have no desire to update the BIOS unless it would make it more stable.

Not sure yet. Very confused on how they list name BIOS revisions. Hey heres a letter, heres a random different letter, heres a number where the letter was, and heres NO LISTING of the BIOS numbers in the downloads!


----------



## stephenn82

I bought my card second hand from a coworker...all he did on it was adjust fan curve and change the led colors via XOC...he didnt OC it at all...he said the stock OC was good enough (it would hit 2026 or 2114 depending on temps) and just played a LOT of R6:S hahaha He bought it back in summer 2016, when it released for 749.99...he ended up getting a Ti SC Hybrid.

So...update it? Or leave it?


----------



## kmac20

I think you mean GPUZ.

And right now it says for DEFAULT: 1607/1376 and 1734 for boost.

My current overclock (NOT MY MAX ONE WITH THE MEMORY SUPER HIGH AS BEFORE, this is just a one i know is stable for 99% of games i run): 1752/1451/ boost 1879

Guess its updated already! Really wish EVGA made the bios revisions more....easily understandable in terms of sequence. BECAUSE GPUZ also lists this BIOS as being released 4/5/2017 and the post on the EVGA forum is about 2 weeks after that, something like 4/17/17, so yeah. I guess it was released for other cards to update?

Very confusing.

Gues im good though! So pushin that memory to 11800 is about as good as I could get it. With this listed clock its about 11600


----------



## SavantStrike

Quote:


> Originally Posted by *stephenn82*
> 
> Will the FTW2 bios work on the regular FTW model? I have a Hybrid version...and soon will go full EK block...when I piece together my loop


I would make sure your RAM is stable with a 1GHZ over clock first, and then yes it should work. I don't think they changed anything on the card itself otherwise.


----------



## Dikonou

Quote:


> Originally Posted by *kmac20*
> 
> I think you mean GPUZ.
> 
> And right now it says for DEFAULT: 1607/1376 and 1734 for boost.
> 
> My current overclock (NOT MY MAX ONE WITH THE MEMORY SUPER HIGH AS BEFORE, this is just a one i know is stable for 99% of games i run): 1752/1451/ boost 1879
> 
> Guess its updated already! Really wish EVGA made the bios revisions more....easily understandable in terms of sequence. BECAUSE GPUZ also lists this BIOS as being released 4/5/2017 and the post on the EVGA forum is about 2 weeks after that, something like 4/17/17, so yeah. I guess it was released for other cards to update?
> 
> Very confusing.
> 
> Gues im good though! So pushin that memory to 11800 is about as good as I could get it. With this listed clock its about 11600


Yeap seems it already has the update installed!!!!


----------



## stephenn82

Quote:


> Originally Posted by *SavantStrike*
> 
> I would make sure your RAM is stable with a 1GHZ over clock first, and then yes it should work. I don't think they changed anything on the card itself otherwise.


if they changed nothing on the card, the ram should clock up, right? i can try it out. just push the +500 to it? I run mine all the time at +450 no issues. No sense of getting more heat/power use for an extra 50mhz. We will see.

SO...it turns out my running at +450 is actually only boosting it by 56mhz..its 8 timed..lol I see 1307 in my actual ram speed. stock is an actual 1251. I gotta bump this baby up by at least 992.


----------



## SavantStrike

Quote:


> Originally Posted by *stephenn82*
> 
> if they changed nothing on the card, the ram should clock up, right? i can try it out. just push the +500 to it? I run mine all the time at +450 no issues. No sense of getting more heat/power use for an extra 50mhz. We will see.
> 
> SO...it turns out my running at +450 is actually only boosting it by 56mhz..its 8 timed..lol I see 1307 in my actual ram speed. stock is an actual 1251. I gotta bump this baby up by at least 992.


Are you running in a low power state? That messes up RAM clocks. +1000 is a crap ton if you haven't gotten there yet, so take it slowly.

My guess is the bios loosens timings at the same time it bumps clock speed and memory voltage. Not every 10gbps module can run at 11.


----------



## stephenn82

Quote:


> Originally Posted by *SavantStrike*
> 
> Are you running in a low power state? That messes up RAM clocks. +1000 is a crap ton if you haven't gotten there yet, so take it slowly.
> 
> My guess is the bios loosens timings at the same time it bumps clock speed and memory voltage. Not every 10gbps module can run at 11.


Good to know...maybe I wont pursue this. I am happy with 10500 lol


----------



## kmac20

For some reason that isn't totally accurate in GPUZ. It doesn't show my actual core boost. When I have +162 the actual boost is to 2067. I dont know why GPUZ doesn't accurately show that.

SO with my MAX overclock i've used for benching my speeds are

2067 core and 5900 memory


----------



## Xrc6

Does anyone know if a 4770 i7 will bottleneck a 1080 gtx?


----------



## stephenn82

Quote:


> Originally Posted by *Xrc6*
> 
> Does anyone know if a 4770 i7 will bottleneck a 1080 gtx?


dont believe so.

A G4560 didnt bottle kneck a 1080 Ti


----------



## kmac20

Doubtful except in very few games


----------



## stephenn82

I think it will bottleneck it if you are running at say, 720p? hahahah


----------



## akromatic

anyone has any experience mounting an EK thermosphere to a evga 1080SC?

i saw some pics where they mounted one while keeping the front plate installed but i'm struggling to do the same.



the 4 nubs on the front plate is interfering with the G80 bracket preventing the copper base of the GPU block to have contact with the GPU die

ref:http://www.overclock.net/t/1601288/official-nvidia-gtx-1080-owners-club/940#post_25249350


----------



## microchidism

just eyeballing it...it looks like you would have to either cut or use a shim

edit: if you scroll further down on the same page you linked, they had to cut to get it to work


----------



## chiknnwatrmln

Hi, I've been out of the loop for a while. Did anyone figure out a way to safely increase voltage past the stock BIOS limits on reference cards? Thanks


----------



## stephenn82

Anyone on 1607 14393.2007 roll to 390.65? How is performance? Im still on 388.31 and considering switching to new.


----------



## nolive721

if there are ZOTAC cards owners here

having a weird problem with my AMP Extreme+. when windows start,the card Fans remain stopped as per the feature of the card and wouldn't kick in until GPU temp reach 60degC say when I am doing some heavy gaming or benchmark

then if I stop gaming or said benchmark, the card cools down and the Fan would stop when temp goes below 60degC

so far, so good.

but if windows enters sleep mode and I wake again the OS, the card Fan would spin even if the GPU temp is far below the 60degC threshold.granted at very low speed but why this is happening?

work around I found is that I have to set fan to manual, give them a spin even for just few seconds and then after going back to auto, they stop spinning as they should since the GPU temp is below 60degC

anybody experienced this problem and found a more straightforward fix?


----------



## nolive721

ignore my post

i have reinstalled firestorm and teh problem seems have gone.

I noticed ZOTAC released a new version of the software which is a Serious regress from the 2.0 version:
- No voltage control ( which is working on 2.0 - increases the stable boost frequencies slightly )
- No function to download/update VBios
- No control of SPECTRA ( RGB LED on Zotac card )
- No transparency of the UI + Always On Top function (which can be used in some games)
- Completely missing any Help.

I do not recommend it at all


----------



## Sweetcheeba

For anyone interested or unsure - the updated FTW2 bios with 11GHz mem works fine on my FTW (first generation with ACX cooler)

I have kept my master bios as is and have only updated the slave bios with the +130% PT 11GHz.

Use the latest version of nvflash and this .rom https://www.techpowerup.com/vgabios/191610/191610 (It says it's for the DT model but is just a typo and should be ignored)


----------



## outofmyheadyo

How many degrees on the core do you watercooled guys get on the 1080? I have a feeling somethings wrong with mine it's runnin stock, havent opened it up after buying it, even my 1080ti didnt get that hot while powermodded and overclocked, my water is 30 degrees and core hits 52, i have a feeling those msi idiots didnt remove plastic from the block like we saw on one thread over here. It's the MSI seahawk ek x quess its time to take the loop apart again.


----------



## stephenn82

Sweetcheeba said:


> For anyone interested or unsure - the updated FTW2 bios with 11GHz mem works fine on my FTW (first generation with ACX cooler)
> 
> I have kept my master bios as is and have only updated the slave bios with the +130% PT 11GHz.
> 
> Use the latest version of nvflash and this .rom https://www.techpowerup.com/vgabios/191610/191610 (It says it's for the DT model but is just a typo and should be ignored)


So you dont have to OC your VRAM to 11ghz for it to work? I heard that you had to get the clocks up and then flash. If this is the case, I will do it as the RAM will hit that speed. I might have to slide it down -50 afterwards though. Plus, if ram defaults to that speed, mining would be nice! Right now it runs at 4954 instead of the default 5005.



outofmyheadyo said:


> How many degrees on the core do you watercooled guys get on the 1080? I have a feeling somethings wrong with mine it's runnin stock, havent opened it up after buying it, even my 1080ti didnt get that hot while powermodded and overclocked, my water is 30 degrees and core hits 52, i have a feeling those msi idiots didnt remove plastic from the block like we saw on one thread over here. It's the MSI seahawk ek x quess its time to take the loop apart again.


That sounds odd. Its a regular hybrid 1080 right, the MSI one? Supposedly my EVGA FTW Hybrid has the "worst" cooler between yours and mine, but my card idles at 24c on core and maxes at 43c gaming...and hits 51c when mining Bluwark, 48c on Garlicoin. I think I will stop once I hit like 50 GRLC and only do a little bulwark...but only after my full loop is done to keep those temps down.


----------



## Sweetcheeba

stephenn82 said:


> So you dont have to OC your VRAM to 11ghz for it to work? I heard that you had to get the clocks up and then flash. If this is the case, I will do it as the RAM will hit that speed. I might have to slide it down -50 afterwards though. Plus, if ram defaults to that speed, mining would be nice! Right now it runs at 4954 instead of the default 5005.


I think maybe someone has suggested overclocking your VRAM to 11ghz manually, just to make sure the card card can do it and is stable before you flash the updated bios.

Going by what I’ve read so far, you’d need to be pretty unlucky to have a 1080 that won’t do 11ghz. Quite a few have flashed the updated SC2 bios (same 11ghz VRAM clock) to reference cards without any issues. 

What card do you have?


----------



## Sweetcheeba

outofmyheadyo said:


> How many degrees on the core do you watercooled guys get on the 1080? I have a feeling somethings wrong with mine it's runnin stock, havent opened it up after buying it, even my 1080ti didnt get that hot while powermodded and overclocked, my water is 30 degrees and core hits 52, i have a feeling those msi idiots didnt remove plastic from the block like we saw on one thread over here. It's the MSI seahawk ek x quess its time to take the loop apart again.


40c seems to be about the norm for a full cover EKWB. 52c does seem a little high. Is your pump running ok? Maybe a repaste would be worth a try?


----------



## stephenn82

Sweetcheeba said:


> I think maybe someone has suggested overclocking your VRAM to 11ghz manually, just to make sure the card card can do it and is stable before you flash the updated bios.
> 
> Going by what I’ve read so far, you’d need to be pretty unlucky to have a 1080 that won’t do 11ghz. Quite a few have flashed the updated SC2 bios (same 11ghz VRAM clock) to reference cards without any issues.
> 
> What card do you have?


I have a EVGA 1080 FTW Hybrid. I had it up to 11gbps, actually a touch over...it benches some with success...gaming is a whole different story. The Division didnt like it...but BF1 is more tolerable. I normally keep it at +450 so it runs at 10910 mhz, close to the 11000mhz.



Sweetcheeba said:


> 40c seems to be about the norm for a full cover EKWB. 52c does seem a little high. Is your pump running ok? Maybe a repaste would be worth a try?


is there any way to get that LOWER than 40c with a full block? I know throwing more rads into the mix can cause issues...but I was considering a 360 rad with a 240/280. Or a case with a pair of 360's.


----------



## SavantStrike

stephenn82 said:


> I have a EVGA 1080 FTW Hybrid. I had it up to 11gbps, actually a touch over...it benches some with success...gaming is a whole different story. The Division didnt like it...but BF1 is more tolerable. I normally keep it at +450 so it runs at 10910 mhz, close to the 11000mhz.
> 
> 
> is there any way to get that LOWER than 40c with a full block? I know throwing more rads into the mix can cause issues...but I was considering a 360 rad with a 240/280. Or a case with a pair of 360's.


Yes, but you'll need a lot of rad space. The closer you get the approach, surface area and airflow required start getting ridiculous.


----------



## Sweetcheeba

stephenn82 said:


> I have a EVGA 1080 FTW Hybrid. I had it up to 11gbps, actually a touch over...it benches some with success...gaming is a whole different story. The Division didnt like it...but BF1 is more tolerable. I normally keep it at +450 so it runs at 10910 mhz, close to the 11000mhz.


I suppose there’s always going to be a chance you have a card that can’t do it, but if you have a core offset try leaving it at stock and focus on mem.

There’s a bit of speculation on the EVGA forums to whether EVGA changed the timings as well as memory clock. If you’re on a hybrid give it a try, it could be the original bios holding you back. It’s easy enough to flash back again if you backup first.


----------



## Sweetcheeba

Oops double post


----------



## Bal3Wolf

I have a evga 1080 FTW hydro copper i havet been super happy with temps so i just recently snagged a ek ftw block and temps are worse lol gaming im seeing 45-47c mining 47-55c i have remounted the block atleast 10 times using differnt paste. My current setup is EK-SBAY Dual DDC 3.2 >>> rx240 >>> ek 1080 >>> black ice 360 >>> koolance 380i "6800k" >>> ek ce 420 >>> EK-SBAY Dual DDC 3.2, with 3 vardars fans as intake on the 360 rad 6 140s on the 420 rad intake 1 140 in the back for exhaust 2 120s in the bottom for exhaust door is off right now. Thing catches my eye is how fast temps shoot up on it they dont gradualy go up just load and boom almost max temps. Juras been helping me but i think we have ran out of things to try looking to see if any one else has a idea the block has been taken apart to check the inside its clean i even tried using the mosfet vrm pads off my evga block no luck on getting better temps.

https://www.ekwb.com/shop/ek-fc1080-gtx-ftw-acetal-nickel


----------



## Kashinoda

NZXT G12 with stock fan, H55 with Noctua F12.

From 85+ at load to 58 on my MSI Armor 1080.

G12 fan and Pump are running 100%/12v with Noctua using PWM from card. So quiet.

Wish it had a VRM sensor but there you go.

Lovely.


----------



## Cannonkill

Would it be a good idea to keep a 1070ti that overclock well or keep a 1080 that I haven't tried overclocking? These are both msi gaming x cards. Rest of the system is a stock 7700k msi carbon gaming mobo, and 16Gb of ram.


----------



## SavantStrike

Cannonkill said:


> Would it be a good idea to keep a 1070ti that overclock well or keep a 1080 that I haven't tried overclocking? These are both msi gaming x cards. Rest of the system is a stock 7700k msi carbon gaming mobo, and 16Gb of ram.


Id keep the 1080 personally. No matter how well the 1070 TI over clocks it will never beat a 1080.


----------



## greg1184

For starters, I want to say EVGA's support rocks. I was having issues with my 1080 ACX 3.0, so I did an advanced RMA and they ended up sending me the Superclocked version.

I was working on overclocking the video card and I managed to get +1000 on memory clock using Afterburner without seeing artifacts on Heaven or FurMark. In terms of scores, the heaven benchmark kept steadily increasing every time I increased the memory clock. It surprised me because my last card I didn't get past +500 I don't know if I am just lucky or got an outstanding card.

Heaven at 3440x1440: 
Baseline: 1193
+100 Core, +1000 memory (2100/6005): 1348

I am going to do 3dmark next.


----------



## greg1184

Double post.


----------



## coreykill99

so I have been playing around with the curve a lot lately, I decided a few days ago to go in the other direction with it. 
pretty sure I have to do it in the curve as I cannot find how to set negative voltage in afterburner. 
ATM I am sitting with 1999mhz @ 0.968 I think it is. running through superposition and then playing some games for a few hours. but I occasionally experience "issues" that arent enough to put a finger on things just feel off sometimes for a second. when this happens however I can never be sure if its the card. or just the game engine. playing BF1 especially does this. like tiny microstutters. but ever since Ive played it those have been there. so it leaves me to wonder if its the card or just the game as they seem more prevalent now. I dont really notice anything when im playing destiny 2. 

so the real kinda questions here are what happens when you start working backwards and starving the card for voltage? will it freeze and black screen like when an OC fails? or anything like that? or does pascal/GPU 3.0 do some behind the scenes work and clock things down to your voltage and just not tell you? how many levels of unstable are there when doing this? I mean freezing and missing textures are the big ones. are there more subtle things to look out for?


----------



## kmac20

outofmyheadyo said:


> How many degrees on the core do you watercooled guys get on the 1080? I have a feeling somethings wrong with mine it's runnin stock, havent opened it up after buying it, even my 1080ti didnt get that hot while powermodded and overclocked, my water is 30 degrees and core hits 52, i have a feeling those msi idiots didnt remove plastic from the block like we saw on one thread over here. It's the MSI seahawk ek x quess its time to take the loop apart again.


This seems kinda high. When I leave my fans at auto (I have an FTW2, so its ICX with the 2 separate fans, kinda gimmicky but I use it when I want auto and less noise) my temps hit like, what 63 or 73? I cant remember which, but this is when fans are at AUTO, and they stay at like 30% or whatever.

When I crank my fans up to lets say, 70-80%, the core/mem are at like 45C. I know this because of both HWINFO, as well as my ICX color coded temp LEDs: the lights for the Core/mem/vrms will usually stay white, which means they dont get hotter than 45 (thats what I have my white leds set to for up to 45, and after that green then red for like, above 75 I think or 80 or something). So yeah, with JUST AIR COOLING my card doesn't get hotter than 45 on ANY COMPONENT, and usually if I crank the fan up to 80, give or take, it'll sit about 43ish I think on all components. This is with a pretty high OC of just under 12000 on memory and like a core of +150 in XOC (I cant remember what this actually translates too of the top of my head, but I have a few recent benchmarks posted on the last page or two you can check) with MAX POWER DRAW and voltage either at stock/ 50%/100% max (again on XOC). This is with main BIOS, i still have yet to try slave bios which is supposed to give even more leeway. Also in my experience, voltage matters very little with this card. As i've written and one or two others have agreed with, PASCAL PREFERS TO BE COOL THAN HAVE HIGHER VOLTAGE. Meaning I can get basically the same overclock with stock voltage compared to the maxed out slider, because it would rather be cooler than have an extra what, .6 volts or whatever? Pascal would rather be colder and would give you better performance that way, than compared to maxing voltage and generating more heat and getting roughly the same overclock. This is again just my experience (although again a few other members agreed).

So once again to respond to your query, it seems like your temps are very high for water since I can get lower than 50 on air only. But I guess anythings possible. IDK though, yours seems high for having a block.



greg1184 said:


> For starters, I want to say EVGA's support rocks. I was having issues with my 1080 ACX 3.0, so I did an advanced RMA and they ended up sending me the Superclocked version.
> 
> I was working on overclocking the video card and I managed to get +1000 on memory clock using Afterburner without seeing artifacts on Heaven or FurMark. In terms of scores, the heaven benchmark kept steadily increasing every time I increased the memory clock. It surprised me because my last card I didn't get past +500 I don't know if I am just lucky or got an outstanding card.
> 
> Heaven at 3440x1440:
> Baseline: 1193
> +100 Core, +1000 memory (2100/6005): 1348
> 
> I am going to do 3dmark next.


EVGA has the BEST SUPPORT OUT OF ANY COMPUTER COMPONENT COMPANY I HAVE EVER BOUGHT.

This is not fanboying. This is a fact. I have an 88GT with a lifetime warranty I've sent in 2 times now (about to be 3 because my old mobo killed it :X ) and they have always honored it. When I've called them with problems or questions they have always 1) had a REAL PERSON AT THE OTHER END, not some automated robot crap or some outsourced job who doesn't know what I"m talking about, its someone at their actual headquarters (?) or another branch 2) HAVE ALWAYS HONORED WARRANTIES, including the LIFETIME ONE on my 88GT to reference once again 3) have always listened to me, taken feedback, and IMO actually probably utilized my feedback in some way/shape/form and 4) are just great and a pleasure to deal with, in contrast to 99% of companies.

I think brand loyalty is stupid? I have owned an equal number of NVIDIA/AMD cards (actually with my recent 1080 purchase, I have owned 1 more NVIDIA than AMD, but still), I always used Intel until Ryzen came out because Ryzen was the better value, I have used the best RAM for the money I can buy, the only brands I stick to are WD (Because I have NEVER gotten a bad drive from them, but for 1-2 people I've done builds for, I've gotten dead drives, so I wont pretend they're perfect, this is just my experience and whhy I've stuck with them) and EVGA. EVGA has the best support out of anyone I've used. EVER. I stick to them for NVIDIA just for this reason. If I go NVIDIA, I go EVGA, no questions asked. More expensive or 1-2FPS less? I dont care. Support matters more to me than 1-2FPS or 5-10$, and I will go EVGA every time until they disappoint me. BRAND LOYALTY IS STUPID, BUT IF THEY OFFER GREAT SUPPORT AND ACTUALLY LISTEN TO FEEDBACK AND ACTUALLY HONOR WHAT THEY SAY THEY WILL? THAT is a company you should stick to. And I will go with EVGA until they let me down.


----------



## kmac20

delete, double post. there was a delete button recently when the switch to vbulletin happened, cant find it anymore. disregard this and mods delete it if you can.


----------



## outofmyheadyo

there used to be alot of good about the forum and the layout, not its like a failed abortion, its terrible, plus it lags like crazy...


----------



## ColdHardCash

hey guys im about the full the trigger on a 1080 however im stuck between msi gaming X and evga FTW2.


----------



## coreykill99

ColdHardCash said:


> hey guys im about the full the trigger on a 1080 however im stuck between msi gaming X and evga FTW2.


Cant speak for EVGA as I havent had one. however my GTX 1080 is the MSI gaming version and the cooler on that card is crazy. seems oversized and I could never hear it. even turned it all the way up once could barely make it out.

Edit: also to note. not sure of the longevity as I only had it on air for a month or two then but a water block on it.


----------



## Beagle Box

ColdHardCash said:


> hey guys im about the full the trigger on a 1080 however im stuck between msi gaming X and evga FTW2.


Love my MSI Gaming X. Easiest to overclock in my opinion. Watercooled now for even more speed. Never a problem.


----------



## hotrod717

Luck of the draw on silicon. Best bet is to go with best cooling available for air. If on water, it could be any card. AIB offer some better components, but really all about the silicon. Some FE's blow the doors off any Strix, FTW3, ect., on water.


----------



## microchidism

What is considered a good OC on 1080s these days, is 2100 constant on the core still pretty much "wow you got lucky" territory, or is it a bit higher?


----------



## sirleeofroy

microchidism said:


> What is considered a good OC on 1080s these days, is 2100 constant on the core still pretty much "wow you got lucky" territory, or is it a bit higher?


Based purely on my personal experience I would say a sustained 2100 on air is above average but 2100 on water is about average
.


----------



## nrpeyton

A 1080Ti with properly installed water block shouldn't be more than about 9-10c hotter than water temp on the core when drawing up to about 250 watts. When drawing up to about 300-325 watts, 13c above water temp is okay. If you're drawing up to 450 watts then it may go as high as 16c above water temp.

If you're getting 20c above water temp only playing normal games with no shunt mod and no XOC BIOS then something is definitely wrong.

Right now I'm drawing an average of 237 watts and Core Temp is only 8-9c above my water temp. Remember that unless overclocked/modded or BIOS flashed most 1080Ti's will start to power throttle at 250 watts. (In fact 250 watts is actually the TDP for a Founders Edition and the "intended" TDP by Nvidia for their GPU).




----------------




My new HDMI cable. (Just ordered it now):_ If this doesn't work at 4k 60HZ then I think I'll jump off a bridge, lol.
_

















--------------------------------------------------------------
--------------------------------------------------------------









*Somethings definitely went wrong lol!*













What on gods earth could cause that?


P.S. anyone know how to remove these annoying thumbnails from the bottom of any posts i make with pictures? (I don't need to see the picture twice!)


----------



## kmac20

microchidism said:


> What is considered a good OC on 1080s these days, is 2100 constant on the core still pretty much "wow you got lucky" territory, or is it a bit higher?


I would say 2100 is pretty high on air or even in general. IDK just my 2 cents, I dont use water. 2100 is pretty high IMO especially compared to stock or even factory OC.


----------



## microchidism

kmac20 said:


> I would say 2100 is pretty high on air or even in general. IDK just my 2 cents, I dont use water. 2100 is pretty high IMO especially compared to stock or even factory OC.





sirleeofroy said:


> Based purely on my personal experience I would say a sustained 2100 on air is above average but 2100 on water is about average
> .


Thanks for the replies, got a couple 1080s wanted to see how well they stack up today, might end up keeping just one for the sake of not dealing with SLI but we'll see!

Memory seems a bit harder to gauge as after a certain point performance tapers off and if you go high enough it seems to decrease... all of this before visual defects show up. I'm sure there most still OC vram till visual defects rather than peak performance leading to inflated values.


----------



## kmac20

I have my memory insanely high, like about 11800 after all the multiplications are done. So thats like....just shy of 2000 in GPUZ or like just shy of 6000 on XOC. I hate how many different ways there are of describing memory speed now. So i'm just listing them all!


----------



## coreykill99

microchidism said:


> Thanks for the replies, got a couple 1080s wanted to see how well they stack up today, might end up keeping just one for the sake of not dealing with SLI but we'll see!
> 
> Memory seems a bit harder to gauge as after a certain point performance tapers off and if you go high enough it seems to decrease... all of this before visual defects show up. I'm sure there most still OC vram till visual defects rather than peak performance leading to inflated values.


think mine topped out around 2119~something like that on water, beyond that I would get intermittent crashing and just plain instability. that was also just punching numbers in. I need to go back and try to play with the curve and see where it goes, if it goes any further. 

and good for you getting a few 1080's let us know how thats going for you if you decide to test and stick with sli setup. im hoping the mining dies down a little to get a cheap 1080 for a sli setup. my case looks awful empty lol. unless I don't have one by the time Nvidia announces their "ampere" lineup in april? well see what happens.


----------



## kmac20

Yeah soon as the GPU crash of 2018 happens I'll be grabbing a 2nd 1080. Hopefully I can get another FTW2 edition although i wont hold my breath. With enough luck though I"ll get one that I can use the ftw2 bios on which defaults the memory at 11000


----------



## ZealotKi11er

microchidism said:


> What is considered a good OC on 1080s these days, is 2100 constant on the core still pretty much "wow you got lucky" territory, or is it a bit higher?


Hard to say. It depends on many factors. Its power, temp and load. If you can control the first 2 and stay 2100 with any load than its a golden card. Sure people can get these cards 2100+ if they run them at low 30s. I would say 2100MHz at 55-60C is a golden card.


----------



## Cerberus

ZealotKi11er said:


> Hard to say. It depends on many factors. Its power, temp and load. If you can control the first 2 and stay 2100 with any load than its a golden card. Sure people can get these cards 2100+ if they run them at low 30s. I would say 2100MHz at 55-60C is a golden card.


So im golden?


----------



## ZealotKi11er

Cerberus said:


> So im golden?


For sure. What kind of load did you put your card under?


----------



## nolive721

Cerberus said:


> So im golden?


you seem to have an Ok card indeed but your screenshot shows only 87% GPU usage and very low VRAM being used too.

with that kind of moderate load, and related low power and temps, my ZOTAC AMP-EXTREME+ could sustain 2164Mhz easily but would drop to 2113Mhz under serious gaming


----------



## Cerberus

That screenshot was taken during [email protected], I'm mining ZEC right now and its still the same clocks @ 100% usage 53c


----------



## hotrod717

Cerberus said:


> So im golden?


Without additional info, can't say. A gpuz open running 3d11 showing a sustained clock of 2113 would be. You are only showing a snap and no info on what the gpu is actually running or loading on. For pascal in general, the 1080's seemed to clock the best.


----------



## ZealotKi11er

Superposition 4K+ is that hits the card the hardest in my opinion. When I play games I am at 90-100% power but synthetic benchmarks hit 120% peaks. I never really test my 1080 much since it was FE cooler. I think Ti is more power limited in general.


----------



## jura11

Hi there

If you are golden, hard to say, have look my Manli GTX1080 Founders Edition with EKWB waterblock will do 2164MHz easy and will these clocks in gaming or in any benchmark, temperatures are in 36-38°C under heavy load,only in mining they're usually at 40-42°C 

Other 1080 is EVGA GTX 1080 which is one poor OC'er, this card will do as max 2100MHz at 1.093v and extra 150MHz on VRAM,on other hand Manli will do 2164MHz at 1.07v with extra 450MHz on VRAM 

Hope this helps 

Thanks, Jura


----------



## stephenn82

I have FTW Hybrid...the FTW2 is a slight improvement with the ICX and thermal sensors. Dont sleep on those EVGA cads. My FTW is designed to ramp up to 246W (default 1080 is like 180W, maybe 195W) 

If you find an EVGa Hybrid cooler, pop it on there and go to town. I hit 2126Mhz core, 5500 Mem, 1.08v and peak at 43c whilst gaming at 1440p. Beast mode!

Whatever you pick, ust Afterburner to tune it...EVGA Precision-x kinda sucks.

Once I get my custom loop built with my EK full cover on there, I will let you know how that rolls.


----------



## kmac20

All of my overclocks have been done with precision and I've pushed it decently far. I would say to use afterburner unless you're like me and have specific features (ICX) that you cant get out of another program.


----------



## stephenn82

kmac20 said:


> All of my overclocks have been done with precision and I've pushed it decently far. I would say to use afterburner unless you're like me and have specific features (ICX) that you cant get out of another program.


maybe my card just hates it. I couldnt OC my VRAM 25mhz with XOC...AB i can push it to 11100.


----------



## nolive721

what is the minimum core frequency a gtx 1080 can be set at?

I am doing some experiment with my card with extreme undervolting to see how low I can go in power consumption in regrads to what I posted on the Ryzen thread(see below).checking the recent 2400g benchmarks showing anything between 100 and 130W combined CPU/iGPU power consumption while gaming.

with my 1500X stock and 1265Mhz core/0.8V on my Zotac,I am running at around 150/160W in ROTR benchmark combining CPU and dGPU power usage




considering a bit more my rig upgrade.

currently running a 1500X on B350 PRO4 mobo with Corsair RAM at 3,066Mhz cooled by the Spire stock cooler, stable in gaming and stress testing at 3.8ghz/1.2875 and stable-ish in gaming at 3.9ghz and 1.35V on non CPU intensive games Assetto corsa, COD and BF previous gen titles.
Not stable at this frequency on GTAV,Project cars or Witcher3 as well as stress testing

regarding power consumption,at 3.8ghz, around 105W CPU+SOC and 130W at 3.9ghz so clearly the Spire cooler cant keep up with a given 95W TDP (I have great airflow in my Aerocool P7 C1 case though) 

I am gaming on a 3 1080p rig driven by an heavily OCed Zotac GTX1080

considering 2 options to improve further my CPU OC frequencies

1) keep the 1500X and ditch in an 240AIO liquid cooler
2) ditch the 1500X and replace it with imposingly higher base clock 2400G APU

Option1, not sure if I can really reach 4ghz with the help of the 240mm AIO if my CPU sillicon isnt great
Option 2 is nice since the VEGA onboard chip woudl allow me to use Freesync feature on my center monitor if I want to play on single screen.and there is potential for OCing it But I see it comes with the lower TDP stealth cooler so not sure the chip will like if I push CPU and iGPU clocks


----------



## Vici0us

kmac20 said:


> I would say 2100 is pretty high on air or even in general. IDK just my 2 cents, I dont use water. 2100 is pretty high IMO especially compared to stock or even factory OC.


That's pretty interesting... I thought that 2100mhz was a good OC but nothing special. My Zotac AMP Extreme 1080 goes up to 2164mhz but I keep it @ 2114mhz / 11400 on memory (on air, stock voltage). It already hits 2050mhz out of the box. I guess, I got lucky with my card. I was wondering what kinda OC these cards are able to achieve (Strix, Aorus, Gaming X, FTW2) w/ stock coolers on air. All the the replies would be appreciated.


----------



## UNOE

my two 1080 strix cards are on blocks. I can't get past 120% setting in Afterburner. I'm not able to get much past 2000mhz. Any modded bios's worth trying?


----------



## andydabeast

Hi guys, Gigabyte windforce card here under water with a 1600x at 4.0 and corsair AX860 PSU. 
I have a TDP throttling issue that only happens in some games and started happening more once I put the GPU block on. The image attached is during the first part of a Heaven run. clock speed is ok, memory is ok, GPU load is ok, but TDP won't go beyond 38%. I fire up COD WWII multiplayer, play for hours over 100 fps. Switch to the single player, 10 frames and TDP throttled. If I come off a fresh computer restart it seems to work. 

I tried stock OC and some other combinations with no change. Only a fresh restart fixes.

I googled around a lot and found lots of people whose clock speed was stuck but none with TDP. 

MSI Afterburner is up to date, Drivers are 391.01, windows 10 is up to date, I am on an SSD but having to restart any time I want to play certain games is annoying. 

Anyone know what it could be?


----------



## mrgnex

Ignore



Spoiler



Quick question I don't wanna make a new thread for and clutter things up.
I got a replacement PSU from cooler master, a V750.
Installed everything but wouldn't boot. A0 error with the orange boot device LED lighting up.
Tried looking around and swapping the cable and the plug for the GPU power but no good.
Took out GPU and put the display cable in the motherboard. Boots up fine and I have display.
Plug in GPU again and now it boots fine but still no display..
Edit: Switch cable from Displayport to HDMI and it works fine
Edit 2: plugged in the Displayport cable again and it works fine. Probably got unplugged a little


----------



## white owl

Hi
Just bought a 1080 SC and was wondering if anyone has flashed one and with what BIOS?
I see the editor still isn't here so I guess I'll have to figure this out. 
Flashing my 980 FTW with a Strix BIOS made it run like hell and I see people are doing that with 1080s with the same PCIE pin count.


----------



## andydabeast

andydabeast said:


> Hi guys, Gigabyte windforce card here under water with a 1600x at 4.0 and corsair AX860 PSU.
> I have a TDP throttling issue that only happens in some games and started happening more once I put the GPU block on. The image attached is during the first part of a Heaven run. clock speed is ok, memory is ok, GPU load is ok, but TDP won't go beyond 38%. I fire up COD WWII multiplayer, play for hours over 100 fps. Switch to the single player, 10 frames and TDP throttled. If I come off a fresh computer restart it seems to work.
> 
> I tried stock OC and some other combinations with no change. Only a fresh restart fixes.
> 
> I googled around a lot and found lots of people whose clock speed was stuck but none with TDP.
> 
> MSI Afterburner is up to date, Drivers are 391.01, windows 10 is up to date, I am on an SSD but having to restart any time I want to play certain games is annoying.
> 
> Anyone know what it could be?


removing the overclock on my monitor seemed to fix it for COD. BF2 still had a hard time though.


----------



## juniordnz

Guys, how's sli working nowadays? Every review I foumd is from 2016.

Just received a new FTW2 from EVGA and I'm considering pairing it with the FTW DT I'm currently running. 

Is it worth it? I'd have to get a new PSU since mine is a RM650X.

Thanks in advance.


----------



## kmac20

Did the newest 391 drivers cause problems for anyone else too?


----------



## white owl

In some games yeah I had problems. Green crazyness on the screen.
Had to go into NCP and set sharpening (and the other option) to nvidia's setting. All fixed.
It's the page with only 2 sliders.


----------



## kmac20

I had to roll back to 390 because it was giving me INSANE stuttering in something as basic as HBONow. And my FPS in Dota has TANKED.

Same Firestrike scores, same timespy scores. All other games seem fine. No idea whats going on.


----------



## SavantStrike

juniordnz said:


> Guys, how's sli working nowadays? Every review I foumd is from 2016.
> 
> Just received a new FTW2 from EVGA and I'm considering pairing it with the FTW DT I'm currently running.
> 
> Is it worth it? I'd have to get a new PSU since mine is a RM650X.
> 
> Thanks in advance.


Sell both cards and buy a 1080 TI. SLI is only worth it when you can't go any farther with a single card.


----------



## AlbertoM

Hello.

Just reporting.

I got the Hybrid kit to my FE 1080, and seeing temperatures never go above 55c overclocked and limited by TDP hitting 220w while playing Doom at 1440p, i took the balls and flashed the Asus XOC Strix T4 BIOS, OC (1936 MHz boost version).

It does work, and well. But the voltages are too high and was giving instability, so I set a curve with 1.075 max, and is rock solid, max temps 60c, and now I play and the TDP goes from 200w to 280w, avarage 240w, and spikes of 320w was seeing while playing at 1440p.

Gaming at 1440p is smoother as 1080p was before. And 1080p now in DOOM max out v-sync off gives 200FPS constant, at ALL times. At 1440p also from 200FPS and slowing to 150FPS at heavy combat. Before is slowed to 60-80FPS from 150FPS, pretty sure because of TDP.

My OC is set to 2088 MHz, mem 450 MHz, fans max on the board (to cool VRMs and RAM) and the radiator (not so loud the EVGA fan).

I checked my PCI-E 8 pin cable that is feeding this beast now, and gets a little warm but nothing to worry, I have a single rail PCIE from PSU thats a Corsair 850w.

Temperatures on the board are hot in the VRM region, but nothing crazy burn your fingers hot. Without the heat of GPU there, i think there is sufficient cooling to the VRMs (with fan profile max or high).

Waiting on my EK Furious Vardar FF5-120 3000rpm fan to make push pull on the radiator to drop the temps so i can OC more.

Incredible what this card can do cooled properly with that Bios, with just that 8-pin power connector.


----------



## AshBorer

well my 1080 SC that i bought in june of 2016 is almost two years old now... and so far it hasn't exploded. I'm pretty confident that i dodged the batch with the bad capacitors. Then again, i go large parts of the year hardly using it (September thru April) so it doesnt necessarily have a ton of hours on it.


----------



## AlbertoM

Update,

Found that 1.063v is more stable for my oc.

Just a pic to show TDP reaching almost 340W.

Average is low because of game intervals, but gaming it's about 240w, not that much above the designed TDP.

Just a few more watts that the card have now the freedom to pull (on average) makes a whole lot difference.

And those spikes mean that the card have the power when it needs most.


----------



## juniordnz

Isn't the FTW DT supposed to have worse GPUs than regular FTW? EVGA's excuse for the DT card is that these cards didn't pass the tests to sustain FTW clocks. I'll elaborate:

I had a FTW until 2 months ago. It died and I sent it to EVGA for replacement. Since I'm from Brazil, I bought a cheap used FTW DT to keep playing as my RMA was going on. 

The thing is, my old, dead, FTW was able to do no more than 2.088mhz/1.062V. The FTW DT can make 2.151mhz/1.062V. 

Here is the DT`s performance clocked at 2.151/11.008/1.062V/130%:
25.371 Firestrike https://www.3dmark.com/fs/15176514
8.382 Timespy https://www.3dmark.com/spy/3492244

Wait, what?!?!?

The DT can only go as far as +500 on RAM though, FTW would go up to +600.


----------



## AlbertoM

Be careful with memory clocks.

I saw a test that 1080's are best in the +450 MHz zone, and above, even without artifacts, you lose performance.

I test on mine and could verify that is true. It goes to +600 MHz but FPS drops. +450 MHz is the sweet spot, talking of course about 10GBPS ram.


----------



## juniordnz

AlbertoM said:


> Be careful with memory clocks.
> 
> I saw a test that 1080's are best in the +450 MHz zone, and above, even without artifacts, you lose performance.
> 
> I test on mine and could verify that is true. It goes to +600 MHz but FPS drops. +450 MHz is the sweet spot, talking of course about 10GBPS ram.


Just tested it here and dropping to +450 on mem dropped the score 50 points. I'll try 425 and 475 just to be 100% sure. 

I tested various VRAM clock points on my old FTW and at +625 it would start losing performance, but on +600 it was alweys positive.

===================================================================================

Just got a new best: 8410 points on TimeSpy. 2.164/11008/1.62V/130%

https://www.3dmark.com/3dm/25769843?

Drops to 2.151mhz with max temp 52ºC (I have a H105 cooling the GPU and it's currently 36ºC here.)


----------



## Dreamliner

I have two Asus 1080 Strix OC cards and had a couple questions. Asus advertises the card as having 1860MHz Boost Clock in 'OC Mode'. From what I understand that speed is only when using the 'OC Mode' selector in Asus GPU Tweak II. Is this 100% identical to just increasing the clock in MSI Afterburner? Any benefit to using Asus GPU Tweak II instead? Could someone have a Founders card and get the exact same performance just using Tweak II?

Also, I checked with GPU-Z and I've got the 86.04.60.00.BE BIOS. Is there any benefit to updating that? Where do I go? I remember loading a custom BIOS on my Gigabyte 970 for better fan control but it has been a LONG time and I really don't know where to start.

My card: https://www.newegg.com/Product/Product.aspx?Item=N82E16814126201


----------



## white owl

You can flash it with the BIOS from a different card but there is no editor so you can't modify your own BIOS.
Since there isn't much info on which BIOS works on which cards, there's actually a big risk of bricking the card. We know the pin count must match.

I use Precision but you can use the software you like. I'd try Afterburner first and just match the max boost clocks to start with, unless it's a turd you should be able to get >1900Mhz on the core.


----------



## kmac20

I remember with my old 7850, once you pushed the memory too far it engaged error checking, which would cause the performance to drop compared to if you had the memory at a lower clock where it didn't need to error check. I feel this is a similar feature in a lot of cards today, although I could be wrong.


----------



## outofmyheadyo

How much of an improvement have u guys gotten replacing the regular tim with liquid metal on your 1080s? Will replace mine tomorrow, should net couple of degrees atleast! It`s the Seahawk EK X hoping it wont reach the temperature point and will boost abit higher!


----------



## white owl

outofmyheadyo said:


> How much of an improvement have u guys gotten replacing the regular tim with liquid metal on your 1080s? Will replace mine tomorrow, should net couple of degrees atleast! It`s the Seahawk EK X hoping it wont reach the temperature point and will boost abit higher!


1080 SC here, used MX 4
Went from stock running 1850ish at >70c (little OC room) to 2000mhz at 65c with the help of a fan curve.
The curve I use is:
0%:<40c, 20%:40c, 40%:50c, 100%:70c
I feel like this is more proactive than the stock profile meaning the fans will try to keep the core cool while it's cool as opposed to not doing anything till the core is already warm.

With the MX4 and 120% power I'm running >2000mhz in all games/benchmarks at the same voltage and sound level (subjective) as the stock card.
IMO it's worth it to use paste.

CLU is a different thing all together.
You need the cooler to have a copper bottom and you need to put liquid electrical tape or clear nail polish around the core as there are several things surrounding it that can and will be shorted if left naked.

I assumed my 980 FTW had a nickel plated cooler...it didn't. Ran it with CLU for over 2 years at 1500 mhz core/2000mhz mem with a missed modded bios. 65c was my peak temp. All was good till I noticed one fan died, went to replace it and realized the aluminum had dissolved.
Not only did it do that, the aluminum and CLU made a huge mess and took forever to clean. Finally got it clean and pasted with MX4, booted and put a load on it. Ran the same temps as before using paste instead of CLU then it crashed, never to fully boot into Windows again.

So while you can use CLU if the surface is copper or nickel, it's not going to be much (if any) better than good paste. It's not worth the risk if there's no real difference. The main reason It's used under an IHS is because it lasts and because the IHS isn't flat. People have used paste on the CPU die and report about the same temps as using CLU.
Since there is no IHS on a GPU there's very little benefit to using CLU but there is one in simply using better paste.


----------



## juniordnz

Yeah, tested +425, +450 and +475, all lower poerformance than +500. So I guess that's my sweetspot since +525 starts causing rendering issues. 2177mhz on core is a no no either...

I guess I've found the sweetspot for this card: 2.164core/11.008vram/1.62V/130%

Considering it's a DT, I'm very pleased with that!


----------



## AlbertoM

I have changed my TIM to Grizzly Conductonaut Liquid Metal when I was on air default FE cooler and the difference was very perceptive.

I went from overclocked hitting 83c with fans at max to even higher overclock at 75c with fans at max.

Now when I put my Hybrid kit I kept the Liquid Metal on the chip and got rid of the factory paste applied on the copper block (applied more LM), and can't compare after/before, but I'm maxing at 55c now, without TDP limit (strix XOC t4 bios now).

The only important thing is to put electrical tape along the sides of the chip covering those soldering points, and you are good to go.

BTW no damage at all on the stock FE cooler (nickel plated copper).


----------



## AlbertoM

Has anyone tried the WaterForce BIOS on the FE?

Wonder if it works and if it is better than XOC Strix T4 Bios.

I can't go over 2GHz playing at 1440p on mine with strix bios, it crashes no matter the voltage I set, even on water with temps at 55c max.

With stock FE BIOS I was able to 2175 MHz, but was heavily limited by TDP at 1440p.

Performance is better at 1440p with Strix XOC bios (though it runs slower 175MHz, pulls more watts because of unlimited TDP, rendering much faster), but if I could go higher than 2GHz with other bios (like waterforce one) and still have high TDP, I imagine that it would be even faster.

Thank you.


----------



## white owl

AlbertoM said:


> Has anyone tried the WaterForce BIOS on the FE?
> 
> Wonder if it works and if it is better than XOC Strix T4 Bios.
> 
> I can't go over 2GHz playing at 1440p on mine with strix bios, it crashes no matter the voltage I set, even on water with temps at 55c max.
> 
> With stock FE BIOS I was able to 2175 MHz, but was heavily limited by TDP at 1440p.
> 
> Performance is better at 1440p with Strix XOC bios (though it runs slower 175MHz, pulls more watts because of unlimited TDP, rendering much faster), but if I could go higher than 2GHz with other bios (like waterforce one) and still have high TDP, I imagine that it would be even faster.
> 
> Thank you.


The cards need to at least have the same pin count.
So your benchmark scores go up with a lower clock speed under load?


----------



## AlbertoM

white owl said:


> The cards need to at least have the same pin count.
> So your benchmark scores go up with a lower clock speed under load?


Power pin count? Strix XOC have more than FE and works.

I tested benchmarks at 1080p resolution and the performance is similar, a few points less with strix bios.

But gaming at 1440p it's a totally different story. TDP unlocked makes the FE much more fast using strix bios, even with lower clocks.


----------



## white owl

That doesn't make any sense. TDP does nothing without the clock speed. Sure the card is allowed to use more power but if it's not doing anything with it, its all for nothing.
We raise TDP to boost higher and longer and if you're isn't doing that then the performace isn't there.
If you use an 8 pin card with an 8+6 BIOS you're lucky it didn't brick. The power tables simply are there but the card can't do anything about it.
If you look at BIOS from a G1 980 vs a SC 980 you'll see what I mean. There are tables in the BIOS that don't match the PCB layout which usually causes crashes or the performance dropped like a stone in my case on a 980 FTW.

Many people using the XOC bios reported higher clock with worse performance so the flashed back.

If your benchmark scores based on graphical performance have dropped, games won't run better or faster. This is my experience from every card with every BIOS flash I've done. Scores go up and usually FPS will peak higher and have minimums. That's why everyone recommends benching before and after OCs and flashes.


----------



## AlbertoM

Nops.

With stock FE Bios playing Doom at 1440p the card is limited at 220w max.

With strix XOC t4 it pulls 250w on average and results in a much higher FPS.

I posted a picture of it reaching 338w a page ago with this bios at ~2GHz, while playing Doom maxed at 1440p.

Benchmark is not the real world.


----------



## white owl

Not for CPUs no they aren't but on GPUs the rendering is similar to a game only on a rail. Superposition is my favorite.
Is that the only game you've tried? DOOM is a very easy game to run with a 200fps cap so it's not exactly ideal for noting improvements. GTA5 is still a great game for benching.


----------



## AlbertoM

Only play Doom.

It is limited at 1080p maxed at 200fps.

But at 1440p its a different story.

With stock FE bios at 2150 MHz stable 55c water it renders from 100 to 150 fps, and stutters quite a bit.

With strix bios 2 GHz it goes steady to 200fps and drops eventually heavy scenes to 150fps, totally smooth. 

And that was with 1.063v max, so energy not wasted with high voltages.

Thats for sure the TDP allowing the card to render all the bits without limits.


----------



## kmac20

Does anyone else here get a stutter if their pc is left on too long? I'm curious if its my GPU or mobo. My mobo has been giving me tons of post codes the past month or two, so I'm pretty sure its that (everything from VGA issue, to memory issue, to chipset issue, to even once microcode not loaded, da heck?),
I have a PSU checker I'm going to check that later before I RMA anything (although Im' pretty confident its not this either).

But I'm also wondering if maybe its my card? Basically, if I leave this PC running too long, longer than lets say, 6-12 hours (which I do somewhat frequently), I'll start getting a stutter in games. Its every like, 2-3 seconds, it'll go from, lets take CSGO as an example:

200-300fps down to like 60-80, then back up, then again, rinse and repeat.

It's pretty noticeable with my refresh rate at 75 when it drops from so high down below that. It's very distracting and I have to restart the PC every time to clear it.

Curious if anyone else has had this issue?


----------



## white owl

@AlbertoM
That's a really weird result because my SC card with the stock BIOS does great on DOOM @ 1440p/144hz. I can run it maxed out and keep the 144 fps most of the time but like with any game I turn down the settings that make no visible difference but impact performance.
I thought I'd need to overclock it but since I replaced the TIM my GPU will usually boost to over 2000mhz anyway. I've since stopped using Precision because it crashes Fortnite sometimes but I wish I could mod the BIOS so I could bake in my fan curve, memory, core, TDP and voltage. It really sucks they took that away from us, with a modified BIOS I'm sure I could make this thing run 2100 mhz under any load.
@kmac20
I did have some issues before I got my 1080 SC with my PC getting hung during POST, it would hang on RAM, boot, and VGA.
Removing my GPU and using the iGPU fixed it after putting the 980 back in. You might try that.
You should verify that the PSU works if you can, what PSU do you have?

Here's what I'd do:
Remove the GPU and use the iGPU. Update your BIOS and use mostly default setting, probably leave the CPU and cache at stock settings then enable XMP or manually set your RAMs clock speed, timings and voltage...the idea being to make everything run at its advertised speed. Once you get back into Windows use DDU to remove your GPU driver in safe mode.
When your are booted back up do the normal things you'd do when you were having issues, definitely check post codes as it boots each time. You might run RealBench to see if the CPU performs like it should. You might also check RAM for errors with Memtest64 and CPU/RAM stability with Prime95 blend (disable AVX so your voltage and heat don't get crazy, or you can use Prime95 26.6 as it does not have AVX).

If you are satisfied with all that or you only had issues in games put the GPU back in and install the driver for it. Run whatever you usually do.

Many of the stress things I mentioned are up to you, that's just what I'd do to ensure stability while ruling things out.
If you have the same problems with no GPU installed you obviously have a CPU or mobo problem if the PSU is fine.
If the problems only exist with the GPU installed while under load it could be the GPU or PSU but if you have issues without any GPU load the GPU could be bad.

The point here is just to remove variables starting with only the CPU and RAM at their advertised speeds.


----------



## kmac20

I have a Seasonic Focus Plus Gold 850w. Which is pretty high end. I also just got a PSU tester literally the past week, which I'm going to check with. As well, I have a backup HX 750 which I've tested already.

I don't have an iGPU. I have a Ryzen 1700.

I've run memtest86+, HCI memtest to over 1000%. I've rune p95 for a long time. I'm old school when it comes to all these tests. This is why I'm fairly confident that its a motherboard issue.

Weird part is though even when stuttering starts benchmarking is still fine with no indications of stuttering. Real world only.


----------



## white owl

Well you can still try removing the GPU then boot up and shut down. I know that sounds stupid but I've helped a handful of people on this site plus myself by recommending that. It seems like stuff just gets confused sometimes and you need to change things to reset it.
Also updating the BIOS and especially using DDU before reinstalling the GPU driver are still fixes for about half the problems people encounter with their gaming rigs.

If you stutter in games only wouldn't that indicate a MOBO/CPU issue since a GPU bench isn't CPU bound? Or it could just as easily be a driver thing.


----------



## AlbertoM

@kmac20

Do u monitor your windows tasks activities to see if there is something using either CPU or disks when u have the stutters?

I would format my PC and restore all default settings in BIOS to see if runs normally.

From my experience (I'm graduated in computer science), no recurring hardware defects come from that amount of time. If that would be hardware issue, the time to reproduce the problem would be only the time to get the components to working temperature and to the same load when you have the issue.

I'm pretty sure that's a software issue.


----------



## AlbertoM

@white owl

I see that u mean in Doom. You turn some settings down to keep 1440p as smooth as 1080p.

Thats because the card simply don't have the energy to render everything and stutters.

The problem could be seen more clearly if you set 2160p in Doom maxed.

With the strix Bios, 1440p maxed is as smooth as 1080p maxed. That FE Bios can't do. It must have more power to do all the pixels. Thats why with FE Bios u will have perfcap TDP maxed at 220w, and with strix bios TDP goes to 250W at the same volts, and allowing spikes up to 340w as I seen, so no perfcap from TDP at all.

Perfcap reason with strix bios is voltage (of course setting the curve to limit it around 1.063v in Afterburner, with this bios u could do 1.2v), and if u bump it u have more heat and therefore more instability. Only dropping temps could help here, so I'm waiting my EK Furious Vardar 120mm fan arrive to do push pull to drop temps even more to see if I could go to the usually ~2.15GHz with this bios, maybe at around 1.1v (the default max of Strix t4).

And the power pin outs of the cards isn't a problem. I disassembled mine, and the PCB have the pins for another 8-pin connector, soldered all together with the other 8-pin, so its just another source of energy to comply with regulations standarts of the PCI-E cables and PSUs to cards that will pull more than 220W. As the FE card have a max TDP of 220W, they only put a 8-pin connector, because the other would not be necessary at all. That pin count is totally invisible to the BIOS. I did a lot of research and if u have a good PSU with good cables, you are fine with one 8-pin to more than 300W.

I was wondering of the Waterforce bios, because a user here at page 335 have used it and wrote better results than strix bios, and I can imagine that is probably because that bios have max TDP of 375w, and strix t4 having none at all could cause crashes/instabilities/performance drops when the card pull more energy that it can handle.


----------



## white owl

My card will run DOOM at 200fps with an average over 144fps, sometimes it dips to 120 or so. It's not frequent but it does happen and when it does it's because the core is pegged. My CPU is a 4690k @4.7ghz with RAM @ 1800mhz cl9
Tomorrow I'll do some testing and see what it does. I'm not flashing because I have no idea which BIOS with a higher TDP will 100% work on my GPU.
What I can do is use KBOOST with stock clock speeds and say 80% TDP vs 120% TDP. I always assumed when you maxed TDP the card would just down clock so it could down volt.


----------



## kmac20

First of all, thank you all for your input. I'm reading it all and utilizing it accordingly. 



AlbertoM said:


> @kmac20
> 
> Do u monitor your windows tasks activities to see if there is something using either CPU or disks when u have the stutters?
> 
> I would format my PC and restore all default settings in BIOS to see if runs normally.
> 
> From my experience (I'm graduated in computer science), no recurring hardware defects come from that amount of time. If that would be hardware issue, the time to reproduce the problem would be only the time to get the components to working temperature and to the same load when you have the issue.
> 
> I'm pretty sure that's a software issue.


Yes I have a second monitor that constantly has HWInfo running on it with graphs displaying CPU temp, usage, cpu+soc voltage, gpu temp, gpu voltage and gpu power.

I notice no spikes.

In theory this problem could be happening in other programs not gaming, but gaming is the only one where it would be consistently dropping below the 75hz. For example, I doubt watching a youtube video that might not even be running at 60 fps would even be noticeable. 

But, a drop from 2-300fps in CSGO down to 60 (below refresh rate) or even 70/80 is very noticeable to my eye, and the eyes of most people (human eye detects variances more easily than constant low for example). I guess also in theory it could be happening in a benchmark but it would be so minor that it wouldn't impact the final result. Again, this isn't just CSGO, it seems to be almost every game: Dota 2, CSGO, Dark Souls 3, Fortnite, etc. At first I thought maybe it was a steam thing but it happens in Fortnite as well.

And again, this is a stutter every, 3-5 seconds or so. Happens more quickly when the "camera" is moving more quickly obviously. 

I dont know if its a driver thing as it seemed to happen both before and after I updated drivers recently for the first time in a little while (2 months?). Maybe I'll try rolling back.


As someone else said it could probably is a mobo issue, I'm still in agreement with that as this board has been giving me a lot of problems with tons of post codes ranging from VGA card not working, to chipset (most frequent code lately) to memory, to even once to a code that doesn't exist on their listed codes (and apparently means microcode failed to load, da heck?).

I guess it could also be the CPU. But I feel that would be throwing up a LOT more issues than I'm getting. Lot more crashes, and the crashes probably wouldn't just be the video freezing but sound continuing perfectly. These crashes also do not appear to have any information pertaining to them in the Event Viewer, which is really frustrating as I cannot determine via this route where its a software issue, and if so which software.

This crash didn't use to happen, seems to happen since the driver update. But the stuttering was happening beforehand as I've said. I even have these drivers rolled back slightly, although not all the way to what I was using before.


Thank you all again for your input. I'm reading it all, applying what I can or feel will be useful. If it was still the old site I'd rep you all for trying to help me. I greatly appreciate it. Hopefully I've posted some more informative stuff here to help you all help me.


----------



## AlbertoM

@kmac20

I would format my drive before assuming anything.

Hope u find the culprit.


----------



## white owl

"I dont know if its a driver thing as it seemed to happen both before and after I updated drivers recently for the first time in a little while (2 months?). Maybe I'll try rolling back."
Like I said you should try DDU in safe mode then you can clean install the driver. That auto update thing borks a lot of people's rigs.
Reseating the CPU and updating the BIOS couldn't hurt.


----------



## kmac20

I did all that already. Rolled back as well as used DDU in safe mode. I have never auto updated on my life 😛 

I havent formatted lately, but that still doesn't explain all the random post codes before it even gets to the drive. 

Again, chipaet, memory, vga, CPU and a nonexistent post code which I found on Google someone said could mean "microcode not found/loaded".


----------



## white owl

What happens on the drive has nothing to do with post codes and such. A BIOS reflash or reseating the CPU could fix it but then again it could just be broken.


----------



## AlbertoM

@uberwootage

Are u still using Waterforce Bios on your FE card? How is it going?

Can u provide me the link for the BIOS u used?

Tks.


----------



## kmac20

Yeah thats what I was saying. That's why I doubt formatting will do anything. Its getting tons of post codes before drive is even active, which either means:
bad board 
bad cpu
bad vga
bad memory.

It gets codes for memory and chipset most frequently Sometimes CPU and VGA. So a lot of different ones. I doubt all these parts are bad since again I
Have run memtest86+ (over 100%) and HCI memtest (over 1000%). I've done tons of benchmarking, and when I did this overclock I ran p95 in blend for quite awhile, and small to see max heat/temps. Once I stabilized it I never had a crash from either.

I guess it COULD still be CPU. But I feel like I'd be getting a lot more problems.

Since it gets all of those post codes it seems like motherboard is most likely. It could be a bad card still too though, but just due to the plethora of different category post codes I feel like the board is more likely. It would've been like mainly VGA/CPU if it were the card and cpu respectively in my opinion. Just getting like chipset ones and memory most frequently lends me to believe the board. ESPECIALLY in conjuncture with that crazy experience and finding the only answer to that anywhere the only post finding the board was dead.

Guess we'll see. 


Thanks to everyone for all your input. Too bad we cant' rep currently  If you have more I'm all ears.


----------



## coreykill99

kmac20 said:


> Yeah thats what I was saying. That's why I doubt formatting will do anything. Its getting tons of post codes before drive is even active, which either means:
> bad board
> bad cpu
> bad vga
> bad memory.
> 
> It gets codes for memory and chipset most frequently Sometimes CPU and VGA. So a lot of different ones. I doubt all these parts are bad since again I
> Have run memtest86+ (over 100%) and HCI memtest (over 1000%). I've done tons of benchmarking, and when I did this overclock I ran p95 in blend for quite awhile, and small to see max heat/temps. Once I stabilized it I never had a crash from either.
> 
> I guess it COULD still be CPU. But I feel like I'd be getting a lot more problems.
> 
> Since it gets all of those post codes it seems like motherboard is most likely. It could be a bad card still too though, but just due to the plethora of different category post codes I feel like the board is more likely. It would've been like mainly VGA/CPU if it were the card and cpu respectively in my opinion. Just getting like chipset ones and memory most frequently lends me to believe the board. ESPECIALLY in conjuncture with that crazy experience and finding the only answer to that anywhere the only post finding the board was dead.
> 
> Guess we'll see.
> 
> 
> Thanks to everyone for all your input. Too bad we cant' rep currently  If you have more I'm all ears.


I also vote for reseating the CPU, have you tried this? your running the Taichi board still right? 
I know once I was messing with my taichi and accidentally unplugged my D5 pump. CPU powered on overheated and then auto shut down. 
I got every post error there was for 2 days before I tried reseating the CPU and suddenly everything was "magically" fixed. 
worth a shot.


----------



## Asus11

im back in this club lol don't game much but found a cheap 1080 strix I couldn't resist

does anyone know what kind of 1080 strix I have? it says advanced on the box but just above the pcie slot it says GTX1080-OC8G

the bios shows AG, base clock 1671mhz but I don't know if the bios might have been flashed with the AG for better power when mining as the guy I got it off got it off his friend who was a miner


----------



## Eze2kiel




----------



## andydabeast

Anyone not using a backplate? If so, do you have a line of power delivery chips on the back that are exposed?

Thanks


----------



## white owl

@Eze2kiel Great pix dude. I wish I had DSLR and a decent macro but I bought a GPU instead lol.


----------



## nolive721

hello guys.seeking for advice

I sold for more than what I bout it for, my ZOTAC 1080 AMPExtreme+ which was a good OCer (beyond 2100core/12,000Mhz memory) because of my new rig theme(White/black/red) as well as the fans being too loud, and got it replaced with a MSI Gaming X

Just started to play with the OCing potential last night but I am left really underwhelmed

It doesnt boost beyond 2030Mhz core, although I can OC the memory from 10,000Mhz to 11,500, and run relatively hot in Heaven (close to 70degC with fans at 60%) or in demanding games ( I am running 3 1080p monitors)

looks like the card is power throttling, even at stock clocks!, and I cant get the slider beyond 104% in Afterburner
Playing with Voltage/Freq curve didnt help that much so far I believe because of thsi power limitation

Is that normal for such SKU to have this behavior, any chance to flash say the Z model BIOS in a safe way in order to get more OC headroom?

thanks in advance

Olivier


----------



## white owl

Sounds normal. Some GPUs are designed cheaper or worse than others and some are just better pieces of the wafer. Not sure why you can't take your TDP up to 120 though, my 8 pin SC gives me 120 but I can't even keep it over 2000mhz. Vrel,Pwr,Vrel,Pwr,Vrel,Vrel,Pwr...
Haha


----------



## nolive721

the power limit restriction is what puzzles me the most indeed considering the Gaming X has a 8+6pin

kind of almost regretting my ZOTAC now lol) but god bless AMAZON and its 30day return policy if I lost the silicon lottery or MSI just locked BIOS on that SKU for whatever reason


----------



## white owl

I'd get a G1 if you can.


----------



## nolive721

well they cost the equivalent of 900USD over here in Japan so thank you but no lol)


----------



## EDK-TheONE

is t4 bios compatible with msi 1080 gaming x?


----------



## andydabeast

nolive721 said:


> hello guys.seeking for advice
> 
> I sold for more than what I bout it for, my ZOTAC 1080 AMPExtreme+ which was a good OCer (beyond 2100core/12,000Mhz memory) because of my new rig theme(White/black/red) as well as the fans being too loud, and got it replaced with a MSI Gaming X
> 
> Just started to play with the OCing potential last night but I am left really underwhelmed
> 
> It doesnt boost beyond 2030Mhz core, although I can OC the memory from 10,000Mhz to 11,500, and run relatively hot in Heaven (close to 70degC with fans at 60%) or in demanding games ( I am running 3 1080p monitors)
> 
> looks like the card is power throttling, even at stock clocks!, and I cant get the slider beyond 104% in Afterburner
> Playing with Voltage/Freq curve didnt help that much so far I believe because of thsi power limitation
> 
> Is that normal for such SKU to have this behavior, any chance to flash say the Z model BIOS in a safe way in order to get more OC headroom?
> 
> thanks in advance
> 
> Olivier





white owl said:


> I'd get a G1 if you can.


My windforce is a lesser binned G1. I have a single 8-pin power connector and my TDP slider only goes up to 108% At +50 on the voltage slider I have the core at +175 and mem at +475. Even back when I was on air I hit 2100mhz regularly with temps in the 70's. I had an aggressive fan curve. 
That is my experience.


----------



## nolive721

managed to push the core to 2060 but with increasing fan speed to 80% so even if this cooler is well designed&quiet, that is not something I like very much to do
the card is completely power throttling and I have noticed that Voltage drops are quite significant, certainly doesn't hold to the 1.093V like my ZOTAC was

lurking at the Gaming Z bios right now and might take the plunge and flash it over the week-end


----------



## outofmyheadyo

Does anyone know the thickness of the VRM and memory thermal pads on the Seahawk EK X 1080 ? I managed to lose mine and not sure if it`s 1mm or 1.5mm for the VRM and for the memory.
Also sent an email to EK asking about it but no answer yet since it is the weekend.


----------



## coreykill99

outofmyheadyo said:


> Does anyone know the thickness of the VRM and memory thermal pads on the Seahawk EK X 1080 ? I managed to lose mine and not sure if it`s 1mm or 1.5mm for the VRM and for the memory.
> Also sent an email to EK asking about it but no answer yet since it is the weekend.


its just a standard EK block I would Assume just made to look a little fancier. I have their MSI block on my card the gaming model which is the same PCB layout for all their non ref cards. the block is EK-FC1080 GTX TF6. and according to the installation manual I just dug out 
you need .5mm pads on the GDDR modules And 1.0mm pads on the VRM

hope that helps.


----------



## outofmyheadyo

Thanks for the reply, I added some pads I had laying around that I thought were 1mm for the vrm but still I see caps in there, it`s a little strange.
The block looks exactly like this one https://www.ekwb.com/shop/ek-fc1080-gtx-tf6-acetal-nickel and it does say 1mm on the use manual, perhaps I should try the 1.5mm ones.


----------



## tangelo

nolive721 said:


> the power limit restriction is what puzzles me the most indeed considering the Gaming X has a 8+6pin
> 
> kind of almost regretting my ZOTAC now lol) but god bless AMAZON and its 30day return policy if I lost the silicon lottery or MSI just locked BIOS on that SKU for whatever reason


Do you have Gaming X or Gaming X+ ?

My Gaming X+ has the same "problem". Power limit is maxed at 104%. Unable to flash other BIOSes as X+ uses different memory than the normal Gaming X cards.
I've seen reviewers having bioses with higher PL but those are nowhere to be seen

I've asked MSI about this multiple times and they dodge the question and just say that the bios in my card is the one it should use. No answer to "why review samples has different bios with higher power limits", even when the cards are the same SKU.


----------



## white owl

tangelo said:


> Do you have Gaming X or Gaming X+ ?
> 
> My Gaming X+ has the same "problem". Power limit is maxed at 104%. Unable to flash other BIOSes as X+ uses different memory than the normal Gaming X cards.
> I've seen reviewers having bioses with higher PL but those are nowhere to be seen
> 
> I've asked MSI about this multiple times and they dodge the question and just say that the bios in my card is the one it should use. No answer to "why review samples has different bios with higher power limits", even when the cards are the same SKU.


Can you show me what you mean? Do you remember the videos?
I believe you 100% but I want to see.


----------



## tangelo

white owl said:


> Can you show me what you mean? Do you remember the videos?
> I believe you 100% but I want to see.


Take this for example. 
http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_gaming_x_plus_8g_review,34.html

+113% powerlimit on it's bios.

The bios most reviews seem to have is 86.04.66.00.2C (2017-03-30)
The bios my retail card have is 86.04.66.00.52 (2017-07-03)

And like I said, the PL goes only to 104% on my BIOS. Card maxes out at 220W


----------



## hotrod717

nolive721 said:


> hello guys.seeking for advice
> 
> I sold for more than what I bout it for, my ZOTAC 1080 AMPExtreme+ which was a good OCer (beyond 2100core/12,000Mhz memory) because of my new rig theme(White/black/red) as well as the fans being too loud, and got it replaced with a MSI Gaming X
> 
> Just started to play with the OCing potential last night but I am left really underwhelmed
> 
> It doesnt boost beyond 2030Mhz core, although I can OC the memory from 10,000Mhz to 11,500, and run relatively hot in Heaven (close to 70degC with fans at 60%) or in demanding games ( I am running 3 1080p monitors)
> 
> looks like the card is power throttling, even at stock clocks!, and I cant get the slider beyond 104% in Afterburner
> Playing with Voltage/Freq curve didnt help that much so far I believe because of thsi power limitation
> 
> Is that normal for such SKU to have this behavior, any chance to flash say the Z model BIOS in a safe way in order to get more OC headroom?
> 
> thanks in advance
> 
> Olivier


Take a look at settings in AB. Try choosing MSI Extended limits and 2nd bios, if it has one.
Dont be confused. The gaming X is not a Lightning.


----------



## nolive721

I bough the Gaming X.the X+ and the Z models are incredibly high prices over here in Japan.
I had read good reviews about the cooler of that card and I expected the 10phases would deliver the power needed and avoid throttling like the card does now.

the 8+6pins should have given me a hint about the real power potential of that partiucular card compared to my ZOTAC AmpExtreme+

still have 3 weeks before the AMAZON free return ends so if I dont find the Bios to push the power limit further.its going back.

thanks all for your replies


----------



## white owl

nolive721 said:


> I bough the Gaming X.the X+ and the Z models are incredibly high prices over here in Japan.
> I had read good reviews about the cooler of that card and I expected the 10phases would deliver the power needed and avoid throttling like the card does now.
> 
> the 8+6pins should have given me a hint about the real power potential of that partiucular card compared to my ZOTAC AmpExtreme+
> 
> still have 3 weeks before the AMAZON free return ends so if I dont find the Bios to push the power limit further.its going back.
> 
> thanks all for your replies





tangelo said:


> Take this for example.
> http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_gaming_x_plus_8g_review,34.html
> 
> +113% powerlimit on it's bios.
> 
> The bios most reviews seem to have is 86.04.66.00.2C (2017-03-30)
> The bios my retail card have is 86.04.66.00.52 (2017-07-03)
> 
> And like I said, the PL goes only to 104% on my BIOS. Card maxes out at 220W


https://www.techpowerup.com/vgabios...=GTX+1080&interface=&memType=&memSize=&since=

Take your pick. My 1080 SC was sold with two BIOS. One at 1709 / 1251 and one 1709 / 1376 which is the highest memory clock of any EVGA 1080. I've wanted to flash it but I'm scared to, why would this version of the SC have +70mhz on the memory? What's different about the hardware? I asked EVGA and (obviously) they were no help lol.


----------



## Wuest3nFuchs

Hey guys !

A mate who just moved to a 8700k paired with a GTX 960 and wants to update to a 1080...atm i'm not in the position to tell him ,yeah take it it's a good card...so what would you guys say to this card ???

He showed me this video 




Now is the Gainward 1080 Phoenix GLH ,that thing in terms of OCing to nearly 1.8-2ghz on the core ?

Does the card have any restrictions like a voltagelock ?

cheers fox


----------



## Sn4k3

Hi guys. I recently spent a couple hours playing with the volt./freq. curve on Afterburner going through every voltage point on the curve for my EVGA GTX1080 SC. Everything was fine until I restarted and noticed that some of the points would change themselves, either jumping up or down.
For example, I have my top frequency set to 2012MHZ at 0.993v which I tested to be completely stable, now I restarted my computer and it's at 2050MHZ for that voltage... I just want it to stay wherever I set it to.

Do you know of any workaround for this? It's driving me insane.


----------



## AlbertoM

@Sn4k3

The curve changes with temperature. You have to get the card on the right temperature and set the curve.


----------



## white owl

Wuest3nFuchs said:


> Hey guys !
> 
> A mate who just moved to a 8700k paired with a GTX 960 and wants to update to a 1080...atm i'm not in the position to tell him ,yeah take it it's a good card...so what would you guys say to this card ???
> 
> He showed me this video
> 
> 
> 
> 
> Now is the Gainward 1080 Phoenix GLH ,that thing in terms of OCing to nearly 1.8-2ghz on the core ?
> 
> Does the card have any restrictions like a voltagelock ?
> 
> cheers fox


Most cards have a voltage limit and TDP limit, the question is what is the limit? I don't actually know but if you watch a good reviewer they should mention it.
Almost any 1080 with a good cooler can reach 2ghz assuming the case air flow is adequate.
With optimized games I'm able to maintain 144fps at 1440p with a stock SC using maxed out textures, effects and lower AA. Running it at +180/+200mem does seem to increase my minimums but there is the occasional drop under 144fps.

The games that have lots of eye candy obviously won't but usually I'm happy with 60-72fps on those.


----------



## Wuest3nFuchs

Thanks man ,sounds good,green lights for him


----------



## Heky

Wuest3nFuchs said:


> Thanks man ,sounds good,green lights for him


Could you please make a BIOS backup of his card when he gets it? And if you can, please use the latest nvflash to do it and post the BIOS here. I would really appreciate it. Thanks.


----------



## Wuest3nFuchs

Sure i will let u know as soon as he has the card in his hands !


----------



## white owl

Heky said:


> Could you please make a BIOS backup of his card when he gets it? And if you can, please use the latest nvflash to do it and post the BIOS here. I would really appreciate it. Thanks.


GPUz is what you pull the BIOS with.
The GLH BIOS can be found here, 3 versions:
https://www.techpowerup.com/vgabios...=GTX+1080&interface=&memType=&memSize=&since=


----------



## Heky

white owl said:


> GPUz is what you pull the BIOS with.
> The GLH BIOS can be found here, 3 versions:
> https://www.techpowerup.com/vgabios...=GTX+1080&interface=&memType=&memSize=&since=



Sorry to dissapoint, but those dont work on the new revisions of those cards. I dont know if Gainward changed something or if nvidia did, but i have a 1080 Phoenix GS with bios version 86.04.60.00.49, and flashing any of the Bioses you linked, just doesnt work. 

Also i read somewhere that Gpu-z has problems with extracting the new pascal bioses(especialy with the UEFI part of the vbios), thats why i suggested to dump the bios with the latest nvflash.


----------



## white owl

Heky said:


> white owl said:
> 
> 
> 
> GPUz is what you pull the BIOS with.
> The GLH BIOS can be found here, 3 versions:
> https://www.techpowerup.com/vgabios...=GTX+1080&interface=&memType=&memSize=&since=
> 
> 
> 
> 
> Sorry to dissapoint, but those dont work on the new revisions of those cards. I dont know if Gainward changed something or if nvidia did, but i have a 1080 Phoenix GS with bios version 86.04.60.00.49, and flashing any of the Bioses you linked, just doesnt work.
> 
> Also i read somewhere that Gpu-z has problems with extracting the new pascal bioses(especialy with the UEFI part of the vbios), thats why i suggested to dump the bios with the latest nvflash.
Click to expand...

Omg this bios stuff is getting ridiculous. Very well then lol.


----------



## nolive721

nolive721 said:


> managed to push the core to 2060 but with increasing fan speed to 80% so even if this cooler is well designed&quiet, that is not something I like very much to do
> the card is completely power throttling and I have noticed that Voltage drops are quite significant, certainly doesn't hold to the 1.093V like my ZOTAC was
> 
> lurking at the Gaming Z bios right now and might take the plunge and flash it over the week-end


I found out something really weird and I will post on the Ryzen CPU thread as well but thought I would share my experience here first

I didnt manage to push the card more than 2060core and 11,600Mhz memory but I was still seeing rather low benchmark scores in Heaven, Assettor Corsa or ROTR games benchmark that I used as reference in comparison to my ZOTAC card.Like /114fp/2800pts in Heaven vs 3200 with the Zotac, or 103fps in Assetto Corsa vs 117 with the Zotac

Then I decided to try something.At the time I sold the Zotac card, I also decided to buy and install a 2400G CPU to keep me playing games before buying the MSI card.
All benchmarks above were run with the 2400G

I was not overly impressed with the chip because of high temperatures compared to the ones I was getting with my 1500X, and also so many bugs I was experiencing in combination with my Asrock motherboard

So I decided thsi week-end to sawp the 2400G and reinstall the 1500X....and guess what, all benchmarks increased dramatically. Heaven at 122fps/3100pts, AC at 114fps so close enough to the ZOTAC to make me having far less regrets to have ditched the AMP Extreme

In both cases, CPUs were clocked at 3.9Ghz and the benchmarks ran withe the same settings.Couldnt see the 2400G throttling at all since the AIO I am using kept the CPU die temp around 70degC.

I tried to run with the VEGA iGPU enabled and disabled and it was making no differences on the fps seen when using the Gaming X

really curious to understand why but also to hear anybody here who would have experienced something similar although I believe I might be an isolated case.


----------



## carlskie86

Hi everyone i just purchased gtx1080fe from a friend and finished my pc..

Running r5 1600 paired with the gpu on mini itx sg13 to be exact

I was wondering if how can i lower the temp i get as high as 82c on full load and clock dips at around 17+ with my custom fan curve, should i overclock it or underclock?

Sorry for the noob question but im willing to learn


----------



## white owl

Re-paste the card and improve case flow. What's your cooling situation like and what case do you have? How much clockspeed do you loose when it hits 80c?
Overclocking is like free performance and doesn't make the GPU hotter until you add voltage. 

Since it's a reference card you'll never have great temps but if you do the paste and you have good case flow it should give you more boost and give you more room if you want to overclock it. Underclocking shouldn't even be considered IMO.


----------



## buellersdayoff

carlskie86 said:


> Hi everyone i just purchased gtx1080fe from a friend and finished my pc..
> 
> Running r5 1600 paired with the gpu on mini itx sg13 to be exact
> 
> I was wondering if how can i lower the temp i get as high as 82c on full load and clock dips at around 17+ with my custom fan curve, should i overclock it or underclock?
> 
> Sorry for the noob question but im willing to learn





white owl said:


> Re-paste the card and improve case flow. What's your cooling situation like and what case do you have? How much clockspeed do you loose when it hits 80c?
> Overclocking is like free performance and doesn't make the GPU hotter until you add voltage.
> 
> Since it's a reference card you'll never have great temps but if you do the paste and you have good case flow it should give you more boost and give you more room if you want to overclock it. Underclocking shouldn't even be considered IMO.


Try Under-volt with a curve, see how far you can clock it with say 1v. Might get a couple of degrees with some good paste


----------



## white owl

Depends on lots of things like how good the cooler is and how bad the paste was but I usually get from 5 to 10c with non reference coolers. My 1080 SC is still pretty young and I dropped over 5c IIRC. My friend just did his reference 980TI which he said dropped his fan speed considerably, unless you run a constant fan speed before and after the paste you'll never really know how much it changed.
The stock paste is crap on Nvidia and EVGA cards, not sure about the others.


----------



## carlskie86

^isn't re-pasting the gpu on zotac 1080 voids warranty?

anyway while playing need for speed payback @ ultra settings starting at 50c gpu clock at 1863, when i reach 76-79c gpu clocks @ 1810-1797-1785 @ 80-90fps

heres my case


----------



## white owl

Not sure about warranty. Look for the sticker or call them.
So your GPU intake is right up against the wall right? Can it be flipped normally to get some air? Is it possible to use all the case fans as intakes so positive pressure pushes the heat out the back?

I don't see anyway to cool anything in it really. Without putting a thick 140mm rad on the GPU it's probably going to run hot unless it can be reconfigured so air can get in there. What are your temps like with the side open?


----------



## carlskie86

With side open i get 74-76c. I see so its a dead end then, i guess i got to lived with it then ????


----------



## white owl

Or force more air in the case. Of leave the side open. Or get a better case.


----------



## carlskie86

^i just assembled this one... anyway thanks for the reply appreciate it..


----------



## white owl

You could throw a CLC on it.


----------



## carlskie86

CPU liquid cooler?


----------



## white owl

Yes


----------



## carlskie86

white owl said:


> Yes


Im already using h75 aio

My cpu temp is tolerable i get 40-50c gaming


----------



## white owl

Use an AIO on the GPU. You probably can't fit a good air cooler on it, I still don't know how your intake/exhaust situation is like but I'd try to fix that first. Removing the side shouldn't drop 10c on the GPU.
If I had to guess I'd say the AIO is part of your main intake/exhaust and that's a big part of the problem in these little cases.
Catch 22. Can't fit a good air cooler so you use an AIO which takes up one of your few intake/exhaust mounts which turns the case into an oven.

What are your system specs?
If you can't fit two AIOs I'd put the h75 on the GPU and get an air cooler that will fit the CPU. The CPU will run hotter but it shouldn't perform any different unless its Ryzen where you'll possibly loose boost but I doubt it. Normally a budget air cooler will run cooler than an h75 but in your situation I doubt it would because if your limited space (smaller cooler) and limited air.

The first thing you should do is maximize case flow unless you don't mind the GPU running hot. It'll loose boost but it's safe.
What are your system specs and how are your fans laid out? What fans are they?

Cooling the GPU will get you more performance than anything so I'd focus on that even at the cost of CPU clock speed if necessary.


----------



## carlskie86

using an AIO on the GPU might not be possible on this tiny case..
my intakes are push pull rad on h75, gpu intake, and psu as intake. my exhaust are gpu blower and noctua 90mm fan from the side of the psu..

heres my specs


----------



## white owl

Using the rad as an intake isn't ideal at all. If it's possible to use it as an exhaust and everything else as an intake, you might try doing that.

For the best performance you need more cooling on the GPU than the CPU since the GPU is usually your limiting factor. I'd sacrifice what ever was needed to put the CLC on the GPU (as an exhaust) and to get more intake. It sounds like the rad is your only real intake which is putting all the CPU heat into the case and through the GPU.

It's better to have more intake than exhaust in most situation as positive pressure will push air through the exhausts anyway.


----------



## carlskie86

i see i will try to switch all to intakes.. thanks


----------



## Bride

Hi Guys,
i just replaced a Gigabyte GTX980Ti GV-N98TG1 GAMING-6GD for a EVGA GeForce GTX 1080 FTW GAMING, 08G-P4-6286-KR.
There aren't editors as the lovely Maxwell bios tweaker, but I would like to know if I can mod my BIOS by some hexadecimal editor or something similar.
I have also to unlock it, actually modified the PW limits with Mobile Pascal TDP Tweaker, i can not upload it for the BIOS Certificate 2.0 verification failed error.
I tried the Asus t4 BIOS but actually under load is stuttering.

Tks


----------



## white owl

Word around the school yard is you need a hardware flasher to do it.


----------



## Moutsatsos

I 'm just wondering if anyone else is having issues with GTX 1080 and Unity engine games.
I have an R7 [email protected] 3466 16Cl Msi Gaming X 1080 @ 2050 Aio cooled and i m getting low fps on Front Punk and Pillars of Eternity 2 at 1920X1080.
With Frost Punk i used to have 25-30 fps from the start and gpu was downclocking to 1683 MHz.After the latest windows update it somehow miraculously fixed the issue and the game run at 120 fps no issues.
Now I m trying to play Pillars and i get the same issue,50 fps and when i stress the cpu(encoding) gpu downclocks and i get 30 fps.I mean seriously *** 30-50 fps?
Btw if I set the gpu clock to constant 2050 I still get the same performance.
Tried power plans,power management modes,lower resolution,higher resolution all gives the same result.
I 'm just wondering if anyone else is having a similar issue and if anyone found a work around.
On latest Nvidia drivers.(optimised... for Pillars).


----------



## white owl

How many driver updates have you done? If youv'e gone from one to another it might be time for a clean install by booting into safe mode and using DDU, then install whatever driver you like.
Dropping 300mhz off you clock speed really stings, I know lol.


----------



## Moutsatsos

white owl said:


> How many driver updates have you done? If youv'e gone from one to another it might be time for a clean install by booting into safe mode and using DDU, then install whatever driver you like.
> Dropping 300mhz off you clock speed really stings, I know lol.


Thats the thing I always unistall drivers from safe mode with ddu and install new ones after restarting.


----------



## white owl

Oh my that is a problem.
Is it possible your VRMs and RAM chips aren't cooled properly making the GPU throttle? Or is it something that happens before the card is even warm?
Another thing that's helped me quite recently was removing a game and installing it again.


----------



## buellersdayoff

Moutsatsos said:


> Thats the thing I always unistall drivers from safe mode with ddu and install new ones after restarting.


Check your gpu usage in after burner, had an issue last week where mine would drop to 55% and fps tanking. Haven't had time to resolve yet as I'm away for work.


----------



## Moutsatsos

Second pic i started encoding at the back so you can see the drop on gpu usage and clock.


----------



## Heky

What are you using to encode? Seems to me your gpu is being utilized for encoding too, that is why the clock drops and your game lags. Or it could be that your cpu is bottlenecking the gpu while encoding is running. Try setting the encoder to all but 1-2 cores, it should improve instantly.


----------



## Moutsatsos

Heky said:


> What are you using to encode? Seems to me your gpu is being utilized for encoding too, that is why the clock drops and your game lags. Or it could be that your cpu is bottlenecking the gpu while encoding is running. Try setting the encoder to all but 1-2 cores, it should improve instantly.


Staxrip.It doesn't use gpu processive power at all.If it was you see the gpu usage go up not down.
There is something wrong with the game engine GTX 1080 and Ryzen cpus.It's been happening with other games too and to symilar systems.
As you can see the game doesn't fully utilize the GPU hence the lower fps.Might be game optimization.
Someone in a steam forum suggested that it had to do with Unity Engine and GSync.He said that disabling GSync on his monitor fixed the issue.
Problem is my monitor doesn't have GSynch.
Someone else reinstalled windows a couple of times and managed to fix the issue.Far fetched but this means that something is buggy in there.


----------



## nvidia bios

*gtx 1080 g1 gaming*

who have a original bios for nvidia 1080 g1 gaming i need to back it HELP ME!


----------



## white owl

There are 5 versions:
https://www.techpowerup.com/vgabios...=GTX+1080&interface=&memType=&memSize=&since=


----------



## jon666

Finally updated drivers. First thing on release highlights was "Add memes..." Hopefully GTA stops crashing after a couple of hours.


----------



## microchidism

Soo last time I had a lottery winning GTX 970 a lot of people were disappointed that I didn't sell it to someone in the community who would appreciate it.

Before that happens again

Is anyone interested in a Strix 1080 that does 2100+ on stock voltage


----------



## stephenn82

microchidism said:


> Soo last time I had a lottery winning GTX 970 a lot of people were disappointed that I didn't sell it to someone in the community who would appreciate it.
> 
> Before that happens again
> 
> Is anyone interested in a Strix 1080 that does 2100+ on stock voltage


Good luck with sale! That is a heck of a card!

Oh, what do you define as stock voltage?
My FTW Hybrid tops out at 43c and will run 2114 at 1.063 to 1.083 volts.


----------



## TrueForm

Upgraded from a Rx 580. Have the gaming X model, very cool and quiet. Feels good coming back to Nvidia.


----------



## Yetyhunter

Hello everyone,

I need some advice on overclocking my Strix 1080. I installed afterburner set the voltage slider and power to max 120% and the highest boost I got is 2038 with occasional drops to 2025. Clock is at +70 and memory +300.
Anything higher results in crashing in firestrike second test EXACTLY in the same spot. I believe this is the definition of instability. Uniengine runs fine though with higher clocks and no artifacts. Temperatures do not exceed 70°C.

Do you think Firestrike is not a reliable stability tester ? Should I try a different one ? I am a little bit disappointed by the OC ability of the card.

Strange thing I noticed; If I lower the power to 105 it's almost stable at +80 because it manages to pass the test sometimes but it results in an overall lower score. BTW my graphics score is about 22300, the highest I got. Is it normal?


----------



## nolive721

new FTW hybrid 1080 owner here. OCed to 2134mhz core and 12,000Mhz memory(wow just wow) and not going beyond 52-53degC, in Japan summer starting, even under heavy benchmark or gaming

scored above 3200/128fps in heaven 4.0 and running my triple 1080p set-up with AAA titles ULTRA like a champ

my 1st experience of GPU water cooling is a pretty good one, never going back to Air!


----------



## white owl

Yetyhunter said:


> Hello everyone,
> 
> I need some advice on overclocking my Strix 1080. I installed afterburner set the voltage slider and power to max 120% and the highest boost I got is 2038 with occasional drops to 2025. Clock is at +70 and memory +300.
> Anything higher results in crashing in firestrike second test EXACTLY in the same spot. I believe this is the definition of instability. Uniengine runs fine though with higher clocks and no artifacts. Temperatures do not exceed 70°C.
> 
> Do you think Firestrike is not a reliable stability tester ? Should I try a different one ? I am a little bit disappointed by the OC ability of the card.
> 
> Strange thing I noticed; If I lower the power to 105 it's almost stable at +80 because it manages to pass the test sometimes but it results in an overall lower score. BTW my graphics score is about 22300, the highest I got. Is it normal?


 Firestrike is not a stress test in any way, you can often be +50mhz or so past what is stable and Firestrike will run just fine. It's a benchmark after all.

There isn't really a stress test for GPUs that works well imo, just run games unlimited and you'll eventually find your stable core clock.
As for memory you can also run games without any artifacts or crashes if your ram is unstable.
Leave the memory stock and only OC the core until you know it's stable. Most high FPS games do a good job of stressing the GPU in my experience.
Once you've found that you can move on to the ram. Just bump it up 100mhz at a time and benchmark each time. When your score stops improving is where I stop. Past that it will make the performance worse as the error correction is doing more work than it should.
Not sure how you can be disappointed in a Strix happily running 2000Mhz, 2100mhz wouldn't even be a 5% boost.


----------



## Yetyhunter

Thank you for the clarifications.

The card runs fine most games at higher clocks than the benchmark because the games don't use it at 100% constantly. Still runs everything more than fine providing high frames for the g-sync monitor.
I thought about applying liquid metal on the chip like I did with the CPU. Don't know if it's worth it though. Maybe a different cooling solution in my Meshify C ?

What are your Firestrike graphics score so I can have a comparison?


----------



## white owl

Idk, I only play games and not ones that are easily benched.

If you have an aluminum heatsink the liquid metal is a big no no as it will react with the cooler and you run the risk of killing the card and it will 100% eat up the cool plate. The difference in temp between CLU and MX-4 on GPU dies was 0 when I tested it on my 980 FTW which is now dead from the two years it spent with CLU on it...I thought the shiny finish was nickel plating but it wasn't.


You can run it at whatever clockspeed you want TBH but you might getting worse performance if the ram is too far or driver crashes if the core is. Since the performance difference between 2000Mhz and 2050Mhz is less than 3% I'd prefer to know it's stable but to each his own. The memory OC over stock would be less than that in most cases.


Since Nvidia locked us out of modding the BIOS I don't see much need in stressing over a few Mhz difference when we once got a few hundred Mhz over stock BIOS OCs. I already play everything at 1440p well over 144fps in most cases so there's really no point. If you have g-sync there is even less point but like I said, to each his own.


----------



## Bluebell

nolive721 said:


> new FTW hybrid 1080 owner here. OCed to 2134mhz core and 12,000Mhz memory(wow just wow) and not going beyond 52-53degC, in Japan summer starting, even under heavy benchmark or gaming
> 
> scored above 3200/128fps in heaven 4.0 and running my triple 1080p set-up with AAA titles ULTRA like a champ
> 
> my 1st experience of GPU water cooling is a pretty good one, never going back to Air!



ASUS Rog Strix 1080 owner here. Just put mine under water and very happy with the results. The snip is HWiNFO64 readings while playing a game of WoT with the second to last column being the max results.


----------



## stephenn82

nolive721 said:


> new FTW hybrid 1080 owner here. OCed to 2134mhz core and 12,000Mhz memory(wow just wow) and not going beyond 52-53degC, in Japan summer starting, even under heavy benchmark or gaming
> 
> scored above 3200/128fps in heaven 4.0 and running my triple 1080p set-up with AAA titles ULTRA like a champ
> 
> my 1st experience of GPU water cooling is a pretty good one, never going back to Air!


Welcome to the club! Good results there!! I am going to build a custom loop, picked up blocks from a local micro center. Just need fittings, lines, time, and money 😎


----------



## nolive721

Bluebell said:


> ASUS Rog Strix 1080 owner here. Just put mine under water and very happy with the results. The snip is HWiNFO64 readings while playing a game of WoT with the second to last column being the max results.


sweet.this custom loop man! I have my CPU on an AIO, will post some pics later tonight when i get back home

went more lazy than you buying the FTW with the built in Hybrid cooler variant rather than modding an Air cooled card

with less demanding games, the card would not past 45-46degC even now we are hitting the japan summer so as I said yesterday, there wont be any time soon before I go back to an Air cooler and that for both GPU and CPU

people complaining about Noise are either not lucky with pumps or just dont make the efforts to work out fans curves properly

I am using SPEEDFAN myself and it works beautifully to achieve a proper Noise/perf ratio
@stephenn

good luck with the Custom loop

maybe I will take the plunge some day....


----------



## Bluebell

nolive721 said:


> people complaining about Noise are either not lucky with pumps or just dont make the efforts to work out fans curves properly
> 
> I am using SPEEDFAN myself and it works beautifully to achieve a proper Noise/perf ratio



Speedfan won't recognise my fan headers so I have had to stay with the MB utility (SIV). I did spend quite some time playing with profiles so my system is very quiet on no load with the fans and pump hitting max at a cpu temp of 70C.


----------



## nolive721

I see.
never had great results to be honest with SIV with my Gigabyte Z97N board so I started to use Speedfan at the time few years ago, with quite good results on a rig purely on Air cooling (CPU and GPU)

so I stick to this when I built my AMD rig last year with an ASROCK board.

Speefan is great in a sense it doent not only allow to set curves profiles but also use CPU,MB and GPU temps as triggers not in isolation but in combination

might look bit finicky at 1st and you must have MB and GPU sensors compatible but yes a great tool 

the last upgrade , wishful thinking maybe lol), in my rig would be to connect the EVGA GPU radiator fan to a PWM fan hub I have built in my Aerocool P7-C1 case to have the complete FULL control of all the fans in my case and be able to bring the noise level even lower


----------



## Bluebell

nolive721 said:


> Speefan is great in a sense it doent not only allow to set curves profiles but also use CPU,MB and GPU temps as triggers not in isolation but in combination



SIV and the latest Gigabyte boards allows you to link any MB temp sensor to any MB fan header.


----------



## nolive721

you mean in combination of several temps triggers? if so, WOW that's quite an achievement indeed.


----------



## Bluebell

nolive721 said:


> you mean in combination of several temps triggers? if so, WOW that's quite an achievement indeed.



No, any one fan header to any one temp sensor. For example, the system 1 fan header can be linked to one of the system 1, system 2, VRM MOS, CPU, PCIEX16 or PCH temp sensor. The only one that doesn't do this is the CPU fan header which can only be linked to the CPU temp sensor. If you wanted to, you could link all your fan headers to the CPU temp sensor which would make fan balancing for a water cooled loop a bit simpler.


----------



## broodro0ster

Bluebell said:


> ASUS Rog Strix 1080 owner here. Just put mine under water and very happy with the results. The snip is HWiNFO64 readings while playing a game of WoT with the second to last column being the max results.


I just got the same block for my card and set it up this weekend. 
I'm also achieving 45°C as max temps. Ambients are 24°C, pump speed 4200rpm (92%) on my D5 and water temp is 33°C.

Do you have an idea what you pump speed is and water temperature? I'm trying to see if a 12°C core temp over water temp is a good result.


----------



## Bluebell

broodro0ster said:


> I just got the same block for my card and set it up this weekend.
> I'm also achieving 45°C as max temps. Ambients are 24°C, pump speed 4200rpm (92%) on my D5 and water temp is 33°C.
> 
> Do you have an idea what you pump speed is and water temperature? I'm trying to see if a 12°C core temp over water temp is a good result.



My D5 is pwm controlled and set to run at 100% when the CPU MB sensor reads 50C, usual temps during that game stabilise at about 48C so my pump is probably running a little faster than yours. The water temp stabilises at about 40C but I can drop this to 35C if I use a more aggressive fan profile which makes the system too noisy for my liking. Fan and pump profiles attached. The case fans are run off the CPU VRM temp as the VRMs are passive cooled.


----------



## broodro0ster

Bluebell said:


> My D5 is pwm controlled and set to run at 100% when the CPU MB sensor reads 50C, usual temps during that game stabilise at about 48C so my pump is probably running a little faster than yours. The water temp stabilises at about 40C but I can drop this to 35C if I use a more aggressive fan profile which makes the system too noisy for my liking. Fan and pump profiles attached. The case fans are run off the CPU VRM temp as the VRMs are passive cooled.


Thanks for the detailed info! 
So that means you GPU core temp is only 5°C above water temp. Mine sits at a constant 12°C above water temp, so I remount the block to the GPU. Lets hope that gives me better results.


----------



## mikochu

Hey folks,

I've got a GTX 1080 FE and I want to add an external temperature sensor so I can ramp up the zone of case fans near the GPU when needed. Right now, I have the probe taped to the backplate, but I don't think it's getting hot enough. Does anyone know if putting the probe between the backplate and PCB/core would be sufficient? Should I remove the shroud and tape the sensor on the fits nearest to the GPU core? I'm trying to avoid removing the heatsink, but might do it to upgrade the thermal paste to NT-H1 or Kryonaut.

I also tried playing with SpeedFan, but I can't get my Crosshair VII Hero's sensors to show up.

Thanks!

Edit: reading some of the previous posts, should I just get a Kraken G12 + Corsair H55?


----------



## KingT

I have a GTX 1080 FE under EKWB Nickel Short Top + EKWB Nickel backplate with EKWB 240mm SE rad and two Vardar fans + EK-XRES 100 SPC MX PMW.

Card is OC'd @ 2100MHz boost with +50% Voltage, 120% PL and mem @ 1375Mhz.

My GPU temp is max 48C with pump @ 1750 rpm and fans hitting max 1500 rpm.

Since this is my first ever Water cooling I hope that this is decent performance. 

CHEERS..


----------



## AlbertoM

Now a few months with EVGA Hybrid kit on my 1080 FE, I totally found the sweet spot with the XOC t4 BIOS.

I'm using push pull on the radiator with one EK Vardar and one ML120 Corsair, Grizzly Conductonaut liquid metal on the block.

It's running solid rock stable at 2164MHz fixed with KBOOST, memory at 1364MHz (+450MHz), 1.081v, temps max 50C with radiator fans at medium speeds, card blower fan at 3000 rpm to keep VRM cool.

In DOOM 1440p i got average of 240W of power draw, usually on a range from 220W to 260W, peaking 275W, smooth as a baby's ass 150-200FPS at nightmare mode graphics.

I did some benchmark tests against the stock Bios, at 1080p the stock bios is faster, but at 1440p and 2160p the XOC bios totally better because of unlocked power limits.

I recommend to everyone who plays at those resolutions and can keep the card cool enough.


----------



## Wuest3nFuchs

Heky said:


> Sorry to dissapoint, but those dont work on the new revisions of those cards. I dont know if Gainward changed something or if nvidia did, but i have a 1080 Phoenix GS with bios version 86.04.60.00.49, and flashing any of the Bioses you linked, just doesnt work.
> 
> Also i read somewhere that Gpu-z has problems with extracting the new pascal bioses(especialy with the UEFI part of the vbios), thats why i suggested to dump the bios with the latest nvflash.



Here it is !


But why on earth do you wanna flash it from a GS to a GLH, does that even work?


----------



## Phantomas 007

I'm thinking to get the EVGA GTX1080 SC in 530Euros to replace my old GTX970. Do you think it will be a correct buy a few days before the announcement of the new generation ? Maybe if I wait to see other 1080 offers ?


----------



## akromatic

Does anyone know the size of thermal pads used on the 1080 founders card?


----------



## bp7178

Thickness from what I can tell is 1.5mm on all pads. 

This is thicker than what EK uses on their waterblocks.


----------



## gotteshand

Hey,

I recently brought a Gigabyte GTX 1080 Mini ITX 8G and put a Bykski 1070 mini ITX block for it which worked out great granted I had to do this (See below) but it works and you can't see it inside the case. My problem now is I can OC the GDDR5X +1000 and all is well no problem or pref drop so no error correction at work there due to clocking it too high. But the problem is the card only allows for +5% Powerlimit which limits the core to max of 0.85v 1980mhz during loads any higher is a no go as it constantly hits power-limit despite core being a cool 40°C I really want to go higher so my question is if there is any bios that I could flash that would work with this card. Because otherwise I'll have to solder the shunts (soldering 2mohm shunts on top of the existing ones gaining 50% more power-limit) or doing a resistor mod on the voltage controller (10 ohm resistors on top of the existing 3 around the controller for 3x the power-limit which is rather a bit much or 5 ohm for twice) either of these options is not really I want to do if there is a software option.


----------



## black96ws6

Phantomas 007 said:


> I'm thinking to get the EVGA GTX1080 SC in 530Euros to replace my old GTX970. Do you think it will be a correct buy a few days before the announcement of the new generation ? Maybe if I wait to see other 1080 offers ?


Just hang in there man. The new cards will be revealed in 10 days at Gamescon. And available for sale a couple weeks later. Once they're out you'll be kicking yourself if the performance is really great. At the very least, even if the performance is just "ok" for a new gen, that should drop the prices on the Pascal cards...


----------



## uihdff

delete


----------



## bucdan

Used Zotac GTX 1080 mini for about $350, good deal?


----------



## LMka

Hello guys, can you please let me know if it is possible to SLI 1080 + 1070 ti? I thought that 1070 ti is almost the same as 1080 and based on the same chip but I don't have the possibility to enable SLI for them. Could I somehow flash 1070 ti to 1080 so I can have SLI enable option?


----------



## tangelo

Possible? Yes.
If it's worth the trouble or if it works without problems? I doubt that.


----------



## Heky

@Wuest3nFuchs

Thanks man! Sorry for the late reply, was away on vacation.

Since the cards are phisicaly the same, the bios flash should work no problem, given that the core and the mem are capable of the GLH clocks.


----------



## Scotty99

Anyone in here own the evga hybrid 1080 ftw? Found one on craigslist for really cheap curious if they get as hot as some review sites say they do.


----------



## Phantomas 007

For a 1440p monitor worth to investe to a GTX 1080 ?


----------



## pez

Phantomas 007 said:


> For a 1440p monitor worth to investe to a GTX 1080 ?


If you're looking for a monitor recommendation--and if you're on a budget, the Dell S2716DGR.


----------



## white owl

pez said:


> If you're looking for a monitor recommendation--and if you're on a budget, the Dell S2716DGR.


I they're asking if the 1080 runs 1440p well enough, it does. I use a Nixeus EDG 27" and I haven't really come across any games I can't run at 144fps or 120fps. The AAA titles are usually less optimized than the games I play but those shouldn't have any issues running 72 or 60fps.


----------



## pez

Yeah the GFs rig is running a 1070 at stock and for stuff that you want to run at 144hz, it does most of it consistently. I think in OW we turned down shadows and even Fortnite is running like 120+ FPS I think. The 1080 and 1440p/144hz should be a dream.


----------



## Scotty99

Heck even a 1060 can pull off 144hz/1440p in popular games, its just the new titles that is making me want to upgrade. I had a 1080ti for a while but could not resist selling it for 1100 bucks a few months ago, for most games i was so far above my 165hz refresh rate it felt like a huge waste of money.


----------



## stephenn82

Scotty99 said:


> Anyone in here own the evga hybrid 1080 ftw? Found one on craigslist for really cheap curious if they get as hot as some review sites say they do.


I do. I can do some really heavy lifting in benchmarks and it hits 47c. That is at 2126 core, 5481 memory, 1.081v and 120%. I don't think it's that hot. This is not very hot in my eyes. What temps are you seeing? If you buy it (which you should!) Let us know.

Any game I play for like 2 hrs or so only peak it at 43c 

I do have a custom fan curve that turns fans from 23% at 26 and below up to 56% (max speed for unit) at 38c.


----------



## Scotty99

stephenn82 said:


> I do. I can do some really heavy lifting in benchmarks and it hits 47c. That is at 2126 core, 5481 memory, 1.081v and 120%. I don't think it's that hot. This is not very hot in my eyes. What temps are you seeing? If you buy it (which you should!) Let us know.
> 
> Any game I play for like 2 hrs or so only peak it at 43c
> 
> I do have a custom fan curve that turns fans from 23% at 26 and below up to 56% (max speed for unit) at 38c.


Oh thats nothing, gamers nexus was claiming like mid 60's or something. I think ill pick it up tomorrow if the guy still has it.


----------



## stephenn82

Scotty99 said:


> Oh thats nothing, gamers nexus was claiming like mid 60's or something. I think ill pick it up tomorrow if the guy still has it.


Please, post that link. I havent seen it. I watch his stuff pretty frequently...but not in like 2016 when the card came out. Is it the one when he compares to MSI Seahawk or whatever its called? I never hit temps that high.


----------



## Scotty99

stephenn82 said:


> Please, post that link. I havent seen it. I watch his stuff pretty frequently...but not in like 2016 when the card came out. Is it the one when he compares to MSI Seahawk or whatever its called? I never hit temps that high.


Yup that video:





TBH i dont know how hot it got in their tests these goofballs list it in delta over ambient then never gave the ambient temp lol.


----------



## stephenn82

Scotty99 said:


> stephenn82 said:
> 
> 
> 
> Please, post that link. I havent seen it. I watch his stuff pretty frequently...but not in like 2016 when the card came out. Is it the one when he compares to MSI Seahawk or whatever its called? I never hit temps that high.
> 
> 
> 
> Yup that video:
> 
> 
> 
> 
> 
> TBH i dont know how hot it got in their tests these goofballs list it in delta over ambient then never gave the ambient temp lol.
Click to expand...

So it depends on your ambient. If it's 18c, it will be +29c at full load = core temp of 47c. If your ambient is 21, it will be 50c and so on.

The board is stellar. It is built to handle 400w on core even though reference is 180. It has better power delivery, doubled at each vrm. It's technically the same cooler. But it does hit 100mhz higher in over clocked form.

Evga also has a lifetime warranty...even for second party purchase. 

Can't beat that with a stick. 

You know what to do 😎


----------



## Scotty99

Ya its definitely the best gtx 1080, my rule of thumb tho is i usually dont upgrade unless a card is 2x as fast and 1080 isnt quite 2x a 1060 lol. Wish we had more info on the 2070, would make this decision easier.


----------



## Imprezzion

Well I just got me a Inno3d iChill Black which has a Asetek AIO on it from factory so it is basically the same principle as a EVGA Hybrid.

Also has quite a beefy cooler on VRM and VRAM but it's not a build board. It's in fact so reference the BIOS is branded nvidia.

That card at 1.075v 114% power and 2154 core with 5800 memory runs at about 50-51c but that is with push-pull 2000rpm arctic pwm fans.

It amazes me how high this thing will clock. 2063 boost out of the box and does 2154 easily but now I'm power limited hard. It will actually run as high as 2200+ but clocks go all over the place in stress tests / benches because of the reference 114% power limit..

Is there a BIOS our there for reference cards to get 1.09v and a higher power limit without degrading performance like the Gigabyte XOC and MSI Gaming BIOS does?

Also, I do run +100mv, not that it does much to the voltage, but I notice when the card downclocks in menu's with low load it clocks to a too low voltage. It wants to run like, 1740-1940 core on 0.800-0.888v. This is far from stable and results in artefacts in game menu's. Or even directx crashes. 

Any way around this? I know I can lock it to max freq with MSI AB voltage table by pressing the "L" key but that removes idle downclocks as well.


----------



## dayen666

i need to buy new card, someone can help me to choise 1080 msi/ftw/strix?


----------



## white owl

The one with the highest out of the box speed.


----------



## Rheinfels

Hello,


I've read about 200 pages, but it's too much to read everything. I bought a used Asus 1080 Strix non OC which was able to do 1950MHz on the core out of the box. After my AiO CPU cooler failed I decided to switch to a watercooling for CPU and GPU. As soon as the GPU was under water I played first with the OC Bios versions and finally with the Strix T4 Bios. With this Bios I'm able to do 2200MHz at 1.44V or 2215MHz at 1.5V using the MSI Afterburner curve. Attached two screenshots showing the Afterburner Settings and hwinfo and a Timespy benchmark run (results vary slightly of cause).

I would be interested in the performance at 1.2V as it's the max in the afterburner tool. Temperature seems to be not an issue. The fans are currently running at 50% (Pure wings 140) or 7V (Nanoxia Deep Silence 120 1300u/min version), so it could be improved to get rid of more temperature (Radiator size 1600). But I not sure about 1.15V and I'm worried that 1.2V would be too much for long term.

Is anybody using his 1080 at 1.2V for longer time already? Which frequency could be reached?


----------



## Echoa

Looking to get a 1080 soon, anyone have any idea where I could find a gelid VRM kit? I've been searching for days


----------



## xciter327

Imprezzion said:


> Well I just got me a Inno3d iChill Black which has a Asetek AIO on it from factory so it is basically the same principle as a EVGA Hybrid.
> 
> Also has quite a beefy cooler on VRM and VRAM but it's not a build board. It's in fact so reference the BIOS is branded nvidia.
> 
> That card at 1.075v 114% power and 2154 core with 5800 memory runs at about 50-51c but that is with push-pull 2000rpm arctic pwm fans.
> 
> It amazes me how high this thing will clock. 2063 boost out of the box and does 2154 easily but now I'm power limited hard. It will actually run as high as 2200+ but clocks go all over the place in stress tests / benches because of the reference 114% power limit..
> 
> Is there a BIOS our there for reference cards to get 1.09v and a higher power limit without degrading performance like the Gigabyte XOC and MSI Gaming BIOS does?
> 
> Also, I do run +100mv, not that it does much to the voltage, but I notice when the card downclocks in menu's with low load it clocks to a too low voltage. It wants to run like, 1740-1940 core on 0.800-0.888v. This is far from stable and results in artefacts in game menu's. Or even directx crashes.
> 
> Any way around this? I know I can lock it to max freq with MSI AB voltage table by pressing the "L" key but that removes idle downclocks as well.


I too am interested in any information about this. Would a cross-flash with a bios from another vendor solve this?


----------



## AlbertoM

Hey guys!

After reading people reporting better results with Palit BIOS than the Strix t4 on 1080 TI owner's forum, I decided to give a try of a Palit GTX 1080 GameRock Premium newest bios on my 1080 FE, and WOW!

This BIOS totally blows anything out of the water. Higher scores on 3DMark on every resolution, higher FPS on games.

Its more stable because of a 276W power limit, (the strix bios could spike to anything and crash), same output connections as FE, just perfect.

https://www.techpowerup.com/vgabios/186373/palit-gtx1080-8192-160921


----------



## oile

AlbertoM said:


> Hey guys!
> 
> After reading people reporting better results with Palit BIOS than the Strix t4 on 1080 TI owner's forum, I decided to give a try of a Palit GTX 1080 GameRock Premium newest bios on my 1080 FE, and WOW!
> 
> This BIOS totally blows anything out of the water. Higher scores on 3DMark on every resolution, higher FPS on games.
> 
> Its more stable because of a 276W power limit, (the strix bios could spike to anything and crash), same output connections as FE, just perfect.
> 
> https://www.techpowerup.com/vgabios/186373/palit-gtx1080-8192-160921


Where have you red about it? Any data?


----------



## xciter327

I tried the Palit bios, however that still resulted in hitting the power/voltage limits. I gave the XOC Asus T4 bios a try and I could push to 2115Mhz / 5300Mhz. The GPU is under water tough.

P.S. - The artifacts that I used to get before are gone too.


----------



## AlbertoM

oile said:


> Where have you red about it? Any data?


https://www.overclock.net/forum/69-nvidia/1624521-official-nvidia-gtx-1080-ti-owner-s-club-1724.html

There is a downside for this BIOS that I forgot to mention, is fan speed max at 2500RPM, but since mine is hybrid now, not a problem.

With that Palit BIOS I can do 2126 MHz at 1.075v complete game stable. Yes it hits power limit very rarely, but it is very faster on average than Strix t4 bios, at least on my FE.

With strix BIOS my card pulled on average playing Doom at 1440p about 240W, spiking to max 280W. So Palit BIOS is totally on par with its 276W power limit.

My 3DMark scores with Strix Bios:

Firestrike ULTRA 5 772 points

Firestrike Xtreme 10 598 points

Firestrike 19 388


Palit BIOS:

Firestrike ULTRA 6 002 points

Firestrike Xtreme 11 001 points

Firestrike 21 155


----------



## Imprezzion

AlbertoM said:


> oile said:
> 
> 
> 
> Where have you red about it? Any data?
> 
> 
> 
> https://www.overclock.net/forum/69-nvidia/1624521-official-nvidia-gtx-1080-ti-owner-s-club-1724.html
> 
> There is a downside for this BIOS that I forgot to mention, is fan speed max at 2500RPM, but since mine is hybrid now, not a problem.
> 
> With that Palit BIOS I can do 2126 MHz at 1.075v complete game stable. Yes it hits power limit very rarely, but it is very faster on average than Strix t4 bios, at least on my FE.
> 
> With strix BIOS my card pulled on average playing Doom at 1440p about 240W, spiking to max 280W. So Palit BIOS is totally on par with its 276W power limit.
> 
> My 3DMark scores with Strix Bios:
> 
> Firestrike ULTRA 5 772 points
> 
> Firestrike Xtreme 10 598 points
> 
> Firestrike 19 388
> 
> 
> Palit BIOS:
> 
> Firestrike ULTRA 6 002 points
> 
> Firestrike Xtreme 11 001 points
> 
> Firestrike 21 155
Click to expand...

I'll give it a shot on my Inno3D iChill Black hybrid. That is a reference card and BIOS and it hits powerlimit just barely on 2126/5800 1.062v. Stable as a rock tho on stock BIOS.


----------



## AlbertoM

Imprezzion said:


> I'll give it a shot on my Inno3D iChill Black hybrid. That is a reference card and BIOS and it hits powerlimit just barely on 2126/5800 1.062v. Stable as a rock tho on stock BIOS.


Please report the results


----------



## Imprezzion

AlbertoM said:


> Imprezzion said:
> 
> 
> 
> I'll give it a shot on my Inno3D iChill Black hybrid. That is a reference card and BIOS and it hits powerlimit just barely on 2126/5800 1.062v. Stable as a rock tho on stock BIOS.
> 
> 
> 
> Please report the results /forum/images/smilies/wink.gif
Click to expand...

Can't run it. My card has Micron and the BIOS does not recognize the VRAM. Says Unknown in GPU-Z and RAM is locked to 405Mhz. Also PCI-E locks out on x16 1.1. 

I'll need to find another BIOS for testing lol..


----------



## AlbertoM

Imprezzion said:


> Can't run it. My card has Micron and the BIOS does not recognize the VRAM. Says Unknown in GPU-Z and RAM is locked to 405Mhz. Also PCI-E locks out on x16 1.1.
> 
> I'll need to find another BIOS for testing lol..


Thats weird... If your card is reference as a FE it should have worked.

You flashed the right bios? The one in the link below?

https://www.techpowerup.com/vgabios/186373/palit-gtx1080-8192-160921


----------



## 6u4rdi4n

So I just acquired a used Inno3D GTX 1080 HerculeZ Twin x2, and it's a bit of a "strange" one. Looks like there's two versions of this card! (rev.1 and rev.2). The original one (rev.1) seems to use the same PCB as the Founders Edition, but the rev.2 is slightly modified. It has 7 caps at the back end instead of 4, which have been shifted a bit around, and it has an additional 6 pin power connector. However, it seems like it still has the same power limit! (At least according to GPU-Z's new power consumption thingy). Anyone have any experience with this card?


----------



## AlbertoM

6u4rdi4n said:


> So I just acquired a used Inno3D GTX 1080 HerculeZ Twin x2, and it's a bit of a "strange" one. Looks like there's two versions of this card! (rev.1 and rev.2). The original one (rev.1) seems to use the same PCB as the Founders Edition, but the rev.2 is slightly modified. It has 7 caps at the back end instead of 4, which have been shifted a bit around, and it has an additional 6 pin power connector. However, it seems like it still has the same power limit! (At least according to GPU-Z's new power consumption thingy). Anyone have any experience with this card?


Yeah I check and the BIOSs from Inno3D available for your card are limited at 217W. That totally sucks if your card has the additional 6-pin connector. I pull 276W with Palit BIOS with the 8-pin alone (plus PCIE bus) of the FE, totally fine.

You could try the Palit BIOS or the Strix t4 bios and see whats best for you. I'm sure your card could do much better with one of these BIOS and proper cooling.

I would try Palit BIOS first if your card has stock cooling. Strix t4 BIOS are power hungry and give best results on water/custom loops.

Anyway, my card is on water and Palit BIOS give best performance in every case against Strix and FE BIOS.


----------



## 6u4rdi4n

AlbertoM said:


> Yeah I check and the BIOSs from Inno3D available for your card are limited at 217W. That totally sucks if your card has the additional 6-pin connector. I pull 276W with Palit BIOS with the 8-pin alone (plus PCIE bus) of the FE, totally fine.
> 
> You could try the Palit BIOS or the Strix t4 bios and see whats best for you. I'm sure your card could do much better with one of these BIOS and proper cooling.
> 
> I would try Palit BIOS first if your card has stock cooling. Strix t4 BIOS are power hungry and give best results on water/custom loops.
> 
> Anyway, my card is on water and Palit BIOS give best performance in every case against Strix and FE BIOS.


Yeah, seems like they did improvements to the card and forgot about the bios. I've been browsing the web, and it looks like the PCB on the rev.2 is identical to the iChill x3 model. Maybe I'll give that one a try. Cooling wise, I have a EK-FC GeForce GTX FE block laying around, but due to the changed caps at the back end, it won't fit. I'm gonna try and drill some space for them, or I'll just cut of the back end of the plexi. I did manage to get it running at 2050 MHz+ with a quick and dirty overclock. I bet I could get it to run the same or higher by tweaking the V/F curve instead.


----------



## 6u4rdi4n

So, funny follow up on the Inno3D card. I sent them an email, simply stating that I had the rev 2 of the card and I asked why it had an extra 6 pin, but still the same power limit as the old 8 pin only card. This was the response:

"Dear Sir



Firstly, thank you for your kind support of our product,

The earlier version was a 8 pin but it has been discontinued.



Thank you

Best Regards

INNO3D Support"


I guess something must have gone wrong in translation or something?


----------



## 6u4rdi4n

I f***ed up....

I decided to install a waterblock I had laying around, but as I was prepping the card I managed to do some damage to one of the inductors. Don't ask me how... Would anyone be able to help me out with the specifications for it so I might get a new one and replace it?


----------



## Zfast4y0u

6u4rdi4n said:


> I f***ed up....
> 
> I decided to install a waterblock I had laying around, but as I was prepping the card I managed to do some damage to one of the inductors. Don't ask me how... Would anyone be able to help me out with the specifications for it so I might get a new one and replace it?


:S seams like something heavy fell on it, or whole card on floor.

you didnt write what brand ur card is, hard to figure out like this.


----------



## 6u4rdi4n

Not really sure how it happened. Maybe I managed to just give it a nudge with my screwdriver or something. 

I didn't think about stating which card it was since I pretty much have the last few posts in this thread now, lol. 

It's an Inno3D. Twin x2, HerculeZ Twin x2, x2. It seems to be named so many things. I do know it's rev2, as the rev1 of this card is 99-100% FE with the x2 cooler on it.


----------



## AlbertoM

You could ask in 1080 TI Owners Forum, there's a lot more activity there.

But to me I see from the picture that only the plastic cover was damaged... Don't know I think it could still work if it has not lose from the board. But you have to make sure of course before trying.

For your consolation and comfort, I fried my 6700k on a very disastrous delid attempt lol


----------



## 6u4rdi4n

The card appears to be working as intended, luckily. However, after sustaining the damage, it coil whines like a lil b. Probably because the "cover" is part of the inductor. It's not plastic, but ferrite or something... Whatever it is, it's magnetic


----------



## the1corrupted

I have a 1080 Mini from Gigabyte. AIDA64 is reporting clock speeds as high as 1833 Mhz, and I have done NOTHING to this card. Is this a feature of GPU Boost 3? Gigabyte's own website doesn't even cite a speed this fast. They set turbo for 1711 Mhz.

Did I win the lottery?


----------



## 6u4rdi4n

GPU Boost 3.0 at work indeed. Not sure if it's considered a lottery winner, but in my opinion, anything above advertised boost speeds is a win. On the other hand, I've had 3 GTX 1080 cards all boosting 1900+ out of the box.


----------



## shredzy

My GTX1080 which I got on release (May 2016) has been insane value for money...haven't kept a card for this long in awhile. Just tossing up if I want to get a RTX2080, has any else here thought about the upgrade? I'm playing on 1440p 144hz, playing a ton of black ops 4 blackout and average FPS for me is around 120-150~ with everything on set on low/very low...would love to have 144+ frames constantly!


----------



## the1corrupted

shredzy said:


> My GTX1080 which I got on release (May 2016) has been insane value for money...haven't kept a card for this long in awhile. Just tossing up if I want to get a RTX2080, has any else here thought about the upgrade? I'm playing on 1440p 144hz, playing a ton of black ops 4 blackout and average FPS for me is around 120-150~ with everything on set on low/very low...would love to have 144+ frames constantly!


A 2080 will have similar performance to a 1080 Ti. So like the card you have +20% to +30%.

I'm a conspiracy theorist in that I think nVidia pushed the 20-series launch to sell the crap out of their 10-series that was taking up shelf space.


----------



## 6u4rdi4n

the1corrupted said:


> A 2080 will have similar performance to a 1080 Ti. So like the card you have +20% to +30%.
> 
> I'm a conspiracy theorist in that I think nVidia pushed the 20-series launch to sell the crap out of their 10-series that was taking up shelf space.


I agree with that. With the higher prices, the new cards should also have brought increased performance across the line in traditional tasks, like rasterisation. The new tech is exciting, but if we look at the past, we would probably get at least one, if not two, new generations of cards before it becomes the norm.


----------



## xer0h0ur

LMAO. No. There is no conspiracy here. Nvidia's stock was due to take a hit due to declining profits which shouldn't come as any surprise at all...given that cryptocurrency mining took a nose dive and it had already been way too long since Pascal had launched. Anyone in the market for a high end card had been waiting for the next gen. Then the majority of those people pooped themselves when they saw the price point. Which of course also helped them move their Pascal stock. 

I particularly took great satisfaction in laughing my derriere off @ the mental midgets that called for people to "vote with your wallet" by not buying Turing...when those people were still just buying a 1080 Ti instead. LOL. I don't know what level of ******ation one must achieve to think you're punishing a corporation in not buying their shiny new toy by going out and buying their shiny older toy.


----------



## 6u4rdi4n

Never found out the exact specs for the broken inductor on my Inno3D GTX 1080 Twin X2, but I believe I found one that's as close as I can get: https://www.digikey.com/product-detail/en/bourns-inc/SRN8040-1R0Y/SRN8040-1R0YCT-ND/2756172 Will probably arrive on tuesday or wednesday along with new tips for my soldering iron. Guess we'll find out if it works soon enough, lol.


----------



## CrimsonKing87

Hi everyone, i recently upgraded from my old MSI GTX 970 OC to a MSI GTX 1080 Sea hawk EK, the performance leap over my heavily OCed 970 is huge and i couldn't be happier.

The only question i have is about OC using After Burner: 

Core: i edited the curve and i can get it to 2100 mhz stable under lod.
Memory: i got the slider to +350 mhz.
Power target: + 121%
Voltage: +100.

Temp: max 40° under load after some hours of gaming/benching.

If i try to push it +1 on the core it crashes in Firestrike (i'm still on Win7), so it seems to be the limit... the problem is GPU-Z shows a 80% max TDP, voltage max 1.063, perfcap reason none so there seems to be headroom for improvement as i never once hit power limit or voltage limit, am I wrong?

Thanks


----------



## AlbertoM

CrimsonKing87 said:


> Hi everyone, i recently upgraded from my old MSI GTX 970 OC to a MSI GTX 1080 Sea hawk EK, the performance leap over my heavily OCed 970 is huge and i couldn't be happier.
> 
> The only question i have is about OC using After Burner:
> 
> Core: i edited the curve and i can get it to 2100 mhz stable under lod.
> Memory: i got the slider to +350 mhz.
> Power target: + 121%
> Voltage: +100.
> 
> Temp: max 40° under load after some hours of gaming/benching.
> 
> If i try to push it +1 on the core it crashes in Firestrike (i'm still on Win7), so it seems to be the limit... the problem is GPU-Z shows a 80% max TDP, voltage max 1.063, perfcap reason none so there seems to be headroom for improvement as i never once hit power limit or voltage limit, am I wrong?
> 
> Thanks


Yah you can get to 1.093v on original bios. To have that you must go into settings in MSI Afterburner and unlock Voltage Control/Monitoring, third party mode.

Then you must set a custom curve to have voltage greater than 1.063v, it does not go alone by itself.

My 1080 FE that is now a Hybrid don't like much that voltage though. I'm better at 1.075v, sometimes 1.081v max.

The problem is power limit with original bios. With strix t4 you have unlimited power draw, but i found better perfomance with palit Gamerock Premium 276W max TDP bios.

I'm happily stable at 2164 MHz 1.075v game stable, temps max at 60C. I recommend doing the backplate thermal pad mod, search about that, it helped me getting higher clocks.

Obs: only with strix t4 you can get to 1.2v if you like. All other bios are limited to 1.093v.

Obs2: I see your BIOS is limited at 291W, so you already have a good power headroom over original 220W max of founders edition, you might try the strix t4 since your on a full block, or leave it alone and tune the best voltage for your BIOS that is already good. With your temps you must be able to hit 2151, 2164, 2176, 2189 or even 2.2GHz, at what voltage you have to find out.


----------



## CrimsonKing87

AlbertoM said:


> Yah you can get to 1.093v on original bios. To have that you must go into settings in MSI Afterburner and unlock Voltage Control/Monitoring, third party mode.
> 
> Then you must set a custom curve to have voltage greater than 1.063v, it does not go alone by itself.
> 
> My 1080 FE that is now a Hybrid don't like much that voltage though. I'm better at 1.075v, sometimes 1.081v max.
> 
> The problem is power limit with original bios. With strix t4 you have unlimited power draw, but i found better perfomance with palit Gamerock Premium 276W max TDP bios.
> 
> I'm happily stable at 2164 MHz 1.075v game stable, temps max at 60C. I recommend doing the backplate thermal pad mod, search about that, it helped me getting higher clocks.
> 
> Obs: only with strix t4 you can get to 1.2v if you like. All other bios are limited to 1.093v.
> 
> Obs2: I see your BIOS is limited at 291W, so you already have a good power headroom over original 220W max of founders edition, you might try the strix t4 since your on a full block, or leave it alone and tune the best voltage for your BIOS that is already good. With your temps you must be able to hit 2151, 2164, 2176, 2189 or even 2.2GHz, at what voltage you have to find out.


Thanks! This morning i did a quick test:
This was the configuration i used to get 2100 at 1.063:
https://www.overclock.net/forum/attachment.php?attachmentid=240170&thumb=1

So today i tried this configuration:
https://www.overclock.net/forum/attachment.php?attachmentid=240172&thumb=1

i got it to 1.075 at 2138 mhz, but firestrike crashed on first graphic test so i have to find a different configuration.


----------



## AlbertoM

CrimsonKing87 said:


> Thanks! This morning i did a quick test:
> This was the configuration i used to get 2100 at 1.063:
> https://www.overclock.net/forum/attachment.php?attachmentid=240170&thumb=1
> 
> So today i tried this configuration:
> https://www.overclock.net/forum/attachment.php?attachmentid=240172&thumb=1
> 
> i got it to 1.075 at 2138 mhz, but firestrike crashed on first graphic test so i have to find a different configuration.


You don't have to pick all the points on the curve... Just reset it, and if you like to use 1.075v for exemple, pick the point on that vertical line and just up to what +MHz you want, than apply it. All the curve will follow, and it will be a flat line from there meaning it will stay at that voltage.

Try 1.081v and 1.093v too, your card probably can handle that since has two power connectors, more power phases and high TDP.

This is mine 2164MHz at 1.075v:


----------



## Cyclops

I've been out of the loop for a while. Is there a BIOS editing tool for Pascal or a modified BIOS for the reference 1080 yet?


----------



## CrimsonKing87

AlbertoM said:


> You don't have to pick all the points on the curve... Just reset it, and if you like to use 1.075v for exemple, pick the point on that vertical line and just up to what +MHz you want, than apply it. All the curve will follow, and it will be a flat line from there meaning it will stay at that voltage.
> 
> Try 1.081v and 1.093v too, your card probably can handle that since has two power connectors, more power phases and high TDP.
> 
> This is mine 2164MHz at 1.075v:


No luck with stock bios so I flashed the t4 xoc. 
I managed to run 4 times in a row firestrike at 2190 mhz on 1.2 v, the fifth time crashed. I tried lowering core but I can't get it stable, it crashes randomly. 
The really strange thing is that I get 1000 points lower on firestrike result even if I'm 90mhz over stock... I guess xoc bios is not really suited for my MSI, I'll probably flash stock back.


----------



## CrimsonKing87

Cyclops said:


> I've been out of the loop for a while. Is there a BIOS editing tool for Pascal or a modified BIOS for the reference 1080 yet?


You can try the xoc t4 bios, but it seems like you get higher clock but lower points in benchmarks


----------



## AlbertoM

CrimsonKing87 said:


> No luck with stock bios so I flashed the t4 xoc.
> I managed to run 4 times in a row firestrike at 2190 mhz on 1.2 v, the fifth time crashed. I tried lowering core but I can't get it stable, it crashes randomly.
> The really strange thing is that I get 1000 points lower on firestrike result even if I'm 90mhz over stock... I guess xoc bios is not really suited for my MSI, I'll probably flash stock back.


Yeah i found that lower score only on Firestrike regular 1080p... When I run Firestrike Extreme or Ultra i was having better scores with that bios probably because my stock bios is very limited at TDP.

Don't remember where I read but they said that xoc t4 bios has more flexible timings on memory etc for better overclocking on LN2, explaining the lower scores.

Anyway, I tried the Palit Gamerock Premium (got the ideia on 1080ti forum) and it was a marriage made in heaven for my FE.

It gives better scores than xoc and stock in every benchmark, also gaming is fantastic, and I don't have to worry of melting my only one PCIE 8 pin because of perfect max TDP of 276w.


----------



## white owl

AlbertoM said:


> Yeah i found that lower score only on Firestrike regular 1080p... When I run Firestrike Extreme or Ultra i was having better scores with that bios probably because my stock bios is very limited at TDP.
> 
> Don't remember where I read but they said that xoc t4 bios has more flexible timings on memory etc for better overclocking on LN2, explaining the lower scores.
> 
> Anyway, I tried the Palit Gamerock Premium (got the ideia on 1080ti forum) and it was a marriage made in heaven for my FE.
> 
> It gives better scores than xoc and stock in every benchmark, also gaming is fantastic, and I don't have to worry of melting my only one PCIE 8 pin because of perfect max TDP of 276w.


 If it works on an FE it should work on an EVGA SC too right? It has reference PCB with a single 8 pin as well.
Does the BIOS allow higher clocks for your at all or is it just TDP? What was your card limited to with the FE BIOS?
Would you mind zipping the BIOS and posting it here so I get the correct one?
I miss the good ole days where we could mod the BIOS to ignore the boost table and just hold the clocks you set until it throttled. Oh and baking in a custom fan curve! RIP Maxwell.


I can't get more than around 1990Mhz with my card as is, anything would be better.


----------



## AlbertoM

white owl said:


> If it works on an FE it should work on an EVGA SC too right? It has reference PCB with a single 8 pin as well.
> Does the BIOS allow higher clocks for your at all or is it just TDP? What was your card limited to with the FE BIOS?
> Would you mind zipping the BIOS and posting it here so I get the correct one?
> I miss the good ole days where we could mod the BIOS to ignore the boost table and just hold the clocks you set until it throttled. Oh and baking in a custom fan curve! RIP Maxwell.
> 
> 
> I can't get more than around 1990Mhz with my card as is, anything would be better.


Flash this one man:

https://www.techpowerup.com/vgabios/186373/palit-gtx1080-8192-160921

Yes it allows higher clocks, TDP is perfect if you play at 1440p, and better scores, FPS, overclocking... you will see.

If yours is reference PCB i don't see a reason for not trying.

Of course mine is on the evga Hybrid cooler, so with a AIO like mine or custom water loop will love this Bios. Don't know what it could do on Air, because the max fan rotation is 2500rpm.


----------



## white owl

AlbertoM said:


> Flash this one man:
> 
> https://www.techpowerup.com/vgabios/186373/palit-gtx1080-8192-160921
> 
> Yes it allows higher clocks, TDP is perfect if you play at 1440p, and better scores, FPS, overclocking... you will see.
> 
> If yours is reference PCB i don't see a reason for not trying.
> 
> Of course mine is on the evga Hybrid cooler, so with a AIO like mine or custom water loop will love this Bios. Don't know what it could do on Air, because the max fan rotation is 2500rpm.


Thank you 
Is it still done like this:
https://www.overclock.net/forum/69-nvidia/1523391-easy-nvflash-guide-pictures-gtx-970-980-a.html
Or is there a new version or something I should know about?


Yeah I play 1440p/144hz. My ancient CPU is holding me back more than anything but that's hardly the point 
Also when you cross flash like this, you get that GPU's fan profile right? I think the EVGA profile is a little conservative but I really don't get great results with AB or PX, I use nvinspector.
Thanks again, I asked about this a while back when I first got the card but no one seemed to know...or care haha


----------



## AlbertoM

white owl said:


> Thank you
> Is it still done like this:
> https://www.overclock.net/forum/69-nvidia/1523391-easy-nvflash-guide-pictures-gtx-970-980-a.html
> Or is there a new version or something I should know about?
> 
> 
> Yeah I play 1440p/144hz. My ancient CPU is holding me back more than anything but that's hardly the point
> Also when you cross flash like this, you get that GPU's fan profile right? I think the EVGA profile is a little conservative but I really don't get great results with AB or PX, I use nvinspector.
> Thanks again, I asked about this a while back when I first got the card but no one seemed to know...or care haha


Man i did this way:

https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html

Also use the exact nvflash64 version that they provide in the first post, i read about some newer not working.

There is no worries... Just backup your first and try that bios, if you don't like just flash it back.

About the gpu fan profile, is totally up to the Bios you are flashing and the fans that were used for that particular card. I assume you will be overclocking, just make sure to set a steady fan rpm or curve for your needs on your profile.

Hope you like and Marry Christmas!


----------



## stephenn82

AlbertoM said:


> Man i did this way:
> 
> https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html
> 
> Also use the exact nvflash64 version that they provide in the first post, i read about some newer not working.
> 
> There is no worries... Just backup your first and try that bios, if you don't like just flash it back.
> 
> About the gpu fan profile, is totally up to the Bios you are flashing and the fans that were used for that particular card. I assume you will be overclocking, just make sure to set a steady fan rpm or curve for your needs on your profile.
> 
> Hope you like and Marry Christmas!


You got anything for an EVGA 1080 FTW Hybrid?


----------



## AlbertoM

stephenn82 said:


> You got anything for an EVGA 1080 FTW Hybrid?


Don't know man... You could try that 291W TDP MSI seahawk bios from MSI... Or the Waterforce from Gigabyte, is 375W TDP...

I tried that last one on mine and didn't work good because mine is reference PCB, so maybe that bios could be similar to yours.

BTW i think I will try now the MSI one, lol

Just keep an eye for the number and type of output connectors, they could be different for your Bios.


----------



## stephenn82

AlbertoM said:


> Don't know man... You could try that 291W TDP MSI seahawk bios from MSI...
> https://www.overclock.net/forum/newreply.php?do=newreply&p=27776956
> Or the Waterforce from Gigabyte, is 375W TDP...
> 
> I tried that last one on mine and didn't work good because mine is reference PCB, so maybe that bios could be similar to yours.
> 
> BTW i think I will try now the MSI one, lol
> 
> Just keep an eye for the number and type of output connectors, they could be different for your Bios.


Roger that.
I need to learn how to mod the Nvidia bios. I saved a copy off of my card. Over in the AMD card forum, one of the guys was teaching me how to mod it, and I had my r9 390 churning out past 390x territory. Teach a man to fish is my motto.


----------



## AlbertoM

stephenn82 said:


> Roger that.
> I need to learn how to mod the Nvidia bios. I saved a copy off of my card. Over in the AMD card forum, one of the guys was teaching me how to mod it, and I had my r9 390 churning out past 390x territory. Teach a man to fish is my motto.


Wow you will be the first one to mod the Bios... Since 2016 no one did it. 

A guy said in the past here but didn't shared the bios anyway, so IMHO it is impossible.

And im not talking about 1080 only, 1080ti and all newer models, I read all the forums, no one did it.


----------



## stephenn82

***. really? That sucks. It may not be worth it then.
https://www.overclock.net/forum/71-...e/1621789-pascal-bios-editor-any-news-28.html


----------



## AlbertoM

stephenn82 said:


> ***. really? That sucks. It may not be worth it then.
> https://www.overclock.net/forum/71-...e/1621789-pascal-bios-editor-any-news-28.html


And im not talking about 1080 only, 1080ti and all newer models, I read all the forums, no one did it. 

Wow i updated my drivers for the last one 4xx (i was on the last 399 since a lot of people were complaining about the new ones since 2080s were released).

AND, my overclock on Palit bios is now stable at 2189 MHz at 1.081v, game stable! I just can't believe what this card is doing with just the Hybrid kit. Incredible this FE if tuned and cooled properly. And i'm running the memory at 1410.8 MHz, thats 11,286.4 Gbps.

I'm playing Doom vulkan 1440p at 200fps, thats 1080ti and rtx 2080 overclockeds territory.


----------



## stephenn82

That is legit! I have my 1080 FTW at 2126 game stable, it seems to be running at lower voltages as well on this latest driver. It would always run at 45c and 1.093 volts no matter what. it runs like a top at 1.063v at same clock. Maybe I can coax some more MHz out of her


----------



## AlbertoM

Yeah you definitely should!

What I realized too is that if you are running hotter, like for me on 60C the card prefers a little more voltage, for me 1.081v.

If its colder 55C region i can drop to 1.075v at the same clocks just fine.


----------



## white owl

So the BIOS (palit) worked for the most part. It's boosting higher but I guess ther Palit used a lower speed fan so my fan is only running around 70% of what it can and it's letting me reach 85c, previously it was rare to even see 80c. Obviously the TDP isn't helping.
Is there another reference BIOS anyone could suggest?
I did gain some speed and over 200 points in superposition but I really don't need the ALL the extra TDP that this BIOS enables, without being limited the card is using around 110w under load. Never getting power limited so that's good.


----------



## AlbertoM

white owl said:


> So the BIOS (palit) worked for the most part. It's boosting higher but I guess ther Palit used a lower speed fan so my fan is only running around 70% of what it can and it's letting me reach 85c, previously it was rare to even see 80c. Obviously the TDP isn't helping.
> Is there another reference BIOS anyone could suggest?
> I did gain some speed and over 200 points in superposition but I really don't need the ALL the extra TDP that this BIOS enables, without being limited the card is using around 110w under load. Never getting power limited so that's good.


Man this Bios TDP reading on HWInfo or GPUZ are dummies like when you shunt mod. The card show max 165W even on Furmark. It's obviously pulling those 276W for doing that at those FPS.

Playing Doom at 1440p my card shows max 150W, and it's rendering faster than XOC bios that showed 250W, furmark 335W.


----------



## stephenn82

AlbertoM said:


> Yeah you definitely should!
> 
> What I realized too is that if you are running hotter, like for me on 60C the card prefers a little more voltage, for me 1.081v.
> 
> If its colder 55C region i can drop to 1.075v at the same clocks just fine.


Even with the most strenuous of games/benches, i hit max of 44c.


----------



## derx

I got my Gigabyte 1080 G1 Gaming flashed with the Asus T4 bios up to about 2230 stable at 1.1v with 3dmark, just in gaming it crashes when going above 2199. 
I just don't like the T4 bios. It forces me to use the Asus tool to get it to go above 19xxmhz, and I really don't like it. I'm also stuck at 1.1v, but all the reports I see say that it should be able to do 1.2v, I just haven't found out yet how to do it. I have an EK full cover block on the card, and haven't gotten it above 45C during furmark at [email protected]
Maybe I should try the Palit bios. If I can get to 1.1v (or beyond) on that while using Afterburner I'd be a happy camper


----------



## Scotty99

Would anyone with a 1080 mind posting a firestrike score to compare to my 2060?

Just curious how they actually compare:
https://www.3dmark.com/3dm/32546443?

19,120 gpu score stock clocks.


----------



## derx

Scotty99 said:


> Would anyone with a 1080 mind posting a firestrike score to compare to my 2060?
> 
> Just curious how they actually compare:
> https://www.3dmark.com/3dm/32546443?
> 
> 19,120 gpu score stock clocks.


25,836 for my 1080: http://www.3dmark.com/fs/17864572


----------



## Scotty99

derx said:


> 25,836 for my 1080: http://www.3dmark.com/fs/17864572


Oh thats overclocked, was hoping someone had a reference 1080 to compare lol.

Nice score tho.

Also does anyone know how to update the rig info? I changed mine to say 2060 but it isnt updated, if you click my PC it is there tho.....weird.


----------



## Bluebell

Scotty99 said:


> Would anyone with a 1080 mind posting a firestrike score to compare to my 2060?
> 
> Just curious how they actually compare:
> https://www.3dmark.com/3dm/32546443?
> 
> 19,120 gpu score stock clocks.



Card set to default:
https://www.3dmark.com/fs/17950482


Light overclock:
https://www.3dmark.com/fs/17950726


Not sure what the error
Level of Detail (LOD) settings modified by NVIDIA driver, result invalid. Check your video driver settings.
​means. The only change I have made to the driver settings was to turn off G-sync so that the benchmark would run without crashing to a black screen.


Regards


Bluebell


----------



## Scotty99

Bluebell said:


> Card set to default:
> https://www.3dmark.com/fs/17950482
> 
> 
> Light overclock:
> https://www.3dmark.com/fs/17950726
> 
> 
> Not sure what the error
> Level of Detail (LOD) settings modified by NVIDIA driver, result invalid. Check your video driver settings.
> ​means. The only change I have made to the driver settings was to turn off G-sync so that the benchmark would run without crashing to a black screen.
> 
> 
> Regards
> 
> 
> Bluebell


Cool thx for testing, about 2k behind at stock. I just let evga precision x1 do the oc scanner thing and it found 159mhz, ill take it lol


----------



## combatfun

Hi, i've got a Palit Super Jetstream GTX 1080, working in a case with a lot of fans. The card has a 1x8pin + 1x6pin power supply. Currently i can achieve GPU 2063 MHz / VRAM 5500 MHz 100% stable at ca. +62°C worst case. Benchmarks and a few hours of gaming are possible with higher clocks. I'm trying to reach 2200+ MHz on the GPU 100% stable. 2300 MHz at 1.2Volt would be awesome! As i've flashed my gtx 780 several years ago, i was confident i could do this.

I extracted the original bios of my card with palit thundermaster. Than i tried to flash it with the T4 mod bios, using nvflash. Sadly it doesn't work, no matter what i try. I've tried:
- Close NV Driver / let it open
- Close Palit Thunder Master and MSI Afterburner + reset the card to default settings / let those programs open
- Deactivate GTX 1080 display device / kept it activated
- run nv flash as admin / run it regular
- delete the 64 out of the 64bit version / didn't delete it (the 32bit version doesn't work)
- tried 2 diffrent versions of nv flash: the actual version from guru 3d and an older version i found via google. sadly i can't find any further older versions

What happens is:
When i start nv flash, i get a list with possible commands as visible in this screenshot:
https://abload.de/img/waedaxd0dk05.jpg

Sadly i can only press enter to continue. Than i get more commands until the application closes at some point.
Or i can press Q to quit nvflash immediatley. If i hit any diffrent button, the same happens as if i pressed enter: The command list just continues, until the application closes. So i can't enter any command!

When i drag&drop the t4 mod bios file over nvflash (that way i flashed my gtx 780), i get this error message "PCI subsystem ID mismatch", as visible in this screenshot: 
https://abload.de/img/dasdwqaeurkmp.jpg

I'm using Win 10 Home 64bit @ i7 3770 with 16 Gig RAM. My power supply has 1000 Watt. Can anyone help me or at least give me a link to download a very old nv flash version?


----------



## PlugFour

*T4 Bios not working as expected*

"PCI subsystem ID mismatch" is a normal message when flashing T4. You have to add this parameter : -6 (means "override SUB")


I'm here to get help too : I have a ZOTAC OEM 1080 which is identical to a FE, PCB and driver speaking. I just flashed the T4 bios. Everything behaves like described here : 

- No more power limit
- Unlocked voltage

The only "minor" flaw is that I can't go over 20fps in any 3D application (furmark, games, etc). All sensors react as if it was working at full speed, but no way to get past 15-20fps

















Do you have an idea why I get this limitation? It's pretty sad as everything looks OK otherwise. I have no FPS limiter, and nothing is throttling as far as I see (my computer is currently in a room @15°C).


----------



## combatfun

Thx man that worked! But now the cards voltage is locked at 1.125v. Increasing the voltage has no effect in that Asus Tool, in Palit Thundermaster and even not with MSI Afterburner. 
Before i had 1.093v max. So i got a small increase, but far aways from the desired 1,2v.
Does any one know, how i can increase the gpu voltage with the t4 mod bios?

Regarding your 20 fps issue: 
I had a similiar issue with my gtx 780 a few years ago. The reason was i did a mistake in the bios and the card jumped under 3D in the energy saving performance profile state. i think it was called "P8", but i'm not sure anymore. as you have issues and also i do, i think that t4 mod bios simply doesn't work on all cards.
Btw, take care that you have 1x8pin + 1x6 pin power connection on your gfx card. Otherwise it's too dangerous.


----------



## AlbertoM

Voltage is not locked on T4 Bios. The card increases voltage as temperature rises.

To override that you must set a custom voltage curve using MSI AfterBurner. Also if you want to go all the way up to 1.2v.

For FE cards, I found that the best BIOS is the Palit Gamerock Premium. T4 is too dangerous with only one 8pin PCIE connector, and gives worse performance too.


----------



## combatfun

Ahhhh, i see. Many thx for that m8!
I will have a look at that and monitor how the card behaves with diffrent settings. than i will decide which setting i keep.
Again thanks for the help!


----------



## kmeel

PlugFour said:


> "PCI subsystem ID mismatch" is a normal message when flashing T4. You have to add this parameter : -6 (means "override SUB")
> 
> 
> I'm here to get help too : I have a ZOTAC OEM 1080 which is identical to a FE, PCB and driver speaking. I just flashed the T4 bios. Everything behaves like described here :
> 
> - No more power limit
> - Unlocked voltage
> 
> The only "minor" flaw is that I can't go over 20fps in any 3D application (furmark, games, etc). All sensors react as if it was working at full speed, but no way to get past 15-20fps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have an idea why I get this limitation? It's pretty sad as everything looks OK otherwise. I have no FPS limiter, and nothing is throttling as far as I see (my computer is currently in a room @15°C).



I also experienced low fps (8-20 in Heaven benchmark) after trying this BIOS my nvidia 1080fe today. Here's what I discovered.

1) GPU-Z states memory type as Unknown.
2) Difference in Device ID information (at the end of parameters) in GPU-Z.
3) I saw in the MSI Afterburner monitor (which I always use on a 2nd display for monitoring) that my ram is running at 405MH on this bios, standard this is should be running at 5000MHz. 

The low memory speed would explain the bad performance. I've put my backup'ed bios back onto the card. Hope this info helps you.
Very curious if someone has an explanation or solution.


----------



## AlbertoM

Whats bios are you flashing? This one? 

https://www.techpowerup.com/vgabios/189498/asus-gtx1080-8192-160705


----------



## combatfun

No, the T4 Bios nrpeyton: https://www.overclock.net/forum/69-nvidia/1601288-official-nvidia-gtx-1080-owner-s-club-924.html

Just for reference: 
Palit Super Jetstream keeps 2164 MHz @ 1,15v
Setting for that: +200mhz base clock and curve edit to 2200 mhz @ 1,15v. Clock is falling down after a while to 2164 mhz, where it rests. temps to +64°C, but fans are not 100%.

I can't go higher achieving 100% stability, even not with more voltage. That's only possible for a short time to pass benchmarks. 
To stay at 2200 mhz, i tried +200 base clock and curve edit to 2235 mhz @ 1,181v. After a while it then falls down and stays at 2202 mhz, before it crashes at some time.

So i don't have the best card, but i'm glad i could gain ca. +100 Mhz more mhz, compared to the regular bios.
Many thanks once more!


----------



## kmeel

I've tried an Palit bios, this also makes de memory go to 405 MHz.








Also after reboot.

I get this with pretty much all 1080 bios'es I tried, I tried more then 10. The card is 9/10 months old. Could it be this memory is different from the 'older' ones?
I can clock my mem easyly to max +1000Mhz with NZXT kraken on it, which I think is unusual.

Gues I'll stay with my stock bios as the results are decent. But I'd like to see how far I can push without the power limit.
With the original fe bios and Kraken:


----------



## PlugFour

+1000Mhz on memory seems huge! I did'nt spend time trying to go over +500Mhz.
I'll flash the palit bios but I bet I'll get this strange memory frequency limitation too...

I have the kraken adapter but didn't take the time to install it so far. Can it achieve better o/c than stock blower, knowing that in my setup the blower is not speed limited and it is in a "fresh" room?


----------



## kmeel

Better cooling = better performance, period.
I believe nvidia's boost 3.0 downclocks really fast if the temps even rise a little, and this is not something we can edit. With the kraken you will actively cool the ram chips with the included fan. With stock cooling I couldnt get passed +500 or +550MHz on the memory. I think i was at +200 opn the GPU, same +/- as now, only since the temps are way lower, it now stays fairly stable, never see it below 2000Mhz when gaming.

Slap that sucker on 

When was your card purchased new?


----------



## PlugFour

I don't know, I'm not the first owner. It was bought mid 2017 I think. 
About the freezing approach : On my Ryzen 2600x I currently have no room for o/c : it is cooled by an AIO, the liquid cooler is at 18° (cold room), but even like that I can't set stable frequencies higher than the "automatic stock o/c". I have a sort of wall, probably amplified by the B350 Tomahawk.
This is why I was asking for my FE 1080. Considering I get +200/+500 with minor throttling in games, I'm not sure it is worth switching to the kraken (atm the kraken is on the ryzen, and the G10 is on a shelf). 

/mylife


----------



## AlbertoM

Well, I use nvflash version 5.370.0 and never had this memory issue.

Keep in mind that Palit GameRock bios has memory overclocked already, 1314 MHz vs 1251 MHz of the FE. Thats 63 x 4 = +252 MHz that you would set on Afterburner.

So I already have +252 from the BIOS, and I found the max i can go gaming without artifacting is +400 MHz, which would be +652 MHz on FE terms.

I'm with EVGA Hybrid kit and temps never go beyond 60C.


----------



## kmeel

I'm on nvflash Version 5.541.0.
Strange.. I could try an older version. Has there been strange behavior reported before that has been fixed with older/other versions?


----------



## PlugFour

Tried the Palit, same memory pb


----------



## kmeel

PlugFour said:


> Tried the Palit, same memory pb


What nvflash version are you using?


----------



## PlugFour

The last one. I saw a guy having the same problem on another forum, he finally found another strix bios working. I flashed this strix bios, everything is working like my stock bios except stock cpu frequency. After my custom o/c I get the same result, so useless. I'm on my phone, I can tell you the strix version later.


----------



## kmeel

I've also tried different bios'es, including from Asus. Unfortunotly the T4 is the one we actually want to work since it has no power limit. I saw some bios'es have a bigger power limit ie 130% over our 120%, but these also ran at 405MHz on the mem on my card.
Only option I've got left is the shunt mod.


----------



## AlbertoM

kmeel said:


> I'm on nvflash Version 5.541.0.
> Strange.. I could try an older version. Has there been strange behavior reported before that has been fixed with older/other versions?


Yes i remember seeing that somewhere. Thats why I recommend that exact version 5.370.0.

https://www.overclock.net/forum/attachment.php?attachmentid=50430&d=1509234517

Its not possible that the Palit GameRock BIOS works on my FE and not on yours.


----------



## kmeel

Thanks Alberto!
I've tried the Asus T4 Bios with version 5.370.0, but with the same result as before.


----------



## kmeel

I'm having a hard time letting this go. 
I flashed the bios again and noticed the "Bus interface" is different once de bios is flashed. The render test within GPU-Z (the "?" icon behind Bus interface information) does not update this to the correct speed. (source: http://www.tomshardware.co.uk/answers/id-1802985/pci-express-x16-running.html)

T4 (v2?) Bios:









Original:


----------



## PlugFour

Hum, were are not alone 

https://forums.evga.com/GTX-1080-running-on-a-really-slow-memory-clock-m2836846.aspx
https://www.overclockers.com/forums/showthread.php/788413-Memory-type-unknown-after-flashing-bios


By the way, the shunt mod did'nt work on my card (yet it is reference design). Many users reports the same. I did the mod twice to be sure. Is there a good place for talking about shunt?


----------



## AlbertoM

Did you guys reinstalled the drivers and reboot the machine again?

The T4 bios I used was the 1936 MHz boost clock (86.04.17.00.F8). 

And the Palit is the 276W TDP (86.04.3B.00.67).

I follow exactly the flash procedure described here https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html

That is:

First: Disable GPU device on Device Manager

Then run those commands as ADMIN:

-- Disable protection:

nvflash64 --protectoff

-- Then to backup original bios:

nvflash64 --save filename.rom

-- To flash bios:

nvflash64 -6 biosfilename.rom

Press 'y' to all the warnings.

-- After that reboot, enable device, windows will auto reinstall drivers, then another reboot.

Thats what mine shows with palit bios (with the render test on and default clocks):


----------



## PlugFour

*Thank you for your flash summary*

I did this except "disable/enable GPU in device manager". *But* the nvflash version I use do this for me : disable before flash and enable after flash.
I didn't clean/reinstall the Nvidia drivers. But neither do you. Maybe I should.

*EDIT* : another guy got this problem on only one over his 6 identical Palit cards, and says the Palit support solved his problem. He doesn't clearly explains how Palit solved the problem, but provides a specific version of nvflash, together with a Palit rom version (11718) that worked, which is attached in the last post.


----------



## AlbertoM

Forgot to mention to run as Administrator the CMD to flash commands.

And is really important to use nvflash64 with the '-6' parameter, or the '--overridesub' parameter.

And when flashing you have to press 'Y' to override everything.

nvflash64 -6 biosfilename.rom

Press 'y' to all the warnings.

And found this guy who solved the problem:

https://www.techpowerup.com/forums/...after-vbios-update-from-official-site.249166/

Edit: I tried the MSI 291W Bios on my FE, no good, Palit it's waaaaaaaay better.


----------



## PlugFour

*BIOS alternatives for people having the Unknown Memory Problem*










This post aims people having their memory frequency stuck at 202 - 405 - 810 mhz when flashing any other 1080 bios, with GPU-Z saying "Unknonw Memory".

Please find hereunder a list of working BIOSes based on people who solved the same problem. For now, @kmeel confirms that the Palit SuperJetstream shows the better performancemore since it gives more juice and less throttling than his stock FE bios and the other BIOSes. Use it with caution as this involves more power consumption. Don't forget that FE cards only have a 8 pin power connector : be sure to keep your case fresh in order to prevent the connector overheating.

*--------------------------
BIOS Palit SuperJetstream
--------------------------*
See here
Download : https://www.techpowerup.com/vgabios/196736/196736
--------
Specs 
-------- 
VBIOS Version:	86.04.60.00.48
BIOS Build date:	2017-09-18 00:00:00
Board power limit
Target: 200.0 W
Limit: 240.0 W
Adj. Range: -55%, +20%
Thermal limits
Rated: 83.0C
Max: 92.0C
Memory GDDR5X, Micron Boost Clock: 1848 MHz

----
GPU
----
Device Id:	10DE 1B80
Subsystem Id:	10DE 1B80
GPU Clock:	1709 MHz
Boost Clock:	1848 MHz
Memory Clock:	1251 MHz

@kmeel's FE flashed with this BIOS shows a 3% score increase *clock to clock* in firestrike : 










*-----
BIOS Gigabyte GTX 1080 8 GB 
-----*
https://www.techpowerup.com/vgabios/203217/gigabyte-gtx1080-8192-180128

(Turbo OC)
VBIOS Version:	86.04.60.00.DB
BIOS Build date:	2018-01-28 00:00:00
GPU Device Id: 0x10DE 0x1B80
GV-N1080TTOC-8GD/F20/065F
Board power limit
Target: 180.0 W
Limit: 216.0 W
Adj. Range: -50%, +20%
Thermal limits
Rated: 83.0C
Max: 92.0C
Memory Support
GDDR5X, Micron
Boost Clock: 1772 MHz

--
GPU INFO
--
Manufacturer:	Gigabyte
Device Id:	10DE 1B80
Subsystem Id:	1458 3730
GPU Clock:	1633 MHz
Boost Clock:	1772 MHz
Memory Clock:	1251 MHz

*-----
BIOS Strix 1080 (STRIX-GTX1080-A8G-GAMING)
-----*
See : https://www.overclockers.com/forums/showthread.php/788413-Memory-type-unknown-after-flashing-bios
Working BIOS : https://www.techpowerup.com/vgabios/197719/asus-gtx1080-8192-171016

VBIOS Version:	86.04.60.00.BF
BIOS Build date:	2017-10-16 00:00:00
GPU Device Id: 0x10DE 0x1B80
GTX1080 VB Ver 86.04.60.00.AS03
Board power limit
Target: 198.0 W
Limit: 238.0 W
Adj. Range: -55%, +20%
Thermal limits
Rated: 83.0C
Max: 92.0C
Memory Support
GDDR5X, Micron
Boost Clock: 1810 MHz


--
GPU INFO
--
Manufacturer:	Asus
Device Id:	10DE 1B80
Subsystem Id:	1043 85AA
GPU Clock:	1671 MHz
Boost Clock:	1810 MHz
Memory Clock:	1251 MHz

*---------
EVGA FTW identical Problem 
---------*
No solution found.
See : https://forums.evga.com/MY-GTX-1080-FTW-DT-Mem-clock-Stuck-405-MHz-m2808644.aspx


*------
Gigabyte G1 identical problem
------*
Working BIOS here, said to be F30 version : https://www.techpowerup.com/vgabios/196623/196623
See : https://linustechtips.com/main/topic/876627-bad-gpu-performance-after-bios-update-gtx-1080/?page=2
--
BIOS
--
VBIOS Version:	86.04.60.00.B4
BIOS Build date:	2017-10-12
GPU Device Id: 0x10DE 0x1B80
GV-N1080G1 GAMING-8GD/F30/0653
Board power limit
Target: 200.0 W
Limit: 216.0 W
Adj. Range: -50%, +8%
Thermal limits
Rated: 83.0C
Max: 92.0C
Memory Boost Clock: 1835 MHz

--
GPU
--
Device Id:	10DE 1B80
Subsystem Id:	1458 3702
Interface:	PCI-E
Memory Size:	8192 MB
GPU Clock:	1696 MHz
Boost Clock:	1835 MHz
Memory Clock:	1251 MHz


*------
NVIdia drivers related problem
------*
See here : https://forums.geforce.com/default/...ivers/nvidia-drivers-375-86-low-memory-clock/
You will learn that some BIOSes had their memory frequency limited to 202/405/810mhz due to a driver bug which was fixed in 2016.
One guy even says the problem persisted after the driver fix.
" I had installed 375.70 drivers and im still with the problems. I play arround 12 fps now  Maybe the 375.86 change the firmware of the video card. I dont know..."


----------



## AlbertoM

For the FE/reference 1080, I already tried all possible BIOS, and the best is Palit GameRock Premium 276W TDP.

Even faster than T4 on all Firestrikes tests, of course Im on AIO, don't know what the chip could do with real cold temps and the extra voltage of T4... Anyway you can't use too much power with only 8-pin socket.

I tried more than 5 BIOS for sure from all kinds of vendors and NEVER saw this memory issue.

What happens is BIOS that oscillates the boost clocks and voltage all the time (most of them), and BIOS that keeps the clocks and voltage (if you set curve) like Palit and T4. Memory keeps the clocks all the time (under 3D load), and it's always recognized on GPUZ.

And it doesn't make sense. All 1080s have the same memory.


----------



## PlugFour

Thank you but the Palit is nok for "us". I'm looking for a working alternative.


----------



## kmeel

PlugFour said:


> Please find hereunder the summary of all working BIOSes based on people who had the same problem. Maybe somebody will find the common point and find a magic bios that gives astoning performance
> ....
> ..


Nice find!! Did you try em?
I just flashed the first one with the most recent nvflash version 5.541.0, "BIOS Palit SuperJetstream - (2_11718-G108060M0N1.rom)" with positive result.
With the same clockspeeds as on the fe overclock I'm hitting 500points more in firestrike.

1080fe WC stock:
Firestrike:	18450
graphics:	21573 103.07 & 86.06 fps
physics:	15826 50.24 fps
combined:	10044 46.72 fps

1080fe WC OC @ 2121 Mhz core, 6000 mem. 
Firestrike:	20311
graphics:	24312 116.26 & 96.91 fps
physics:	15?35 50.35 fps
combined:	11201 52.10 fps

BIOS Palit SuperJetstream WC OC @ 2126 mhz core / 6000 mem
FireStrike	20727
graphics	25155 120.84 & 99.89 fps
physics 15666 49.74 fps 
Combined	11294 52.53 fps

Gonna try the other suggestions within 24hrs.


----------



## PlugFour

I won't be able to try them soon, so be my guest and tell me how good they are. By the way, the frequencies you use are pretty high, are you on aircooling or aio?


----------



## kmeel

Will update here, today is play day. 

I've got an Kraken G12 bracket with Kraken X31 AIO (120mm) mounted to the 1080.
These frequencies are stable for me. Memory is maxxed out, think I'm lucky on this one, and core will give issues when upping any higher. Also with the Palit bios.

After looking further into the 'Palit SuperJetstream' bios, as I'm seeing it ramp up to higher power and higher tdp settings then the fe. The information in the original post is _incorrect _now corrected , and I'm happy with that as is has an raised power limit which explains the boost in performance I'm experiencing. It is now purely limited to the power consumtion by the vcore. 
As the source link describes this rom is send by Palit to the user to fix his 5 cards that were having the same issue.. I wonder from what card this bios it actually is.










Edit: after looking up the vbios version, this is a match:
https://www.techpowerup.com/vgabios/196736/196736


----------



## PlugFour

I updated my summary using your findings. Very interesting. If you test other BIOSes please use the same frequency and give your firestrike score, so that I can add this to the post. I think the F30 BIOS can be a good competitor to the Palit one. I'm curious to know how it will behave on your setup.


----------



## kmeel

Alright that went pretty quick, all bios'es worked, let's jump right into it:
(to unknow readers, I'm reffering to this post)

-----
*BIOS Gigabyte GTX 1080 8 GB *
-----
Working bios for me, it's running fine. Altough due the -not so interesting- power/tdp settings I didn't bother a firestrike benchmark.
But here's a indication where it was running in Heaven @ 2126Mhz on the core, 6000Mhz on the memory. It's throttling as expected.









-----
*BIOS Strix 1080 (STRIX-GTX1080-A8G-GAMING)*
-----
Working bios for me, it's running fine, altough the higher power/tdp settings, somehow it's still throttling @ 2126Mhz on the core, 6000Mhz on the memory.
Firestrike: 20372 (graphics score: 24533). So this is just a little better over overclocked original bios with +61 points , but 440-ish points lower vs the Palit jetsteam bios.









------
*Gigabyte G1 identical problem (Gigabyte GTX 1080 8 GB BIOS)*
------
info in original post is incorrect, bios in the link is different then the card having the issue (which is having the issue) vbos version 86.04.60.00.B4 instead of 86.04.60.00.BF. 
The bios is working on my card, but here I also didnt bother a benchmark due the power settings. Also note the Adj. Range: -50%, +8%.









The Palit Jetstream bios seems the be the best one out there for us thusfar.


----------



## white owl

I used the 8 pin Palit BIOS and it was great, I did get better scores at the same clocks and could clock a little higher but the BIOS also has a really low fan speed which allowed my GPU to thermal throttle...not downclock...throttle.
I'd like to fit some 120mm fans to the stock cooler and just run them off the CPU temps but that would be pretty fugly. Replacing the cooler costs over $100 no matter what you use (air or water) so that doesn't seem worth it.


I haven't come across another reference BIOS that's similar but with faster fans.


----------



## PlugFour

Why don't you simply set a custom fan curve under msi afterburner?


----------



## AlbertoM

Weird that Strix BIOS works and T4 not.. They are the same BIOS for the same card, with just different settings.


----------



## white owl

PlugFour said:


> Why don't you simply set a custom fan curve under msi afterburner?


 I don't use afterburner and the fan speed is -30% in the BIOS, AB can't do anything about it.
I seriously hate what they did do this arch.


----------



## doom3crazy

Hi everyone. I am thinking about moving from my 980 ti to a gtx 1080 card and if possible I would like to get the fastest(or rather one of the fastest) from the start for overclocking potential to close the gap between the 1080 and 1080 ti. The 1080 cards on average are going around 300-350$ on ebay and that seems like a pretty good price. 

I was just curious if some of you could chime in on what 1080 card you have and what kind of max over clocks you have achieved.


----------



## PlugFour

white owl said:


> PlugFour said:
> 
> 
> 
> Why don't you simply set a custom fan curve under msi afterburner?
> 
> 
> 
> I don't use afterburner and the fan speed is -30% in the BIOS, AB can't do anything about it.
> I seriously hate what they did do this arch. /forum/images/smilies/rolleyes.gif
Click to expand...

 I have the same problem. I can't use it. The MSI F30 is power limited on my card, unusable as well. Will try the Strix again.


----------



## white owl

doom3crazy said:


> Hi everyone. I am thinking about moving from my 980 ti to a gtx 1080 card and if possible I would like to get the fastest(or rather one of the fastest) from the start for overclocking potential to close the gap between the 1080 and 1080 ti. The 1080 cards on average are going around 300-350$ on ebay and that seems like a pretty good price.
> 
> I was just curious if some of you could chime in on what 1080 card you have and what kind of max over clocks you have achieved.


 I've mentioned the G1 and FTW already, Depending on where you live Palit and Galax also have similar GPUs.
If you can find a Windforce card it can be flashed to a G1 but the G1's are a higher bin.
Not sure what your budget is but:
https://www.overclock.net/forum/14779-video/1717168-fs-4-evga-1080ti.html


----------



## combatfun

incredible how much mhz some guys can already achieve without a voltage increase. my palit superjetstream keeps 2063-2076/5505 @ 1,081/1,093. with that asus t4 mod bios i could achieve after days of testing, 2114-2126/5505 @ 1,31v. that's my 24/7 gaming setup, 100% stable, running @ ~+60-70°C [temp target +70°C, so fans go ~70-100%). can't hold 2150-2164 100% stable, not even with higher voltages. only good to pass benchmarks :/
but yeah, never recognized that i ran into power limits with my palit superjetstream default bios with oc in afterburner. if that was the case, with the t4 mod bios it isn't anymore ^^


----------



## PlugFour

Actually, using my stock bios, stock cooler on my FE card, it runs rock stable at [email protected] without throttling. This is why tweaking the bios is just a hobby to me, I wanted to see the limits of the card.


----------



## TwilightRavens

Just thought I’d ask a question since I just bought a 1080 SC from ebay to replace my aging 290X (haven’t received it yet though) but I hear about 99.9% of 1080’s will roughly hit 2GHz with little to no effort, is that true? Just curious because that’s about where my goal is as far as overclocking it will go.


----------



## derx

TwilightRavens said:


> Just thought I’d ask a question since I just bought a 1080 SC from ebay to replace my aging 290X (haven’t received it yet though) but I hear about 99.9% of 1080’s will roughly hit 2GHz with little to no effort, is that true? Just curious because that’s about where my goal is as far as overclocking it will go.


I think most of them hit between 2 and 2.1Ghz. If you're lucky your card hits 2.2Ghz. 
I'm one of the lucky ones. Got my card to 2254 on the Asus T4 bios on water. Now I'm back to the Palit Jetstream Bios as it crashes in the game I play (iRacing) when I'm at 2.2Ghz, and the Palit has a bit better performance with lower clocks. It just can't hit the 1.2volts like the T4 bios does. As I went to the Palit bios today I've not yet tested it to the limit, but it does [email protected] stable. Memory I've kept at 5700 for now.


----------



## white owl

TwilightRavens said:


> Just thought I’d ask a question since I just bought a 1080 SC from ebay to replace my aging 290X (haven’t received it yet though) but I hear about 99.9% of 1080’s will roughly hit 2GHz with little to no effort, is that true? Just curious because that’s about where my goal is as far as overclocking it will go.


 My SC only runs 1990 100% stable. Sadly the cooler even at 100% fan speed will keep the core in the 70s under 100% load so the boost just trails off into 1960mhz where it stays.
I had better results with the Palit BIOS but since the fans for that card were much slower the BIOS prevented the fans from going to 100%, it was closer to 70% and it ran very hot like that. I don't know what the next best BIOS would be to try on a reference card though. I'd steal Jensen's jacket to enable custom BIOS again.

I miss OCing in the BIOS so much, my 980 couldn't OC very far (+40-60Mhz) with AB but after modding the stock BIOS it would run just under 1500mhz 100% stable.


----------



## TwilightRavens

white owl said:


> My SC only runs 1990 100% stable. Sadly the cooler even at 100% fan speed will keep the core in the 70s under 100% load so the boost just trails off into 1960mhz where it stays.
> I had better results with the Palit BIOS but since the fans for that card were much slower the BIOS prevented the fans from going to 100%, it was closer to 70% and it ran very hot like that. I don't know what the next best BIOS would be to try on a reference card though. I'd steal Jensen's jacket to enable custom BIOS again.
> 
> I miss OCing in the BIOS so much, my 980 couldn't OC very far (+40-60Mhz) with AB but after modding the stock BIOS it would run just under 1500mhz 100% stable.


I mean I am debating on throwing my Kraken G10 and Corsair H75 on there which is currently cooling my 290X, but if I do that it leaves the 290X without a cooler as I lost some of the parts to its stock cooler. Reason that is an issue is because I had planned on moving it to my 790i board with my Core 2 Extreme as it would be a heck of an upgrade form 660 SLI. I may try flashing a BIOS from a higher clocked model and see what happens, but I've always been skeptical of cross vendor flashes.

Edit: Do you think a Founders Edition bios would allow for a higher fan curve or would it be about the same? I would think it would allow for it to go higher but I could see Nvidia limiting it because blower coolers can get loud af ex: my Galaxy blower model 660 at 100% sounds a lot like a jet turbine.


----------



## white owl

TwilightRavens said:


> I mean I am debating on throwing my Kraken G10 and Corsair H75 on there which is currently cooling my 290X, but if I do that it leaves the 290X without a cooler as I lost some of the parts to its stock cooler. Reason that is an issue is because I had planned on moving it to my 790i board with my Core 2 Extreme as it would be a heck of an upgrade form 660 SLI. I may try flashing a BIOS from a higher clocked model and see what happens, but I've always been skeptical of cross vendor flashes.
> 
> Edit: Do you think a Founders Edition bios would allow for a higher fan curve or would it be about the same? I would think it would allow for it to go higher but I could see Nvidia limiting it because blower coolers can get loud af ex: my Galaxy blower model 660 at 100% sounds a lot like a jet turbine.


 Well the SC has a good default fan profile IMO, it's a pretty quiet card. Not sure what the fan parameters are on the FE cards but it would have the same TDP as the SC has already so you'd only be in it for the fan profile...I think blowers are lower RPM but I;m not 100%. You can cross flash but I'm pretty sure it only works if you have a reference design and flash a reference design. I'm pretty sure all 8 pin 1080s are reference as well, some do have different video outputs though so if you used one of those you'd probably loose one of the DP ports.


I'm actually fine with the GTX1080 >/=2000Mhz, runs 1440p really well. Adding 100Mhz would only be a 5% difference in speed and would equate to a smaller percentage increase in games. The extra TDP gave me an extra 300 points in superposition medium even though it was thermal throttling (and I don't mean loosing boost lol). 300 is a lot in that bench.


----------



## PlugFour

*2080Mhz*



derx said:


> I think most of them hit between 2 and 2.1Ghz. If you're lucky your card hits 2.2Ghz.
> I'm one of the lucky ones. Got my card to 2254 on the Asus T4 bios on water. Now I'm back to the Palit Jetstream Bios as it crashes in the game I play (iRacing) when I'm at 2.2Ghz, and the Palit has a bit better performance with lower clocks. It just can't hit the 1.2volts like the T4 bios does. As I went to the Palit bios today I've not yet tested it to the limit, but it does [email protected] stable. Memory I've kept at 5700 for now.


Hello, indeed you're lucky.
Previously I said I could reach 2.15Ghz with my 1080FE. in fact that not stable. The maximum frequency I can get is 2.08Ghz. But that's exactly like a wall. At this frequency, it is rock stable. If I add 25Mhz, it crashes quickly under some games, even if I keep the GPU cool in a cold room. 
after some googling, it looks like the 1070, 1080 and 1080Ti have this 2080Mhz wall. Many people are stuck at this frequency, and several people mention a wall, but never explain how. If somebody know the reason... It seems to be something else than the silicium lottery.


----------



## TwilightRavens

Yeah as long as I can get 2GHz I’m alright with that, even if its 2001MHz. Card arrived today and will probably test later on tonight and post my results.

Edit: Did some testing on the card, without even doing anything in Afterburner except creating a fan curve it boosted straight to 2012MHz, that's stock power limit, stock voltage etc.


----------



## TwilightRavens

Furthest I’ve had mine boost on stock volts with a +100MHz overclock is 2100MHz, fluctuates between like 2066 and 2088MHz depending on the temp it stays at. Afterburner does let me do 100% fan speed if I want but its freaking loud so 75% is as high as I can go without hearing it over everything. I’d say I’m pretty lucky I guess. I’ve heard that Pascal is mostly limited to what 1.09v or something so I am assuming moving the voltage slider wouldn’t really do a whole lot so I opted not to even bother with “overvolting”. If I need more GPU horsepower in the future then I may do it.

It is pretty nice to actually see my i7-5775C above 30% utilization for once in its life during games, for the longest time I thought I was CPU bottlenecked but my old 290X really was holding me back way worse than I thought. 

Also would like to add in case anyone was wondering, don’t use driver 418.81, that driver prevented my 1080 from going into its idle state of 139MHz and it just stayed right at 1709MHz with fans not spinning at all, so temps would creep up in the 60-70C range. I dropped down to 417.71 (I think was the version) and it works as it should.


----------



## naveediftikhar

hey guys. need a little help and heads up.

i am buying a new gigabyte windforce 1080 oc edition today or tom (depending upon time). now can anyone guide me regarding 

what sort of base performance i should be expecting with boost 3.0. and also

since this card comes with only 1 8 pin, what bioses can i flash to ensure maximum performance. (i mean palit or strix t4 or any other)

how should i play with the voltage curve. i am not new to overclocking but since i was using a 980ti earlier, i have no idea how the voltage curve should play. i know that keeping the card cool gives the best results but since i cant loop i kinda would settle for best temp to performance ratio.

and last how good this gigabyte version actually is?..i can buy a palit (40$ less) and a zotac mini (10 $less) but both are well used. with this one the seller says its brand new box sealed hence m kinda inclining towards it.

thx.


----------



## naveediftikhar

so guys...i ended up buying the gigabyte windforce just because it actually was box sealed new as the seller said. but as is my luck always..i ended up losing the silicon lottery. my card stock boosted up to 1911mhz and settled at late 1800s with temps around 64c with my custom fan curve.

i then started to play with the clocks. adding 25 mhz on increment and using heaven i started to crash at anything above 2012mhz. raising the voltage or power limit only found that 2012 ran with higher volts + power throttle. but since the chip itself is as bad as my ******* LUCK (i really like oc'ing for the fun of it) i dont think it can handle even 2050. however i thought since i have my stable at core i might as well lower the volts for the sake of it .

playing the v/f curve for the above said challenge with max volt and power i figured that i can least achieve 2000mhz at 1.05v. at this point the card doesnt power throttle much and stays between 1950-2000 (2000 90% of the time due to lower volts). i couldnt find time to test for same freq at even lower volts.

i might as well mention all these tests were done with memory clocked at +550.

what do u guys think should be the lowest volt for 2000mhz. i mean i really havent got the energy to test all volts between 1.5 and 1. so any ideas would be nice.

another strange thing i noticed was that with memory oc'd i found my card to power throttle harder than with only a clock oc. any ideas on that as well?

as i said earlier m not new to oc'ing and did my fair part on 970 and 980 ti.

another thing to add is that i cant water cool. my custom fan curve at 65% gives me 65c. i dont wanna raise the fans much due to noice.

closing request. how can raise my performance with the above stated conditions.

ty.

lolz m really sry this post is getting long but i thought i should mention. i read like the last 100 pages honestly. but i really couldnt find much about windforce. so if anyone can actually link me to a possible high power limited bios for my 8 pin


----------



## white owl

Fyi you need to do the core or the memory first. Typically you go for core and get it totally stable before messing with memory, since these cards have ECC you NEED to run heaven or valley benchmark (windowed, high res) and enable "walk" in the camera settings so you can render the exact same thing over and over. Now put AB over top Heaven/Valley and bring the memory up while watching the frame rate. Once you see the frame rate start dropping take the speed back about 20Mhz or so.




Not sure how accurate this is today but in the past we had the G1 Gaming and the Windforce. Same exact cards but the G1s were binned higher with less limitation in the BIOS. You might try flashing the G1 before anything else since you know it will work 100%. When cross flashing you need to flash from a GPU with the same pin count or there's a good chance of bricking it or getting bad performance.
2000Mhz is really good, I can't even get 2000mhz in games. Usually only runs at around 1960mhz.


----------



## naveediftikhar

white owl said:


> Fyi you need to do the core or the memory first. Typically you go for core and get it totally stable before messing with memory, since these cards have ECC you NEED to run heaven or valley benchmark (windowed, high res) and enable "walk" in the camera settings so you can render the exact same thing over and over. Now put AB over top Heaven/Valley and bring the memory up while watching the frame rate. Once you see the frame rate start dropping take the speed back about 20Mhz or so.
> 
> 
> 
> 
> Not sure how accurate this is today but in the past we had the G1 Gaming and the Windforce. Same exact cards but the G1s were binned higher with less limitation in the BIOS. You might try flashing the G1 before anything else since you know it will work 100%. When cross flashing you need to flash from a GPU with the same pin count or there's a good chance of bricking it or getting bad performance.
> 2000Mhz is really good, I can't even get 2000mhz in games. Usually only runs at around 1960mhz.


ty for replying man. what u said is exactly what i do. i benched using heaven in windowed with AB. the core oc's were found without mem. i havent really found a mem max oc but thought 550 was ok enough ( i did bench 700+ on mem but dialed it back coz i was tired and didnt want to really find the max, i might do it today and find lower volts as well).

yeah i kinda realized that 8 pin to 8 pin is the best bios bet. but do u really think g1 with the better binned chip would have any diff at the bios level. i am not saying that maxwell and pascal are the same but i dont really think the basic of bios engineering differs a lot, i mean ive gone through a lot of bioses with my 980ti and the only diff i could ever find was power draw and higher base clocks. but since the WF3 chip m sure is not a champ, higher base oc's might cripple my sad little chip. plus g1 has the same power draw to the WF. thats all just my guess..maybe m wrong. but does the bios level really differ beyond what is evident i.e power draw and base clocks.?


----------



## naveediftikhar

ok another update. m gonna keep this thread alive...lolz kidding..i really any input i can get. anyways.

played with lower volts on 2000mhz. went 1.043 which was stable. the went to 1.032 which crashed immediately. hence i went back upto 1.05 on 2000mhz for a little head room.

tweaked the v/f curve a bit more for stable clocks for more stable performance. tried my overclock check from AB oc scanner. it gave me 90% confidence.

then later not having fulfilled my hunger to extract the max, i flashed the latest g1 gaming bios from techpowerup using the latest nvflash (i guess its 5.XXX)..flashed fine but upon reboot my memory was stuck at 405 mhz...gave me a scare. flashed my stock backed up bios and all went fine again.

so i guess this is it...unless someone can tell me what went wrong with my flashing or even if flashing a g1 bios would prove to be any beneficial, m ok with my clocks. getting 64 fps on shadow of the tomb raider @ 1440p all settings maxed out(stock AA).


----------



## white owl

Seriously 2000mhz on air is not bad, I don't even get that and have great framerate at 1440p. As for the memory just read a few pages back, you're card could have the same BIOS issue theirs did.


----------



## naveediftikhar

white owl said:


> Seriously 2000mhz on air is not bad, I don't even get that and have great framerate at 1440p. As for the memory just read a few pages back, you're card could have the same BIOS issue theirs did.


yeah man i guess...thx for helping...will keep updating if i find anything useful for others.


----------



## doom3crazy

Hi guys. I am now officially an owner of a EVGA SC gtx 1080. I am a little late to the game(came from a 980 ti) but I am excited. I wanted to go over a couple things with you guys.

1. From what I remember reading that because there are no custom bios with pasal and it's pretty locked down voltage wise etc that the best way to get the best overclocks is keeping thermals down. Is this true?

2. ^^ If true, outside of keeping my side panel off and putting the card on water, does anyone have any cool diy cooling tips/tricks they wanna share? I currently have an old 3000rpm 80mm fan sitting atop the card over the VRM area(it was a little tip someone gave me from the maxwell days with my 980 ti)

3. Does anyone have a certain driver they love and feel like performs the best/ is most stable with the 1080? I am on the latest 418.91 and it seems to be fine but of course if there's something better out there I want it. 

4. Does core clock or memory clock make the biggest difference in added fps? Right now without much tweaking I am stable at 2100mhz on the core and 5200mhz on the memory clock. 

5. Just double checking, there aren't any worthwhile custom bios for the gtx 1080, correct? Lol.


----------



## white owl

You can run almost any memory speed without crashing because of the ECC. To see where your ram speed should be simply run heaven in "walk" while looking at something ( I use the dragon), tab over to your OC software then bump the mem until FPS drops, set it about 20mhz under that.
Pascal uses a voltage curve, you're not limited in speed, just voltage to stabilize with. With Maxwell we could bypass this in the BIOS so it would always stay in the highest voltage range until it throttled. With Pascal the BIOS is locked you you can't change the voltage curve but your can increase the voltage and TDP. The SC doesn't have a great cooler so I re-pasted my card with MX4.
Each card is different so you'll need to figure out if adding voltage helps your situation, in my case it didn't do much.
There are no custom BIOS for the cards but you can cross flash the (8 pin reference) Palit BIOS for more TDP, this will limit your fan speed a lot though so it's not useful on the SC.
I've never had any issues with any drivers that were clean installed. I'm using 417.71 becasue it was the first with VRR enabled, if that hadn't come out I'd still be using something from months ago. Glad to FINALLY have VRR after all this time.
To keep the card cool you simply need good case flow. Get cool air into the card and get the hot air out of the case. What are you getting now while benching?


----------



## doom3crazy

white owl said:


> You can run almost any memory speed without crashing because of the ECC. To see where your ram speed should be simply run heaven in "walk" while looking at something ( I use the dragon), tab over to your OC software then bump the mem until FPS drops, set it about 20mhz under that.
> Pascal uses a voltage curve, you're not limited in speed, just voltage to stabilize with. With Maxwell we could bypass this in the BIOS so it would always stay in the highest voltage range until it throttled. With Pascal the BIOS is locked you you can't change the voltage curve but your can increase the voltage and TDP. The SC doesn't have a great cooler so I re-pasted my card with MX4.
> Each card is different so you'll need to figure out if adding voltage helps your situation, in my case it didn't do much.
> There are no custom BIOS for the cards but you can cross flash the (8 pin reference) Palit BIOS for more TDP, this will limit your fan speed a lot though so it's not useful on the SC.
> I've never had any issues with any drivers that were clean installed. I'm using 417.71 becasue it was the first with VRR enabled, if that hadn't come out I'd still be using something from months ago. Glad to FINALLY have VRR after all this time.
> To keep the card cool you simply need good case flow. Get cool air into the card and get the hot air out of the case. What are you getting now while benching?


ya know I am not sure to be honest. I was just doing the typical load up heaven benchmark, go do something and come back 15-20 min later and see where its at. I set the fan speed to 100% if I remember right at one point I think it throttled to like 2088mhz but it stayed around 2100mhz for most of it. 

I did think about possibly replacing the thermal paste with something like thermal grizzly kryonaut, maybe even conductonaut(although liquid metal scares me lol) 
I know you said you didn't think the SC had a very good cooler. Do you think maybe your cooler was faulty or maybe just need a re-application of thermal paste? I've certainly gotta do more testing to come to anything conclusive but mine seems to do a pretty good job cooling. At 100% the fans are a small jet engine but it spins at a higher rpm and stays quite a bit cooler than my old msi gaming 6 980 ti ever did heh.

Also, there's a guy locally selling a aorus gigabyte gtx 1080 xtreme for 350$. I was thinking about trying to flip my current card and pick up that one but I was trying to decide if it would really be worth the time/effort/price(I got my SC 1080 for 300$) Not to mention that aorus card is a BEAST. Three slot card and looks like it weighs as much as an actual brick LOL


----------



## white owl

The cooler isn't very good, not really my opinion. Better than blower but worse than most other coolers. A cooler can't really be faulty and make it past testing unless it's damaged later on. Pretty hard to mess up re-pasting yet another card, especially since the cooling performance was better afterward. I don't even think it's possible to mess up re-pasting a GPU, the smallest dot will push out the sides of the die and the cooler isn't user configurable. Not sure why you'd think something is wrong with my card just becasue I said the cooler isn't great, the SC is the cheapest of the AIB cards and has historically had one of the worst coolers if you compare thermals. Obviously you can't use LM on these coolers, they're aluminum. There's nothing scary about LM, it's not explosive or cursed...user error is a totally different thing though. You shouldn't trade a 2100mhz card for anything but a 1080Ti and up, best case for trading for a different card is you get 2150mhz on air (which makes almost no difference in performance over 2100mhz or 2050Mhz). Worst case, you get one that clocks worse.


----------



## TwilightRavens

doom3crazy said:


> Hi guys. I am now officially an owner of a EVGA SC gtx 1080. I am a little late to the game(came from a 980 ti) but I am excited. I wanted to go over a couple things with you guys.
> 
> 1. From what I remember reading that because there are no custom bios with pasal and it's pretty locked down voltage wise etc that the best way to get the best overclocks is keeping thermals down. Is this true?
> 
> 2. ^^ If true, outside of keeping my side panel off and putting the card on water, does anyone have any cool diy cooling tips/tricks they wanna share? I currently have an old 3000rpm 80mm fan sitting atop the card over the VRM area(it was a little tip someone gave me from the maxwell days with my 980 ti)
> 
> 3. Does anyone have a certain driver they love and feel like performs the best/ is most stable with the 1080? I am on the latest 418.91 and it seems to be fine but of course if there's something better out there I want it.
> 
> 4. Does core clock or memory clock make the biggest difference in added fps? Right now without much tweaking I am stable at 2100mhz on the core and 5200mhz on the memory clock.
> 
> 5. Just double checking, there aren't any worthwhile custom bios for the gtx 1080, correct? Lol.


Yeah gone are the days of BIOS mods as Pascal requires the BIOS be signed by Nvidia and no one that I know of has broken the encryptions on the cards outside of the “handful of people” that claim they have a tool to do it but show no evidence of it actually being done.

To add on to the part about memory overclocking, It not entirely true that you can increase it without crashing, while you may be able to add even 1000MHz to it and it be fine in Valley or Heaven, I have had it crash at +350MHz in 3dmark Firestrike between loading the second graphics test and actually going through it. It will crash but Heaven and Valley won’t usually show it that I have noticed. If 3dMark hasn’t detected it I have had Mass Effect Andromeda crash to desktop a few times finding out my VRAM will only go to +300MHz even though memory overclock will scale up to about 400-450MHz before it starts to degrade performance, then again at +850 ish. I assume it’s because after those thresholds the timings relax quite a bit.


----------



## doom3crazy

white owl said:


> The cooler isn't very good, not really my opinion. Better than blower but worse than most other coolers. A cooler can't really be faulty and make it past testing unless it's damaged later on. Pretty hard to mess up re-pasting yet another card, especially since the cooling performance was better afterward. I don't even think it's possible to mess up re-pasting a GPU, the smallest dot will push out the sides of the die and the cooler isn't user configurable. Not sure why you'd think something is wrong with my card just becasue I said the cooler isn't great, the SC is the cheapest of the AIB cards and has historically had one of the worst coolers if you compare thermals. Obviously you can't use LM on these coolers, they're aluminum. There's nothing scary about LM, it's not explosive or cursed...user error is a totally different thing though. You shouldn't trade a 2100mhz card for anything but a 1080Ti and up, best case for trading for a different card is you get 2150mhz on air (which makes almost no difference in performance over 2100mhz or 2050Mhz). Worst case, you get one that clocks worse.


Oh I hope I didn’t offend you. That wasn’t my intention of the post. I was more thinking out loud nor was I questioning you. You know way more about these cards than I do haha. 

So can I ask, do you remember your before and after of applying new thermal paste? Seems like some people have really good results and others it ends up making no difference or like a 1c difference heh

Oh I was also gonna ask you, does your card have the thermal pad mod and vbios update ? I guess when the FTW and SC cards first launched they were having major thermal issues. The vbios update was to change the fan curve and then the added pads did something with the vrm I think. The kit is free from EVGA if your card qualifies. I guess my card is new enough that it shipped with the thermal pads and vbios update already installed. Or at least that’s what the website told me when I put in my serial #

Also I ran heaven benchmark again. I let it go for a full half an hour and here's what I have concluded::
My card boosted up to 2100mhz sometimes with my average being between 2066mhz-2088mhz. In terms of cooling(and I am eager to hear what your thermals are like) with whatever stock paste is on it, I never broke above 56c. Average was around 54c with the fan at 100%. I do live in a pretty chilly basement and it's winter time so I am sure those numbers will come up some during the summer. I also took your advice on the voltage and I didn't give it any extra voltage in AB and it seemed stable. So like your card, it may not really make any difference.


----------



## doom3crazy

TwilightRavens said:


> Yeah gone are the days of BIOS mods as Pascal requires the BIOS be signed by Nvidia and no one that I know of has broken the encryptions on the cards outside of the “handful of people” that claim they have a tool to do it but show no evidence of it actually being done.
> 
> To add on to the part about memory overclocking, It not entirely true that you can increase it without crashing, while you may be able to add even 1000MHz to it and it be fine in Valley or Heaven, I have had it crash at +350MHz in 3dmark Firestrike between loading the second graphics test and actually going through it. It will crash but Heaven and Valley won’t usually show it that I have noticed. If 3dMark hasn’t detected it I have had Mass Effect Andromeda crash to desktop a few times finding out my VRAM will only go to +300MHz even though memory overclock will scale up to about 400-450MHz before it starts to degrade performance, then again at +850 ish. I assume it’s because after those thresholds the timings relax quite a bit.


Yeah even though this card is quite a bit more powerful than my old 980 ti , I do kinda miss it haha. It was a beast on air. I had it loaded with a custom bios and I was stable at just over 4000mhz on the memory and 1516mhz on the core. One thing I do like about my 1080 card though compared to maxwell is the power draw. I was def pulling some wattage on my 980 ti with that overclock I had on it.


----------



## TwilightRavens

doom3crazy said:


> Yeah even though this card is quite a bit more powerful than my old 980 ti , I do kinda miss it haha. It was a beast on air. I had it loaded with a custom bios and I was stable at just over 4000mhz on the memory and 1516mhz on the core. One thing I do like about my 1080 card though compared to maxwell is the power draw. I was def pulling some wattage on my 980 ti with that overclock I had on it.


I feel you there, it was fun tweaking the bios on the pair of spare 660's I had in my other build before I turned it into a server. I can relate to the power draw, try a slightly modified LN2 bios on a R9 290X lol, 525W on just the card alone and it dumped about 90% of that heat right back into the case. The 1080 is a godsend compared to Maxwell and Hawaii in terms of performance per watt, though on the 1080 I recommend setting the fan speed as high as you can stand starting at 55C-60C if you want to have the best boost clock in the long run, did that and the max mine gets to is 2100MHz but usually hangs around 2066MHz-2088MHz at 63-65C ish.


----------



## doom3crazy

TwilightRavens said:


> I feel you there, it was fun tweaking the bios on the pair of spare 660's I had in my other build before I turned it into a server. I can relate to the power draw, try a slightly modified LN2 bios on a R9 290X lol, 525W on just the card alone and it dumped about 90% of that heat right back into the case. The 1080 is a godsend compared to Maxwell and Hawaii in terms of performance per watt, though on the 1080 I recommend setting the fan speed as high as you can stand starting at 55C-60C if you want to have the best boost clock in the long run, did that and the max mine gets to is 2100MHz but usually hangs around 2066MHz-2088MHz at 63-65C ish.


Yeah that's one thing AMD has got to get sorted out. Their cards power draw is too much. I saw a really interesting video by gamernexus where he took a vega 56 with a modified power table that put it at like 220% or something crazy and he was able to overclock it to match and sometimes beat a rtx 2070 and it was about on par with a gtx 1080 BUT MANNNN, the power draw on that was just dumb stupid haha. 

So this is a random q for you(and anyone else that reads this) while somewhat staying on topic of "1080" there is a guy locally that is selling a evga ftw3 1080 ti for 400$. mad cheap right? for the moment, he's seemed really genuine and upfront about using it for mining. He said for about 8 months. Would that be worth trying to pick up orrrr.... I don't know enough about mining to decide whether that card has had too much use on it or whatever.


----------



## TwilightRavens

doom3crazy said:


> Yeah that's one thing AMD has got to get sorted out. Their cards power draw is too much. I saw a really interesting video by gamernexus where he took a vega 56 with a modified power table that put it at like 220% or something crazy and he was able to overclock it to match and sometimes beat a rtx 2070 and it was about on par with a gtx 1080 BUT MANNNN, the power draw on that was just dumb stupid haha.
> 
> So this is a random q for you(and anyone else that reads this) while somewhat staying on topic of "1080" there is a guy locally that is selling a evga ftw3 1080 ti for 400$. mad cheap right? for the moment, he's seemed really genuine and upfront about using it for mining. He said for about 8 months. Would that be worth trying to pick up orrrr.... I don't know enough about mining to decide whether that card has had too much use on it or whatever.


I mean its about a 50/50 shot of getting a good or bad card. On one hand you could end up with a perfectly fine 1080 ti for $400, on the other hand you could buy a 1080 ti and it die a day, week or month down the road. I have bought 4 cards that were used for mining and all 4 of them have not given me a single issue. Everyone else I can’t speak for but if you have $400 I say go for it, if it doesn’t work or is not as described exactly you are covered by eBay’s buyer protection and can get a full refund, regardless if the seller accepts refunds or not.

I shouldn’t say 0 issues, my R9 290X when I bought (got it for $274) it would throttle on the stock MSi cooler (it was not a blower style) and I had to end up buying an AIO and turning it into a hybrid card. The Galaxy GTX 660 I got works fine (bought it for $79) except it overclocks like balls (most 660’s do though). The EVGA 660 I got was an auction for a not working (for parts) and I didn’t expect it to work at all (won the auction for $22.50), took it apart without testing it at all, gave it a bath in 91% isopropyl alcohol, repasted it and put it back together and in my system. Loaded up Heaven and ran it for hours and had zero issues whatsoever, still works too, it even has a BIOS mod on it to allow 150% power limit instead of the default 110%. The 1080 I bought ($371 shipped) was a hair bit pricey for the condition it arrived in but it works nonetheless, you can tell the card had been mined on at full load for at least a year (dark stains on the PCB, dust was caked in the heatsink fans, TIM was like chalk) but it worked fine as far as I can tell, still does. So if you do buy it, don’t go in expecting a 100% perfect card or you will not be impressed, keep your expectations low and when you get it, clean it up before you even test it (just in case, dust can kill circuitry really quick because of the heat it holds in) do that and you’ll be fine.

By clean I don’t mean take it all the way apart yet, just get as much dust out as possible without taking it apart, test it and if it works then you can take it apart and deep clean it as I mentioned above. That way you know that if it doesn’t work its on you instead of thinking the buyer ripped you off. The thing I do is I have a rule for any hardware (used or brand new), run it at 100% stock settings for week to verify it works 100% as stated, then you can do whatever you want to it.

Lastly, the reason Vega has stupid high power draw is because the architecture wasn’t designed around high clocks, its a really efficient chip at lower frequencies, its so high because they had to clock it high to even compete with the mid high end 10 series SKU’s. Look at the Vega APU’s the entire package (including the CPU itself) is what... 65W? And it blows a 1030 out of the water most of the time. If they wouldn’t have made it so high clocked it would be a much more efficient GPU, but performance would be right around where Polaris sits, but it would be hella more efficient. Polaris is kind of the same way in all honesty, had they made a high end Polaris model it probably would draw even more power. As for Hawaii (290X/390X) those draw so much power because they didn’t use a FINfet process (which drastically reduces leakage) and they had a 512 bit memory bus (twice the bus speed of a GTX 1080 yet has the same exact memory speed even though the GDDR5 is clocked half as fast)

Maxwell is basically in the same boat as Vega, when at appropriate clock speeds its really efficient, but the voltage has a lot better scaling for clocks than compared to Vega. Pascal (for the sake of simplicity) is a die shrink of Maxwell, 16nm from 28nm did a lot to improve power consumption as it was designed for power savings whilst still being able to clock high. Turing, not so much as 12nm did almost nothing for it in terms of power consumption, mainly increased IPC whilst retaining the same clock speeds roughly speaking, plus you have all those extra features (RTX cores, Tensor cores).


----------



## white owl

doom3crazy said:


> Yeah that's one thing AMD has got to get sorted out. Their cards power draw is too much. I saw a really interesting video by gamernexus where he took a vega 56 with a modified power table that put it at like 220% or something crazy and he was able to overclock it to match and sometimes beat a rtx 2070 and it was about on par with a gtx 1080 BUT MANNNN, the power draw on that was just dumb stupid haha.
> 
> So this is a random q for you(and anyone else that reads this) while somewhat staying on topic of "1080" there is a guy locally that is selling a evga ftw3 1080 ti for 400$. mad cheap right? for the moment, he's seemed really genuine and upfront about using it for mining. He said for about 8 months. Would that be worth trying to pick up orrrr.... I don't know enough about mining to decide whether that card has had too much use on it or whatever.


AMD cards when OC'd properly usually end up with similar power draw and thermals as a comparable Nvidia card. AMD hasn't put solid effort into the gaming GPU side for sometime so it's not surprising that all their cards use way too much voltage and TDP from the factory and can go faster with less than stock power. Stock vs stock, yeah it looks pretty bad.





Mining is only bad for GPUs if it was done with more voltage. Any one trying to maximize profits is likely running around 70% TDP with less voltage and a lower clock speed. Even if it were done stock, two _years_ of mining still isn't that hard on a GPU. I'd rather buy a mining card from a smart person than a gaming card from an idiot.


----------



## white owl

What's a typical offset for memory on these cards?
I blew my 1080 apart trying to reduce thermals (success) and after going back into the overclocking I found that I could get up to 5600mhz and still gain FPS in heaven. It that normal? I feel like it didn't do that last time. This look right to you?


----------



## TwilightRavens

white owl said:


> What's a typical offset for memory on these cards?
> I blew my 1080 apart trying to reduce thermals (success) and after going back into the overclocking I found that I could get up to 5600mhz and still gain FPS in heaven. It that normal? I feel like it didn't do that last time. This look right to you?


Test it in Firestrike and Valley if you haven’t already, those for me anyway have been the most reliable in determining scaling.


----------



## white owl

That did it. Does 1325Mhz sound right? Shown in GPUz.


----------



## TwilightRavens

white owl said:


> That did it. Does 1325Mhz sound right? Shown in GPUz.


That sounds more like it, ends up being 10.6GHz effective clock (5.3GHz) which is right in line where most of them are.


----------



## white owl

Ok thanks. I never can remember which speed everyone says so I just say where it's shown lol

What's weird is when I got this card the previous owner couldn't get much speed out of it. I got it and took it past 1900. I repeated and it ran cooler but didn't clock higher. Now I pasted again bit with a lot of paste and it drops like 10c compared to how it was and I get 2000mhz under load with the stock (super quiet) fan profile.


----------



## doom3crazy

white owl said:


> What's a typical offset for memory on these cards?
> I blew my 1080 apart trying to reduce thermals (success) and after going back into the overclocking I found that I could get up to 5600mhz and still gain FPS in heaven. It that normal? I feel like it didn't do that last time. This look right to you?



What kind of temps are you getting now? And what did you exactly do to get your temps lower?


----------



## white owl

Nothing really. I was taking measuments for making a custom shroud, repasted, slapped the shroud back together and hastily stuck it back on. All that immediately after tearing the case down to the last piece so I could (literally) give it a shower and re-doing the fan lay out. I wish I'd sanded a corner off the cold plate to see if it were polished aluminum or nickel plated copper. After looking at it a second time I swear they took the cooler from the 980 FTW and just stuck it on the SC then gave the BIOS a 120% limit so no one would know...which didn't work. The stock cooler is so bad they had to issue a BIOS update to ramp up the fan speed after launch.
I'd like to get some SP oriented case fans to make a better shroud with to help improve it. That's half the problem is they used 2 high speed 92mm fans in a cooler that would easily fit 3.
The first time I took it apart I was just repasting it, it was good for almost 10c on my 980. When I booted back up it was 5c lower with the same speed. This time instead of neatly applying the past I put a big blob in the middle...the amount I use on a CPU.


----------



## doom3crazy

white owl said:


> Nothing really. I was taking measuments for making a custom shroud, repasted, slapped the shroud back together and hastily stuck it back on. All that immediately after tearing the case down to the last piece so I could (literally) give it a shower and re-doing the fan lay out. I wish I'd sanded a corner off the cold plate to see if it were polished aluminum or nickel plated copper. After looking at it a second time I swear they took the cooler from the 980 FTW and just stuck it on the SC then gave the BIOS a 120% limit so no one would know...which didn't work. The stock cooler is so bad they had to issue a BIOS update to ramp up the fan speed after launch.
> I'd like to get some SP oriented case fans to make a better shroud with to help improve it. That's half the problem is they used 2 high speed 92mm fans in a cooler that would easily fit 3.
> The first time I took it apart I was just repasting it, it was good for almost 10c on my 980. When I booted back up it was 5c lower with the same speed. This time instead of neatly applying the past I put a big blob in the middle...the amount I use on a CPU.


How long have you had your sc 1080? Are you aware of the thermal pad mod? My SC is newer so it came pre-installed but depending on your serial number if yours qualifies because it doesn't have it, they will send you the kit for free. This + the vbios update they rolled out helped a ton with thermals as far as I know.


----------



## white owl

Mine has pads on the fake backplate and on the inner plate. They're already super greasy...no idea why they use them when they know they do this.

I've been through a few BIOS already lol
I might try the Palit again since it's cooling so much better. With the fans up high I can keep it well under 70c


----------



## TwilightRavens

I mean idk, I don’t have any issues with the SC cooler on mine, custom fan profile keeps it in the lower 60’s which is damn good for an air cooled card (that’s where my watercooled 290X sat with its LN2 bios). I did repaste it though because that’s something I always do when I buy second hand GPU’s. Once I get a new tube of Kryonaut I’ll probably use that and replace the crappy Cooler Master paste that is on here and It’ll probably drop a few more degrees.


----------



## doom3crazy

Okay guys I need some help with something I just can't figure out. I ran firestrike benchmark and scored 18,869. WAY below what it should be. I randomly watched a video from a youtuber named tech yes city and with his x58 cpu and a 980 ti strix he scored just under 21,000. Something isn't right. 

I was wondering if we might be able to troubleshoot this and see whats going on. If it's some weird bios settings I have going on or what.

So I've got my x5675 clocked @5GHZ. I've got 32gb of g skill ripjaws @1866mhz. Asus rampage gene III. And then of course my evga sc gtx 1080. I am so confused. Can anyone tell me why I am scoring so low?


----------



## TwilightRavens

doom3crazy said:


> Okay guys I need some help with something I just can't figure out. I ran firestrike benchmark and scored 18,869. WAY below what it should be. I randomly watched a video from a youtuber named tech yes city and with his x58 cpu and a 980 ti strix he scored just under 21,000. Something isn't right.
> 
> I was wondering if we might be able to troubleshoot this and see whats going on. If it's some weird bios settings I have going on or what.
> 
> So I've got my x5675 clocked @5GHZ. I've got 32gb of g skill ripjaws @1866mhz. Asus rampage gene III. And then of course my evga sc gtx 1080. I am so confused. Can anyone tell me why I am scoring so low?


I mean even at 5GHz X58 chips will still bottleneck a GTX 1080, but not by enough to really worry about. How are the timings on your RAM? If they are running at CL11 then that may be why. Is your cpu running at a decent temp, if it’s overheating that could be another reason. Lastly are you running the RAM in triple channel? I’m not sure how 4 sticks (if that is what you have) would affect it but I know on dual channel platforms if you run 3 sticks sometimes it’ll revert to single channel.


----------



## doom3crazy

TwilightRavens said:


> I mean even at 5GHz X58 chips will still bottleneck a GTX 1080, but not by enough to really worry about. How are the timings on your RAM? If they are running at CL11 then that may be why. Is your cpu running at a decent temp, if it’s overheating that could be another reason. Lastly are you running the RAM in triple channel? I’m not sure how 4 sticks (if that is what you have) would affect it but I know on dual channel platforms if you run 3 sticks sometimes it’ll revert to single channel.



So I think my timings are in cl10 right now? I've got 32gb of these:
https://www.newegg.com/Product/Product.aspx?Item=N82E16820231615

They have a CL9 but I think I loosened the timings to get a higher overclock. Also, at 5ghz I am def pumping some volts into the cpu but it's on water and I am getting like high 80's during firestrike test but the tj maxx is 96c so I shouldn't be throttling right? 

Lastly, according to this diagram I followed my 4 sticks should be running in triple channel:
https://www.overclock.net/photopost/data/711476/e/ec/ece8148a_DDR3TRIDUALCHANNEL.jpeg


With all this information, do you have any suggestions for me? I am good to try tighter timings and lower core clock on the cpu or even removing a stick of memory etc. When I think about it, I was having this issue with my 980 ti too. I remember running firestrike and getting just over 17,000 with that heavy overclock I mentioned before(1500mhz+ on the core, 4000mhz on memory) 

Anyways, I appreciate the help. I am quite annoyed right now and I hope we can figure this out. It actually occurred to me that there was an issue when I was playing the brand new Metro Exodus game and in 1080p and 1440p getting less than stellar fps despite following an optimization guide. I know the game is a heavy hitter but my numbers didn't match what other people seem to be getting.


----------



## TwilightRavens

doom3crazy said:


> So I think my timings are in cl10 right now? I've got 32gb of these:
> https://www.newegg.com/Product/Product.aspx?Item=N82E16820231615
> 
> They have a CL9 but I think I loosened the timings to get a higher overclock. Also, at 5ghz I am def pumping some volts into the cpu but it's on water and I am getting like high 80's during firestrike test but the tj maxx is 96c so I shouldn't be throttling right?
> 
> Lastly, according to this diagram I followed my 4 sticks should be running in triple channel:
> https://www.overclock.net/photopost/data/711476/e/ec/ece8148a_DDR3TRIDUALCHANNEL.jpeg
> 
> 
> With all this information, do you have any suggestions for me? I am good to try tighter timings and lower core clock on the cpu or even removing a stick of memory etc. When I think about it, I was having this issue with my 980 ti too. I remember running firestrike and getting just over 17,000 with that heavy overclock I mentioned before(1500mhz+ on the core, 4000mhz on memory)
> 
> Anyways, I appreciate the help. I am quite annoyed right now and I hope we can figure this out. It actually occurred to me that there was an issue when I was playing the brand new Metro Exodus game and in 1080p and 1440p getting less than stellar fps despite following an optimization guide. I know the game is a heavy hitter but my numbers didn't match what other people seem to be getting.


If you feel like you can go for tighter timings I say go for it, or even better try for a higher RAM frequency if that kit can do 2133 at CL10 or CL11 it may bump you up some more, if not try seeing if you can tighten them. I have noticed in a lot of games RAM speed + timings can affect 1% and 0.1% low fps ranges. And yeah if it isn't hitting 96C then it shouldn't be throttling, but even if it hitting it for a split second that can tank results.


----------



## doom3crazy

TwilightRavens said:


> If you feel like you can go for tighter timings I say go for it, or even better try for a higher RAM frequency if that kit can do 2133 at CL10 or CL11 it may bump you up some more, if not try seeing if you can tighten them. I have noticed in a lot of games RAM speed + timings can affect 1% and 0.1% low fps ranges. And yeah if it isn't hitting 96C then it shouldn't be throttling, but even if it hitting it for a split second that can tank results.



Man.. I am stumped! I got a memory overclock of 2165 with CL10, ran firestrike, gained like 30 points. Then I tried slowing it down back to 1866 but tighter timings.... got a worse score. I can't seem to break even 19,000. Am I expecting too much from this system? I have my 1080 overclock to +100 on the core(brings it to around 2100mhz) and then +440 on the memory which brings it to 5400mhz. Whats an average firestrike score for a 1080? I watched an old video with jayztwocents and in the video he was testing a gigabyte g1 gaming 1080 that scored 24,000 with a mild overclock but he even had the FE card for comparison and it was still getting 23,000 at stock. 

I just can't decide if this is a limit of my current hardware or if there's something acting up or not working how it's suppose to. I am currently running win 7 if that helps. Direct X 11.


----------



## TwilightRavens

doom3crazy said:


> Man.. I am stumped! I got a memory overclock of 2165 with CL10, ran firestrike, gained like 30 points. Then I tried slowing it down back to 1866 but tighter timings.... got a worse score. I can't seem to break even 19,000. Am I expecting too much from this system? I have my 1080 overclock to +100 on the core(brings it to around 2100mhz) and then +440 on the memory which brings it to 5400mhz. Whats an average firestrike score for a 1080? I watched an old video with jayztwocents and in the video he was testing a gigabyte g1 gaming 1080 that scored 24,000 with a mild overclock but he even had the FE card for comparison and it was still getting 23,000 at stock.
> 
> I just can't decide if this is a limit of my current hardware or if there's something acting up or not working how it's suppose to. I am currently running win 7 if that helps. Direct X 11.


Heres mine on my Broadwell i7, I know not exactly helpful as its 1st gen vs 5th gen but this is what I get.


----------



## doom3crazy

TwilightRavens said:


> Heres mine on my Broadwell i7, I know not exactly helpful as its 1st gen vs 5th gen but this is what I get.



Oh okay. Well that does help and make me not feel as crazy. I had a score of 18,869 haha. So I guess my system isn't doing half bad  lol

I am an idiot. I keep referencing my 18,000 number to the actual graphics number. My graphics score was actually 23,200. And then my cpu was 17,000.


----------



## naveediftikhar

doom3crazy said:


> Oh okay. Well that does help and make me not feel as crazy. I had a score of 18,869 haha. So I guess my system isn't doing half bad  lol
> 
> I am an idiot. I keep referencing my 18,000 number to the actual graphics number. My graphics score was actually 23,200. And then my cpu was 17,000.


havent ran firestrike to test my card yet, maybe will to it later today and post my results...however i would suggest you checking the pcie speeds of the slot your card is installed into. i say cause your first gen would honestly be bottlenecking there. i dont know what pcie multiple first gens are rated at i.e gen 2/[email protected]/16 but i have an xeon 1650 (i7 3930k eq) and out of the box it doesnt support gen 3. i had to force enable it and saw a single fps constant difference...thats not much but is what it is. my superposition score are within 43XX. maybe test that as there the pure gpu proves is tested and u can avoid bottlenecks brought forward by ur first gen hardware. just an advice.


----------



## doom3crazy

naveediftikhar said:


> havent ran firestrike to test my card yet, maybe will to it later today and post my results...however i would suggest you checking the pcie speeds of the slot your card is installed into. i say cause your first gen would honestly be bottlenecking there. i dont know what pcie multiple first gens are rated at i.e gen 2/[email protected]/16 but i have an xeon 1650 (i7 3930k eq) and out of the box it doesnt support gen 3. i had to force enable it and saw a single fps constant difference...thats not much but is what it is. my superposition score are within 43XX. maybe test that as there the pure gpu proves is tested and u can avoid bottlenecks brought forward by ur first gen hardware. just an advice.



Thank you I appreciate it. I did think of that. The x58 platform does only have pcie 2.0 vs 3.0. so that could be effecting it some. I am pretty sure its in the full 16 slot though.


----------



## naveediftikhar

doom3crazy said:


> Thank you I appreciate it. I did think of that. The x58 platform does only have pcie 2.0 vs 3.0. so that could be effecting it some. I am pretty sure its in the full 16 slot though.


you can check it in gpu z. however since we are already here...i going through a stupid testing phase of my own. no matter what i have tried it hasnt worked. in far cry new dawn with my specs m getting a stupid cpu bottleneck (or maybe something else, but cpu is my guess)..in the built in benchmark m getting a measly 65 fps avg on my 1080 all ultra @ 1440p while all the benchmarks show that i should be getting 88 fps. in far cry 5 i would get 78 avg as is the norm, and since the game is built on the same engine and everything. what do you think could be the issue.

m using the ab osd and cpu never goes beyond 50-60% usage, same for gpu. 

xeon 1650
16 gb ram dual channel
1 tb 7200 rpm hdd
gigabyte windforce 1080 (2000 mhz 5600 mhz)


----------



## TwilightRavens

naveediftikhar said:


> you can check it in gpu z. however since we are already here...i going through a stupid testing phase of my own. no matter what i have tried it hasnt worked. in far cry new dawn with my specs m getting a stupid cpu bottleneck (or maybe something else, but cpu is my guess)..in the built in benchmark m getting a measly 65 fps avg on my 1080 all ultra @ 1440p while all the benchmarks show that i should be getting 88 fps. in far cry 5 i would get 78 avg as is the norm, and since the game is built on the same engine and everything. what do you think could be the issue.
> 
> m using the ab osd and cpu never goes beyond 50-60% usage, same for gpu.
> 
> xeon 1650
> 16 gb ram dual channel
> 1 tb 7200 rpm hdd
> gigabyte windforce 1080 (2000 mhz 5600 mhz)


Yeah it is PCI-E 2.0, 3.0 didn’t come around until Ivy Bridge (3rd gen) on Z77 and Sandy Bridge-E on X79. My 790i board (LGA 775) also does PCI-E 2.0, that has relatively little effect on the 1080 as its maybe a 5% difference in terms of performance in pure GPU benchmarks. The only GPU’s really powerful enough to be bottlenecked by PCI-E 2.0 would be something like a Titan V or RTX cards (2080 and 2080 ti) even those wouldn't be badly bottlenecked by it, but even a 1080 ti shows little difference between PCI-E 2.0 and 3.0. Most of the score discrepancy comes from lower IPC compared to newer generations. For example you'd see a higher score on a 8700K or 7700K at 5GHz vs say a Ryzen 7 2700X at 4.2GHz in the Firestrike graphics department because of the better single threaded performance, hell even a moderately clocked 4790K would keep right up with those. But anything Sandy Bridge and prior will start to fall off because of the single threaded aspect of it, doesn't mean they aren't usable anymore, just won't see as high of a score as you would expect.


----------



## white owl

Maybe this will make you feel better.
Card was probably in the high 1900s with 1325Mhz ram, [email protected]/4.4, 16gb 2400mhz cas11 STUCK IN 8x 3.0...no idea why. Been that way for a while now.
Firesrike isn't indicative of gaming performance which is why you really shouldn't care about it and why this is the first time I've ran it in years. Installed it just for you. Just look at the way RTX and Vega7 scores on it.

Gaming performance=Gaming performance, if you wanna test it get fraps going.


----------



## TwilightRavens

white owl said:


> Maybe this will make you feel better.
> Card was probably in the high 1900s with 1325Mhz ram, [email protected]/4.4, 16gb 2400mhz cas11 STUCK IN 8x 3.0...no idea why. Been that way for a while now.
> Firesrike isn't indicative of gaming performance which is why you really shouldn't care about it and why this is the first time I've ran it in years. Installed it just for you. Just look at the way RTX and Vega7 scores on it.
> 
> Gaming performance=Gaming performance, if you wanna test it get fraps going.


Pretty much, this is a benchmark so we gotta expect benchmark results, in no way is it directly comparable to actual gameplay.

to add on to mine because I forgot list what I had when I ran it was 2100MHz GTX 1080 and the VRAM is +300, i7 5775C at 4.4/4.0 (EDRAM at 2100MHz) 32GB of 2400MHz DDR3 CL11 but card was run at full 3.0 16x bandwidth.


----------



## doom3crazy

white owl said:


> Maybe this will make you feel better.
> Card was probably in the high 1900s with 1325Mhz ram, [email protected]/4.4, 16gb 2400mhz cas11 STUCK IN 8x 3.0...no idea why. Been that way for a while now.
> Firesrike isn't indicative of gaming performance which is why you really shouldn't care about it and why this is the first time I've ran it in years. Installed it just for you. Just look at the way RTX and Vega7 scores on it.
> 
> Gaming performance=Gaming performance, if you wanna test it get fraps going.


I just started thinking in regards to you trying to cool your card better and not being happy with the SC cooler. Have you ever thought about picking up something like an Accelero cooler? I watched a couple reviews and most people had between a 15-20c difference. Seems pretty legit. 

https://www.arctic.ac/us_en/accelero-twin-turbo-ii.html


----------



## doom3crazy

Well now that I understand whats going on and I am not being a tard, I would say I am doing alright with my 1080 lol.
https://www.3dmark.com/3dm/34039668


----------



## white owl

doom3crazy said:


> I just started thinking in regards to you trying to cool your card better and not being happy with the SC cooler. Have you ever thought about picking up something like an Accelero cooler? I watched a couple reviews and most people had between a 15-20c difference. Seems pretty legit.
> 
> https://www.arctic.ac/us_en/accelero-twin-turbo-ii.html


Nah, you couldn't get enough speed to make up the difference between a 1080 and a 1080TI but spending -/+$100 and selling the card would. Might gain 20Mhz? Not worth it.


----------



## white owl

Anyone wanna do superposition?
17058, 1080p Medium


----------



## naveediftikhar

white owl said:


> Anyone wanna do superposition?
> 17058, 1080p Medium


LOLZ.. lets so supersposition 1080p extreme, mine is 43XX...does go to 44XX when the card is cooler and the pc is just booted up.

gigabbyte windforce 1080, specs 2000 mhz +600 on vram.

btw. has anyone noticed that the more you oc the vram the more the card power throttles..i tried +700 and yes it did produce better results (1 fps more) but the card power throttled alot..just makes me wonder if i could ever remove the power limit. sadly dont know if there are any compatible bios out there with single 8 pin. (tried the g1 bios...literally did not even get a single fps diff)


----------



## TwilightRavens

white owl said:


> Anyone wanna do superposition?
> 17058, 1080p Medium


Yeah I’ll run it when I get home today, if I remember to .


----------



## doom3crazy

white owl said:


> Nah, you couldn't get enough speed to make up the difference between a 1080 and a 1080TI but spending -/+$100 and selling the card would. Might gain 20Mhz? Not worth it.



Ah okay yeah that makes sense. So I had a question for you regarding the Palit bios you mentioned before. I thought about maybe giving it a whirl because my card was cooling pretty well at the moment and just wanted to know more about it.

I noticed I can add +120mhz to the core before I crash. If I go 125-130 I crash. Am I not able to reach those numbers because of voltage? Or is it a power limit? I remember you mentioning the palit bios increases the total TDP. You did say something about fan speed though. Does the palit bios basically put a hard limit on the max RPM the fans can speed up to as opposed to what they can spin up to on the stock evga bios?


----------



## TwilightRavens

New personal record! I finally stabilized the DDR3 2400MHz with all 32GB and gained 200 points compared to before I was running it at 2000MHz, finally breaking 19000 points. Here is the website verification for more details: https://www.3dmark.com/3dm/34070935?


----------



## TwilightRavens

@white owl there's mine.


----------



## naveediftikhar

so was running superposition 1080p medium last night, and found that this *****  uses cpu as well...n like..woah..:S...anyways..i ended up with 15k something on my xeon 1650 clocked at 3.9 all...


----------



## TwilightRavens

naveediftikhar said:


> so was running superposition 1080p medium last night, and found that this *****  uses cpu as well...n like..woah..:S...anyways..i ended up with 15k something on my xeon 1650 clocked at 3.9 all...


Yeah I think it favors clockspeed over cores, but I am not 100% on that because white owl has me beat by 300MHz on his 4790K and I have 2MB less L3 but still came out ahead. And the IPC gap from Broadwell to Haswell is only like 3-5% if even that.


----------



## naveediftikhar

TwilightRavens said:


> Yeah I think it favors clockspeed over cores, but I am not 100% on that because white owl has me beat by 300MHz on his 4790K and I have 2MB less L3 but still came out ahead. And the IPC gap from Broadwell to Haswell is only like 3-5% if even that.


you are right...it does favor clock speed. i can say that cause i monitor my system and performance constantly through AB and it only really stresses 1 single core with very little impact on the second one and absolutely nothing on the remaining 4 cores. while you pulling ahead cut be due to the gpu sustained clocks.


----------



## TwilightRavens

naveediftikhar said:


> you are right...it does favor clock speed. i can say that cause i monitor my system and performance constantly through AB and it only really stresses 1 single core with very little impact on the second one and absolutely nothing on the remaining 4 cores. while you pulling ahead cut be due to the gpu sustained clocks.


Could be a part of it but compare our Physics scores in Firestrike, I very highly doubt its the L4 cache on mine doing that in Firestrike.


----------



## white owl

doom3crazy said:


> Ah okay yeah that makes sense. So I had a question for you regarding the Palit bios you mentioned before. I thought about maybe giving it a whirl because my card was cooling pretty well at the moment and just wanted to know more about it.
> 
> I noticed I can add +120mhz to the core before I crash. If I go 125-130 I crash. Am I not able to reach those numbers because of voltage? Or is it a power limit? I remember you mentioning the palit bios increases the total TDP. You did say something about fan speed though. Does the palit bios basically put a hard limit on the max RPM the fans can speed up to as opposed to what they can spin up to on the stock evga bios?



Fyi to know what speeds you're talking about you need to say the effective speed instead of your offset. +200 in NVinspector is different than +200 in Afterburner. Same for ram, some are offsetting according to the effective speed and some are doing the actual speed.
Even if we were all using afterburner +200 on your card isn't the same as +200 on my identical card. Plus there's the fact that boost changes based on temperature so you might not even be getting the full +200.
The speed most of us go by is what's reported in GPUz sensors tab while under load for a while so boost/thermals have leveled off. 



The SC cooler could probably handle the Palit BIOS if the fan speed was there. The Palit is limited to a 2400RPM fan speed while the SC fans spin up to 3000RPM, they'll still work but they'll never spin faster than what the Palit BIOS allows. To my ear the Palit BIOS limited the EVGA fans to about 72%.
The extended TDP might enable higher clocks but the real benefit is allowing the card to be TDP unlimited so it will bench higher with the same speeds.


----------



## doom3crazy

TwilightRavens said:


> New personal record! I finally stabilized the DDR3 2400MHz with all 32GB and gained 200 points compared to before I was running it at 2000MHz, finally breaking 19000 points. Here is the website verification for more details: https://www.3dmark.com/3dm/34070935?


Dude thats awesome. What speeds were you running on the core and memory clocks? You gained 200 points with just 400 extra mhz on the ram?



white owl said:


> Fyi to know what speeds you're talking about you need to say the effective speed instead of your offset. +200 in NVinspector is different than +200 in Afterburner. Same for ram, some are offsetting according to the effective speed and some are doing the actual speed.
> Even if we were all using afterburner +200 on your card isn't the same as +200 on my identical card. Plus there's the fact that boost changes based on temperature so you might not even be getting the full +200.
> The speed most of us go by is what's reported in GPUz sensors tab while under load for a while so boost/thermals have leveled off.
> 
> 
> 
> The SC cooler could probably handle the Palit BIOS if the fan speed was there. The Palit is limited to a 2400RPM fan speed while the SC fans spin up to 3000RPM, they'll still work but they'll never spin faster than what the Palit BIOS allows. To my ear the Palit BIOS limited the EVGA fans to about 72%.
> The extended TDP might enable higher clocks but the real benefit is allowing the card to be TDP unlimited so it will bench higher with the same speeds.


Ah okay. I will have to run some benchmarks again and open gpuz and look at that. Thanks for letting me know. 

Do you know where I might be able to download and try the Palit bios? I am familiar with nvflash and all that as I used to flash bios on my old 980 ti all the time. Is there anything special in regards to putting the palit bios on the SC card?


----------



## white owl

Same process.


----------



## TwilightRavens

doom3crazy said:


> Dude thats awesome. What speeds were you running on the core and memory clocks? You gained 200 points with just 400 extra mhz on the ram?
> 
> 
> 
> Ah okay. I will have to run some benchmarks again and open gpuz and look at that. Thanks for letting me know.
> 
> Do you know where I might be able to download and try the Palit bios? I am familiar with nvflash and all that as I used to flash bios on my old 980 ti all the time. Is there anything special in regards to putting the palit bios on the SC card?


2100MHz on the core (+100MHz in afterburner) and 1325MHz on the RAM (+300mHz in afterburner. Yeah I didn’t think the RAM would matter but it appears it does on DDR3 platforms, can’t speak for the DDR4 ones.


----------



## doom3crazy

TwilightRavens said:


> 2100MHz on the core (+100MHz in afterburner) and 1325MHz on the RAM (+300mHz in afterburner. Yeah I didn’t think the RAM would matter but it appears it does on DDR3 platforms, can’t speak for the DDR4 ones.


I've been trying to break 19k overall and over 24k on the graphics score. Did overclocking your ram help mostly the graphics score? Or did it help combined and or physics as well? I've got g skill ripjaws 1866 but I can't seem to get above 2000mhz on all 32gb's. Any advice for getting a stable overclock on ram?


----------



## TwilightRavens

doom3crazy said:


> I've been trying to break 19k overall and over 24k on the graphics score. Did overclocking your ram help mostly the graphics score? Or did it help combined and or physics as well? I've got g skill ripjaws 1866 but I can't seem to get above 2000mhz on all 32gb's. Any advice for getting a stable overclock on ram?


No effect on Graphics score (within margin of error), Physics score and combined score went up. For RAM overclocking it varies from platform to platform, on Z97 (Haswell and Broadwell) its all about System Agent Voltage, IOA, IOD, Cache Voltage and Input Voltage. For Z77 (Ivy Bridge) VTT and Northbridge voltage. But really I’d start with VTT if you have it on yours and Memory Controller voltage first before increasing DRAM voltage, 9 times out of 10 that will stabilize it, thing about it is too much voltage can make it just as unstable as too little voltage. If neither those voltages or DRAM voltage can do it then that just means your IMC just isn’t up to snuff for it.


----------



## doom3crazy

TwilightRavens said:


> No effect on Graphics score (within margin of error), Physics score and combined score went up. For RAM overclocking it varies from platform to platform, on Z97 (Haswell and Broadwell) its all about System Agent Voltage, IOA, IOD, Cache Voltage and Input Voltage. For Z77 (Ivy Bridge) VTT and Northbridge voltage. But really I’d start with VTT if you have it on yours and Memory Controller voltage first before increasing DRAM voltage, 9 times out of 10 that will stabilize it, thing about it is too much voltage can make it just as unstable as too little voltage. If neither those voltages or DRAM voltage can do it then that just means your IMC just isn’t up to snuff for it.


After all my reading and research it does seem most the 1080 cards clock about the same BUT Pascal is all about temps and keeping the card cool to get the most consistent speeds. Now I am trying to think of ways to keep my card cooler outside of putting it under water. I've thought about re pasting it with some kryonaut since it seems that the liquid metal is out of the question. I've even thought about making one of these diy ac units ive seen tutorials for on the internet and have it just blow into my case. I live in a very dry climate so I can get away doing it without any condensation collecting inside the case.


----------



## TwilightRavens

doom3crazy said:


> After all my reading and research it does seem most the 1080 cards clock about the same BUT Pascal is all about temps and keeping the card cool to get the most consistent speeds. Now I am trying to think of ways to keep my card cooler outside of putting it under water. I've thought about re pasting it with some kryonaut since it seems that the liquid metal is out of the question. I've even thought about making one of these diy ac units ive seen tutorials for on the internet and have it just blow into my case. I live in a very dry climate so I can get away doing it without any condensation collecting inside the case.


Or you could go the Kraken G10/12 + AIO route, did that with my 290X and went from 88C to 52C at stock speeds and voltages, 62C with a modified LN2 BIOS.


----------



## doom3crazy

TwilightRavens said:


> Or you could go the Kraken G10/12 + AIO route, did that with my 290X and went from 88C to 52C at stock speeds and voltages, 62C with a modified LN2 BIOS.


I thought of that for sure. The only issue would be space in my case. I've currently got a h115i corsair aio for my cpu and the rad is mounted to the top of the case and the front part of my has all the hard drive bay slots.


----------



## doom3crazy

So as I was saying Ive got a corsair h115i mounted to the top of my case for the cpu. Do you guys have any idea where I might be able to mount a rad for cooling my gpu via the aio method? Here's my case.
https://www.bequiet.com/en/case/1182

Be quiet silent 600 series. I def have the hard drive bay slots used up but I don't need the optical drive bays so if those are removable i'd be down to take those out but the question is...where to exhaust the air?


----------



## white owl

Put the rad in the bottom, use both rads as intakes and use the factory exhaust = best thermals
Though I'd probably build an AliExpress loop with copper rads and a D5 first. Or simply sell the 1080 and upgrade to a 1080ti for about the same as it would cost to buy the CLC and the mounting thing.


----------



## doom3crazy

white owl said:


> Put the rad in the bottom, use both rads as intakes and use the factory exhaust = best thermals
> Though I'd probably build an AliExpress loop with copper rads and a D5 first. Or simply sell the 1080 and upgrade to a 1080ti for about the same as it would cost to buy the CLC and the mounting thing.


I thought about that but where do I put the power supply if I mount the rad on the bottom?

Also I thought about the 1080 ti thing but I think it’s a little more expensive. You can get a Corsair h50 for around 30$ on eBay and then the kraken g12 for 30$ or go with the older g10 and it’s even cheaper. I saw one on eBay for 15$. I think the average 1080 ti is still going like 550$


----------



## white owl

Oh yeah I guess if you're buying used it's not that bad. 

In front of the PSU. You can't relocate the drives to the optical bay? Is there no fan mount there?


If you can't do that you can just use the exhaust mount. Running all positive pressure isn't a bad thing unless there's barely anywhere for it to get out.


----------



## doom3crazy

white owl said:


> Oh yeah I guess if you're buying used it's not that bad.
> 
> In front of the PSU. You can't relocate the drives to the optical bay? Is there no fan mount there?
> 
> 
> If you can't do that you can just use the exhaust mount. Running all positive pressure isn't a bad thing unless there's barely anywhere for it to get out.


Oh yeah, I guess I could do that on that bottom mount fan area. I always forget about it cause it's where I have all the wires coming from my psu tucked away lol. I totally get what you're saying about spending money on something like that for the 1080. I just figured it could be a good investment long term because I will be able to use it on the 1080 ti as well. Funny enough I am keeping my eye out for the perfect 1080 ti price and I will probably jump on it BUT, until the price is right I am pretty happy with the 1080 so I figure why not keep it cooler haha. I'd love to snag a evga sc2 1080 ti for like 450$ but it's probably wishful thinking(at least right now anyways)


----------



## TwilightRavens

On mine I had it mounted at the rear as an intake (where exhaust fans usually go) in push/pull on a Corsair H75i and had the top fan as an exhaust, CPU temps were a hair warmer that way but by all means not terrible 83C under extreme loads (loads i'd never see usually) but now they are mid 70's. Best to have the rad for the GPU as an intake though since Pascal deals with boost clocks via temps and that would net a few degrees cooler at the expense of a few degrees on the CPU.


----------



## doom3crazy

TwilightRavens said:


> On mine I had it mounted at the rear as an intake (where exhaust fans usually go) in push/pull on a Corsair H75i and had the top fan as an exhaust, CPU temps were a hair warmer that way but by all means not terrible 83C under extreme loads (loads i'd never see usually) but now they are mid 70's. Best to have the rad for the GPU as an intake though since Pascal deals with boost clocks via temps and that would net a few degrees cooler at the expense of a few degrees on the CPU.


I was thinking about getting a h75 and then because its a 120mm rad fan size & putting that as an intake on the bottom 120mm fan intake. So did you have your gtx 1080 on an aio? what kind of temps were you getting?


----------



## TwilightRavens

doom3crazy said:


> I was thinking about getting a h75 and then because its a 120mm rad fan size & putting that as an intake on the bottom 120mm fan intake. So did you have your gtx 1080 on an aio? what kind of temps were you getting?


Nah the H75 is still on my 290X, I haven't needed it on my 1080 to be honest after I gave the card an alcohol bath and repasted it.


----------



## doom3crazy

Yo you guys are gonna laugh. So I did a thing, and then another thing, got lucky, then this happen.


----------



## TwilightRavens

doom3crazy said:


> Yo you guys are gonna laugh. So I did a thing, and then another thing, got lucky, then this happen.


I was going to get a 1080 ti but $500 is still a bit much for my tastes lol, though 1070's are a really freaking good deal right now as some can be had for less than $200.


----------



## doom3crazy

TwilightRavens said:


> I was going to get a 1080 ti but $500 is still a bit much for my tastes lol, though 1070's are a really freaking good deal right now as some can be had for less than $200.


haha I totally hear you. I got really lucky I think. So after selling my 980 ti for 220$, then getting a 1080 for 330$, then selling the 1080 for 400$ and getting this TI black edition for 500$, my actual out of pocket cost was under 150$ so that seemed like a no brainer to me. I think 140$?


----------



## white owl

Told ya lol
Congrats!


----------



## TwilightRavens

I thought about getting a 1070 for my old 775 build since they are cheap but it would almost not even be worth it.


----------



## doom3crazy

white owl said:


> Told ya lol
> Congrats!


Haha thanks man. I am still thinking I am gonna put an aio on this thing cause I figured I've gone this far, might as well try and give the card a good life with low temps and consistent performance. 



TwilightRavens said:


> I thought about getting a 1070 for my old 775 build since they are cheap but it would almost not even be worth it.


Ya know honestly, I am still impressed by that old platform. I have an Asus Maximus Formula II board that I did the lga 775 to 771 mod to and I am currently rocking a Xeon x5470 clocked at 4.5ghz with 16gb of ddr2. Just for the hell of it, I stuck in my 1080 before I sold it and it wasn't nearly as bad as I thought. There's some bottle neck right away guaranteed(and I didn't try a game like GTA 5) BUT, playing games like far cry 5, the witcher, doom, etc I was pleasantly surprised to see the CPU wasn't quite fully bottle necked. Not bad for a 4core/4 thread cpu heh

At 4.5ghz it was hovering around like 70-80% usage on average. 
I thought straight away it was just gonna be at 100% straight up choking the life out of the graphics card. I even managed to get it stable at 4.7ghz. I went for 4.8 but it just wasn't having it. All on air too!(granted I do have one of those giant be quiet heatsinks in that system) And this was all at 1080p. I am sure the cpu usgage would drop going to higher resolutions. 

Anyways, it still blows me away. There was a user on here I think his name was levithan? He was(and probably still is) rocking two 980 ti's in SLI with a heavy overclock and a Xeon x5470 and he was totally happy with the results playing in 4k haha.


----------



## TwilightRavens

doom3crazy said:


> Haha thanks man. I am still thinking I am gonna put an aio on this thing cause I figured I've gone this far, might as well try and give the card a good life with low temps and consistent performance.
> 
> 
> 
> Ya know honestly, I am still impressed by that old platform. I have an Asus Maximus Formula II board that I did the lga 775 to 771 mod to and I am currently rocking a Xeon x5470 clocked at 4.5ghz with 16gb of ddr2. Just for the hell of it, I stuck in my 1080 before I sold it and it wasn't nearly as bad as I thought. There's some bottle neck right away guaranteed(and I didn't try a game like GTA 5) BUT, playing games like far cry 5, the witcher, doom, etc I was pleasantly surprised to see the CPU wasn't quite fully bottle necked. Not bad for a 4core/4 thread cpu heh
> 
> At 4.5ghz it was hovering around like 70-80% usage on average.
> I thought straight away it was just gonna be at 100% straight up choking the life out of the graphics card. I even managed to get it stable at 4.7ghz. I went for 4.8 but it just wasn't having it. All on air too!(granted I do have one of those giant be quiet heatsinks in that system) And this was all at 1080p. I am sure the cpu usgage would drop going to higher resolutions.
> 
> Anyways, it still blows me away. There was a user on here I think his name was levithan? He was(and probably still is) rocking two 980 ti's in SLI with a heavy overclock and a Xeon x5470 and he was totally happy with the results playing in 4k haha.


Yeah I am going to throw my Hybrid 290X into it once I get a way better PSU for it, otherwise it'll trigger overvolt protection more than likely . If the CPU still has some headroom available for a more powerful GPU then maybe in the future I'll consider something a tad more powerful *cough cough* Fury X *cough cough*


----------



## TwilightRavens

Has anyone tried the 11GBps vbios from the EVGA GTX 1080 SC2 on the og 10GBps EVGA GTX 1080 SC with any success? Was thinking about trying it if others have been successful in it being stable.


----------



## thauch

TwilightRavens said:


> Has anyone tried the 11GBps vbios from the EVGA GTX 1080 SC2 on the og 10GBps EVGA GTX 1080 SC with any success? Was thinking about trying it if others have been successful in it being stable.




I want to know the same thing.


Sent from my iPhone using Tapatalk


----------



## TwilightRavens

Okay so after looking into it some more I don’t think it will work because it looks as if the SC2 has an 8 pin + 6 pin vs the SC which just has an 8 pin, but I’d love to be wrong about it.


----------



## kikimaru024

Upgraded from a water-cooled 980 Ti to Gigabyte G1 Gaming.

Not sure if I'll move the WC stuff over - think it's worth it? 
Or is Pascal good enough with heat/auto-OC to not bother with water?


----------



## confed

kikimaru024 said:


> Upgraded from a water-cooled 980 Ti to Gigabyte G1 Gaming.
> 
> Not sure if I'll move the WC stuff over - think it's worth it?
> Or is Pascal good enough with heat/auto-OC to not bother with water?


I had my air cooled Zotac Amp 1080 for about 2.5 years before I decided to put it under water. Temps were too high for my liking and I was building a new rig (new cpu/mobo/mem/case) so I figured that was the best time. Very happy that I did so. Temps are much lower and the noise from my full loop is equal or less than the noise from my gpu alone when under heavy load. 

To me, it's worth it from an aesthetic and temperature standpoint. Performance-wise, it does OC slightly better but nothing drastic nor is the OC worth the $$ for me personally. Moving forward, I think I''ll continue to put my cards under water for the visual and temperature benefits.


----------



## Carillo

Hi guys. I just bought a Evga GTX 1080 SC ACX 3.0, which I tried to flash XOC T4, Palit gamerock, and FTW 2 bios on to. My big problem now, is that the memory is [email protected] 405MHZ on all three mentioned bioses.I have tried DDU and refalsh several times.. I have also googled it, and seen that many have had the same problem, but I find no proper explanation or solution, just RMA. When I flash back to the original bios,the memory boost as normal. This is not the first time I used nvflash, I have done this with at least 8 pascall cards(1080 and 1080 ti) before and have never experienced this. Someone who has a clue ??


----------



## TwilightRavens

Carillo said:


> Hi guys. I just bought a Evga GTX 1080 SC ACX 3.0, which I tried to flash XOC T4, Palit gamerock, and FTW 2 bios on to. My big problem now, is that the memory is [email protected] 405MHZ on all three mentioned bioses.I have tried DDU and refalsh several times.. I have also googled it, and seen that many have had the same problem, but I find no proper explanation or solution, just RMA. When I flash back to the original bios,the memory boost as normal. This is not the first time I used nvflash, I have done this with at least 8 pascall cards(1080 and 1080 ti) before and have never experienced this. Someone who has a clue ??


You have an early GDDR5X card that can't handle the 11GB/s that the newer chips can handle, which is what those BIOS all have. The memory is reverting to a recovery state because it can't handle those speeds.


----------



## Carillo

TwilightRavens said:


> You have an early GDDR5X card that can't handle the 11GB/s that the newer chips can handle, which is what those BIOS all have. The memory is reverting to a recovery state because it can't handle those speeds.


Older than the T4 bios from 2016 ?


----------



## TwilightRavens

Carillo said:


> Older than the T4 bios from 2016 ?


Well there are the older launch 10GB/s GDDR5X chips which were in the first batch of Pascal cards, then there were the newer cards which were clocked at 10GBPS but could do 11GBPS right around when the 1080 ti launched, some manufacturers offered those for the cards that had the newer ones, my guess though is if you have a 2016 card you have the first gen GDDR5X and so that's why its reverting to recovery mode when you try and flash that. Try a BIOS with the VRAM running at 10GBPS (10GHz) instead of one that's 11GBPS (11GHz) and see if that works.


----------



## PlugFour

This is a good scenario to explain the few cards failing to flash. BUT my last search shows that all testimonies about capped GDDR5X start from end of 2017 until now. For instance my 1080 OEM (=FE) was built Q3 2017. Furthermore, my 1080 accepts very good o/c on memory. So I'm not sure it is the right reason.
By the way, the voltage mod didn't work, I tried twice. Coincidence?

I'm OK for flashing a 10Gbps bios, if someone can find one. I already flashed at least 10 different bioses until now.


----------



## Monk83x

Hello guys I have a ROG Strix GTX 1080 Advanced edition and I wana to flash the OC one on my card ! Is that possible ?
I tried different ways but I failed !


----------



## ArturoH4L

Excuse me. im new in this, but it will be OK if i flash this T4 Bandwagon on my GTX MSI SEAHAWK X? 

https://es.msi.com/Graphics-card/GeForce-GTX-1080-SEA-HAWK

it wont get bricked?

Thanks..


----------



## TwilightRavens

ArturoH4L said:


> Excuse me. im new in this, but it will be OK if i flash this T4 Bandwagon on my GTX MSI SEAHAWK X?
> 
> https://es.msi.com/Graphics-card/GeForce-GTX-1080-SEA-HAWK
> 
> it wont get bricked?
> 
> Thanks..


As long as they both have the 10GHz effective memory clock it should be alright.


----------



## Bride

Lovely Zotac BIOS, flashed on my EVGA... stable frequencies

Zotac GTX 1080 8GB (AMP Extreme Plus)

https://www.techpowerup.com/vgabios/194143/zotac-gtx1080-8192-170321


----------



## AlbertoM

Bride said:


> Lovely Zotac BIOS, flashed on my EVGA... stable frequencies
> 
> Zotac GTX 1080 8GB (AMP Extreme Plus)
> 
> https://www.techpowerup.com/vgabios/194143/zotac-gtx1080-8192-170321


I read that you have the 1080 FTW model... Yeah this Zotac BIOS should be better for that card since it has 2 power connectors and more power phases than FE model.

But for FE i would still use the Palit BIOS so no fear of burning anything on the card or the single power connector/cable.


----------



## technodanvan

*August 2019 Foldathon!*

Hello 1080 Owner's Club! 

I apologize for potentially derailing otherwise productive conversation, but this will only take a moment (yes this is a canned message I'm using in other clubs too, sorry)

You might have noticed the banner link to the upcoming Foldathon, August 2019 Edition. The OCN Folding team has been losing 24/7 membership for some time, and to make up for that we encourage other OCN users to jump in for a couple of days and donate a little bit of CPU and GPU time to helping find a cure for a variety of diseases, notably various cancer's but also Alzheimer's and others. We'd really appreciate it if you would join us!

Link to the Foldathon is here: August 2019 Foldathon

Link to the Stanford [email protected] Website is here: [email protected] Windows Download Page

I promise it's easy to set up! You don't need to worry about changing driver's or anything like that. Just do the following:

1a. If you don't already have an FAH passkey, create one here: Passkey Request Form

1b. If you haven't had a passkey before, you might want to run this for a day or two leading up to the Foldathon in order to qualify for bonus points.

2. Download/install the installer: [email protected] Windows Download Page

3. Once installed it'll autorun, enter your account and passkey you just created, and enter "37726" as your team number for OCN

4. Once you have it working, head to the Foldathon page (linked above) to register!

After that, you're pretty much done! If you don't want your CPU to run you can either pause it individually or disable it completely in the advanced controller - Configure -> Slots -> Select CPU -> Select 'Remove' -> DONE.

This whole process will take less than ten minutes, I promise. Put that studly computer to work and help us out! You can always shut down or remove the FAH controller afterwards.


----------



## TwilightRavens

So I get to have fun with a real bottleneck for the next few months 

Reason being is my main rig is too bulky to use as a daily for now, so yes bottleneck galore lol, but damn it I bought a GTX 1080, i at least want to use it.


----------



## Bride

AlbertoM said:


> I read that you have the 1080 FTW model... Yeah this Zotac BIOS should be better for that card since it has 2 power connectors and more power phases than FE model.
> 
> But for FE i would still use the Palit BIOS so no fear of burning anything on the card or the single power connector/cable.


that's right, actually for me this BIOS is pushed a little bit, giving to me great results in game


----------



## Hokies83

Been gone awhile,
My Gtx 1080 FE does 2125mhz boost with power limit at 100% and +600mhz on the memory,
Is that fairly good OC?

It would be nice if the voltage was unlocked tho....


----------



## TwilightRavens

Hokies83 said:


> Been gone awhile,
> My Gtx 1080 FE does 2125mhz boost with power limit at 100% and +600mhz on the memory,
> Is that fairly good OC?
> 
> It would be nice if the voltage was unlocked tho....


Yes but as I and many others have tested, the memory on the older launch cards see's no benefit from going past +457MHz, performance actually degrades past that in almost everything, Source: * here*.


----------



## Hokies83

TwilightRavens said:


> Hokies83 said:
> 
> 
> 
> Been gone awhile,
> My Gtx 1080 FE does 2125mhz boost with power limit at 100% and +600mhz on the memory,
> Is that fairly good OC?
> 
> It would be nice if the voltage was unlocked tho....
> 
> 
> 
> Yes but as I and many others have tested, the memory on the older launch cards see's no benefit from going past +457MHz, performance actually degrades past that in almost everything, Source: * here*.
Click to expand...

Thx for the info, I’ll bump it down to +457mhz


----------



## TwilightRavens

Hokies83 said:


> Thx for the info, I’ll bump it down to +457mhz


Yeah no problem.


----------



## Skye12977

I was wondering if anyone owned a 1080 Strix and had issues with one of the fans coming loose and thus making noise.
I can simply push the fan back up into place, but it's only a matter of time until it happens again...


----------



## TwilightRavens

Woo! Got 2126MHz stable (in firestrike at least) on the highest performance state (1.093v), finally learned how to use the voltage curve graph in afterburner and Pascal overclocking suddenly makes sense. I’ll try and post some pics once i get the graph cleaned up (clocked higher at the lower states). But so far she only drops to 2050 worst case, might be able to do 2113 at the next state (I think its 1.081v right?) and then 2100MHz at the state below that. But as it is my curve is kind of a mess as I didn’t adjust anything below 1.043v, or should mess with those that are lower (I haven’t seen it go below 1.043v when under actual load though).


----------



## derx

TwilightRavens said:


> Yes but as I and many others have tested, the memory on the older launch cards see's no benefit from going past +457MHz, performance actually degrades past that in almost everything, Source: * here*.


Darn... I've missed this post bigtime. Too bad I don't really have time in the next couple of days, but I'm sure going to test this one out. Think I'm at +700 on the mem at the moment, and never even thought about going slower  I've been bumping the core to oblivion, and got it stable for benching at around 2240 @1.08V. More voltage doen't really give me any boost in clockspeed, at least not stable enough to run any significant benchmark. 
For gaming I'm down to around 2179. Above that I get the incidental C2D, and it's quite frustrating when that happens, so better a bit slower and stable. I do realise that I've got a heck of a 1080 (GB GTX1080 G1 Gaming with an EK full cover block), and 2170 stable under heavy gaming is quite rare.


----------



## TwilightRavens

derx said:


> Darn... I've missed this post bigtime. Too bad I don't really have time in the next couple of days, but I'm sure going to test this one out. Think I'm at +700 on the mem at the moment, and never even thought about going slower  I've been bumping the core to oblivion, and got it stable for benching at around 2240 @1.08V. More voltage doen't really give me any boost in clockspeed, at least not stable enough to run any significant benchmark.
> For gaming I'm down to around 2179. Above that I get the incidental C2D, and it's quite frustrating when that happens, so better a bit slower and stable. I do realise that I've got a heck of a 1080 (GB GTX1080 G1 Gaming with an EK full cover block), and 2170 stable under heavy gaming is quite rare.


Also take note that only applies to the 10GHz GDDR5X cards, the refreshed 11GHz chips (ones launched after the 1080 ti release) don’t have that same limit


----------



## AlbertoM

TwilightRavens said:


> Also take note that only applies to the 10GHz GDDR5X cards, the refreshed 11GHz chips (ones launched after the 1080 ti release) don’t have that same limit


For sure.

The point is, these cards have error correcting code to the VRAM, so if you overclock the VRAM more than a limit, it will bring FPS down, and you still will not see artifacts like older cards.


----------



## TwilightRavens

AlbertoM said:


> For sure.
> 
> The point is, these cards have error correcting code to the VRAM, so if you overclock the VRAM more than a limit, it will bring FPS down, and you still will not see artifacts like older cards.


Yeah I’ve heard people with even the GDDR5 on 980 ti’s say the same thing, I know my 290X I had before I upgraded to the 1080, you knew when you pushed the VRAM too far, it didn’t have the Error Detection/Correction especially when it looks like a terrifying purple space invaders on top of whatever you were doing. I guess Nvidia started it with Maxwell because neither of my 660’s did that, it just outright crashed if I went too far, didn’t even artifact.


----------



## TwilightRavens

So I just ordered a new monitor (1080p 144hz) a few days ago and I figured i’d ask here before it arrives, would a 1080 be able to push those kinds of frames in a lot of games? I don’t play a lot of super new, super demanding games but my most played are probably Forza Horizon 4, Final Fantasy XIV A Realm Reborn, Sims 4, Fallout 4 (I know the physics are tied to the fps on that and Skyrim), Mass Effect Andromeda etc. But was just curious if anyone was able to get well north of at least 100 fps in games similar to that. CPU isn’t really a bottleneck in my case, Its pretty much overclocked to the max that my cooling allows (only game that really pegs it on all 8 threads is Fallout 4).


----------



## MorrisDaddel

*ZOTAC GTX 1080 ArcticStorm 8GB*

Hi im new here and wanted to submit my results. i had extreme problems finding information about my card and so decided to try it anyway. here i post all my searches to make it easy to find for the next one.

ZOTAC GTX 1080 ArcticStorm 8GB Crossflash strix1080xoc_t4version2 known working custom BIOS remove power limit 

now the results  it is working wonderfull. my ZOTAC GTX 1080 ArcticStorm 8GB accepted the T4 BIOS without any problems i got around 100mhz more in terms of overclocking. the original bios was pretty near to the hardwarelimit anyway. i guess i could rise the voltages to get higher results but do not want to push it to far. the cooler is the original waterblock im using a custom loop (gpu, cpu and cpu power phases). the cpu is not really a golden dye so i get him bearly to 5 ghz. temperatures are as always ill include screenshots with all infos.

my delta temp is 13 degree celsius from 31c (idle) to 44c (furmark) it goes to 45c or may 46c when i add prime95
overclock is stable at 2100mhz core (+190) and 5500 Ram (+500) voltage at 1075

i let it heat up for like 30 minutes to make shure water gets to working temperature.
i hope the next arctic stormer will find this and will have an easy upgrade without worrys to brick the card.

chears

MorrisDaddel


ahh and i got 1st place now  against same cpu+gpu kombination
https://www.3dmark.com/newsearch#ad...lScore&hofMode=false&showInvalidResults=false


----------



## AlbertoM

Just a quick update.

After some time i realized that my FE 1080 with Palit BIOS is more stable gaming at 1.063v, and i'm pushing 2164MHz under water (hybrid EVGA kit), after it reachs the max at 60-63C it downclocks to 2138-2126MHz min. That temp is because i set my push/pull fans to very low speed around 1000rpm~1500rpm max.

I was testing from 1.075v to 1.081v and was having some ramdom crashes with both voltages at even lower clocks (screen freezing at ramdom color, typical of gpu crash). Then I decided to try 1.063v and been playing for like 6 months with the highest clock speed now without a single crash.

And as this Palit Bios (Gamerock Premium) is limited to 276W, i guess i have even more performance with lower voltage, as the current increases to reach the max power.

I think what they say its true for this card. Less voltage is always better if you can keep your temps on check.


----------



## TwilightRavens

Did a run on my new 3900X system and this thing is freaking awesome, (figured I'd run it with the 1080 because my 1080 ti shipping is delayed because of COVID-19 crap going on) but holy hell that physics score is absolutely mental.


----------



## TwilightRavens

So I finally made the jump to a 1080 ti and its pretty awesome, even if I did get a "less than ideal" partner model version.


----------



## adversary

Is EK full waterblock (meaning it covers all it needs including RAM and VRM) still available for 1080 TI Gigabyte Aorus?

I try to make contact with EK but still no reply. When I use configurator on their site, it does not offer me full waterblock but only block for GPU chip. But I seen on their site that full waterblock exist for this card. Also on ebay there is exactly same product for sale.

Never before I did watercooling, so any help here is welcome.


----------



## TwilightRavens

adversary said:


> Is EK full waterblock (meaning it covers all it needs including RAM and VRM) still available for 1080 TI Gigabyte Aorus?
> 
> I try to make contact with EK but still no reply. When I use configurator on their site, it does not offer me full waterblock but only block for GPU chip. But I seen on their site that full waterblock exist for this card. Also on ebay there is exactly same product for sale.
> 
> Never before I did watercooling, so any help here is welcome.



You might try performancepcs, i’ve had better luck with them sometimes.


----------



## adversary

TwilightRavens said:


> You might try performancepcs, i’ve had better luck with them sometimes.



Hey 

First, I know you from Broadwell thread, as we are both owners of that CPU and I was also posting there before. Nice to see you here again 

But this company is from USA, right? I'm in EU and would like to avoid ordering parts from USA for few reasons.

Anyway, looking at their site, I can also see they use/sell water chillers. I got some interest for that option also (and I can get same units in EU, so far as I check). Not going extreme cold to avoid condensation, but from my research so far, it is superior to classic watercooling, right? And for GPU, every C degree lower is worth.

I just read some thread from Techpowerup forum, seems one guy is using Hailea HC-500A for both CPU and GPU with great results.

But on overclock.net someone said that these units are not for continous work. Because as I said I never did watercooling before (but I intend now to make a very good custom sistem), I need every input abut that. Does it mean you can't run it 24/7 (which I never do anyway), or it can't run during the GPU load? Is such units capable to cool down water despite GPU heating it while on for example gaming?

As I understand, you do not need radiator and fans if you use chiller. But you still need water pump. Of course I would get quality and strong enough pump.

Now, is I'm doing overkill for 1080 TI? Yes. But if I can pay that, why not. I plan to keep card for some more time as it is enough now for my needs. Would like to have superior cooling sistem for Nvidia 3000, I would later switch on that (especialy if they bring superior Nvenc encoder like they did with 2000 series). No rush, never buying tech when it is just released. After it matures, initial problems (2080TI had it, right?) resolved, others people expiriences of cards are there so I can decide which top model to pick, and maybe some refresh comes out.. that it is right time for me to upgrade. Until that, lets boost this 1080 TI to max power.

If it is better to open new thread about this in Watercooling, let me know.

But I would really like to hear everybody opinions.


----------



## TwilightRavens

adversary said:


> Hey
> 
> First, I know you from Broadwell thread, as we are both owners of that CPU and I was also posting there before. Nice to see you here again
> 
> But this company is from USA, right? I'm in EU and would like to avoid ordering parts from USA for few reasons.
> 
> Anyway, looking at their site, I can also see they use/sell water chillers. I got some interest for that option also (and I can get same units in EU, so far as I check). Not going extreme cold to avoid condensation, but from my research so far, it is superior to classic watercooling, right? And for GPU, every C degree lower is worth.
> 
> I just read some thread from Techpowerup forum, seems one guy is using Hailea HC-500A for both CPU and GPU with great results.
> 
> But on overclock.net someone said that these units are not for continous work. Because as I said I never did watercooling before (but I intend now to make a very good custom sistem), I need every input abut that. Does it mean you can't run it 24/7 (which I never do anyway), or it can't run during the GPU load? Is such units capable to cool down water despite GPU heating it while on for example gaming?
> 
> As I understand, you do not need radiator and fans if you use chiller. But you still need water pump. Of course I would get quality and strong enough pump.
> 
> Now, is I'm doing overkill for 1080 TI? Yes. But if I can pay that, why not. I plan to keep card for some more time as it is enough now for my needs. Would like to have superior cooling sistem for Nvidia 3000, I would later switch on that (especialy if they bring superior Nvenc encoder like they did with 2000 series). No rush, never buying tech when it is just released. After it matures, initial problems (2080TI had it, right?) resolved, others people expiriences of cards are there so I can decide which top model to pick, and maybe some refresh comes out.. that it is right time for me to upgrade. Until that, lets boost this 1080 TI to max power.
> 
> If it is better to open new thread about this in Watercooling, let me know.
> 
> But I would really like to hear everybody opinions.



As of a few months ago I am a “former Broadwell user” lol because my board finally kicked off, ended up jumping on the Ryzen train with X570/3900X because Z97 boards on the used market (at least here in the US) are going for stupid prices. And about the same time I was a former GTX 1080 user since I upgraded to a 1080 ti and gave my wife my 1080. But this should still apply because its still a Pascal based GPU.

If I remember correctly performancepcs I think has a EU branch, but I could be thinking of another company. But yeah that’s the only other one that I know of that would still have stock.

You won’t really need to go all out with a chiller, If you just have enough radiators it should be able to cool a 1080 down to near ambient on its own. Hell when I bought my 1080 ti Armor OC it was hitting 81C under load and I was considering water but for the time being ended up going with an Arctic Accelero III and worst case temps go up to 55C give or take, which is cool enough for me to sustain 2.1GHz without it dropping much below that.

I got a friend that could probably help you more with Pascal temps as he has a pretty overkill loop. @kithylin lend us your wisdom.


----------



## adversary

Under heavy load, I doubt you can keep 1080 TI OCed (and even worse, if with XOC BIOS you bump voltage a little bit) at ambient temperatures. That would mean there is almost no difference between idle temperatures and load temperatures which would be impossible.

Chiller can go even lower. Have to pay attention how low you can go because of condensation.

I do not know if it is allowed to post links, but I will post because there is interesting info exactly about this.

First one is some user of chiller :

https://www.techpowerup.com/forums/threads/loving-new-water-chiller.228975/

This is review of HC-500A chiller :

https://bit-tech.net/reviews/tech/cooling/hailea-hc-500a-water-chiller-review/3/


In both you can see temps. Lower is always better for GPU OC.


----------



## TwilightRavens

adversary said:


> Under heavy load, I doubt you can keep 1080 TI OCed (and even worse, if with XOC BIOS you bump voltage a little bit) at ambient temperatures. That would mean there is almost no difference between idle temperatures and load temperatures which would be impossible.
> 
> Chiller can go even lower. Have to pay attention how low you can go because of condensation.
> 
> I do not know if it is allowed to post links, but I will post because there is interesting info exactly about this.
> 
> First one is some user of chiller :
> 
> https://www.techpowerup.com/forums/threads/loving-new-water-chiller.228975/
> 
> This is review of HC-500A chiller :
> 
> https://bit-tech.net/reviews/tech/cooling/hailea-hc-500a-water-chiller-review/3/
> 
> 
> In both you can see temps. Lower is always better for GPU OC.



At 4k resolution yeah I hit a power limit and it will throttle back to 2037MHz but I play at 1080p so that’s not a realistic resolution for me.


----------



## adversary

I will open thread in watercooling to discuss my plan in every details before I start ordering parts.

But I can ask here one more question.

This block :

https://www.ekwb.com/shop/ek-fc1080-gtx-ti-aorus-rgb-nickel

Any of you used this on your Gigabyte Aorus 1080 TI?

I can be sure it will fit my?

And somewhere I had read that "it doesn't have contact and cooling of everything which is needed, some of MOSFETs are left not in contact, or something like that". Is that true? I would like to cool every part which benefit from cooling. If EK block do not cool something, is that issue of just not using thermal pads at that place? If that is case can I on my own place additional thermal pads to cover parts I also like to cool?

tnx!


----------

